var/home/core/zuul-output/0000755000175000017500000000000015145677610014541 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015145723561015502 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000420770515145723376020302 0ustar corecoreikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIs,r.k9GfB P"mv?_eGbuuțx{w7ݭ7֫gC% oo/q3m^]/o?8.7oW}ʋghewx/mX,ojŻ ^Tb3b#׳:}=p7뼝ca㑔`e0I1Q!&ѱ[/o^{W-{t3_U|6 x)K#/5ΌR"ggóisR)N %emOQ/Ϋ[oa0vs68/Jʢ ܚʂ9ss3+aô٥J}{37FEbп3 FKX1QRQlrTvb)E,s)Wɀ;$#LcdHM%vz_. o~I|3j dF{ "IΩ?PF~J~ ` 17ׅwڋًM)$Fiqw7Gt7L"u 0V9c  ˹dvYļU[ Z.׿-h QZ*U1|t5wKOؾ{mk b2 ܨ;RJK!b>JR*kl|+"N'C_#a7]d]sJg;;>Yp׫,w`ɚ'd$ecwŻ^~7EpQС3DCS[Yʧ?DDS aw߾)VxX帟AB}nyи0stĈCo.:wAZ{sy:7qsWctx{}n-+ZYsI{/.Ra9XcђQ0FK@aEDO2es ׇN# ZF͹b,*YVi+$<QMGhC}^}?BqG!(8l K3T[<~6]90}(*T7siv'=k 9Q2@vN ( R['>v*;o57sp$3ncx!>t®W>]tF-iܪ%GYbaRvHa}dkD̶*';ک|s_}8yj,('GrgTZ'U鋊TqOſ * /Ijo!՟8`"j}zӲ$k3jS|C7;A)͎V.r?t\WU1ojjr<~Tq> `=tJ!aݡ=h6Yݭw}?lѹ`f_" J9w4ts7NG GGG]ҡgc⌝M b/Ζlpah E ur C&`XR JcwB~R2EL9j7e\(Uё$׿atyХ?*t5z\+`/ErVQUxMҔ&ۈt.3;eg_O ξL1KiYLizpV:C5/=v-}҅"o ']쌕|tϓX8nJ*A*%J[T2pI1Je;s_[,Ҩ38_ь ͰM0ImY/MiVJ5&jNgBt90v߁R:~U jځU~oN9xԞ~J|dݤ߯R> kH&Y``:"s ayiBq)u%'4 yܽ yW0 -i̭uJ{KưЖ@+UBj -&JO x@}DS.€>3T0|9ē7$3z^.I< )9qf e%dhy:O40n'c}c1XҸuFiƠIkaIx( +")OtZ l^Z^CQ6tffEmDφǽ{QiOENG{P;sHz"G- >+`قSᔙD'Ad ѭj( ہO r:91v|ɛr|٦/o{C Ӹ!uWȳ)gjw&+uߕt*:͵UMQrN@fYDtEYZb4-UCqK٪L.2teB ˛"ո{Gci`du듎q+;C'16FgVlWaaB)"F,u@30YQg˾_YҊŏ#_f^ TD=VAKNl4Kš4GScѦa0 J ()¾5m'p/\խX\=z,Mw˭x:qu礛WԓL!I xӤ1(5AKRVF2ɌУլ F "vuhc=JS\kkZAY`R"Hr1]%oR[^oI]${&L8<=#0yaKL: JJl r;t#H+B|ɧJiM cm)>H=l}.^\ݧM<lu Y> XH\z:dHElL(uHR0i#q%]!=t_쾋-, vW~* ^g/5n]FhNU˿oۂ6C9C7sn,kje*;iΓA7,Q)-,=1A sK|ۜLɽy]ʸEO<-YEqKzϢ \{>dDLF amKGm+`VLJsC>?5rk{-3Ss`y_C}Q v,{*)ߎ% qƦat:D=uNvdߋ{Ny[$ {ɴ6hOI']dC5`t9:GO: FmlN*:g^;T^B0$B%C6Θ%|5u=kkN2{'FEc* A>{avdt)8|mg定TN7,TEXt+`F P |ɧ<Ғ8_iqE b}$B#fethBE;1"l r  B+R6Qp%;R8P󦟶Ub-L::;Ⱦ7,VW.JE:PgXoΰUv:ΰdɆΰ (ΰ0eTUgXun[g, ׽-t!X򴱞_aM:E.Qg1DllЊE҉L ehJx{̗Uɾ?si&2"C]u$.`mjmƒVe9f6NŐsLu6fe wkىKR%f"6=rw^)'Hz }x>1yFX09'A%bDb0!i(`Z;TyֻΗ|ִ0-6dAC5t[OM91c7@i]<ք6ym®Yi&s`dyMX](^!#h k:U7Uv7쿻чd)wB5v-)s蓍\>S[l52, 5 CۈP$0Zg=+DJ%D  *NpJ֊iTv)vtT̅Rhɇ ќuގ¢6}#LpFD58LQ LvqZDOF_[2aޙ-did˥]5]5᪩QJlyIPEQZȰ<'\b$BrW XWz<%fpG"m%6PGEH^*JL֗J)oEv[Ң߃x[䚒}0B fU3U;,ד1t7lJ#wՆ;I|p"+I4ˬZcն a.1wXhxDI:;.^m9W_c.4z+ϟMn?!ԫ5H&=JkܓhkB\LQ"<LxeLo4l_m24^3.{oɼʪ~75/nQ?s d|pxu\uw?=QR -Mݞίk@Pc n1æ*m$=4Dbs+J \EƄզ}@۶(ޓmmU̸nG pHKZ{{Qo}i¿Xc\]e1e,5`te.5Hhao<[50wMUF􀍠PV?Yg"ź)\3mf|ܔMUiU|Ym! #'ukMmQ9Blm]TO1ba.XW x6ܠ9[v35H;-]Um4mMrW-k#~fؤϋu_j*^Wj^qM `-Pk.@%=X#|ۡb1lKcj$׋bKv[~"N jS4HOkeF3LPyi︅iWk! cAnxu6<7cp?WN $?X3l(?  'Z! ,Z.maO_Bk/m~ޖ(<qRfR"Au\PmLZ"twpuJ` mvf+T!6Ѓjw1ncuwo':o gSPC=]U҅yY9 &K<-na'Xk,P4+`Þ/lX/bjFO.= w ?>ȑ3n߿z,t s5Z/ Clo-` z?a~b mzkC zFȏ>1k*Dls6vP9hS  ehC.3 @6ijvUuBY hBnb[ Fr#D7ćlA!:X lYE>#0JvʈɌ|\u,'Y˲.,;oOwoj-25Hݻ7 li0bSlbw=IsxhRbd+I]Y]JP}@.供SЃ??w w@KvKts[TSa /ZaDžPAEư07>~w3n:U/.P珀Yaٳ5Ʈ]խ4 ~fh.8C>n@T%W?%TbzK-6cb:XeGL`'žeVVޖ~;BLv[n|viPjbMeO?!hEfޮ])4 ?KN1o<]0Bg9lldXuT ʑ!Iu2ʌnB5*<^I^~G;Ja߄bHȌsK+D"̽E/"Icƀsu0,gy(&TI{ U܋N5 l͖h"褁lm *#n/Q!m b0X3i)\IN˭% Y&cKoG w 9pM^WϋQf7s#bd+SDL ,FZ<1Kx&C!{P|Ռr,* ] O;*X]Eg,5,ouZm8pnglVj!p2֬uT[QyB402|2d5K: `Bcz|Rxxl3{c` 1nhJzQHv?hbºܞz=73qSO0}Dc D]ͺjgw07'㤸z YJ\Hb9Ɖ„2Hi{(2HFE?*w*hy4ޙM^٫wF(p]EwQzr*! 5F XrO7E[!gJ^.a&HߣaaQÝ$_vyz4}0!yܒ栒޹a% Ŋ X!cJ!A\ ?E\R1 q/rJjd A4y4c+bQ̘TT!kw/nb͵FcRG0xeO sw5TV12R7<OG5cjShGg/5TbW > ]~Wޠ9dNiee$V[\[Qp-&u~a+3~;xUFFW>'ǣC~방u)т48ZdH;j a]`bGԹ#qiP(yڤ~dO@wA[Vz/$NW\F?H4kX6)F*1*(eJAaݡ krqB}q^fn 8y7P  GRޠkQn>eqQntq"Occ°NRjg#qSn02DŔw:ؽ 5l)Fa/TTmCԤ{"9b{ywSXE*m#3U ùRIvޏrJ`k|wJKH:O*OKy`( ݢe*{ ua ȻݔhvOkU~OǠI/aǕ-JMX _.6KsjA Qsmd  O#F.Uf28ZAgy>y,d$C?v01q5e.Um>]RLa&r?+@6k&#l)I5_> ` D s5npo}/ؙq #a2V?X~.4O/'|/_|&q̑0dd4>vk 60D _o~[Sw3ckpkpLNa ^j 5*<&}kˢmqvۗj=<Tr=[ a^؃ È(<^=xZb [_tܡ&yЋ{ Sym^?̑sU~' Ԓ f\itu)b>5X -]iSCQ&s~In/SZ % 'I Ƿ$#stV \'xMgaSZNg8>e!^f%cYr]qs:"̊;isXa]d+"v=x7p.fZCg_Ys;pE&\U}ܫSh])qKYAـhhdEnU14&G * QIQs;rԩ.k83֖8Muqu_48dHܥlWW q>fu6+'}xu\Veelz`Zbym gp8펠ˋֆ:1IC8qٞ\vXçL ]X/r}7O}Wh,h ;RQ=]u00yiC۔I^3!?H6iUH:ô 4P$rT`%2Aq-֢׍qt=@x#~0)p# ы9'iri]ͪ/@繁qVGCڤr,DihB ,m 9 _$q3= A$IC"6g^4e`Xo(D*6"^eTh'4xpFڜe'fVQ7~'c L^ԯwIڣA.}H;Ë*׬=`^ 9]r鐃 -Dfi2|QwZk‹u^6DQ1&H凎c!n[mi3)WfsF:M"uҷs.1!뾧1%s,hQs|hx̗3%*v9(I;:'>uQ+v)vR/egBhAAdh]4H:nV$tHI98/)=mͭ ڐn}}~ק?g_6WĩDRc0]rY9'z .(jHI :{HG}HDN`h7@{jnE#[dz;n#y 9D*A$$"^)dVQ.(rO6ӟZw_Ȣaޒu'- ^_,G;U\cAAz7EtlLuoXuA}bT2H_*kIG?S(קjhg 5EF5uKkBYx-qCfqsn[?_r=V:х@mfVg,w}QJUtesYyt7Yr+"*DtO/o۷~|hw^5wE of7cꃱ.)7.u/}tPTGc 5tW> l/`I~>|灹mQ$>N |gZ ͜IH[RNOMTq~g d0/0Љ!yB.hH׽;}VLGp3I#8'xal&Ȑc$ d7?K6xAH1H#:f _tŒ^ hgiNas*@K{7tH*t쬆Ny497ͩ KVsVokwW&4*H'\ d$]Vmr달v9dB.bq:__xW|1=6 R3y^ E#LB ZaZd1,]ןkznxtK|v+`VZ3JϧC^|/{ś}r3 >6׳oƄ%VDSWn 0,qh! E-Z%ܹpU:&&fX+EǬ.ťqpNZܗÅxjsD|[,_4EqgMƒK6f/FXJRF>i XʽAQGwG%mgo 恤hˍJ_SgskwI\t`ﶘ080ƱQŀllKX@116fqo>NrU Ѣ9*|ãeeH7.z!<7zG4p9tV|̢T`˖E ;;,tTaIUle*$!>*mBA2,gJIn_kSz)JC]?X(OPJS3.}clݨ{e!MB,cB߮4af祋,1/_xq=fBRO0P'֫-kbM6Apw,GO2}MGK'#+սE^dˋf6Y bQEz}eҏnr_ ^O^W zw~Ȳ=sXअy{E|04SRm+0^PTi-"] O('@BKD6 {NmʐzRj.aQcb^CZ-uvpr CѐٱlGNzIveca=%1Qi F>wTLHUGӃ\sA֎Xpljlv ^tSȻ \cPwίwX"{>9V0ټ_`#U8VdTtD_GU9V ұ{q:ObUi7s )B ۊZlzIA4S#x,T3ѱ ԶJ=rs>Nb: Q6ˌ߉J%.Dl2ȱ%ܱ&6XƟ6qg(USok+Po$lwvmi8W_VT18V =| ub6QWCnY'"*aN08wuSEAVخ m3 o\` sHc# fqT .,ŀU|⦍߶/*~48âF,#[:y_YIpʼn)dk!J'Z5=r&; (y*b*O_ULT.ÔD[%s1,jЅ@k0Ցu֯dtKl$Y5O*GUڇvI`b0ο0~oI`b#FOf_$0!i rS/wvҍ%Eb/Ec|U9F-)L)ŘF`U:VK jeFrԋ7EDYpԽ.D\dNyj荊EEg]bÔF˩ք%EGƶ*NX)Hc(<|q@Oޯr^3>Uf1w;mCja:-1_k٘%VbZ˙#G6 `q+MPU~l!.?I_Pĝ"] rT [eTr؟˰ ]\ h! v˱>5S1px fnk}sRmA>d2UAkؖvlX܇Bz1U_#Xӫ+al H d\k/I,k,ρ|`zR/$@8VU^rcG"E7\qtS:ڝUyy >Vc11*?xYa8U`Jw/AcL~|;yj8TR#s"Q.ϊ/Yrx+u6*27fǪC%+A~*Zآ'ѭnۡ|< a1s\ T5҃FZh?EV"sd!@БU ^p%pO3|B5=2怕nwRqR9~ i±za+HFNi>. EWz:V^&YEs5Ȭ N *7{!fRБBSۘ† Er/IGU}APQT]|XN X]FbKjKdO U6[3TTX)|*H'2U0:VunBl  `5/@ա06VNO8VGON@KgjyK?Wq1egI+ I.*F~L!Gf"LD&U 6tGd#fR*c ^tSLjnKS9 Ȼ \ >lr&}+̼d"I va,Jm_u)d靕َ| Vw85F3Liƙb<;dM-})C?Fw*IJ_3UG'+¨[9| >80\+ xJpΕ`p~mg˗%F Rg(6=/r+%a>w Ohght uЍaRs ^d6GXAf?V_mW puȇ S:tŴvŀU#-*mZ5k5r)_x*8ͼx@(k:_TX%[paRu~}#Ѥr %A%`;MxB[CzR怕#H% }8@*AM.SEhd,rKrʇ)br\+! s1CtӒNc_:F*`Nv;ogQFa2V%ZniE|nZ&-I,t*ώlo Lhnٓ'Xm R ˍ-~ά}hs\5TT%~am.>!LcoJrKmqvez܅E9t6FZXgsreHhlٷ+ [}r:̓?W~e6>0E8`Jq-(ed;W¨:Ä&]䒿e;0:|$Ȃ1L-%;Ƅ{dɱL;V[bp>!n&աIJX1$9;[?- й vRCxKVV+#lj@_RL;IQ8ŢΌXD@Z< (1ZRÜ:OUM/vư{'jYXE4S/8 7: `/ +G\ U>]B2/n2=8) B gJ3bcKo̹ʇ\B~Is 2sO/I!}xV&\b<9$4Nve^آ]$LGF@LjKٕyzH 31Հm-XıUXF|\A-2) ' RG6h?āUŔyj[j_ӂ~ яA弆^bDyzǖQ8`jXbsK?l58,?YP5䜭ve9YFznTEf3Ja\,@2,?WYؾNr<V` =V[oB5!Z\ļǪЎr8@*ucѡv\[|s L-+y{5K@dzp`r"mũɸHNd"yc Pu>x2;W`_VR<aӗ&D<=h-Rר|/r _ǖھcߖ]G@Ն;UQG1 '3Jە Q88ASUȿ!:WѥLf21;d9OU᧯MR3V:<}xXh//T+coY5Ȧ4/m0NE(G2[+G~H'5ipӘ͏O +Px SPp.,?Uv][oȒ+_%ؼ3M'N3`FlJ\lٞ_U$%ˎUX18([_j IVAd$R-h(AˤJd>^Dƪ0hF,ƪ$r|Xՠ:4~ƪJwDYI%nJrl!*J%BHxD UASV漹Օu+nO/yPH>Uaj/o!{GI&2h-Ń2R,&Hh5qAU=F*\7s\B+R^r4yn7v̓ utU!Gc^`dxdzMj>zL`'y$~اEr5N%q*/*dp^ {%eXKVi?u bBƂƗ`2Z{i eqe376],Yp70-1LqM@*ib?Qi%Ëw ),CkY&e ֊Rvg>c=3mag XDne;{=߈C{標S4͐~FA@uAǓ8_m~)G(QT<Iނ̫˒=-|+oa"|}H|ikִ7Ʉ& (DO"QSѫɿI5)x3PrR-deL^I_\@aMBu1>WzA2 1YuQHլE5kG U{#$$$9^铥&ېR]ZԔ/?* fh`6ުd\O~$MQJ'qS7~ QNG,MMкLBrK..|A7<\/Gqs^wy ŊQ & %<<ylucsS6}ݙ|>L_ǔ2t č 4gi1O+3zEB5չ.wkbQ0ȣiM=[7yGm`ԧ]$;.*.0d߻}wYϢŻsYIXuFӡ M۲bA_I>||s +8Ȉ7W̟úl0MJHuy}/#{|ɎΎv˺$AU31GEIWsѯ]n+yFągE*h¸\M^k+̗w00<[RD豚2I$ٍ!6:9[g}w.^5ԣ36eR_wF6-З_ƙ]_~`9xmLL}77.x`Ɓ$<= T{,kxtJbn@O?ߥ4P䌑8ߐ&TƢ(AS,J =M-B@D H~zۧ!ނ>0>I)ǣ40nPX9>ɳ(ެ*$'y O .|:Q}=+ v,[T,]yhCCDwD h(1vЛFL@x$(y!k>AU6E.:U{q FZ&) CpQo>MnK'X^FXȧW]}=j[Ѱ*d2~V@8h0K0{6ܒ-rzKNd<r1@貪yc7zq0iKY<uMhKSh33 ;'m@Mf#D nR[aQѐbvc),EP#l OhU\M@6ѰȣidI8D8?uK7f]0햍?aBGmAVLwa+ O7#:YgJ!Eç7_O4GVpiRDZW<4S ht7ts_g4< xx%EG"%iYZW4ɒ4Y;L8EEAv)1O+;mЂLhH6s⇐ȲBeޤ]=ߜj_=)ъ&wGwT8bd$jb̊R3 ԝJH(d6O' mT5|H76nxio|w"v,9FPZÃeLԄ͂#DK")wRߢ}S0M}$m4Z _$ RsUn?ʡE9=jpuoKb{QYcYŁi?4 )9JJٗyL*Yp>nO "v0FbO[$!Rv|DjrYXf޳@do;h@}[ Z3Q<3DfE[LtEq}êUM9QF 3\#YC"Tl*)GCkV>"UkH6H_=H|J[p"q,+y!gvOP 9e(rʃ YtW65TbuД2Jۆw }(1oH߶B{5,3uXz%1 ߿DV]k[u`SdTmE\`%)V<wre-aT} 3`URk@LF7[ؼyc{&ݴZ2e)R cO2sI!XP/2R֖xR 0]{fSj8Pv%PelC55c |!r!r 2uhoLMLqU%%_xQTL߭fkZ=|2Ǻu|"$:,$Pe[hU{Fk2ڑbnO)"^ВQUGHLB\}HP1Wq0L"3T=[ï[Z)_!s9a-qV,]-a;MӶ|rap?kzq[D3vЋ}#3]׳4C呧;O T&KC@߆ $;8SʛPu:U~kK"WE~@5DaP+@?P&Sq&zcicS~]v*׻&X 5i}N4Ѯa;ןY Z2%dK#f+zU?-j%T$Z"z29OINvY@tVU{vFWU]\0F_Uv'Z,\_Wg]3iiwm$ no!:H -ơJ̊U`1-՞!iMD MQ E@Zj7y8# Vk4¦$¸7]tϡPιUu)q*X"iI+Ab cOS~7D=gM=R֌1<=ƤY kR n8@C 9b"vq[O }%qv_Ӽ;t,O=xg&y.m P|kOu"BwWW2UAGe4q!3y0K<iajhLm[>:;, # n¬HWo[:9[Z1YZX"X݃2)""?Dϟ~(FeWgZx㶵_!bwo4#µhq@Icff4+{aws[:`t깪B|I' o߿=:0?}8?Сh PBDJDǑ1ƵE(Vnc^ 4g F@}q5zҽN 3z HA0p~r0t5^?IQ>$aP΋xQӢ,Eba8 ^&d  MtpP\/YӾ@P=K<HSU D0N!ѨAqGGG"E@ƮU=h3vQGT#<rz!We2ꅾ@^߱p%.Idg̶ 5E:Ќm[pS6θp.ވroL\'r#W</5m6pGO @nl&Ҋ.~"s$ a- b b^x.Mh(rŁ x*AY #"ev=;/FVtUlR[`']3rA>7ż Cff0'Braxbҕ˲d Բ>@t"n| 8GibI~ =h*jmWp>9] V:fZEXmf,rKI!Dc@}o3ʵ[u}<=&#.c ~vs_6 ^7b h4=rǠ^qס}P'nnOt/'ӓ8{48Ϗ3}guk8HNt{`{׆ꭆ"M;Θ#MG1E`h]3]f:nY3cO]Fāg#+b)cz{^bq0rQJ|ԑP 0 `oRZ e$|ډs+أD) q6Cg)R#OY ؜,r1'4}>B6ب|_4%NEqod pϰ:W@ķIt7* Ip<, }:qhp;Z,CZ(8ꦰUmh=ivvդBSokS ZǛ`q)d Q8vZ;_mMaj_M5CX^XO#1G>:vfgN+XQ$@L緥Y(d &0ol'!砈Z=I=seWd|5ȇОӂ wDk{ch5M'!>΂b)mo^ԑTr-(@eokȠ<&kn~!E!Jw N(ݏPDB.M)eN&9Jwif2'wٵWw ޝV{?R'Rj@([%@(۝P쉄uV uv ٝPg?B'@J>PwB*|BO$@Jz=PoBUBw'ߏP;@h;~O$4BhKP:H$Kx I,Uí^>ߡƩa t--Wj㩰 t2qy~F5lq/${8{ )e,$t4wK@Y2bˠ{ _c4vXc{؃t\9/>YD,-ޗ0?G  Y_Pe_UvEq Is(ݸp3|WE6>n` E񡲉90 /?/,Y=Z[)SP~a)żrW5sTO9~QXh+"<=eHV.stC.=Zj-*~"`(̇"2 h|5FhNQ PZ$a 3(eG3u\CO/P6&6B3WTU "Gn.DT o 3 -<h99SL(--ݭW:>x=EXtZ,K_iͱވ &.7 .E60nΪ%Yu0<Ϲv{ZTaYW?lxP$%)Z`Nf1S"0d.tpRT关K9[̣fDiJMxì*9msH3 lb! ӁqKG"ߐ@ J-"<0* CbNf|nfD%X-v&J5<ၹ@ajnZ 8h̫7X![G=:<ۃKC|"X0b-މlQzԬ b}j-0[`pVN˔&$UEf MMb1؂L,؅"a߰ j֬⁧< Ku@-ljAu@e־F++}D\7Q^:E%HoҡW u~ F_㉐E2~"ULq_{3R8}wi+^F2Fo[ֳ*˻2PXց89MרZo2oXW%W)M%lxxβΣ |(GSO lq]*pίiUd B'SKEWq̣[Y6Dj)[Sjj g|=HJRmHp1t! :Dt Su 9ʇ|!ƥ!2 -=(I=x2.3=8w$.*DtjaQ>uEf}O$I9DR\B'ͤa^[J[[uN%7517P*g38mū.wU[Y"r RCVJϼY:Gn}^ޠo*W_#b;mu2 UeapO!qkW>"lW)4E[ʊ4||$0_N.R eΣXuV[FUϨVJlW:,QR'+?:CP˙.n_oCx8ƴ^#֏茸mvDۦ[źn|u+R.c4^N& D>{>e =^V(왝VvJ5KnLZ] Ɛ\XAe(D!**V tz 5ue7~!|0Y":[t/ݡ?@!CM32[yϻ|;%qٿkCʻŸz }7}o4][%޿ 5>/: u}Kuꯓw(ɜ7x2]^_jr|ͯ3ڼWW}/vW-;]tN:L$~?ܿo=2E+̬+aCQmtb{|Bn6 2E:=We 4aV(˴Oy#SQ%,wFxO1=&6q gN1wZ\"4c,P9(Q ͗OI,ffK!ܑ.ɕc=(εHs뙮Qdw5;I=wlW'XZ6pY.0-K+ϛ:5q#~O)pTa,:dEp>eɼES5l¬`İG*d}~n0Za=<F52Y *5t,PrقE``C"^I@je8:ŚS[Yksc$)kl(p hQ4]Bd)Fϒ<xgt0^X#vZF`KD7pIޘDJ:"#=lU^# ;O2X'0f#渳'Pg^TIeTU^3G7cBk] ^9"&\H1pnRLq&,;C$ >'yHt6?Dq|lbkĔVi*K xIc-I[Խ8+"Q$ PldsO+<@$J.j^/=?\ .4ΫE$P9vVÞ2qGs/ucf lUBlRZrC=|=Ю}?_nmh9T-2 DբҶN慉lY~^ߧ]W Vk%ܛ=ӲfQfbpPш+#Gx`ӎŘUЉG^TEZ"s%4H̲NwbY?ɼvXN kSi׽E.Tj ,& $.`[)Сr> w O~GdxV$~*+tXTuD^gm΅$qf psLpշ 5Rǁw"TL Ff0;v'jD3BFӉj=ЋBP)"|2 Z1z)b$0b^f cൃ e4L +r6ǀj!$݋Dѱ=׃gsQT&k Qt;6/&/_æMx8$X{P3sa%25*D b# {96ELӻ XbB*s4Ԇe3]~rkGe"10!+Icθ<>K*Am !$iCw*Vyw,T]FXl"nBD_O}Q>3~BjM貄އ7& Vp6I5idZ&W=r !]>OQx ʲ4Q`8yvF<\r0!JZ\(u1&#H(s{:(z9}$_bQT 6 f{Hp2g5*ͦhkKmyIg^v[HpfQ#3y tvQA3~Y'Egɾw$8= K[xnZ-â2YLF? h",%<)5%anuHIKd#̰ q0)1gqp {9?%;8X8ˍi"O@ }a[gl!a?!~R-ʯ1~MˣCdBL-Ψgt¤#S)h eTb~wxBKwl&&X:K,cbhs/uopeGIg^aj2=ȬcF^XG98m3xm,'k.ua&^R&q[0gR6W ba}I*L=#w+@HL4Mab9vRi3TfIמV16h0hF"wW`$mIJ5 1K~4~0݂qbp\tG[zpIb Eu1A~_d⾤ؠDċV,ڰ<-2t>WX-U漂L~!N3$r! ǝf[f:F)=铃J/ aAG;hJLd1Yɗ>Me[٣E݄a1au}/ tZu NtUou\kP-6v0%N~E<!թI61O7Ib2ga1Ap#) B=r>ESbMVɴg4x"^ "$cXH0#bn˫D +vI ^* b[ ٳVRZf[ tdړHO,?Ѻ{LҝJik.n! .k0e0%q}? \sq/o=IX. )LL:N/?ح-XI tw1E8KH" m{IKjj|@4uRt F!pErai'mOYvf. I;L~)Z w7l_zkLK\}nc=&lMw.[g!|#:un?.jv62DW{a7kɆ@/m'alM|KjQK!.S[/ĢHO:#nh ]0QTd:L0,KTQRIl#,|GG\H~T֋*3hTV;_2Q&witƬ?n?9ҬÌxx}~|mCzӥ31Mgz@cHPm?ꖎ^5 5-';eH4 +S6J-,\P$XU%H",GR4Q~$튜& ft7|*b=-xv)p-d^ݍ$qAEχql>yapT),pǢRVIbc0!RZ{afx}]֏aʥɧ0I@p)mq+26 n\>) I"aLԀI%_P m$8e1JF3$fY4P94*V0fQk5ΔkcM1 Qgl%K3 !|zK]l[!>9Os0;8bAЏ #)NO|=cł1f] ð;QÎ+>"U&jN97V< f7۾tN>H[Ō[ssoo,S6x/-|CL ?_CQZ3m썡 *ϊ:y<8ESċVR鳉*{γKb'c91]r=`ʷR5_n#v NbhWR/Wkivj$z@P3E !)i.-g%8~nH#\ȏ^tÁo Nun:s8"25Feܝ$9bo~|ўgacHϷ0=& ;aA/Khҝ9۬.=Gw;֫Id2O.h< y{5HUMKT7;FF! x;0(ھ^D3j"e]/ٗbrA@X]LuP .#(8pAozgߋ(^:/O:>8 oY'مAHv'7P< .#;Lmt+Te-*Ai4<..<&>yƫ@, h$gz}SGs1qL ]*{1?(g ؎/# 8tl :8&vQňt j/x~KOMB;C5 1W`1&.%MuT2 _^,v-? óU1G4>x2믕~1h;i ! YPVSWtM`֤_]/5;7.Uh$" ]fa??ckU1I#*FrӪ ծvBy3Yś%}z%yatg!ę,x6SCݝVBqWy(^<ܵ3cgYFQL_վg[\OV,u3H7/Hd>H K7TazD$Fe-'|qlUQe_l'e!^b@hM 4= ⶨj@hFK~LNYm}Up2Z^:,1r[5% u9)ԌRmHVO2UE7x;ݿq^08L3Pz4||6x Ӵ, Hb9(p猔(B{2;-e﷧UJ⏨x ,'RaeWQE;VjrzfkYRr4qԄ,׈I s/E\*N<M'*Y*y*0B펕Zm(<:$Ւ[D0NP!cXGM^kk;5]j~]|* i%GRQO ej;iZ'%U5W5mg~%ܵ1AU < \_۹rb>W@ >t[ .ݹՕh/W'& u8ϛ<`x<Bk-}EGD$NhB&t%#79@Laseq>ӕn mtq#@_l{SU]î;wBy8<Օ;C^J?A5N'4}'Fl޽*Թ<mK=5xE#ӈ &`smWV<^2ţ (;0ÀBϔsfl_JҖkRi,|W#0‘cr۩ .~v~鮒]Zex~R DL_¡:%t[oҮqcޢ<[4*OV mmЊ<)IƴZtJ֊טh1"%ˮ#] #/^<r}j{(KaU;-7?qnѾO*}9I0#Fǧ2`j3%1fp`9-)BS#0f}ZQ?{]Y(nvΦ ] xN=I곺9 ׸AH}[ƪ@nI̹)}\`b,@ 2˙.'MƭmM;jdM;e`^Fcߖjn*H(x>:wFqiKslNW ^2v2vp ~-cUAV2v(EC"(^ `uZ0mM;jdM;o`3bu s (8 s3#1#O?A)VHSd'A}HL4/"dt ҭ %Hq^,6b295ƵLy-hzqauFGW!RoVAD!rDXxXׯuq}_gG!!#Gx;o;wd1؟QOo~7k3 W'cU |rj&>i:yiE@W4yjXn5W\5۲M'&[ jp4:5k0BۃoqtUk@t|nńq x((YQ ;0`Gs.sh !t~;OjFaobrˠ5B [9G[}W"#o?oB_sX6mmUj=sDfbfŧ̗~bfnt1|,#[zW?m5cERWhIf1&{P`+47nhNJ8#b 0d*L=2#b(86yA^_Lന3_c7a>)jIѢI2W1?Moe" 2䘳D!L. >GrM&ILEC 7S!~ emOp⬰A;̀68RSG sR[atn%p1ÙvG9 $C7 xJc^Hk4FY=BySj7|;3)MD.A8g28c[Lbp/cҐ!|ǼH1Wٻ$US޽Nv lqpچA63:k(WԃH3E݁E6ɪ_uWWa ӿ)82NA+UdJ"8bFMHI$H?aGh'h1Vr26ۇT.@؎ϯjb7XD.4\ۋg7+֤c"sctGvJ}4'gu* kqb g* up`Y﷥JT_'~xo3_ #z톃PK_x`ȑjԥ/<-WB!}ĺ|c#0jdv5jKlMSX]/%h@8"ꪮS!8͙O+(E*aBh2k ))iB6VZ&>p z5MET"(2m # i2%#-لmtՑw%4k@ѣPeMSzu5U;k PV Z i.1-R_o zNz[%vp*mE@h6P,ۤ 6 $+>5"x[#"\08&{׈؞B8263-ᑌe^M64"18yDxj<ւ1w場0T8MFSk%f&$5(<)%I=QQ#*H(RiRpS bmʭR8LqDq F&iQF$  ’(B$VQfy% 2a82&bDadMeԨ,>AO5r9n \H>kmPiO^Zh ;iT3eg8C iG[w3bHT Duj K#@Ze*nJ")tUL`y[`F#F.UmJ+R 6)#X(-Qg9<<)YOqJ^JqtӘ&Rt]5+v)EIj窰pv5i 0`*Dl?8]Zŵ:wQvdLĕ3}&م+(Kaޤ? i~X%2ԍ>F" ŭH.m6ޜ<+JhbB>zbgI[_OqNh{ڃ..[ W{=/{?/])dĎ@}}fz'דy?H -s»#~07 (L$]gOKt_3_>Ϻ /?Pʓ:_|JՂgՋZ=VȯQ|T`3q7bƛI(ʲ N˗>A-+٥_2`L ymP_0'2y]OŘ,Tb{b:t''/Q~3~d}|W0 *R~_/'a܇r5^6/- y*:'T-l6 !$^I:]e$-s1QgZ<`DvQR?FT(vLj R"QJ^3RJ/hwM%ׅ J6`#UmV,(,BG$3-p3WizÃ3p?*T*,^UZo1p1`O&Qv᫛:Z'`&pXnM:wk65?Ƙ@$ln>].צ..ؠY:IE/Uޞ f\ձ}L?0O3h>!炁 gӛQ{Z-hzigDQ2ă;?w>*v=A<l_ؾͅ]> ތqxXjeI\>supAwY4Nɗer/wҠr_! `l?lq`0Tx\JeQ3o onr̩tsY4܃eBW7׳4e6wj;P{5& nk>g zzԝ.`bpaC'~reWԛ4X6li,@) )q"-B%)U41fZ 4iŏ>};lqjǷ>\jQ.$:)`&L܎^uBgwQ>RD |KkpYVr3F$߾Y4ܛ>}޿ v+a?T(R!cUQ1m%ĵ߼~ev^2:M\28f_oꃯ'o'>z>˿rcxa4rK½̀~|dheh;.5K}R_o١RZ 3+) u9!%M@K#]ҎU)x-*3}>>l+L܄Bw~.wv.QD 'J#mpJc+XD 0*hcX*^ybE2G]j@.,5eJ0t8f0d~h;^قoi&Q=+;f-|.qMA4[:x|o~O>rvW/VrMxxq-+x/~iq)E}<˖ (_g x.K3,_M2UGXnx<Η47ϋajd6^L#b<*M]eMp1κeHi@i:=@>VK]=\K9FJ01@&Z&್&OUv'TQw^c +kЖbj6"-oG&Canͦ~i (v_*N1z4Vvá>L+S m(A•pծ莏rh(7xzSoDN BlP: u@Bw'D@hoXmLT&D&uɎb{APOzD=[z-e0gT!݈"\(zlsHumS&<2]j qXj `ke &B!HcKc˜NSYJ4‚Si,ո?'m<8M.j M{tÍb+3|v#1FXNb!HħH';7J>.s` 6[Q'a0ou)[I]t{.[ۻ.n?K<)dڑׄ[KټY﾿ݥݝ6Rlȕ$ x.7%,S8^o:ycg{ 4f'jd$GhܐE((&!Jo, \ÕV2 JydlRc_tʰHY* cR&%q,D NSҹ25]mEk^vez~y@F(Չ#x/4n{=w.n&+#>6+>")g8THو1%Q(0"ew@d =uڋȎI,$ 6E N 1:.#(2ƜPF GCduR!1qvd#]9s7;RUeӟXi7CZc'~Kb(6y 3r6߹b]1ͥZG#&y֍x: 7}:󓮞w.>>e&g0 :҂H^>)D_u{V|qmI0N0ӑJ-KpIU)b 6""׸,gZ(ۨlFiޅ"<Uyz/_KimA˕1?O.ז㷃e'|M.Fr%ht!s].FD|]/P~!3y.鍽ow5ֺPB^Qχq(FTͺeL'Yn\IW:pW2q8dG37Q##)Np!BcZrϧա0{DL;fП Or}HWk82⏗dv>s3d"ߥBĈ"'V{xTfZEƟgc/h?K&c9\v8jG|>ywR<QD򔈵v0&i ψO|a.|,sO1J`~o=eh>䵣 l. P`^,-PL$=4N,>Zw3+C$3fOl̼\*r.Gee<` G7UARηt$$ mJܒ+'6c>͊M8.VrY6IG ?n6L4*Rt8&j/yȉ<]~p?dc+ :uP 7L- Q_%:{b-kqcYt  KIi=" ]Bz1uT==W +i1C7Vz9ۚuO Z鬹a6ZT&gŃu7* X6Ui\zj1,ez?q$~|\lw5’h2[DxͣgQmÇ@}PcΦ'xZ?o9Q {z'^>Wq Pk>YꉐHK(Qv+QdN+~#BsQcj&4||lcJ1M3cOXkȆjB<,0;72{hl (O/˖$$T6C~D+n D|z.INli0~(b4i5mqƥª*~SPׁ5&zSIJf#sqc;aJ *fD;`tH^͏>5 wk @^/ 9Q"W"W:yn]I!DzhG@e9VR/4L&Ã'dQl_cG]E>|@֓ {*nV{8P}"pFtVث$Zd@e|n"S,0j j˶&k\ pJ"e:%R1;lƁe3vv?FШʏ<˓#}\=4* MU@wV"}ý"i7މ@6n! |$9F( W[>ayP~j/qPD3Mf$HFeH!U\aow]>ͲCTlGY_0I妶DpGsC5P}3ZjkHj;5>8 H.j41|)nu$,<|Qࣙ:-HTx7a0w n~ZjN%:5zW5`ݔjI<{"*M ||S,/N_x U-C9|!*"f0u$(`f@[yC |,wg {3){`an3._wMnO|kURkFe0Tia4MƯRS_'Q; 3Л(\qPVAy9d8NzhŭVXK>q'5'ʸ O''6nY.€*쵱)H&2 %713%݊yXf܎Ao$jШTP:"ڏsiڐi6DQ>$~6B* |,]U=YQѷ5 Ӣ9cJ 5D=4*9ݼOeޒChT+|~jq@j>'mŎQqy/ft8c FvΔ#k,wkh FPQwQѾΟHMLzO6VK`d aJKQ?u*(]>增I&0W}k…b5yPuPNbؐ)4$m!$īhPDKi4j}9-IyC妕{hP*eM5ĖCC:'h!m]7LWj4o5F *i#".LB\"KCg.>FQq~.+i"EQy+NI~fBgp5cF^,!7hI 42 l(ZoCeгmc>+5{rUAhcIu.EqϒBA >ZTZjt?\<=(F*+i\#| f}Tą*߯XH60A#ܜ-4W Dz&=_ [p=at̀ !$>dsZIyeRCƑY>zױ;&tj8\I9wME6r3_u@ **Cv.V ͡l23HX궰 c,gre4.Q#m&ut18@g"1$hIIYɣnJ:x[⻮ >7^cLX>~ h,8J8/ ev(mksH8: |,'A>]r=&7sh<O '+t'!1V"Ps)eœaxNHkvn>y!PkR!o '`'^K.xNW&ۦǕWa$墒G6W6jJ4%0M8LtΓM~XlwICUξKY7a~nn$_76]JwNAP2Uzhl8[e Tr;5Y&W Jum7xvfq+C1hK8%(,PSd/ITj& > :aMCT2,.rA'W#Ί0r1 qiW2B4prrAi{jO X'ăx[/&k|hwWQq`FWԾ}wMgL5Qx`'qa0P}t2Ev2k4LHg!a O>0Cǻ^9:J  OZo's }^fn&#'c9n׫]G ^+%Wke.̘$R=x) <.Q&q"YsyPi #B!~+K14ZxϨTr{ܯ_Kr q(n+Pp@2F#;ϓKlF;xv"B+Dp/P*9:%mؾcZ>jܫZŝidtp=Ԃ23l)y>]d_3ID;L 0vad53iklFW (jZZQ{u[ĢSgz'SwѠ w h<.r[织Q.֫Lխ,k=|z{w-1U9Šb qo+9p)lIj}Эcz{˸Q2Dг*q A錶 9l=@lJQi]G$PxZ?o})/f\..&I+P_y=\r{cS \*{;<7ט̄EJ|mh[~L#Q0VeET$ ֓c^l95PQWCzTxd}!}v`Y_}#+>DY˹21=Mϣu`6ݲ |r!rԫ[ \-ϏO| u+:pޒ2H+3Q\P}:~OT/Rl#f4].`s+[EqMƯR K˽"E#8E:G1}F@=AkG=:0gGtkz߽zO2ZQG;+!ͽ3QV̭Pޙ2ӇOɚ9H3Jg:6E5Dms }67l.|A`?߅J6kPj^en>Ӝг+q3!qXr>LܛJg&u| (r_}4г+0q)|9|TC;;0 2!Q8;4**\${!`ThMD_,}XfR߽l߽?A,S p:5TE/w~۟cz=Lx{w9lhxʸ-“@e8MJz_Q<@EM.JP%+%L͔knF_E5u10U\nR>fo $@[,)$mxk5Hj"ŚK\Mhtnb" ֝9;]^Ÿao0`3ld-no%uةxa9 vKa'Ն Պ [p0:;Axh/j::Z0kX`ۨ(-2&i{@(]J~lO@p.z??QOvxKGtΏӫ##x}hreŸ'ݧ_*=hKnoA`t@"D:ȨK#x4K{N"pFYg6ChL-W) H(JFr 8򾐇syXfcΌZd 1Gs!F|#;oN4uۣuszuG;& 5Iwf0(M~@)cqrPuW]72(Cp=lK:^-ɼKE@?d>qmO n~3{ %}/X癠!e,[e\,݀Xbav5Z.bUf}ɇO.-sKàV~wj6_Ί;7eg-2WL^&L{6W"/5p{x]OEz.lJbk*⃢6$I3^CD\2:xI/Ѩ-2&ǐ[Gv=pQ3ŜDJpBLSmg") fYy_P. j=zGSSL%"Ō a[G,bl# .Z!ﳞpd.jUסܵ'|*&p깫8Bo@lIkLRY10LDHBQP'dBL*bS:+xwGDH^\AY ¥ !)Ә*#SEq}uM_EfMň~ހZ|[dLyCs7;B'=ZO E0d, l8aLe0TLjjKTfYeS,d 8$|bS\2uG3ErAC9 [ڲlC;3(Ae8)~^GpF#lvnhsΰ~ϰcBFeEs=>TWĤ?o#&Ƒ" I'Z3L&tc+dQ] aV k%sW]8GfT.VℓD8b3dyśO p@(*)+8g݌lDM9,;t?:˘|RwP}O+x4^,QŵK!>$kU, ahCw362v6<=! b#B.U{qؐZGs0ahɔ -2vUyWa{B.$rxàݮh d"NT-CP15qV5[/K}SdM~퐢&㐁$NImgIϷZdg3Jy_pONs:φg )&(FA0J2:}F o>wNzN|WhB5k>h/Ð%JO&M脐rZqt&|r&(򞐫`56ΟnJ !uyȁKs.Cc72cx4 ǑfQz^LR͞qyaj,]"{-b^5h8 [aFKf9Doۣ ngnuO7.aV33LD\* UjGa$g[km#lh*ZāX)_oש5_tk,MQ͂3iCVTA(grÄ(?^>.'jDž-KnXD;{sU%oM>NK;XgU_/yT_(?Κ!fY^TXfL]ھ7-慫UO^ls<&{\nVp.!~c<}^rQ~1Ooom9#TQPָWQ0K)0} ,bVf$V ym,l1=+_}Yena)ׯz޻5sٲ0^:ps0-yx; qW=#vYg,Y2)2oapkvv"lP"x4WQ==LAxr~p]EFbLD|}*\[Gu·!̵[&[.73eDK{*DtHQ ǥ c/7?mR 1+ ndVy_y0X"tSq=ŕ0#f>)+ȕ%CɚтFlVpW/QȘ"c_R=T [CfX}e.|YLu?uoE!]2RJ;F6%jmR0%i|/^լyT )HK[cBIک bLHIiaI ێ!Oo^#r^UWҒ:ٿYKu7rbOrȓ >kJaxC) 1MMg%4UV[G|w EaļӢL]ӊo8#-2Nm]DI۬^g/ݝެ7C|/ 5R eף_fݿCiEƗ),yIcPC/_9Ie7Uqx0>Y5m>r,!=w %,֜(xe4+ @g`qY,"M)76w0emD jHG]GJ{UGQ7xtuW0~rL`b[V7H4|P`+EiWZd| ;/ьujԋcdH"5=Z2%&;e@ ńMI$DD1swFN(ܗވtx`$8CbIYبZ&pɵ^|-_ tЄD p,$u1 {caa<8OO.5ih-2-yBDJA8$ֆl+˼@UN(z_VVCc|?2PMDAAT tMT;C"i, 2 |Qh2:6>*~2$ e^Ţ łl!ԥW P@fE3E#i|ߏOĤI-d?0*Q5YI<__{ܡbksI\Ә.R`#GD{ Sw8B-2G}]iT[׃]!Q(<ż5|A m'm.% l>h#.@lQ #/( ߨ8\`&=wfE)gpc7DQ0:"c`p"=z!0ߕΓbA^=/ysIiB##<K.4']ܜtC;qP@«硶AFG"!uhP ?WY]Bs}`@ !O-ĩ6 (X& Ri`c k{i2)S%4q$̖5ɭciusJܰ [+*i1$80ibu;Fh]q!]K|/r%uoEDZF3~%(aDTp:.3(k|IL[L.$yEr/)gfiһWo9,~gk_ O:0_M! I7B|P_|1*JԈ"c`pƱOx9D'>tjDڱ}/FĻK7[CUă,@E_>bnC3\a>_n}a`U, (@L0pCss_ [CdsUwϟl#|*4qkcg;aIp#oj+n#gSihHLⳓ6(cܒ?,.^~ˤ"cMm:ܺexTLy9]݇=9l0( 10AU7W HU^iJ;^Ͼ_.ߣ5~,hN_({j.{_"SV}oX+EܦЖxM*VehuN͝SepEZ(6-ka K9GWe-~A؊zkcv6Ck\ p(< *F+F1j^*gfQ㏊W&65"nXW%4Jf:(ުC02j۴w_$_Tצ^l=Z{(W^S50)RԔ0_8*$a$H%4fÖ*nj  qŏQDA8:n(z@;/FyQXzk8/tD]9r&nD=5ٙ[ qQou#pDcn(?<.ǜX.wVi;b[򩷸:q6;B a8*C|\GIjxg!)X0NE4!ͣЖQJӣ峎:2vQp^gT:8?Z(iOxnSL%T)F?S+x NWǹr}wvb+.1j]NlШԎBx<u :0m<~ky\Ah{:ӈ($ssh~.hZQs5b]C:z *Q-g!׸FF, =nHq4ŇHaw3/ Hu{=;b%\ѱKdb=X%? fy^LpKw)uyn4 |Fh(MLTòŢ\y]`́=p1C"]u_5Qۻܥh֬_.1kx;(!Au5Z("lIhrJx&y5| )[ - x[BϬ]J۩ro>5t4~4Fr)Bw:\O:z|?PTWzq? Ea^>e.UC28jG^oF,"/ޖ$q܂h I#I3z,[;zCu(Ʋo(쫒QfMcP=΄UA9 e 'CSnLO|RFcF-j]Z_KLeb3q,C( XIsO#3x1#v  Q:F3.`liPA{i<ה*ns{rknn;`Bc \3k+Q_!=o>zp5gmf9wS}(X-}o. S?WCu2z . X^@TA'mWm+T_m;@ z #)Zdړ*^qJR`Үu"ؾۏ"z vtJ韨'P7U?u*:I3-$zK4dufSd=cIB&Da-t TړXə?dq+J>>$}g#(/ϔc"Fn0vd5-d &QĺTgzKLd38mxbizxM5܈Pn _jĺZ sLx 0ob/~"2b->!R8!J,]¥pK2vߕrQt{NhqN*f*{/u-^81cUoEt{hjI;ETnWJ~ :zVw?P'@\C_|벟o/f3&vN!W۰t﷾AoSҊܤ*f$#YL28KM4l.6|{& wg%?eTRpqK&wd2(8wwl:?/GRpdDP1ăqf l&7o#34m#[]ƊDzbj1@[bDlAeh QBrާWmxL0)k&f$E\ŧ]Thbq5x}U[FSl}~p<>u#oIN7QW&ÒLߙP~3ӧ[?*b唶n:쎮4euyP͑=>db1&C)TQ@7ɕ% {FMYnaOdӹ ޹pM'b63L,S3"8u()HC7LXI׸0CA=.D.Wi8ŢnoNU/Iv|ɒtSRi*̋Hl$IT,3 ˿`2%QCkR ]Ak 3!Y#~0CFhYjخL_NT?O_~mb/H옢Znx)M)٤c<=s|>TZHƤ^Pρ&$C8%HB=L-DP1n{݄SIn<}獝+_JIਿf.AFnWϻ7x~Le~_oVNY4۩~`߄H3+WYܭQA~3O /pr-楝,0lɻ!j/^@qyUoB3.߬|]}fz7;|y* ;oLOkPVanpc O,7 SRK5nn3c8ڃ`g'].%(g}mL_[4\J;JNyfpx() 04I%7F+^T :/e2(sȵSؖJ-ev6k?٬ad~2[aOJ ޮ'.xiXga+$}M_ B2Oם>[| "w^?^{;-ު`赾{czT:^w'VO R%UkޚՑx( >bBMlƝeJO9I,2q~7!^o tCC|- 1 #$DEQʟ H:Q"ZWS+'< 7͉ rieMgކ`ϲWL3-T;¿6FcH;fXnSݮs| E5OXIJ_˽MiV@jCd5{I r70o "ǭRmyM>| )$ X"v ~Lm9˴)\]A|:Dcn6Q1RE;#LCϝM s/*&7UFooVO;BGXr Պž,"NRϬklM4@Pܚ eUOVhsv-6#pFiFeOrA|%ji%IVdT[Uh$i:b8pE<bHGT@H/)q>gP:ɩȜ6xؤWoyfKv"/tyf3}k*FEhd ^FNFv74D{nFʔ^r;HP>$b!vFE49ؽ=!%$r!lkfht7a`O|25+ŨP\0*J^H|fwXIybWowA^`ƸVNK9܊a(5y\P`q~t!OHyz-p -= -c~уM  3XHÉzGY|QTԯO?)(] ?==y@AB< #)ϒf$UGd!טIE ^c c'JGRXd7n@344v47qLl Dv( _>`~@UYR)Py;[. [ LQÙp4Z9n6 pV,%2FQ}p/##\4ȹ8^sN0 Nyl~zL!08)7إyyO~M2FiH %$ zM0dyB}C0pgJ ȭ$$,pܓw( 6z5Qxg2 oL6ea1pU? xMJVF *L噞OWm)m;x'Cyf?s+ZSۆ kKQS6&^h2\8t'2rɐW:(Y+I8 ( ΌF50Jq۱7 _4X,%Mʱ0v:Q*MZ>A[13^'-t/lXF+}sn,fR÷{ͣ08)< N\T#8±_y)k8l"Y #PM9c<6<ʂC0-CjL-08H*7z CEAғf#d@AsM0."}xDHH<ʂC3zE3`E{)ϐ}z>$na2rf+ބOClFA]"'{ f@{*v( N[_ ݑOr9krPǡNkVe'h"T4M7?[< R3'L^\eH.?BܜV')6Z1fhE.8W'KZ3Ycͻ`JM7P08mX_-Ty`"w q]c'^ 4)]cc^Q02^/mV**ĭ/Oikw %_g90Ev( NgIM\cB<ke3^?#rߊ{qr)W^i]8SѪx ʄԫ]e8{E#v@'&T7"Ew <چD( quăajeg;$s_vV?y .V4ŏ~8/i0_<]$jWms@_8@Ek4+>#P\s '>Ib럢_yV{q孝٫""Oտ[ؐݗUzm}]{_?-%v86-WR ?2Gc9cG'K"G7di#Yf12%5x%ÂY4$"u Çxon~A\V±<UQYXV2km{8`;0EԢKc032G3?B;W^ k#?RGn謣qvƎvp(wNS_BHabo`2Z9b3kh"12:J;e~-4T^"U꟪ lnX.X.>AFBd4fWOW3ТX4;LݢR4Tn]k?Gh᤾n;'R* $ugw -x"OV 'h5>Ial} yp*9_'X_MQw䇳 :p~i:?g$LnACL hQ |8۹ԛau7e,Bą2ݔ{b"|c9av .D;y俯 H"Ag52 {sդ[ɲϯ ,>=,Zj2_U۷ˡz|f!^.yVsu}{aA"xA#u'q>L'7պ'f_]l(5taDomg޽tPe2 F57bclxÈvx^OW}_V=zCJa٬]ث,^]xv*uӹViEt^pIjh3QZǹPnߗoxl_U#!$qb, qkwϡk7+~4"rKP A9 `)9u*h'fy1"-g -U /.ՆE:JRkSWioxA}7n bkXůwSxT:FT/-88Y?\utVi]T_6_"~~YCdo_ B^_,ճut;^$ W"kϡԲ74Je6i2,2vbOVv'CxYO.ESM& $W/ϙZ92+3h.ͤ\ϬBb[RewTJ>(lJ ..N<4:K%V3){bݬj۫ͦ9;B"+B488ͤ[2@^U=t N\dd>ț^e:b2d4#pnLHLpD[WO(o-Ǣfؼ+bB14 Se˯Vq1lY)?Uys5w"1\[~\0N9ź-4jۙ"c`cݢ/G|KJJ?Y,lpvqoӉ{"֧%u]lx!`L'>B%ɿ {O!bh&r#|+{<~?,{0--b҂+/qLY )Y܈/ B!Z2 05e rox=L'L#| S,|_>Bcsnn q8phSRǴU(-<) :JyرV"M}BEvk2l§P"nR@ػ6$W〽iƝv6 b 2aTHʱ$32[mauWMuWU%WB=t0߃B 5ִ8YCPy)eqw1Mn,Kega(5XA| E4}f3pu^p8{,j!d NcCOm{P(HT$(Xٿ߽{$m8 ~?j CGu R_8='q<4q6{CjU?O/цY/0v4#m4ݼhacЯ79h lvjdzN>\"svy|wv47 mY=l<$p4FEZ}'dAW2-|__&ZiN{~hxogZ*Q$1 񘻠 6aS L]$ց}kGT:V4ժ;Nj?c;oH{-~)T4g0^̛h}{)AM4$\=ޛ-,yq&!7 |q.!wGg;s+"nV/RVŵY;ml+.EΆ x{ 7O..{M̺:6GsZfi!@&zx6:^ǣV0R.l:~(g15/(?_֯l{A{{wOhf⡻FގF9JnQc`cnU_@*,I ~ޭ'(2y9Lh>StKs4"9Z4㪉/gY3 |^t<okM 6ZkD[N~ i93|JIaEC伸;If0XBsCg0,HA;0v %LR{9؂LJ0/1/mZ)| EWd/M-/b=gg.1Hj#nzPX-1˫H8\Ku(^;ZHURJ njv>J0Oř.|7"k\;}\%F#cEJ-Qc %Z &Gp$V!MA[;ZP(Ĥ40SPJ0hQCE"9O*\;+uPyMy#nZ$f#t=º"61P(¼Zuw>gv?JRUORp<=^sF:;Wz7&H|\ys^('z<iȶM`Jyn,=(3VچSqf7}(`^:n2H)c#x/J0OU7LjXqG2>- jóއB 9.ԧLY=(a^i]y`jLNYsp?uEPyiq{`"kU\e `0Wg>L" {acI>J09)λD]aqF* )t\w8P(098dװB@EC2(|PZlx "B( kPyD /(1! m`*aT1P(ޚ+Rt/XKRuzeXz >T%WvK?D)XY`"3F+B ҽqO)(nQeŹ?lͶMݼ(м {p{n{qJ)̍F$ã9XnL\\'uPJJ&zL㍊ 9-"ыHH5Xw3߇B d `dA\8,vRWR> 0->[ޓ[F]ٖDې!@Q>a@DH[vCf+o ®j %d֣7B=fvel&;?Ym[MS鶺f1MYo!)drNp0`1Ij0!MXP(a&ٞ!,CMRH(ŁqisWc"nPo;W?sz[W1|[={^XFY6{.bZsO o>fm۲D[[ݤ39>Z.QgH_5( q%1Ib!ǿؤ#e<&^]4*?!HR1@kв ͮ?.U$&JFʕ{& hE- (#@qI;GMcPq6PINɤ\V\X&B[ߋǾ.Mw4H( ^dqGϩjTNq^ׁtifiN4 bʍ7ҲY)Vtgoٛw`F'?շ֋^q4ޝeGˇ >q} gr~j|%d4.)Gl8 -\Qxb+h0GWa0=8J2F?Y¸kJ/\S0K<-g>8 i|xq6r^ \~hx-3]sU ƓOgf%J8~1[[bsK'ۚۛ1dk3mI3?MڒhyGKA. ;׍R%Z+Uhu7rwϤfq0I?|ֽm=m]`d.W8ߣך˟7-TeLMJqxv(v u.rcRjmc?}y13?/''9vͻEɃ tɪԙ_hzݵ^6?ik>j>oVv|&k3@m{r`.Ԗhǀ iYƼI'C cW;rj̚?lk ElL kPy@ȪSNw|gA˯ܕZ/Q炣*?j(%L3IJqmL]Ylb:qFǽ-tݾڮF cvofbmJE܎)a-זր+Lx OܝYX,ëЮY׍GBKaʊyBj"'k@zieB_}'5M=H;Rn#˖f, 6M\͎;XͦC?C)nFz;I#g1V*Ng+X~Xkg0%Cߜ_9ϹEA`~x{d+pG`R2 KN2B`)Nd7s{&Ü\9Ͻdz=)'wVu&FK K/HDƈ!t9>2>+aȳS8X0u7 `7[ی ;4 >9 8xlDZ3ᕡ&4uub КCJ,*XDbq=)tF#7OpzI|9JXY?ܣ]˸ri$E8s_fwPOïyU_' όe'2$y.uOA*0K5k^UGf+;;F3?kQ:Ex^LGrܼW=/)nb /= ȷ@9B<GOL (6kj J1*XT KmŽ(G d9B}!RnތqMaa7,HZrRf>k6`۷uh?XCS4 #VTqW%D)be֯=zL˵jU[oMr>~MCemlIܯ/zP=wx(.\_gg=#C-2x6nV7վEF*\51|M _51|;[TW51|M _տ&)=ZRفTSה5%{M^Sה5%{M^S?V9poRA`۬=~`.;h3Q#5"ʭi@Gv+xHu~CFW(W #پ^\J+'HEķdp+x8FZ8Γ IT"E%( '%:uTOruz%˰p+atL`Rc skHd"锈 &R0cb)TUu{k65wnMw֗ '|]PBl( _=A=*= 8ї`hXraF1`ؐP4hSsUݓwiG Ujz[Ԯ׮oۊ "-!-rE9Ps9dcV k/޵q$e_c3Rwq$@ -:6/7b!)J"9zDnv"r4W_ƓßGhO#\yh5!/G_KՒ%ZI1xK&NYrϣA JQI^\#pCIч"N{c/SxZR@|攨 쵡&*5]/;,Ǜ!es,r\iCIߋY?w;k2Ua6 :>oEix"YfV?SquOD2\Ss&n6TaTWh'+-TyEj; 5OKl' \k_TFǗb^V?TaneдT}yҽtFxVO痷/lۀvTmol[R2[Ym--]Ki"MZGT[.~k+-WSNk~Uj;RzCWi&r:M`Y;u&2= 56rVze?q)- .qG_W~W߆d@m-ͥgZ!r.y| |̙`΄l+{6~>_m ^Ổj1qL$ެ7k ]omݓ4^n2jo[{8@]zƨvnk{.pCf=zG0^Pj]g<-= DݓK$Ibz<ozCgd'NrV>G6]%_>'{i7"pMm1?#K%]^.Qp'\vC>fZY[5G']iɳU S-vOK.բ+R PU =E3ޮjWjҥg!\Ӿ /KV6'OqޱAuR5TL,"b aR&Yh6GwhMEE@Շ&SV͝-(2:@i/8Pv%v˧OT6MQ| y\+~y<]n3N5 [Ԝʘ2qmNI?Ѵ`c/E0Oa8:²MAre_}*fYd.IY]䫨w`>++/+rxwʝRVv>S=Ih$jЈ( J j3fh}ܒ0i>&%8dEn<><ϗP8l/G^. o,>Rw)EJO/Uku'c̯yo_ :+|uӧO7sk;fdfg9th (؀2"hB,OF|:^m 9=٧SƐPրPX k@a (5β1zdzlwz@8YPˠ1r0a(-~oP A7(&T^@Wa/<V^x RIWTW>ywЉz A{ZsĉAbԖ1:m?,y..Ε.(zKco=GGKCushf.!n 34%W{ I@{G(hʂJFQ "%D:τ1RW8q9m| 1&D`* ]P8`H=_[w_}UΟsG'C_křoqӆ{_#l]D+^oO6,˕7 tW_z q% IjM:1 u\\'P n!$Wd?E>Ls ,k"o,y4nMI.ъR-*BD.KتOswFPp]OOc+kWR  =M8%J`#$!K7,AV""ֿrM]ISM5F%TsPc@Baʡxxq恑H*Ҧmk( x c/HTy 2j/mF@VF4+e_$VAu;sYKjdY[_ ggo̫;޳#h[\R*~M^e 9 {[}+KM-HFcVbxGZ?9I+~H591#99|$cC;3wp:CxvOM=n_-I*yDgI'˻n6gle Dʓ7!l?~sf^.~%OΏV?eV`fӆeS7= m@?G޹Qq uR]}<&Uo/^]>?8^f7;C9*3g(h疙˹]IUY/?F[NiljƖX٦f`s3nlfUXޢ҆87ie/>~ZLpvP;6 Vꦱ22?ߴ:q#a9x1;#zHq_nTm/(?/گ+?<wo_&ջ}7^y~wwoy YK ݅>{B4hiޡi)|;DC AsM0;K(c'éǟgy N^s}rF5?[W4ˍ^/YʟZNwK;\7ߔnN,oصMK5H7HbXY[mR 7iHB`1o%8cQbH+A҃g6n_ohg^b8RR.>uߙ#:R c Hจ8P%Lz(Q)meSgg{B&z+9@7ֶ ;\μʮ˪[LJRwLJuMKR8KJVA-kg}Ԃ3eE{rֻ)|0,O돓u=@\05Ѳ1_4"W`%P^+'NRxΤ)@Kƺȓ5 FO 2eؾ J1 RX8< YN`ug{ md}\:qcZiq`qn׿g d/ 0M1^dX6*-QR=7FJZdSFEydiG~-e^U^!1(%>%-O4?t)MĨc7q1%ҏxRܽ3 Vp7AJtA7, Y/3⥱诩W8έpNspi yFYP\QDdW=+ڼAPՁKiXT$ $0@R##K^O=`OOE;z]zI{ߑ["|n<>HrEAB)6ybiԅݵwǸWފPYT_RfqJGh(TdyXi&=}Z5E2+R  =M8%J`#$!` \ȷZ~5u:eO5` X :8OPqRBAeD %) F"H"nJ|#ɿ2ln3dVvq6:P)#FPe rS*1 A3DPTD $DNKW_k%z[M}LQD30f(1i ׆T/+*5A\O`ٻ_SSxɏ!dzҨ "a[YT pf/#_E(b5G2ž3 !L靡)(tvtٛ@{aJ`PUtb ր&,*Awu24Z|8IYBcqX_&Xe1\fg"r^֟+`ПͯrBJIѴZoNM8̙ß'NA}F<ήi>:#|,CuUOޔ.f/MdR:͗aGR\F(+&ۋYYoPdZH4^Z0Mb9 V0bbtY/t s8npZ:JP)jI9Ϗs^}1 $fIlqg/U䶗 ˆE8sóS @>|u߽99x}t:9ћ8 ̂K5ס@(m ?mˮ@4Uh|[){}@,wtfnrK8{{S/'8-&xdp%q]?ǥN{%V?JW]Jؗ"Dt3]ȑAyń_3tnip(փ V&JHxj(u:b4t0{' {jK;ŵA9HɔtrJ;tFϤ&G;_}IL+GRr׿ȹ^.ìT6J!۠Ln *KdOx ACHO7j QsZKeyyQ,b`R%lEyvѹN;<.{/}[Ja`8_ݼltJSVq|;::"-#09"`'ɥ3xRQC@/@$EI]o6g&*NK_^-.WIzsu~j5Zݥvt9i27_|mZ5[lUmx+JȮw_e<ܼ; }[ Nht^pR+rJ)(5hK*mw1Zp/^ر66MKL,w"05-zmBU啶lbgEN,5VӦpBB3̫o&rޝNY/}3|xcor&K&9Lf3[f/('{;5p}ڼ '- QiQ?[ſ9;-pCN西o(tVdV Ѱ z ԱbH4vysY#^uRBg/vean8Sd6Tgm:K-I.b2?d8&%y]y̗$p|Z%]# LTd$0lqxvX֗ˁlq3;Op=12"r'Ue],.AfV'sq^Lk_ܡ((t_zz+i{ޘiv @u'Q%=Wf)]bϜMnenɜ&vZnζ26V>Z!`_7*rCѨܐ傜Jg^i@!dL/fN-o=Fm% % Gl^њUh,;xEVq`7}6<]NңueG~|q zBۡ76p^|7t{oxR?=V|AM6biX.M6"59ܓo' pyi#OsW/{r @Z!B&H.{4(;PPf\uO.WyOX.~?KOVHsLWrB|jBpnwZ{U+qy)l[hN_ujoZ0 F_N39 V[ Ap ѽ{Wl5-fO=ҭo,ݣF< H^Quߺ%zXC1Ƴ:cyũG\3kh>#lzόF%#2ֹQ Ozk1R43c59qV}o==; wTp&zJ0w >3X0DJ-q!(Gˈ -KnJ+PR2¤h mt@K˙eFHLBn;/ЧT ტ,hE#!=Ea8gQ" QML-7tSm8^VE칡hA?ac !^a)RAZѕnǓjM<MFM$%H#GQ(rBXyW#bK- ŧ6zd+ٔpWUn(F@(y$""iXj,!D`E<=N٣;Xc]>B61wo1z1=ARA#PZI,kXd?m}7sJ6Q~S㪖xڂYow4[j(+V8[Zbiqzgi-mժzFWEKg!Hݯ`"{joViV]eG7Ǥb'YveH 0֓@1֖Ǽ̔ܤֶdY{[9nWfJ_}i`k.PH-ɘ8P=x:FY^1Tk%i{M{w4imt\v\핍>Wl,5V,&ԫdtcGMrq>FA3(C) h6N]{r_z|)-h΁DvVyf`R(gjQKޅnoO &"\JlB Oc*5(K) "0g]ߞ oO^ لɇmFs^ 2)f l#>IMTP"g E1L H@〴!7h"uf,J}$#)Sj"9̰4R%pD0AP8@2A'}$( 7*/}OFŬ~1 K1WxKD8as+ MzoOIQ?5ثRMJ` B)52蕷RcPCD*-3hH2&)`e\3p9 t>H8XAE0ARIAdj$ѹ4xun uvm}LQD302""{ڇůtTRq 5'0/G٩)<ǐYiT C>^ G~~Ɖqv͑Li:d=H)x}RL]LalG F:? UE-6` nr tKt%Н]O즂8^/ミH1\fg"r^֟+`ПͯrBJIѴlu|(YTÜ]LmG8?YQUy]SHB V=yS>8ο8lf7I<6_E ?VkKrmAo/feoPdZH4^Su"\ I,G>R>MIOe%x8erkt;z렮+RZBPH189/{~]LC6Y}٤:0Kh}aQ=;G'_y=:8woN^`Nvxr4w-:(߶B*!7@t=h. jTroW. ҙ-ŗ¿NST˓tԚ~?NjAno`dW9.ۂ-z<*Z_.[nf{ES'in}NzDxbO NG"Z<Bv$=xyO,b-x}gJH_AgGYK֦9ȒqPJ )Q￿BH(,Zl4r2;+Si!p7Xc),eZڤƒ^H,&N:UOLOykIܚtkZ/Y)Zw Du'v2db1Jq+tALizxIBȓr2FN6﵉"r}αQwѣC`,{PS@t++my'VHϫ•NUr:3M{?ػnZ7_%?=gC+c$ψ "biHt"AQOЌ? %׫D@nh[DZm1 p|{r*~+.е$*C 9A(FߒAm\hq!DF Q%P3DȢ(-S6Pb#>[dY1 Y -t4XmZpA .SϴgR #(c.AyN8ԓAVt(s]327Xa߽l@f?Kf\)odimNCW]VxXt6MtE[H|j(Fw m=7ힹp6b*[ףKg>+cGK|X%1J B] )kü[*V"Ķm=U+Y9JR#d] ߿]=ˢ%}YW"*'3,1,ʌ[HDtq:ݭJ(^CV'Zar:)7}e,/a|"q= r`\s2ΪdS:gJlgOg19A Bu5X|y> yձgϖf`Bܟ{7N=u)fXNM Y' PG"rdZ dk+'.|Zy5凨,0tFMBefA2KeM%MQgDS$[>9c/<-x@\¡Ջ%go%J[x9hXI` n%Jf|$ʂVruJ[鳲)BzMĥN%!Y)Y /ɪʙ W2d hϕ<$-*jII|YQYΪ^ =J xADJui<8i6 gUhm;ךTFF)VjK^COmVYf7DPʊ Lf2kI"9t\<ѤEz1c$ ίE:qSW#A/FIcƃBrx ('6dO&2~}`P\ipv6O|IPw-`xfeߌǮW#7?ΪWr~Cݙ]'M:scyʅMOJ+nni{5Q'@9&BG #ޔN16袦{>}}^ؠutbtmzwpqL1PAѓ٣q6.̊1ۚiBk-=ʎttڏLrזk/sJu'}nKLrޞ1-p, dJX3)']|&l̶q> RF)'U1P?p2qkF ,5O|^lx:%>SbES;tn~тqU[?ٛ?BSy9=8k/ްhܜbo}{ƂQO㮱mi{ mҚFeyxxޫNF-Q;{yĢ#Mp'd ^L iaU UupԈǯgVRRVo R zi,Kwq#=@ -9IL%s##A&%^zKr{L^T .c=V)=-IAJS8˒6BL(93 YD2%F[U'|u^ZlMt(9o}Qy#9 -5,| QQ| -5Ihq\[pE)"7+.%ZG/l㺺[4uDž6'Im6i'XsYe **5g>Flǯ8VM}҆H)!mcAmmڤޚj3kyS=X{ |:Iɸeи[ {s!8.Sl֚L+ڑGIHuER&뷶AJVjwh8&0:eƔvꛐf~ pvkj=~.R9!P.7<6D'LV6gnx5LDF4jE n-0僐KKqN ;B&GhĸQjEΚ |~Hm>`prb[n<]}TyJbebK>&q)Kz@dZ2_u@Bb4F zKٌm*tZ@! X͘FH!)!hP7?+ C!N=*|! ܭ:0& + ,h%W4:s@Ҳr+,-th/?44!*c,r43Zf*Zca@Jy!~IFdDL8F[*e!WZR]k%aĻD5 ]H~{w37hʈk^_F+C*(%q;(Eeם9u;9u%b0#/8 ;ɒ2Bށ#f(i"<GD+!Gn4Jgwu4<;J=K7&U9X\4yMyQ4MZA.nx ɧA?.3>+C+ə9:Q"Ͷ4kF ~}<>;?^D-ɏZwᐳߝ98g ӗ h>Wl35?vnizwgr{I?P?'|b|:v%^3Ͻdm^ۅ Q?z1n"?FrRlHg᫇l06Oe{W]=JQ9$WjYzQIL3)Kd 1n>O~~O*ۘnDo]hk ͆Zɚ.jYs+ƽ#>uwqfanrc׋oo]wxjVp,V=n\E]c'ξd5rɽ G4TUvR@*Fj|ФfyGj`[uu>"V65PG ς(fF3"Nc&OR:I;6t6i68hrH2-mRNXD/$`\RK6V'^HyQnM'ZD{9MVN$kP5Ÿۛ qtF\/%J[0a#ՠv@@ [f8Sso_ @<ר* !T  F&G֥ؤBVؓ*2g[5U#튔YuU8rRO(%o,ެ>PW(_J= |:Jm oS} A ׳z,5gJ+c1šk|Ɠ!.p 7޽L?Bj]u=Ǔ[z$ӚlpP P,QnF^pj=lSs@7Vێv{}m֑d͠A/LQVW}4)EǸ.>cc`\b҅Vpb#VբB]u+OkV*8ʝ7\ \bZ1b"iZ+IoV.qe;߫tTXz~*n2lЮB#A=.9E-;O ZWeiyZvEGt6ĿDm߆!m2₫g%:ЃuhRxp.003tEGݚN)W RDke+oʼnv EkAb `'N1O\)yepQ{(HG"(m%JrKNK/oH8oqg:}NIVGwƅ?tƳLK7zz _Ͼ5=RI9Ft@S57wxdfsLγֳg+,O|u D{Y1$(efWX!<)"Wwbhnmh Ubl(R85܄TSNxo0n]72!IP)+0gcIઌ %sgEdm&/onǷwKWZj$)Z>7x 19Y׆{'>0J!w4ySՓǟ&^f z֘[^jl!Q?t?.=8_aHmMo鶮R_Ƶ, [PZr}LvAm:УyxKHrcmuɺZJAVi:* > ]*~:I5Auk*P2KU4uu>8o^W~?z&{폯`#̵xU݄>V^Լj.sU M5Z𜽞 xz%zV,lͶR(}UģmIN],n\i~ j~[ZM([׵V^2k2lTI,LM{T%kz}&n3$ʩ.1`ed@&XE"SCր0`C@ȶFOk^QS<_; XnOuV\D#3D&,h%AeSFqfAJL{Rgwг4dV?jwF~r0Sv&gnW5QbZtaڀZ/jB5lt +û^6%"@ F/pJ,;ܗAa74 dRCg._{gUH SN&$vl9ri(!ҙf _ kS2KZx6㸵ly o¨:$w;e|B[uт#J" o{09W~wAEp(}AC `p$iq Ǚ M?uqXFf֪n S;(w, lwקӃtpG6U| 3#GtQtn]4^>gLG e8k5f,`ZFL&Z iȖ9l x":9?3%CR4k.(qCfA%*$ +%R.ncg-نj]vF5|.-9ydFSɿ\jyT|B}lB)#ASNq飶ěU`̙YtR6Vj׬EotZzX2)} ^(L7P*0G JҺ@tY$h) Ӂi]Ic X1ue/H"8x"ఙb hL> 'H kCiD=rl4Yޮ:V!!: IN Y$53RjбCVZMk5mvH%]e!4ՈL\a|=XDMi0@i؆QD3k ~ө<>aub^&TK(Q˳F]ÍfSp7Vfl65u|C{U3Q9ޚ&Z+s=n65QwOVgZX}5G&kw*f6$=l9a:8Hk-D#5Fb燛(sZemC/?[B׵F];] r{h v>';# =jfV89 O݋i9/l@#hFǽ'}~.3@9,R$6Wq68y/Ry3+j3 "tUWU^0b~6"jHSokqgF}YɎ}!5UHFX0xM5Ua@QJ/ָ0÷G{rwb౻F_&կwh1I;0`,:_CRf~:|FịUh4BkM̨9X"FG`G$۸Ɯ'wX2{v9[7ĵa69lXC V=.j ^(;z6աoaX^?C^}ҺN˪we+ ؄?wq* &&A ͖7] )!J.|9? V jGM?[P7v $ ~x/Iʴ "Qˮ7~Z|~H9h *X0B`! alY}E=bdhu+$",1OIMHQ A4GǨ82GFml`gǒE mL<Vw~Ff%0l92Ô=zbxKMfb!!H|{lq^%fCC)QU\AbR+T$1L  6hbZʛEy'ptw')Cj_q.Zu#- -5Xer~?J Q32}oyv߱~ϡr0vNy~]^M^0ҡT4—4}{?{W۸ 1> yNv  d30#K%όw$%Ӷ"-ddz>UwuUV1 f#对A'h|r,N6y:cen@9JtM}fP8;:Kp6je Cabn䕪#ض],B)=uH͑;-x5Yy!j~}HZ3FbMItb;`s Da-@a|HVHhs)v"~כkEᲣE{8tL^-CȤ襥ڍDrtFpMo0!nK yȾygJ此ڍܜto*nW]IZw7Y>=!: >Nֵݠ%M/¸OKTk|ߦƹ|( I2imID\Gm$>9yq6k~pB]v.tGfIbb(}P]Biʺ}lafpDht֢[mo<s'5E=Ff&t7dcV\3F=ۍDTϰi\v:! U:$-=dW qTyRIeUF=U銬Pq X(9.%

1PX6O`K.롛q4MTY*v6MoFf11דlb}ʱvE>2,)8+`fD @Loqs\^>2p |ыө^,){1w&y HgB4z#=y}/ Ct Uez*r˄͙A"OIS/Q)Bvc7 wE&K \r3ݮ݃ޑ6ݖAd{K`q|4HAB}Tl<_llfGU̢=K4uuv};?asaQ[ ɐy? >tY/7tVfIi(@z*s9Xjnukh뛩J4(E8U`"çZ2zm̿hf@m2qOG٥et`f7n< l3,f:glwL?Ȳ"Zx|{җkxe\_jckUZ prng`T$>|sF]%Zn j-VցBu@`gQX*XJא+nq_Oc`bҹVpb#VգW3֋\L%`-KZGKPIHb1@1gz4‚SX^jl"K C__˟t0_Kk(h!jG΂kAi$C'| ANj(sueqͧG^ā4,˽ eu2GwI0V`L3U gQh,ӂ)$8 m sIr=@3yivr`yHHdL? J3-XDI$dvǂqlɆ?Vrha݆Ø+d%x"8F啌QZWXDb7z?nǮ7Me$& DHW:#X Ekc`'N1_&=Ajv] N=fs$|q@AE0Iu4]īqNyGgߔ6G)FOO'[-yžʢ $usӇҌ=䧐UYq}3 G}8ikf=K)ha`RP,\`6x5#W*aUiC(XMX.A4Uh.,r6tI0)%ɊOƣgw}5j애.&k\i*uuBFRkl6Ng5ȺFSƆqK`o@ߧޟ_8sL=XoƠi=_c{ Q8ylko5UlosW6#sʅ3;tK[o3ܞ-ttg5#&g~6U)ӞEbT4*%Q+_E>zPE=jFZ>Tw$U:IS++!xbO NG"ZNH !;IϾ۰yy!>=# A"*mDFC4aA#,PM-S2 샔LPC}ocѮ Ľ6F7֖dOzNXtifSkX(76=tkZ/G-,T}aO~se ])`NƜasεQ}r* c,L^}DOVˬ"& 2(:oNZ D佖豉hj4BZ"Z+Y`AV:i(M!_ptXʆoe"̈́ +Sod:/ڴq#I70ȵC+˲S̟lÀ.&2"~z;/nv(F֛]GG-߽FɑE˵7d*߂']R74M*U;yR(%k.xuh֛ha/~t}F%g~ cP8ȫO[J#wT}Nc9>ZӘgSȅtV>c!á&ҧ0w%b*R*L-I!p$C+ci $[bl ѡKnkWF$ͦ=jYEʭ Һu H#.hlU/Yts,51qrfP4TJF9`FD$2 RjDs$rŐ*"irpPıQS#hqQCJqH@TGG0[#`D @dp6EkJQoyBP00o,*%y A[aD!0 $DC֠z?h<{\ i"RcHQ+xm^#2 j﮴8skHY?{q~-nVCQI0.ZӤBR"=_"wIQprv<|3{;RR𥖜 yu*O^l(W*S?ކf2B,:YT@X'hkndPy"P0t`㑃F?GuE+kn!s'P诳I0RYB=͡ދO݅E|Z ȭLD;'JE4. :Rd)Er[ߧQҕynoc.=ovSSO@3:z!icvd<0A'Z+WZIC=0HN(zifZAUAUߛC[4SY+k&K|=iNJ&Ӧ+Qg~@x#F_:dkUm7ݬyYH$Țcxn!:,6shtiz?U KZf@;ǢճփdGCo֋KM{ɢsև/mf'sh}lس1#O]5 6:6}(s։G Ykqjty7̺->k(tW0UO9c=C ":S௚>_hZob|hnwi9$JvFYKj/(%|4{) үr Þld>|L+ jbKFbr$S, A9p0IMBc]ؚMiӲiga=8Ebʸnz ?EfRV^-?M~vaZؖ U`hi)1HWm~TZscA C#\q'aymbL)!B;F1qyDmyTXH/Ǟ YJ7^JJd 1Y,-,xGI4 G:R;ۑ̉jSUj[X|Z@6wIcɹXxrc);vy7"BXͬKZid`ieVPhbZfy|V-h*&5/ڷl)'VX[YpP |7'Z`E>sٸGuxr7xTϿr Gr")aJ%8-j,:pr k`]I˗k\ /tK*B`"RSFDD b Fр6R@,Aԯ?"dy!W-uVIgΗnvSBerh5Z`(3* etۨQ1,%! j CLc',+l +g^O!y^hQ`$Zyr̳v^P'%dRaTYIJ{Kȗ?/.+l{ _ᮅ-}^4'=*!nK?QDQ"OX; DEω*\n{6+܇s8I]Z99em /Y[OiOu7LsϜ(^d+[f"~+$͕n45lc~Z$nr]T@Y2 ش#=of勞\v|7оr"Z[DP7|VXy7zKK(v;S۟;Ȯ~^>[]$x&ѶYRԽۖgiAK;y|E0tkuDw,թ֩tty~*kN-_~b2շ`p0 oxqIp5L\2͑_=nPfa5Pv\L9ygF n/>B^ S}1~`ޏ.{MpJh=]24 "t< 45ҥT2|? Lhyr-MwJ1v1KK i_F.Q'=[Р5T"ł@eBEK0uRE>)jʐ+]cim1/fwѴ}߾A]3\zT,milA4uY_@ʋ&bYy_us\P+@NLL&zGh Lf:\(Y{ɱ$QK-9+R 3 %cJPdeK'ˀY,CjP:OBA|%KKQ]'h+" OmFE+CM`v~Q9}ﻄ~%~d0.%C.j!DYg1t-Ԙ(1xO LaMTp!`C>١.j!xfBYbvZ m݂Pz[2pV_(߿۟f0 5krCQ˒ZKt>YzQ(\0&n"Ha"^Q`(7,b8KQ5ΰ` hA?j>AMrq 2 &ᅬǷ !Ԯ϶rC7ịErF{wff" AjzRD¹T.;jv§R+ibԂFkE'A:@fEĩ $KFHFGR4RE`tdj#Xy!w. +AO=JcI !)R 600ث8TplE #`>a s,`ؖc>*0:0 H[= СCe- Pr";bFo u]}sX.s1 (%aa漣, /;"i3j%+Si©>h XOVQRJpD*kVA*XllA AY5R%a?cy 70fJWe88|F"|׼XvJCZjحH}P0vY4a#A2rVk/;b:w@Uȧ{|TLe9_g)叕x8_߄QE=Dv_,_Y~wѣ+U_ч~Vc*e=pbl}`6No jZokՊ@[!6#!-I&!}F]0-ePS:{͐k$Up;oxE"' pV{tHbaJ3k=W֟_Yp%Y/5LߗK߿x׈ v)řW"ihic aG!^9\2|!2(C)[$+x-\{WFZG EHyoYŰ-OqjIΪĩ5`U1,X02jԻMHXȊp) m&>N ٭KNIYOׂ뫘@81 :8?{zNp$NV? $e!j"7 u@\ Zrwzn؂bY.Z+8P&g9nؾ^o ^n~4U\ 0"t MkFv^u8+ᾂQYFwO9ňv+ ////?x # &ZSxN~Vkn$&ˋ.Q VI}mjɨӡfp3}]{47+Q|>;Z)zqŖav VWjutjpyWrQ'6aq(d/s; jC:;'=pG}o^{wRwݛ/q \tu-D"(ywoF{u0{4רKѮcvy;^zE!@Z|YxUW}y&1.v׳2)_ؘ뚣#;tG*ĒTt.$Xa|-@u儻S {&Ue$i$mu*ʁd0TRMZؤh"&Ej"}!I<۰yn9>߶ټ 69+Q 1ĔJRKEejX験 E,{ˢ(:Co}0Dh6q# N3H3Х"U[מƦ[}M/gRA'۪u 1Ob'[iH!Lvjir Vecoyޫ4&G-6poV~I i"Z y1dr.$anup)Qqғq-!qsT/ݪ>xw^| :ztq ],t}?$*ޒMUKi{rz}DDZ~"H'D=%cگ$d#TI+TT }ady 4ИR֑jĮ86#]CجDBn*%;/tS#C n{n֎/}kPA y.#S&+]!NsvVtE.i.PCN;(I?#F+1y~^XоrYg /__Fśˆ;GaGy}}B7-~GZa_^\ӳlaMw|(Jh%wIYX ݫg08Nڝ;96)VR&IkeCiE%R4NTeNfg $] .oZB`w`vi}*ϯ`_kgאSd"gjhCnEd*>hgKVh$*ad1؍N~(G(%&/BIYkAE )E+ġiQ~hG3CʖVitJqҚP1#yPm1j󱆜秧\5jȖZ6 )7E5[!ePň9 0w*u[%Bqؔr lv)J3dE D$/gaEDx,.a:&%LEzmWFiX*Hɔ F`"xBYe5ւo96O;,0JJ.g:"HUYzQ'Y(SmZ)L`E1\22o!.wG9J۠]F)a;$jCF˧]`##+}ׄiYА*P1{|NQb9`DdM*n8" g%)hJ raV30CLZeF:.3$?@H$1D^1@6!)S+ #]xѸ~=%Hos@ сwK}BP-fx  wn桔5Jp'5IgcY  %@@m% khGŌ,F,n0Z[2 D8)Y αk?,,$f )Dir9 <@KpwFdU% ¨0,$ `Ɉ&B@::c&[5Z6't" 騚:lHƒ5$\Y&޲YPRSi[~/E۠ Jw+ a:J&>o\X-9J/ R-4,8h<6skk-r Y T4<Pqt*MN›i`t>F1!-$ruVV Q $BV#m7vQN 0#6R9p7LJxKD]жPLrt#bٹzI(ZLwz `1!J[8="2@z-Zqae P>f0W"NR%X4UU7Hx_&yϰaP ep!IX(&z\5g+`~(]ڈr1(j r$[*/37 3x҂c b$TLdd D `Q΁?]D^ki8?n*TS/bmS"!w*Y[0m.1Q'X)©: ltaċF zx]Ht 0#A ÉrT5]@=um(T`$U5qXXoCnTB)kL. E-2cnVMZrM`$ >DӰ\1iŕL- Qgy[L^]Ѱ;S;Frc!/6M=Mv,6 x+#K^I8ud$ xl6UEV}4J+BS VBr 4ZhgTlJC*]סq8_/CΚ4_~Y>;+&+%@ۆJkN$yvE7ٙDsF:_;G9)~4>!4Waq3t,h%Y ) BՎ})H|\x.TSCYûAZOxNF} FjK70Pyژ< !Gi&1Ǵ/_'w2Q/)qFpjj"I u7'ᦲ/yPle9EO#C1)qqyeS:ŜVQ#4M)* 1$;Ȓ')Jg%VI+>JEL^ҹX;ߜHUQ8 u` cEN8n'WXsv;dRw݆ XV 歭IҊp<P/s܏ TFdE|y-6Q0Oz#*&uR!j.L9 @QXB[4qb*fj$zg;KRbךʊDYG\J&GB\и\])3q+9۹XwQR1b@ +#֫T[)8XTWsBX\">O}+"жq__2sr4Wj~<4UG |7R 7w0dЂW/Hr6Fq\ Eud QMBNdR)9=% ҧCry2$hsP䯈 Io4=04sC8H7}⵷V8 ZQu 9r:Uwy:-ᄺpqpGk8j4voKUV1e4~s2+|3m?47>fHAb{؃$ =HAb{؃$ =HAb{؃$ =HAb{؃$ =HAb{؃$ =HAb{؃$]b9>rh槓^5dkѧC+^&/pUpo&Հ_8ԃ -Cŝ 45 > ]c3e'$t(|[nsGWZמ~\1X1q1_X))|>i \wK7hpIƙeOV_"%`ߴ<{*|<$~7_>~ܯF a6Iyd\Zh R$)sdSy+%%!I[sw|g,VXܜ"?eϱjT8U;n]#+C邹J"_n5h֪ 0M=1ozdfnGwouV9d&󡽾5tijעQvEM3|}ڋ9",U?\v⧃IwL {q͓fcFs.$ʴ%$+ X3JJJsJIK3bz}]m@S3EmNbEYa®PZUĮ"v]E*bWUĮ"v]E*bWUĮ"v]E*bWUĮ"v]E*bWUĮ"v]E*bWUĮ"v]EŮl`v0BsSBk:z*RĮA&y8vu9_ieo}5&PV(]hyv  H_]ók3Q}ʍvfidOQn³Y PiN++^KR̀WwXp,+=^W9saNg@ȞԾE{Ԡ]wRw1TݗpCP&%> ܪ—/^=ae$$i7NEmc6ZHh)KĚÔQx| fivVl1 J:k؆xEaHpJt$ IH:@t$ IH:@t$ IH:@t$ IH:@t$ IH:@t$ IH:xC0}o ه~PӕY@ >0i{M&t 590Cn&fȭ5 yR3E0N3\Z*Åy'%΃?u~5#7ZA-93wwCemtrMe%x}9x4v/F\wvF>LQU[|;^uS57d'ɒ0i{jjY)w~/L]V{  T :FF$,{N'/<}!~t]Dp52:Id{z1ձH47ДTv>xp$,IPj]O u25x@BJkΎb;m6>\Y-6 .yv n[d{|i{.>wkP*w2"T(%88a:ze(q(YaII#1-2w(x{<6 8J1 +;JMlpL9Ǿv)s,1{5NUb.= Уjz)մ i)Ai͌aN9:Âˉ % #r'(He?X'_Xk3D2q!0ekג~1)S؟\D]%*g缪D]Y]oqOXoLA^|_[%ɚu -|wSϱ7[ʥAeeuU Nć=G2}`^f#>B9agWSԥܟ$^d Yjvm[+\cliԱz o}Y~GPNs)U} 1,MJw9l34%_YݻG&3S)$xNc)XmXܛDI"(1тb$ġ 9jc\_{ܖQZ ^6?0Skip'tXB[)3=ogc; deJ~ޭm̓(Lڙ,!kG4Qh9i]K_l}.IPIn|%i**Re֕DV.j[|LF)>wܧ1Z[KS2R¿@;m^\A [sTZN*֜v m[/`˄lhߑN㾟lO(>7bطx'{rD@֑uԿj֛Tv E( oTŴ数U:V:sD{DI DOV$r~nqGU2o%Zf8 .֙Bj")MQ(IT9MDaW ̫-*gKj5xaO@Vz%ρL)Qi#|o̜t2خQg*$RF3h"⸓&" rjKy :d1R_ʦDy %8Q'!&KY,֜7۹?t|Dtu 8;n{ h2kk{} t*7vktKSggjvK3-{գiS'ZgήȍMVW6Bε~֓3hSMt(]As-zdK2[nso6B JVѺ{A+wy%ͯnZ>.c~4LE-~WTD#QZcI1RJ:qpzIS4&@j"I 5Z䃦1N,[9<S9/fse*Y'j?ї+Jqi)omŜVQ@iMS{Ap$;z"K(Jb[͢Iś"3zKb|UQ8 u`֠cEN8nBҢrVrV9;Z;6 T|,ɉ3<4f[[&yF9L^͹0jޢ/e&Cc$ZDT*D+m$GE.G2P؇Lcg.O%d R$EYJ*b(c^z*: Kp{ LG(,&UV1o5NqYS=,$^2J1^kDYG&]J&pǨ0 z\gL]])c8Ns1x%'Sy 5H:LjA7, [/8XT䫉91oqisro|Ee˵1>oO|Tࣅ7r5 "HL&BS"wέcI1pq`DS r𘑱Y)<0:[\&UNקᐦ2{UCZ"N 5H&8pm8~<+2)mׁOd MD޹&d 1J@ >B:>O=媌= 4F=XEP_`=Ug(4{ Zb/{jޗOqлq}AbKzInwك{g`eceWݫ(?f峎,|:~ b3]$]()ѕVJu$JeVˊR@rυ\s1^nqyb0|퉞M))5I Yƕur,~@bDQ$D)Ot eEZdXVX)Ra-t`Oy/vɳqIa/p< zB:J1C1 \ņ M-\ NON;/ Q$Ίb0«mmKi׃i/մ}99,Sff'_k5?>;}˯&4O\;,sMhβYqVDylV7IM43$f|(>ɜ'a6LZ%1eAsD >P *'HM )!I۲5G#eXnI%"E-hW!7qFy3|2!j֝.w\h3hX,)UrUQ֢V]1/J!$ZBd 6X[J؄×ŕWt툮ywylђ }}tbO4~_o k%dFslT^3A%$$c"^6`)|vzW$iA 6:S H= "ED_s[zb]H"<*kK тXX+˜є jIRQ*Hg(ns<ؚ%抴C"vP,G~ gVxwӞ{M8Vs]d.g\(]oQO ga-7\~C>^َzp*GX1rp">WFq?q9VJ;OJ)-⢕LO8k8l7/.'[ j|ݚWsDsmM"^[n{2ɳiy7ݪk'iv[ޛKkh0}PӫiH {zfw#Q/BpF{oûLԛQ=U Nt.+_mGp%&$dF73HJE#BM-ߏZPA}Zx%**88TQdJKCS7df#GYHc@盠ggRXfqT9* Y@/}_L}p#ྙؐex?hìq^]$h/&` 'I'p2jJmUZz0FJZɦ$-.| E}<̚![TcR XA.i9/_w?1]a\ D8JЍI4m!6nvC06v0I){,hQIE(iefP<ȼ-V)(%SBe80(3Y`jm™8;9 ~;{4wAft} k_R0\5C]N}hɘ9v˗rtdrh 0։F fERpk#֒D9NXPvh(Xj#B$K3*Ř[UtblGJ pu%Hg_r¦1d}\Q##ETo;̵ў/Gڐ \6k~&h7ZsNjqP s)dui[sr4h*[%bыM]J4}O7pux,P'$4lEy:-w>kyd1tuwJ;Zy yX9\R`wz)9[VqSqs9Xp|2@=krj^܏O;{^ۭ[ -6]j}$DT\.[]Rd]W P#]^tLkTzWUdWM'Ҵmm 1 Zzhꎻlod)Ku;k4$fdzK.m]@935;okooZlByx,O^KsɗgN!I\3Ǐ9.$`5Xlzuo0_ ;_ sȍl fԦi\҇rJE9 7=sHY˵pAb"&Θ og3_o m++IĎTO'DÏw)wi=n߅I+q.@e)of6 |RyG0AQ[ND7M0ژ&ů|~ƇG{k彧t]|nw]ڥWʒ.,!RS&Z7(#^E&f;ed\>Q.I0PNvK3B;6{&P!?GкlbC3q"<"N1y';iן8dJOa*=B SJEtj9gpQXcI1-SJ:q4u"Q/(A&EA%jѩ/SDZ)}VHI9E !1RV*N pF'7ns_J\7SeԶq3]#Ko=7Nqv]zF2QR*iRM`@d+ Exh& [HҒp<M})E4S-ȸl+kP`HA2D!#!g4qbRjZ_'{ڝ x (%x rf]rt)MĨ0 z\gL]])c8֕#}X[aUqщ dU6aMMMۋogi Ps,v2)+/EMu@9 <S+ʭlw"'R*\vA HixD{* `pu1I=.hb *<TD6)Nm:w9YYgeۖҮozXMs8wsJ)B^Aq;|[~}u%Eyja2cQa)yNvb1J&s18ᆡ]{o9*BY,m0rv.8l`ik,KZN,_%K[%ʒ< M⯊*佖FS):WԕnS|}P "H-Jƒ/!ě`#Yia9N?` ,ȍ$SCՉ F13)QQ-㴮AEd #gWd|WmD԰hI܁''U֞X9 bYSr;)k?wSShmdՌB}Lpb'8>jK ^Ȍ ^`A ƜJ*fFj͸EfMu!tQE&ɋrYX\Yޕn2('nP`u/'@cglQ420fD%$FtY$JYcI*MrX^&!a iU30`&A0 ('%c}94&=r_v2+10BB2ց3uS/IN $5)S吱 N:I$mvnI0qWNuJp Ro5U/kDEt0{ !eM8 ^牶T[hU0ƅVZxZŽtq/]F[˔$X<"*ᄉ!FbTt6xm֙ud; 39 98ـx3f]u u>T~ORʑ U·$ *O bx1.uuQ^=L\LoKgZr@Yeƫ;o\07<3km-g ףaQcڱS/ɜ2cOnM% qjAOD.GR 2Q+dRp|X kW/mGcl *X0B`D$4q-MVx:e2I]HS1{rǰT:qF4Q 0Fs;,5B )@kB)P )$q1{:,U3$|HaDBPԋ aB@r:,DDꥦ0++R Όf(W5Sf-c7]U s;̡×}_ŗGL©c N)81**4WJ j#[I-wh5Z`(3*92:ڀmT(P/XJBԀPErFj ~g9NWl#io+:rO%{$v&7`TR+"*(` m|,Dc3*_G'i&2]qe]Wkt՚_l;'%|gH  e5Iҥke,:uqEc`)[0sH[]DӹsbeNE.pycZ&ĵ r ψT~,&՞2#,RR:AJ;k=Waxb;;4՟'Bf_,?xL<[cޞFDPoe8wnT#rTGCH[3AGuŃ3{; p/]](KnoH z&YNƣOոA1]\$m`纒Gӗ@`O.Yo{ejAK7a9,f}u) vy8rɴr{sկh-ayb <!`v?|`Yw"W{N(^X(CL<Ņ]r,;i)]EԳܰoW8͌>V:.tqpIiCr\, >eI}q 3Z@y%S^f__vu;~),=hj&4[rztΠ_OxSHLYI;-sf i杼PinND@'}ͧ(E._x99P Eݎ7$o ?'wr:a!Z+]-O{Ǒ]~"GOy>G.Zԩǎ:B4)r`BÃ!(C)ke\5Ƈn#vڌQ:VeGasdlf蝦(q:YE^;oBHccJXyM˛ (W%6!`BL1jve@Kxge#ghPFn[+`i/GcVlz{oOZ^d:κ\' Y&LX2c&@(KUL Nh,L 8iC H:\(RMY^")9IE$s2JaQiƝVc  B' L(=W+ƽPYe7a%ֵJ(8B["'/G}y%#1qޫFDS,"t˻떦Z$& DHW:#X$Pi=8Nb> R>AjKر&;Q3DP *" Jj*S['yΥ3ZΟ ,0EeJ^īqgI)$POOaOٷV E_e:Y*{w9{fM^^J BOPU4?go_{(bz *S5>v1!L9A'RP;1e> ]4+aUiC(XMX.A4Uh!^\wOvxq8( '|ŧ}d|=3ҝ^ލHբzmo5UZtr}s: TK$::qk!?~4u,գ[X'.a{0LM^<U[}Cu䲹~=&31[zn9'Qo8|;RM 2Fbu$Ɵm>ƭ3GZJ? YLh8Nty?f?Yp |CX~2m,9_Gc-Co{Im;Q,TO?o\O|ÇwuߟV? LK9Hxk~ۊƶ {Cskho:4Ul̈́19eGZ-Wl͖J(ۻ_~~$ U,~\I]}Od_neoͤꯏdF QSڈ)gzTP@o vZRaM/{FzjO`ed@&XE"SCր0`C@vFҋ{:! RgiŵA9<ф:C5Ly(\)2ݦNMoks xbрv3`5Ppeu]IQRF6J!^ؠLa  ЫKMrؑb6h[?Yb"^o>Aĥ G oμ| == (% 4ndO?0?7—DlU77_S~0}s#XW8lEFן~؎Z}NOMmG%Qp=L@.*@Yg XY-3ÃoS7g5zt\H K rý$ipIM_pJ74zԗotzug͝_*T/jzWLnòVg0@/?&Ʉ^׃g}NOzxFx"_ooWO/i~AګU,h(!ҙf _)/d^nA/o4c/PahmZQتg$''Tk lV X[.!aΐh\uD%xgf1ZAC V#?K|eܗ6ד'MXB"ޓE̔`C%op0F=fP%!8Nf@;, (Zý`cMZ9Ait#5r֍n^?Bt3Kq=9_O AHRVa `nhX$"﵌F5ͭlU Q';#&ohbg#+* ?G'-G7d ݺ IMҩg,@(vGEgO>(0 r# T));-9E4\ӺjwEZ[$@6>$T~GZʑU$ * q1a3ܩp|j]ԭw9,7t:k]%ռW~wny|xs-zkƣњw6<+l:б7ᲮTҬפ8tӦۦRn/9/oq{*"}sfg@wX}Ac>ZXgSN0]ϊgř|WWy7h hR,Q"ƨ"[YC4Q"Ъ`X̛]Z鄒1 &*P j4!13ˌ3q&w\l :J ,:ch=T[6,rËc%*"=(1wx6pcr8?Kvu\wPs~PJ͙хkMi4bArjq3q6 :PHC K)FD}3U%+жlK sŝ^P%)N A41*4|GFml`1o1,Y@g2D'Hk`|K$%\2NXYh6.w1ll,8V ?lMˑM1M+剨Ͱ3txlxƗ}$˴+AEy:AbR{`ieVPhbZv›Exdw⎛T- M6lvJJ/Og=7*AƝh$bV]xzs;~ [UESJpZ 5TYutB[HR\G8A"fF(H*ˆ)c@"ˆ"tY+냉KM-a$EV\W @8?f#gBck\N‘ r_7 ]Mho XBiF/n3}::urm%:AK^Ĩ_VQRJP1EJ֊ӘG"h@Qi#( F$;A Ut(wS6r6KJ 8m1@{/V62۬)T7~EYɳL¤5ɑZڂ.Hw&zGS:"V2[/7 ).>ic|z•2_XҼ־.F t;t?FW?_¿Jt}ouuU$w< ¸h`q*+C1zy>9NkB$a0_}pўG/>߾$&oW>uGۇK?0}dT_zF.!ϸ?iR[jiTrv yE@(Ku+Qټrf8UI[pYp!ᎏDՂK8ME޶p"ϖSTXJ5n^'KշVWw~*3`w{u]z<,kcr}v"]>u ػ]g&3^)17-dY)/~9z|4݇@y >O!]a= t.5 -c13!2?X]b҅VTX9k>cz7nq3_v?𾳘G_VEW Ϡoן)ݸ5[7.UDgôE;2N]Mwo6ok9]M'7)7;;IeLOηȨ=:Kݵ$тضSʄ6<ϑN.C.+I  gIh\Y:xfd//z0;ɔn"A&57lBô 2͡L7VW(G"L:-wHZni t䶉\i:Z,]DBy[N¥o~װiJ_0%pIh+55dޡٰbtk>$;& ôD56,yCgo'M)*DU<횁*DDtҡǢ|yuzóMn兡{;ʹqao ]yk PJx'H*Hp2!C%V@(gιt].o<]oʠfG֚uQk))2¦UvZR.~Oj( ,s2J &9Qix`,(U0xa23Ńt]A4d1ScԜ'XnAꀔ Zr*a3Y=E ΒBz%Z6J 9K]kX4ucl^;@BˍƱCG{BG{tԿ$) 3u <$(R4G5(§YUMx $ @ZgYB<` eRgAcVRqbѰ(p (1B)7j<ۤudXDtaREb"D4FfH)8~$jډ7;(Pj#A=AEʿ14eĜTa?:wSAvIF}vSzvP(v'_Œ/2]vyl+p48F9*,yCU:FE^F-zBQ0!S&aJ;f2  "0gl6r<4>;N]VV7=K\ tnmzyXkS_uMU*L Yzm#Wd MS f` nU@As 7.B]RS8;ey<@4:&Ξ WyJb,A"A<>fW {~oJ-[vsDOJFb(=WR2y=l 7_]g4m#H +LJFu'FX$Pi=8Nb R>AzLlB.j8 6y$rfP1LTRsP:ct.5^F8:n]>EeHW+_Nz ޿w6Ozz |g*[TR*NR4I>Óf?+ozEJp^}-9iLV)a::RLݑ)piGQ袹7 WE-6` 7aT\x3r-[ߝ6Oacy?g`Vtwe*%R_ON'W BJISV#U+r#sft6qr vG:[SӺW7O[gUK}cu伹~'3|v^-wk;I?^Odf;÷ i#] CڇѸuyfBpT'XɊBwc/5#u~$Fm+VLa*#i`?K]_76mySEu*'Qwַ{u'PG߽?>bNO߁$L/O"'po追q547*Ђu[+y){O|@=ʙɭϽҥ10Oztg5#'+@6e]ٻ6$ %یuuk_': THʱ~=CH"%g]ƕkT}s*8KfT3E"'e&G*s|¯M}B[wG׵'$ʩ.ced~M\GKnH-|>ZGdJ*1!8y{ Tc~uir2IWŪY&l8v>GU*W( #MC.ZjjF Jb$xp u&l75/שuJ[%}3BxgN>8nj# $UM)hɒ-QmGxfg4H]eC+u\k>~Ui}]θv0]ӪIȄL.Fq#|ջoThҜbwoߛ+؍b4#|4Kx I%Ya 38znsC֔TBM0_؃/{| =^jgܞnx/i9I!ӂGQ;@x sZD) Kj.:.QőM:Z6^#*BE*$O5Yjq4bL0HqL01@QX*θB,;G EέRȰ%cM4Eb Dcחݝ#X !!96gc#[x)`Εq"ree{e[Er7Xb%};Ѥa`"kma+0a^(E&ZF!`P$&v 53cF$lB.j g>H8hAE0ARIAdj$ѹZRI *(us\fa% U{;=D"cW"vR0$M89!V`k ha9:ENEx&G+s|¯}JB\ﴏk ;5Z5(փ V&J@Hxj(u:b4t0`HzpoTՙ}Gڞnksyh‚FXrZ54:Xۋ9f46ʇ\ -hMƣBN9ZVh"vK;Ji|,Z>(EMcvQ*ON曄O` HVE8`^N ZaFj*e;3m8v0)&8 ؠ팜ut>@=*D"F 3qkiP,jPG>"M.qz 8. %U4.m༊G+o"XV!ܘJ~]D "P8\ HDLC-DJ#XrI=7_l,жf7Aq:CɅt {'S#KS)쑨x {$**%>#jlҮtbR5MA bR1n#\qȺ(hlb ,BcԜokq)ꀔ)Z"0- UZکsg@ LjCV lxbK/>pLԝp 7&xҌHWŘ{$cNiɻuvчG?ͩluh';Њv_SxE? ^ќ)E+>heuV(]Uo0uaea6m5QA@Yjq~Ѧf/o~^EvlC-v4xў<8 x$ uނn3jX$"﵌F PDm5ɝL@F'BXdn5+jbg#xE%PnXq97 9BQnCRT `]QQ`h@YI(}Jڀ튜,NGnIG(&^_KZVgdôy嫋D}dtӧ\6j {r=MV\tk/nl󧂷ЩT^#!ȣF6šFp2"s]I(aS0.NXJP2z-!|J rK%bdF/cL!յd쌜5byREv36'% gM.h1IWG/Ǯu.S3؝V)Y¯x$ѨN0+D{CJG40H\T]d7-O+r9D]pXBgO94H`&$h !C=r`AiLv}Yx310BB2Pg(d=`$&F:łK&9bǖ\N+V8m9lL1 \BpEZ{IUҰ)ìQa)FTgs19FOvfPa9ޅfnik`i Tޮ?u{oh} KjQM)53ZXdez@iԟ&n:-{tLs=nJZ14%:!Z3|j1yy#ޚnAh8ޑB`7@ t&kz\-!k/RٝU\.*T)߬P4@SBmd)W]%)i={HiUuzU)KiĻfWO[GĽڲ:8G;G.(0q6WZѮ8{6e7gn-]̯Xf]pnFt,gRb\plP1ـ eFL6zƜvP# x^g pw]L3{pz\_>)`0v d&a0 8kM!bv1eu:yFlMP]\|*X~R]ò~)Mͫ?_~<Ǩʋx87V#n=(QR~E~((P,Ʊq6ԙ=v[~|]|޼:o\)Iz_F4T]^߻E Ie .WփZ֫<>_uՉGqlJAZ^o;cnB롅Lzw5͘7YaRyk:Ρ|89fOg+`ܯn W7m3Arad'}[~nʢ#&ŽAo:/"MJ\"דeX|%prSQL;TKuR4#?ruopذ3WF 7 Z]% Bd,U0T J)iYBmZcICL]!*>}sVVm4:PO SN%bCwB&P尊KB-vm͸_Q[sU,[IVIj_hv+EȕT;s*]`IUXE21e/ԈE`SKmB&HAi Ngj H\vdAUrZ)jW])1Bo+x6-A Q@`4AbdF_*jsNS9ƉBY.[&P LD|8\6P˚SEKh 5LbLDhJTwN="!lު=N f4! |~W}xpp2ɣ=ٳK|&Oa_OpzA-]Mg ᖻK".645MAMA4)aN<3vrt٘DRv  C1ɟNٖz|K:zɎTAUiȆ(h>6t ƢS rAEdB"JjLz_5rUibyRA:_VWYBPrZRjCMPBDh \Iՠ1YWM<;-wr~5H%HNǗC㙘 N.G8M_ʹԫI\\PQkcYI{ bP+{ەwAw%fZRdMX7k;oyn0IBz'w43R>ǖfxɸs5vdžٓ'}?y=`G|K2h th uѕ ,n,jU9Vb[cȉT]*JzҊ+**hU.'p!T1M&Ά)g1A߿R@ltzvx9afņmݬs }|{}Cb1aS 1s13*9osL?iUeR NM.g\|C!\k . rZLb T5eѓ^^7┺\կ6bFbkr!RF1J"`V'``(-~b^7Q\]a) O ?|:-?C3/ p*FGIu'bhX~>4o;ۄ+^#&0VNש}izX9"%w+M&[F?p.r<&(-,ڃ3P{%u0nN}}; iIO{PD[dgJk [Ou`srIlP %J)buAU!?^jv)x?8m0XjL%Zw'ӯyNdPWfˡ:YeJ 30 6'=.H#(EH~7~%(񀊎7V!DFh4dlPgMeM^1ҢyguO-:9ZK8}+ÞZe!Fpb8!霪T$RdF^vXRص\,&&r%TbzJr*]:ؒѡ2N&z˸Ky[8x-4-S[2{s$X_z:yyӱη5]wͻ+rh'}/]^߻E IxKi=Hq0k{5^Yyr6 -P6v~Ms7yhΗwj\՛1o7BtwDGuGfu|=:rQz2C$'P~8ŵw5Ħ "00[&wuz\|VW[ vUY)@ůhZDں>ib&fX}EiA$J"KֶTȘx֒2VRl "ڈFgښȍ_[*#~qU|M6Uٗ 5Eʤd[[~!EIЂ,8 ,9AD3q0\~qG6i8r1Z-bwaE YvޯSD9n  HCԂy9i ~]ƕT^{y<6nͲo]iT޷-_e4h9V{r긲>8s|u Ü6o5kZH v6jG UTe$EK(2{yHV VL@B)cW7]:"ZIҚ:oib "D锨S(Xt0k^Ws81},;`Zϫa[v;ԓb:@g'sTJ Ky9xgV3Fg|^--E|u`$**r`\][c8'=:D?B_eC_mfuʅ>M..~(4V'/BM>%N`0? ziR]A`qG>,h[tx}(:++VQ#+Žcēpcjһ^]l;y+&ԳԮa HD]qYBOMvi{/+~8\G?_yi=3fY`>T)m8E{!)w ^4H?]5Dw?hryn\F/7ҔTr^"{0A MR6%f<JXCyjjwKKx. b[M]18p^32D1 :clrUY_+,W&S(w vMi-Wmz`hy-Z滬rBkF"0x%6I^YJ@XF_8X`!$W:~P !#4db)%O^hô:iCX!5R*BT\f Bm7{!sP9f쵕u^Ri LyhHW0:1$Z,*1m?5kkE 5a2O$ڨ8)ADG'7"/oENqs"f,x T/IԸ@T4^"ڌ&AF[KeJݜLʝX yή~7{wgoP|8_[V l_eLpy7퟿.8?+:?#9kͺn0q">5|4c #vvM.rfx .Wgqd_HtgUDpHibz;Td ,>r9q!V{?ǻD^޽)|T1n5F3/ߝJav͜IˮM['.qO0* qOMxv1⯵_'?\6|XfOz)̕>r>ޏ튓h0)>[("ikI֖\U[3K[Y,҆8`żOƣ@^ o6tUF6d[m}8[僅Бܰogq0Il3:O;ϓJ T9ye/q: O?}?埿?2}qM#MN_,¿^$/IMKUޢinM+eߡ]-rKOW\CrkÙ1w'8h38 +(6՜i8Z= T*KaAl*!um>IsNlZhyCY#KThԸAoY``X86QX8Z#!P{k^U}_#v{ 4-7 wČP@0`5Zz6(b^dTe–CG Rn6)f\vVo#n mcsL÷\o^v+7b̽1G9'D oU9-VUBXEM^8=$ȐgC"`ܪ-BʼIe.<2CB"IшӻSH1ΥܺHRl*aBHψo@ƿL Sׄ=^z.mxï{ GWBA7}^ (\uOɏVdY\COŏ~l~~l!"ZXD ha-,%쬰b"ZXD/,E"ZXD ha-,E"ZXD ha-,E"ZXDK-k*,E"ZXD 7ba-,ES ha8,,EѢkEwJBW8h -_+|ZtℸiP`pdqa"n,}R*Si 7 C$Da-\a-,E"&,5 W_ݎJ^`;9;L%(PUAUs\.ysEsEG/ad R) kS^L}4r% "Jmda21C jc h,"RHV'S.Y JS#1:b6 =]}% a<`8vuvW!&[[_;Tfov7WUvk0F]gF6f޽M]'Z6vrhPe \g3;C1OCt,]at5/zlK2[ y4 d%f^}R}7~j^)y>L' -nB.s~2N7;_RqpjN4zCrj.(k`jei~.X}1h Fnכ![髖[_6WXyB(pêuY\)O5K;')jrOQVvC9J~:*;)Z=U|މʘ+%JFoRST=)U|>E)MxWw[m˗C"j QEps o3@#@hU BNLm1e%#1(:)v&quOD?qɨJqGe8K"eӠV5-Njq򬝲 h ΙsZaWGRxHQhY4Em %#εJѫO͉od`^H3H# k,Ht6dA;OHx5 T 笒1̚gDHqV}.(_)rʷl"?@=mTZ;U"cb,t."rG!d"A7+0jvT!j{L~N[o¢!lȪRl$Jɲ \β};3G|+ bRAua ` eDGH!v΀ ~{hӧۭ lm_jvi4~?{(sKZhpr9UWk~mnzz4/U6 لatLB4<и4D? OMyf軛|۫FFUdi6GOv3?ˑG%OE|-? Vj v_57`x+mRJK!WjMu\%Ast.Z'k;yGɽ ރ|?3ןLf(t;=?1Kj־}}k5۫i . w-mEZ\=+댫fGy/p/ 8sˠBN;a?yɂkbhqmcl,=e֖eA!G d w1cY ȘTS g 4YSo EJdfִ+.ymo@EkS2ʱ3X8G=~8cRCpR<Ⴉ:"NbN1Vp!FCܼҢu[EYE ~*O 认G .GKÖG6&+I΋ kEk4[GBL.v6㈻<-4z MdWܟeOW, '| 퍦PVD+)+ {IL0RR8FI) ;[L^I Io %yBqks@)dNqM>4V'G46m7 ZEޒ 1DfU6 tRŐI(n`sH(A QP:GV !SU˨1" xr6f K١>E&E͵TmXMq7J-63lU'<|*Y-=܀be{:?wo;eH̄9!L,B9҈c) @VfE> &lJmVU4E)c1L9SKǣ\jW}ЬvC5LUm|@ \QJ !3 Z$a&LGx" &}ճdU5T1mB .ke4JH1Xtd^(#}T ۲|4iZӴsjUk⦦WseYiJ*V2wE;`jw]KNֺ־G\ć{hyX'L;˴TC)|9~Kb+Zj#_) Y^(ֈn90$e9_W}U헿D`-Q (P ИDZ: I:o󊓧2RGSHIɂB ;p_F!W0Ruy?|ezPMO~ǯj vhVbyԭ-*/St3}W+$|̀QƎJ\Zz:-'q<'}IS4d˟> 8׍if/-N'O6yFLˏ~u2д J%xOUWY/w/Ma9nw,}u>/= בGYeMoRoV ))ȦwЧ%4<%nؤ||SNfE#vfQTE"[\m[^^w4)fS)Fd4t^b5xcj:l.?uL[ޫ)tPra !+y6bSt>fd6Rҽu#ΓH*&iεODo@LiޞYX_!v691YYZ UcJ&t 9 $KfeDc󧒀"f{y_ⱭC<|zzYTk%ZR%ZR%ZR!3 -cVZmQ6p5E\+ڢHL߫-HJ%bn-,s۶ЂĶ@[hkm |0gQ7F~o:Q77Mo~#iM/~o(~oJ(~oryF(~o }5~orQ7F(Ֆ[%&"5W^uZIZuߛ´& :'DΠ8-ǻ>WHA691't<ЁEL6.t7ƆCR*w[)rm&; F#hApA%J\sr'qd+p)n,A,sv=!zC"#&ՈjVlL6 #`=>(G.eUJ= gVR%9ٷ)ǏX):VхБ6C>,.$ NwZYCc~(UND$!;PȢiK'R(ggG4I&a2ʚ'U!$4&) tI 8rg Y Om̵ rmZA"ydFi-EBπŸ/^v \X][o+D7ypMٗxbZ3#Oe.(nZf+E u@n`ƭkPZZXZ zƭ|V:Yf]Xɉ,4 XHd{'j=뺋3]m4gCRt2s笲 ڇ@ d&gц &x[Ejn1Aj~/DZ3kѣAnXcX Q!lLY"6[<'%O:Ovúl**R䀚"ptE2#$%d *x_qǹ?$}J4v;tͬl8A*L/G,}E3jXU˵8'ܦC_e .Fe X Vg>0b L;FhO#'!j=#c$!-  K4J&H6 90m$2ZBi|rt#ۊ wsj:=w!ϛ?l<^ʽ^_էM*{*kqq79 jn qĔ`s; } $~T?-F,<ↀ@S GVBptNh:#xzFY<[mKiW`=w)nKMirr1k+ӾZQ4ƑP?a`0of}'i|'Z? =sx:mqQmu9Mnt=\B%%mMBbgyP1?⃳-өm*;ax~BO$>o?T#_}|[:qD+0K~DO&_O" ~hk M-Zɚ^,jYsƽ>$ur fau $ć/_ߎ߆y|2X{Q]ǯOMBâpq=Ml~:NaEKPEgɊp[bCfʤǶM6CD[Mma <80HRHqܸo!H,8D6s4lr|J$a)}uѾHXDzz=L_`e\KVx4Q2&ѸRʥNVjHpQ-cfzBOXW~'l8V4(4IfԮ_HjGO"r>{Tٗ~IVQj2책?L!~=WՆZS/jG1z#pwQګTRJz~L#XBd?Z׉`yc"XLWO~b8&WΏbGߦ+d \ ]#DS ={Ȝ2A]sCRzdRe v;᳣20ͭ13Y' ;OP%I憄Rr% մ)GK'8β92nXm #U>zD:⢺yz1Z*81ؠ+hgA6hऄz'c){:wYcڵI. ͇w)pkiZ N%!YIK-LQ)SI%0bq%saU>s,;AFN`q 2L Kd6J$2U#gO>^yE|ZI-w b'ay9ͲQL A::kHm%FFR"Sp= de/r %R1IfȔHcLzC03{Uf'8V}Ӯqt9TԪM'糧@Z:quTUj{;jYX׀9 Rt ++Y.Dkn"<  Hk̽蹭Eh$xw&I`-X㕙/o`ЙJY101 *-ZZ<(xѤEgzR:Ny,w@ez1-kpQXԤi¯>tdq]U$!w6y%gc,pBkЃMo~.c-4Ve٠Q1Jh3n{>Cyl)*mfRqrG*?C8c99Pc,oǕET%U]T:&L@(Iʂ"j+-VCHΈD`sjo5r0񎔜׫.N?᪜NrAimw͐lRmdf-&%||/1z2k$'<&GFrF+1JA|BlU98XDxgU"$u%A)Yq6}6 5 J8bXmZ|qIŐ7ճ> ,cyZ͸G,Ke (+<,P~cf n 9F:,o SDFqEf`Z0УYI&'rn:i( ZF%\2J 2J%P(3T'̲*R Z"V(lZ((e\bA@v]H9"I)!p 7/7_61{ijqVVF974>}i:˙yttLl,r-zNkk,s(کqs1,i&=VaxԾl[ MvXi{aP& s;#Y1c28sL!ࡘc tCTr{shA+5uyM{n?[ϭhϐ|#} ~ݟټiդ(]g@Y J?i[L.Ky0Rh^5Da?Yy-/fQ+YS7e{8W蟽[Ƣc\WF\`b'3Ӊ>k)09z,\M6 [}2aZY++}{{FE(ڜ-rm|̓Dcr3$;GNw78Ŝ9s˭#t,2v̔evy/tL7Y{ɱv5Qs-9SAfOgvJM>b!0\h4iv ]IhAFp2{`mz4y͖5UQZs|~5{ DnY6sL0mc8 28#J}V ' e48A$-a<'8 Es˫mO_g c.7[7g?W5P;36K- jo!3}ux_X?z*rDH:w&HR%⢧%SP'Mķqx.5F"Qd}KH@a8Cr>vYBɔ$]xjڅv]xjڅv]xjڅv]xjfBKX{u^]W{u^]W{u^]נh4p~70`WN& L$AvQ:oFK{ζM;[.h"){{CM>OϴQ_gEɗ59\f\a 3JG07j;SZe1Aa#A2sVk/;^s| *ۨU6wMU? r`*zQ,na't/TD:C] }(ugaK)]ijo~)nCP#KdD}L饛^y0OZoÃI˭*h_voצ5sVp栝 bxP%&hgԵIR*Np3SZ&ĵ r B($W1ИpV{tHbaJ;zNq_̚`wvR+2 y24 3= mmpwP@Pi/IҮnإ)Vy ?XNݠ~5WWI .l%`;PA܍XoQ1aЧN+_![ÁmTˠ+jJm|=dzj9aE58vy~/m@ܾITwguZfFu|Q<_7dA; 4> ,X Z~M0"xhǗ%8s gv1sA8.XS2S<:xFSIozzq줣tݠ^t {vaZ Ӷ xn  MwR܂ tYҔ7%_l?hn}!m{XўǔbaF@{B}|^qQ}̴^vhVּsf:!Ҁ{R k{E{w+nKk^mf[1y*2QxM/va:׶k6 ϻX}l3u7cޖ '-F^2 $&QN:T4d BX5'JSxB9sΥh?麔6I"ha|?1M[*1_Euu{"'^q N_[;K;p^.=;tt t4h7'`qREY-@"/aN(ESv2[cI"T;H$ ů g>$j!yTʘG#,& 4+ĢaQ*$8M(rn(֑aK >"-J))$mֆ-JcsEqJ+'^;}P#`}}A{Je`NY4\O,[eKz? $@4Nmɓ;$sm03: JY8U zvN2ɮ7i(q:(YE^;oBHcc˨%V^SB֞ڔλ׹_<ٰ-so4p`,a³ r3MTP"gL'4N$8 mI.:HwD@4:j, 3âҌ䪊%pD0AP8@2AZpy+pPU=Rrhf݄Ø+d%x"9F啌QWXDb}:?w>Ga& DHW:#X$Pi<8Nb*L1R{M7lB.j( g>H8͠"b 2u\j~H??~݇O_0Q_/>ORAͦ!PaϺOwe[]Cbt-x|~%os >c:O\*=7=rKs ]oTL2Wׅ+0l`RSN7~E9<%Q[_E>T44lKk+y4hIOXIZXYL ħpH1Pt("54t0ֆKkG3N( lzKd?D4’ 2)8y %SiRrŵF[披+n۬ \|OFeP>w'0>J*^R (s)Pܩ@oU: ]VNgojtxl0%Ue44Xw$6uz31.³.׹y]f>*YWbl4;bT%tY","$e8><ux7RϳokH3WpWύRcuEfap\%<@rFϵ,wJ8w'©iUzV nY_:қ9,\&uuȞN7!KwQQpl z C 4ϳl۰ȥ.< 5}h'm/t2xL+gfPxQkUfѹݕGi G Tڐd'ݴ"s*Y`&:[0'a61F9Pztkf-k͕> +P(@M*hHr(sDs]s+q 9 }Lt۸t"(OV׎ߎ:BOִ'P̚/]#YWIF+C0!B*CJƌL"^ˈi!X ih?Ȇx%79MLLaxE%"3(&q 9BQvsr&) `]QQ?>8&0 |# T)+%R4m g=2/ S&}>WQktr(cyZUݾչŀ&ǥA]?~+֖/TVMH秒U?1rGl*eaʼ L#Ut!J&qO%gXi ',%0C(i,!|irK%bdF/cL!նdlm8@[R" [me!dIP¸zNqǑSh/_AnC7B$u`1vc+ 9ٶy'>٧dֱ,y$3^ =VKmnLYd+X='Ih I-u"\JE˞.Cl`;8j+pJ 1*)}.SMa͹L R-{#<r@4 KAƟ,rZth:PMXE$saIDT4?-=x[8~%rb艏I%1WJޤwO䤫;sNꐓ:Ki[%蛃6*QTSE*:8\8ǝN,]uK 59G#uJ*QǸ \V\(+XH ť7zK1&*>UjȾt@pGTج%`S_K@}ӻfTt%r|__G-RJ.*QK`{ yNZp{iA$H*U y';eSzMmz{G´C&-m)p0OG@㣴2cFL\d^R &M$:w:Q*eWFk!hAٜz79e"\4]o`L(RpqjNK)Fy!Pv./Rd 0,kQ[Cv3rsG(V,:Ef(($ZiP?Qp$Qs`,yt*qz-&iN"! ?q3vKv6ڪ(@s/Ŗ !d$5Lȑ4aI0`&j +YFL("'xa9+FΞrֽ}E˶0*>JM9Dj"g ҢQ"YPDFo{()? 7z=>0FULWB4\Ds$th(,!Y`8BId1jZ~>OzUY7BKZF)ȅS(J˙uɡ}dʥ^fLphq@RFqtno1ICuF' 9H9V ֫x55jbNѝ3\cx/mvD\=s|gR:cc>,`[<r&'QOeb*6-\ncLQ#=O^QN6dll[,|VlA N?960r'Fww+6>6WO=`F]erABjL%^]QT r6&ל*uuUSizu++ -Yp8[z01 J+Lo}:شa/(Py%ȕdDR$G[4 |v3\ΫL%3zT‍?JdORH]e2W3[)JEOַ!}¾} 3лR {f'HoyBA u[CF)T\MBzw "ɢQzTđ tL$08>9ZH2Q)b%+p^1rvGNZḻ9[Re=3Ou.r~e7>:Q݌C~\A,$$sNEccVk@wܢ)EN:e!QlJ t/Eda_3:0Rzq2(5u#]dnlǷ}XU8YR&7h#:tϛ 2p4$+a"6n{8}ͰfXGk(%$"4"!aJ":"g@$ʪxP6kL0#E ,85.DxS2P ᑚ+1!ƏٳfY_Kiߑ -tvd+lեM+|},ڬÏ}߿O< k$J׉E3)8 Hc.QN,`T&gR`\X7\. 5c+nQ'/Y#goY?aW_j[Ū>ŢU_{YJN^q~:7CViq^JN8+!UyΦw*slfZ}Hxױ9x~)Fa\[†V~9GpkԁyK?Rrc(D`FB-MEt_EL r5jFf¿6 ֖sE:+&2B]8M.hBsE\ /\kAko%D6vs&7x_Ҷ}˖Go* %hZJ>{۱fh_l5=$!-d( jUH^Hí.dHKHE'P9JL2s:uT-c=*3sZ6`+R(U=& BDNJx0USV~ 8`Ջ Km=X??/S yyW]ln COz@Po8֢뗱J;oF IUhDU6B` Gt' ΐ}"FYrd4B~T&Y!:<͇KMY%> "`,y:"]9aЕVXdm~ln8ivdYUl*0k^n:)7'sd kܭ9f8Co(`4ݵdۻin״POfGe{ `yʳ%E.?Yz`xe /4L-,*av)EA&4lbüYږqYF&6{-%-j t[|Qzt <ޅʋ64oYr͉Ihszwl<6u5 OLseݾjZfUa6=樔قM̬"Ǒ}27];eҚD@'=:}g Z\Mnx#o'B'F`샙}Bo>RڻoY{=ԭDi$):>7؞╧C߆^j<ŭ%g˗V {ƀUxZ CV!Fc9 DہGw@t~]8!Z?hr9DZ.Jk=jdLrČwRPtMM|oD9ϥAl2FNpg g5sUL"LqH!x1r ^y5(غ{) M+l0-)\x;!2^i>?r嵼,w>) مij 6aD."L (q:}Q#II#CH:~P !+4ɶdy@3xљZ6L 6TX hJ5JD(Qq%@ qcPȸqu~ֽjNxJ!68l`ӄ/G$}QH+\ {sקgӰi{QiA2P&20Љ`'\,'*K1\;d0N2oDX\ ͉(O@_@T4^"ڌ&AFD kmN&-*gmȼ:>|?oez~H/Wߚ]n\6,u/0W|qgV /H.CV?tTh%#f|1 'ZqNYsEA<[mV W%ZOHtrRhB. mޔ{8g)U_Wk 1 ,z6> mDk򷦿Z ]]]/? h݅߫̅9 4~{@;q#N ܘ_/Դ,-Fުœ:0{K (+b87kDdy jqGqGAHH$چamfYBxR'\يOƣBϞ [&:*#G]NmԶrdQg`!.e$,}m2YL KzAg㸑2/{c>!g6AbFlV#۫,bCkFJj@aWYů;;zSs⯋ΦǓ8=}Ol^}w?oݫTՋ|O<9֤SD`r}6խ~[k;ښOLGpԇ[i\/i䇷z^Nr .VOp\5#VU,M+,6uY좗>גTMMW֍ڎlTa7ń;qٱMoQ`}N6R:@0*)hFc(RA!0N"ddrF'ѣ +kUYo  %@H ! ٢' /dsqi3(>U9'J|e{'wkgy;K7nU#iKJE=2-;|z#FJCSD*΋b}Ll 32B6PSӨ'II *ٿI* q_!:bȫT `r%$)p)1`v rlIƒ5*y9)ٺc+q}+x/ΩH-볿NֈS$- U7WIkYFC2jp4HآM YYP"OT7YY%HY{8Yz3T印1 4E`1)!>WL—]A,A$pC={"j6qĬ,06)'4AI" *nWzIDF@ߧku9u`|ps fO\zU"PcN9K wU% GG^mG~[x(| kxpZv"x23{|.emlz=bw? ??,po򹾚y7Ytbm^ɢl0_ &/~~cgIl]..L?5h=(#C:Mt ux%&%[ɕF{)+$ V+! y dc@&'_NN1tZݟC@B͋ϦE?a޼=U>"h8Un/krEu>K%C|dkFӛZsԁI ANECH~U>=8Yہ M̆-\_v^)2)A~afDg_)m \9ch/d}:ʞA)2) @ؚJCb&l^_eLLH.MZ)K冄uQTW*8  KDe#hCZDkբ#st1yyDejAP~(_)ʷbegO`m2)!<9# h$=\.юN=`'Cюf3d_PC}AзEMg4P晛t}nSJcZF*t}mixq>ԶH$T>S7DN[l ).8]h2t4#m4(ݻtfZNj>yϻ)>AwrEP6ܳ& |Wq :I]Flr=$r2mwo^ulƪߖ,zw8"jW\-{*PW`E^;u)!@*'oҌ!T}!q9i S4˝v;Ȁr#`zas0 24*F .\=R&0J;$BME '+sQbDT2(O69ft$%A¼yy4zjl<C1B'(C{;urLoD#+6*/^F޿μ"yEn{BʹU6Pc1gkS= #i KUDJKb0h lJ;)ѿ_\RK[kpNH=Yk%WeU%dd/Q&q5Fmv UKJQB̦x6+eTc2ShYF5:_+qv`vg8% l4'{_B!EF۞yQw##?\QQT'Ծ[>X4;yպ]ecԐbaU %CV+sSh+)*A^|7.@*aN@VjmC(Ila-4-lZlok,rŇjF8e2Ѡ'n+{$eISE-SH-ƌ'b+jcl< ä&RZUBr*$W5 o/J-6r(V6}j = (IͿUvd m@`3q exj{IJ07l%kZ:;5)hUV>ZlR*&NuvTK3qS_뷝MSq("юq[AAqvw<4y`$ADbKinjBXgZGUDCJ^&,S[hn&~aU!:͒}olhG8h.@)6$̹7)7V[ltX^#}mN*nW*n`UqƎUq?êl5F666ݜ@w HRMN v<Ҏn=Id%@A,hlA6@f9W(b6m.c>8`#=/Y2{GUx.*$[)Vm{׵Kb.ܺݢy8g;:];;\wv~e;_:+;|pvrէ۳]>.s~'5\v~G u 4O%qzjtyw~1|}D{m %>Q(X oU/M^,DnδHs1΅(:PO)ע@Nn9vN1@eLRoG!TNZ"\koɨH 5q1f]lk)BAO2KJIJ DQx%r浴n'}iI;s~WF+og  *?&Lj>0le~Utjhu:;9>gݲg*n+YM06΋b+c2 3HoVnaA3cqg dș\@'Ĝ_!:bLRVpڶnJ%$)p)bɢ Cr^y8]kTb\qg3qv8#}v^FO;z.8SQr'@+@ _E !; :Y#RNYxewRIR)5.Jn)j7sTJ$ h!ggIkm,b\Dj}8;n~AFM\Q)r!+ C8 Q) ^yU,>:-;dX\LIHAMv4EЉ_e|"HC3k"`RPZGeA%fH)2x ]P+3P^IyVO^m(mΉws̳=Bev`9ɋB4?J|&t&8@>l`U'W=lm#0Itɓ4.19T TJ^a^aW}+BuEfE^z|qHAY-H.E蓎"J|Jpҧ㧝F·ѧ~xk"{/vVԼV HYMbleǴ*piPo: l"fb3vdK&oO:ֈ2c pBw'}q 442lԋ[YGR/0E,r\,WfmD}A@P3Lhg?wKz!@pEkt(+N)sȀJ(YϪOM:V 9IZ)ߌnz53L'pt&9Tty'_ 6$t.Vd,UBHB䈹Vh5U-׻n6UtQ/}YbiBt.kwW_^\A8^LGSE_O*rj9'`%sތihx=boGuZ}' ̾X`>LNVk^9$MO?W#>}o<b[O=} l&ȭ\_XiS.`]źN'_.z~xcr[{׋Inu\ez/tvlX|v:uQQ^}HKMov|* Ӳmg?֛Φwh~> {$IУX\RA'JQ(RC`&kMf1g!@uKkOPD}1Ep@΄lQOJ+YRvN{.u\7Zā o#rқj_z%i/-ݔxLMl 1Bjbv6i3b d 莼4!mlg7]pѳ;@^\R ƿ$.o$- ẓz!>hhh4ۼK;ǿ{sWWO-CvEB7y,-޳{ qn [;l"vnz&H5 9J*}=T'߷)EN9Ng] [OD`J zZ&m@9U,NhΑł&& MBw,t"W2,yGyW+>uâ!ETRbK|  RBTغS׹R6@ ( ٽ G.̎i%8YV]d>ԩHӂ-9sDcZHwHW%D=4yVȘ&weh+ew!]dhr1>#v\K,:Cʁɨ "0U65FmV Pne()^񿍬JD֑Ɓ2{:QhܭlIw=OqvK_|њɁhWAL _u{?ћ.f_[٨x}{L.\kkhh-aM^肊)AZ@l,IV4P.N;C I-}*6͊TX &Sgӓu sht9YiJi46R-cQl'Jy[L2f/nFT8ǷiaMURƙ:틿zd􅐟b+佀 D9JIdt(Q%DzcdE+^zU4/C@AA5::Z =Czj:9"pt˾Xfm= ݎ- <ri(g毐U@4YcLe\Pp|,Ma1!Eoj5PJeǶ&eA2x sM!0b &!6͆s=\d=q_,b38"q;#*uDZqe"F(IEo"Z3VZPdKKnjBXg 5URLL6[TA;NjHl n$lZtLJ]t].(>R" #y P f1A"J6lXE5YV 60ǰ}ͤ/;`6Nd52_=rؽﵫEb魽WLF Ҝ: v!&1]1t&F(~{bD2FwddF:idVT>+0"Gs&|˅Oa8?~Zpl6{G]xۡAe +tm0*m/Owrny9;!Y( N͹v/<>oM* $: ;RXcρ?I='w$^tc| %%)mH!aVhm:*IJǨ I+f; $}uGe1ICA!&_ޢ9YUZ__$}mDO|e>72_x7XM+V(xHa|-` S|@ fσW2 T‹Tعue^ͱL@ur=f iNy:)u+i}(0S%ǎQeBP&;(ZΧmm] Mوm6Ԗp9]6B ݋m)IE+saarDd6Gz紖ŚYDR9#S FGfNISHf#>>SM 2.yk+EP?G% f\7T7/ bddo]ėd((VܢC1ե8f($J6if1=` }R=/*cv3thRS6\tN#hM( * l9KCP4ٹ %b6f%0zl8{Y&!q_"%RIfdщPH& ^caԛ 2*R>~+6{@ @dFڔٔQ$K7 D$B ^D!E\A-ZL4Uz#X&^AJB2AKl,%x!,6@+xǍ! +m Gjs7o6vpM3Ic 1 `)$[ـ_KZ#1Z%^cՠmNDxl}ILDU>XZI'mmPEg6&)`KuhsF0% & A=ک*Uw3 9T;vxz>+d{?-ل_鿫tHT 7|,[O> uZpq:;uZI \R}-bQTP> !ÐyBnBJ)Ȑ_hRn袲/Lǭ U5SW[V"7#i((cXTrAi[ԐѫdDX6ΞyfN?Sy>>?ֹl6x8Y=s%&)?>}}.O{5F%4&hK2w)&+%E|0{b)>3L Qm(1+M9B,c͋tUJUH&͆sߊ=6Bא`W9*-v^10ˆR D]b[}{l߯h*v^X錪 RNWK?wAĹ5$^*>Z';"ѧN3弔j1Ju%=Iz(*R3+2#VpcZF)$*"5zdwTT !ZM=zeВhE<o67 ugP >uV= Y%$WfؓZOUb?*&gU$t}V qg?0WΜ1.9~gIi'(سRGSJmΌ6> aLc|1ER ᎚Z|^lYy0˞J.kvg-#;9lܩvי3 uߩ 2`k U c`ꀰ 9Hys1\`@UѪl*"1bgQˋ]^6!RlRt=߅"CAPC^ea+-]>iZV|,FEC|eenvklޗ`l@ަ-Ю~*5㯹Ӵɺm 6pu Iuw[a`oՕr-c1[Qܶx LqY̙C:<+C_B]u񀫝!]2$Up;oB!ń3@cĜE NVFcv*}0kzVǨtW>O^yz;|N0پSmk4;-ֈp7lT䖉GC"O'3AG2/r_q4 H7̇ۤb+1[y3}q|Hze*/t+_P?ɧ]<: P^AWt7.) vsIip㠺VT3mތkWڦ]xm7(k9]M/7g)7ySN ^@s\6p!G6=D Zl/`BBeb7ujcI[6Rۆ@h"[mqIFr\ས['r"/8wCрI f옦i ) :&>揸7PTo's?g.nSؘz,T7$W R"ijچ [mg-6~7᭟R zDˆqay]4Y'"@'=:}_=\ϻ޼쒲w\*%I o@.ϻ $&QN:T4d BX5'JPΜs鶟t}JVR$ts*oXb>ԭ%n6:mQ9蒥BWCP3$t;@$n'襤I vvRvt;.I]FA*rեD-#箮-+ʄxUq:V]%%uURP+٫D%c6 RW`J/F]%r/Lr~*Q9gRW\9s犠\5+*?dr#,CD@뛿d񤩇9&9R7T0|Ea[UUٷ7HO(=+MLSᯀA&j3 󜍊f^6gw;S3SW橚;c̱|0{ʸ~> CV" |[y/K{KV~<<_uB}Ju\IQvx*cUmI+K.VH֯l+ۤvgY H(򹖜N)bxЄ~É'%?y"K]o zߢ%fSxzIz\*/E]%j٫D\ݧ^]}CJJzJ0G?fcVruSQX_'<5hF`&1A& d"2)%W(z&L!b!QQ(:B>f a7OCɯ`auii[e>$ɥteϩu4.wӵ4!FۋPGfm\PCL?cR;ĬW`D Y@SJx-ׁ{'mʂS.E˸ޅ*`H }ؐ G .% J݇*{U-~&҂eKi'k[= 'U$ivߥߝ |0>4o}8|*⌿ɺm9uhW70Nױn{`=(y,?Xb%lZsRʛK P`vg:YaRldI>x,0j ]p 6oFƉ{mn7SfNrM_Ancu[Fy۾$e{\#`y:\{ m V/,/^&Vxyqcj`W(^mۈkKm6va=limC%>>ˑƽ;'-5:o߬hW/."Rý'n?ӔabJ»I# |OfE噋m6^O ;&)4" T޳K[mn[?1E6Qe=͗ M{NPinNDNzu(Oj6wy%E@0U' oK|߀\vNcP oLv5 GT(%NG$ $8V`͉{D 3\'],d]k>LӨօ o,/@nl u[ 4DEt x3ٻN \[+|9zztMӸVHx폞F4,b'|gܥtY&,mg`X|9S C0E* m"sI9^7r7|*B]RS8;ey,@4:&gOa`i%1  Hdb)y+Cކ"Lo?S N\!-pő6rX/d$&0ҺjD4"9l ٿ/ߚ^kkEV0/"Z-#^yNs`m(^Cp8qIV^CepQ{(HG"(mI%5|?'?3SJ;L^_׳o׍.r_eM*s~]vg +1d?+*A^!A0`'^ lɐL2UaWX!S<)"WV J1uW,ý~*ǠPgJ`pUtb pKMȅ坑 ܇CPyd|zt×TJjQ-wӿϳon G9Zj$)+ӕ۩t9}13:c ;>F|-nM'#Y>wo*1Pψ7En&sK9bT4%M01zb't iFnŨO0i.>*_]97 W6:dSMcJ/ɫ2:>=2d%qp6Yc塊lzmA18s;`?:?MӇ߾?|x f`\FDG`V/we[]CӮb;t-x'|~%os hUفmQ>aPt94 :IN/&Xă.|_A'eGTTu*=XQ}BybʙiT7ߦ}B"Nȼ 6(ֻTt+%C5*)&JEd,B7^0yy%>/ ŵA9F:.E(`(ݧ Ix8ȦBuvu_V!U>E+j}+|>= `S@ 2(D%2O2jb$X\-nczQoCQR#)"e!5TǔQLNl1fUZ8BUe!1 mIa3ta$d)"7ǡG{J*1g2-IzI#$ Q&e KڃFD^dAQ$cԽ(e/}|x3h|@{8[jHiT H| |}-XHaI?&T|bH.eYDA IY0*͘1L^"cVE;zHU--5M&#/}9e ?׬~[w>#|&u}/_b١3™"vHW>U}'盓q-/̴gt|q(G';b"=\)p}Q Ԯ1NIɧUYܥxx qEf1:̑tNzP\=_lUCu Ɨ>& d8s+7g+Fyd R+\`Z)ƲuK&`Ɖ5I@v!lIodbA$!1B*i-C@a6ԮZ_݂NmN򢋻6$FT㓗ށhvn7j:~|KO^A?)wmgj#-W_J ۪ZyHʎar&)28ƕ!eF.*n#;+ukikrТ<KZ&lPDIQs-= |&%U[3V#gHjX.B5 {,xC dHww,z*/~7lyk`rrBh0AlD2JPsլE$?+H4f Q0H>nQR5D#\4|;f]H{:X0Z8[.`zh⎡hm5jnT!A #M)t#7ggH8R^$!>JЪRW2P*dHĂIؠ 8RyYEȩN9aԯ:h8X?ՈFԣF50YG6)I'YZgΜW4E%Ft3 oHHK,$472@`7P;955^\6p imehF8mϸr=S,R$[KA@^3 sRHD5i RbԋЋCчոc(jGPa먰Z^//FՍe_pdvFg7k?SZ#G?s]\@#a ڼAq:im01ph.GVE52dD8HKʂ'ih):1\k9)q0տY~ZYkofwL[oJr>Fs >u]~Odп ܁F742Fos1z̎g}5 idC#W{Omy?fT=f5r;?,-nrs^!wij|ѧ]|vz{wp͚5Wo11OV~ۘL,.{빍3VAoզ벳v5Rt'S`!ƸPjt{h-_9*[6EG'HN9zNR6C7wz_y47\3`I'^uL*W$ },k5a2F 8's#d DAÈ29 wdʠ:lrTj2 φ)BK&s/.uA5WtFmaHX aDǹAHH/FICT5\VN,E oU~D#߹oˇ'*FZ$,wZ *fq:򜓖I .%ȁPg)($!T8 Czc+r.J -ΓPxCr 3dlxn*gK#Ўqb-DM!,0mJjI2de9FΞr6𸦴[N_" #OA V3r\ ʓ4"%HZ DG)W_ z=B%DU똬2 d12m")!x!,T tF? @=$x)/8gkm}9Иi  Eݕ:c..χV'LAn ڡSoCxAQ0GfX;łAgHdW@ȝ #1G9J[rU ELj,r·jOf4qxAk'3$R&6p%B(AzBhqAȣI9(c U*toqrY ki 4i~yN[ Q"NV c3p%lz(̟hBf S§|kDNGx@lH!g4{{5FN?zo,c!L nz嬵=;Ki/MjUcZY縑aUZ-p^Zy`:gɀ2Q3B1@%Y!71mq#6A;KY >Mn4 ӤXj4\иb>DO&wM9} wffO Cuh~gp'Cf~Ӵr5[3-= 03cJGH\1j,XJ(~'u^d}Ӳrygjŵ NBR|4`_u 5V!@\s? PcD8s *tU)&ФW72ņUo%D4l RU^KMtb"Ι+2D+52%>%n丯/0lx)ft?KI d͝Cƨ{X>PUblu7*s򊾶}FiDcVn46np >,R:#^6_2h1NI>L(VCCN׭}v{ xOH#Nc%79.|ڌ"?xO˨5HK!e- 'A̅ƒHmd:0}fʌחAWV[kC}gb?`gqWi^LlFƅ[З@a_e܌r3H!F0)ȣ`Aq\4F E$V^ 7? }߼\<%~ inNH-fJ[njv~fWͪreG|@2Gg=Jẑ5  (]ô^5N1"Z6nW}'LFaZnHZkr"v:Nl.iEGiUx, |~ŽѕNW p8Ea<2I}[Ub݆ګͤ֓ј3e&}ˤbzC)wB[ƺQPկʆW MWqAQIo@.]*Di˅bR[c0efL"֨SK"ZTB(y۷M<39&LBUkouQ{C>qE4x]\`{I{B :o$hޑ.%GPO~㨕ԮbG%؛tArR$U&dI r֧*"gmt.>1t@iALՄt! ލ <3t"U8ɳ#^P7-tϼkVX=ID\̖}MI_.|I&7+O4W~B8_.֤/WGة>PZ?smhe[3ABs6A26DF%S^J{WY A|}pe!& h# .D/m.ga2=:e4n1 Q4޵q#2?%X| 9'n,[hZi5=ɲf4(KnYͮ*XrHġ'S^dl]9;Xk!U_l 哷^G= ^t~S.9EFH,A$5t(؊%e5 :   !sx/tb^ٍe{V,29B`V%u`1$<-Y)-K2lX4wP cP̸oiFmMX.oډ#$YX4l@E}^Wl,C9](8rƥ驖]I'qo* N)rH0CaQ62{$qüH9 R;AO+lL*>KdcW c\Ҍ6ulγ1Y]JJ!A6ѫk&ZW!LL,~~Zu0j_kѕr1şࠆa&?02w4Yqju'=a0YIC |=Q|5BcC:d?!SJ{a>xt7?gxGG[[uFP@檒|! A10p3Q[MG׻&7_Mӛ8{rZRIȔ'gCG+Eھ^}Zk:Pz,m/Ԋ\飹=όjhg ^'y7W0gZ,W]T]ypyu`)1Y_r4}{[sd2;pq`[O;F!6q$Էti<,Xh3Nz3Xg'DϯƜOϷ<KQAluFYjyfJFO'gv}1IT/dt`[yoΦI2;ћ^|7o/|ݫ7ܛͫsJ5d J>>~Ckjho5rF|0-yøkӴ^r{R䇋_oT]N;ujdxZWU׳<ɫ0,Nj|]Tx-YSsBٌ ЏT[0aʘf,bb#D DdBE_T6! +R,Q$cg(}?w-hk4=O`pB]|~{^ZAb|pC^ںH.tѠFƍVIs<$]\8'k(zZcd)^kTDOE]c))`ʙUSizfrv(1ypSh-jj\Ue^d]8::p45v%9'SLL)hlRX QeZ`2Eȶ*PAƭjS Fق϶m'/C$Zg\nE;5s0[qP`q|1 <ry-R c@4YeYMø&Pp\,MaƆ5#C5i"˺&H\2 &!梳HlT >lF7WbFlՈF4F581%d,Z(pIV Cd$QȚ"zFN a,mL,54Bz&!#"ؼ V+1ALO4#nUG֋n)\Cu6EX/Q/zq .Rӣ ȦK5 * dr!JEQa`^| 8}،;*l]^s:ȝ~|D~HdF>7F?>SAqܿ Wbʗ~㓳ltfס2Λ̦q7FeQjpp@pQ!s(ژ(2 &P6@x$er}D[b }[Ge)z0Q@ݷvg5]/Wݻnb"r+ilivk|TtMnqknz{<Qpf!j;bnM'=_5󥱲CZn燳-nODr̻Nql;:΁ Gx-9W֬zU]sUs0[n+y\R}yWE}ݤn`{EI$TK윫0{8HFgfF֮`r)6օPQ"d(bT2)QE Y7n~>KWO%ɘN>IGA|Pfb4^lTiߊr, {7T&<7\j+-P&^STz;L,]1#^}&jr3fQf뻒l}Wj }R(9n}a[Zy͏3V嚄fs ŎQeWkvV c[ޖ)rMm:z93WNF]dRD#bf@NEaMY^%l6hJ4"E6=B/#[VIf쐒u&y}|V˵H?wn3E*Nz1ja&A-d)dT/%%B: LreZ!I0+l<^l$CylTs%̺&: eAB2#tX^(P∬3&r֌lxa執0*Je 夲$e)+3B@*VI^k >!c RERQ}!Gc29K(rf"cH) 88 h'vZ{Hgm'>{ B,G00l%n+mǜ/NrOm߷6u|7ީiA&FaH*$SlШD2>z-]fZ "CyeY_Gi8#F&p3·*K .1^$V&.I]f@ϓB#.QhdUţxؠ4J4󺰦p7&W?=r16yb@X' vS]7"&NZTjb trƮzATvc!,öpr1W%;LwK49KҪU#6- tMnHeo=@oC\-+HR@4s}`D{I /ff'f0V߷UnAW8vKuy7V-PnQ~k fkk(0|m֥tTУmgřj~k ˝)tHc⏲WF4W 4}U1DFMAcPԒFM<6BX+ Sn[Ik J鐔E:b/(wmmgd$Prz$[[qe]*\%)R!)1R!%q(2eπї$nQ39/JX"g%t")Fv|&ˤ!3:&(aϐ}M,*( &Bh:L9dH|#Z3L7 vvȰ{:7ׅ ԕ-gR0F+osh$)mGn=3<AZ TZ3T/24e(TE0uR-GZ1@#MК3LIl"jGswvFfex_z &3ڒs|` a-l4xnUs¾!p ɜ"Xڀ!V•)udSRFye5.¾f4Rr QEtdwxbW2΀P}qh8ЍǷ7)ӷЁKJlJ`5E)4pOOM>Η&(qK(!)ZDGTqDRQVVeŃʵ[ a9|0R^0DgI)mA.JŬ`,B81~쌜;+T]YFp>6Dv= =̧ZkCe6K9Aᅬs&U&uVRN4.(*I#@q('i;`oL5*J1ȅlH`MrIEJt99^ 8׳ImQ1{_0])uYmvh/Yj& =L0sПP3RARr"Q}$ca\ϽU,óއ%\Q_="O$?!zwh:k](6ߑ/GD\XGY㕒+s[+Tz4,L.i5}әJn~bZjP8qU>5{_qŕF8 qg*W\iE\ejaݑJNJ)$_*, i4e>t^}mƙL5q }.<_OU`5+H&LSoM)7SoM)7_bvUoT!B3A߭LzۭW/(0ܝ}?u-ަQ?Qё*q DdtFOHAS0!C@5c 9Jm;y-7x*GMD#Z:^Q&O<8$a>%'uی>gC"⦋wsA?̫Y~\> l1n Exj:xვ<^ƕe4 O!֫C !&p&PM&fFPG)IJ E'2(OΓ, @#y= x#"Et#y2^~-7b#8_Xr<jiR\4%Ql4F8NBP4Ʋ^YWE >t2Dԣ@Qך]m&.xfGj2_qfMjP|8 B'eBR rϸ,31~tWChМ.bw4^$5ma:1Myk&f|$4w scd3Wn6f&ic_ jZm^ / 6a<1Mf5GaGnDt |*\1s<^<$Sm1!N DkbXmEb$yN\qD{J p!EB N'N`^fGֽN(ހ}?T,! (+& Ҕrݨ2A@@;eV\f} 4~YYH%Q$+Q#vE~GEwQQr7Ɏ7;k|T FE%A A`R+Q SV j-Gv(bݍ.8avF=VUW*Mv}; kb+I 0 74%{DA吔IwR Pt2B0@*D{DEfDeDb/QdL Q&Dͦg:>8{ʭl@;A1NWzՔ|1Ǟ0$L;ݜ~S/qrmUfNެzQFLr)7qDl ;$GgZqN?IpQ7 D+pU%o,7FքB.OsӼpɧT}^O#bYkxݻ[ԯ''Ӌ˕݊qMNfr #VwG< [6]>qG<6&ʘֽYlZ&_fw>77>,PL ~Ӡv>[nnv8?_Mp-&wEޞkI{h놵wchk7ˋ2mQ-p,]|4\'zrg?]38^Y|mRuq#c/Ϗ(8G7=ם[Pk⟳=߿<'G?~ۯO)ep>~=8FW$BlAw-uW]c9tdn&j[>(unrb˭Ͻ}? ?'3Gy#욨3zza$bV?5UC% EEb#>xIC?XզGhQ:=֑. TK&-b~BHs(I0-*.FB\ё0yU=ymϓR)=#AK\pL'ah ?; NOZ L3JFٚxb2*JMFmNv^G͠]23n;M~<3 5{~~4 įnWww3㕏v*Tb4ގ`;'#TEQvTxr5ԷR.y)EM"Iֽ\(Uk>7r>[syw2z]Zx!o#JAm- }4]uio;ێ&y1-xo9ֲv7:?{F Y,0;Xl08ge6do[/c],v;C[MuY"0|}?uO[yvq:\`6G'ar1_.0OArd=r'wݨo'4'_vbqeZQvڲKSqN34,/T%fvŬFZ}!Aj]0LĒl)B5;z?=\lr9Mt=Mqg査D}=;pq#ǿHF[>h zm,Mgp~9Ӥl҅&Wgi>YQBޯcꝣoCq{ w|U~ߒ5=xMWfv̏t`$7nլ֗iܘFoJ2B*ΉBRg_*d \ ]'DUj54 )7ǘ"dqD2X)!qYya5NKŘց1,wBwzLGdٕUN}^ezӌؿ@z 'bL/gtŁY_qX$txp dƁN$;րVc!- Ҡ:QkT71L,Bd&\ -;,gp.2uU֩Sr))0 )ED1q,F39 gWڃ:uȸ=&j>HV%-рrZ 29r0$cde ge;%dBɣ, ېd6Y[L$d$P(x /ku簡^hiӹ/%8y~dGPSۯREcro+{ʃU3U|DX9 Yf|)u*p1d2KC ]d$XKueS9\"M&vL0Rl=DNQq%]&պsD1j\VblXpۣroŒw|8fub?Aft;aP{B:5$^D;E0ޓ0"#=LxO>nMtPF 6L36ZBUmjouaxMZc,-j7뗈Q. ֒G1chPOH:4& N b8x.x#"$fȱ= kbbAAT19I;Wu0N}I^Lc*8D6>NEDUUCĆ{S$gJ#L47*iάGN֝ÈxuU{e61uV%⢩b}#q)3"+,3Pl jr9I2bML .>.ቊ9Oa\ϵ\y_qFG;ע_(SG? m-,h:ry繷UV)[rMKirhEg4+8ˉx Ò̓RG}pIk9)q4w,u=8z>8u'vmy>^Wp?u9'|Kݠ!x!uu7zz偓n3fBjQxwƇOX:7Z=/z}njCnN]<ԼӏO\vhn7M՜mӭٿ[v⋸|%}vP렃\//dHu;cʟ))I"']IsՈt5 ']J1xonqNhcfkG/x2rZ]pw)ax\IVeA謏PB_F9J`)*kona'}crOVK qx?YW\ ,xIߔNj/l4Mims!1_ Wc j͏%$᲏wӹVI r Sҷ*?#}c#o}+fOS;\eH$)Igmɔ`Td!^f$g^]L[ҙ;XϴAy Q"2Id}2АrDON /֝# CT;HT)E4j7h=ֈJ|\OT&ŇS L+mUR" 2܌6Kr^eYn5,h,RzjgE!Oڇ oS }H{b%" *eu*Y, 0QMu,;A@N`0 2t)tB, 2MJ(`ΊvV;G {tx,e)]< 3Gib\hQ"AFF")f5M˷2@}N!`tT*�ipIBpyYTYZa5k~RT$TxI3kKa.z}98=͏|0k P[VWǨjVHm&>f&'FiXYɂpAe$ u4_sf.;gPDz܋fmVDo*z)ѻ6DN5>^4ЙJY1eK -=<(qGr9fcKF[mjP:{\\5jM @HM@B 6y`,[rn-q%.喸[rK\n-q%.喸bZ_'.Q_ak A׀O X@R:,Xi?߿r?Jo>'r(__aw9KEz\wA?z\oW^gL+g Yħ@YvvwSkϕ>ﲊNWOp'cBMo{˻tC>?Y?I:s'4ذFz70|}?Nn}^qs͋ b]`㇟&\rGR|Bm.W_痋mW/_7yp䫇I#DYhkMg,4tBYh: q*UeBYh: Mg,4?t6fgMg,4tBYh: Oj: Mg,4Ь,4tBYh: -`cؼ8u􈧡ϊ? g)|' x#bnQK2$)j]N&"N{Bo3O8=ffp Flr9gpx70;@R1u`&+ !pGA3! ºY묭ULZ`cB54*onT BB)G:%tdղ+Ss>ii E:6hx\sa987 N6 3>Hw2FMqXw:X'ΓRuֆ)&fbkY(L#h D@384wChH%2^_|L 622Op9"dr[&#%+<#d,gjB-0w#'S5pݗůq3LbySߒl?F FIק\Q̏ΉL4Did2*l!R`<Ȅn:ט%Nhf-Z %<{ȍp^gfx5Ive` x#K$%SZleYjҸ'gK}*jRg'ZiK0q<:(8nv}ZoNmvpf-y\5~i q?!_McȲR**=wZ[cFL rư9rVVwۊ淠DK6Z-D Zr=i 11 WFQ2FGG*-X8ZZAʪ,+aX1h<Y8]..ʑB%Ix ȄcXNr1VNQvA{z1IdβH\H)N &0LWfQ82lM-Rޅ/_)u!i>?B&pCZs.BGJkHkk;YRm_oS7β|%J&yӰZ\Ufc$~`G*/~*d3ZcYgGȎGnGc#}BJh0H^Avs` d,s'K:EL6[73Vq?L[7k>_=5ۿ`@9\=nzͰJl~M}ңA"zZ̡G8$ @ _`IbLY7h"b`dIV퀧WF[C` 2sRc0n :@&H c@`E!O;S 2/\9F+#kAL&"A6G-y YvY4x MDS,iO7(}jѠ6C!CO ,RԶeO HK?CȖR ^.4a;沲cXv87sM1 RZ:by}Α8B:A{qWН稉ߛv؟N^ d:zhDRSw/s.&EE-&5tP]D˨LNy7414 -G]Ϟ;:RhurW~6:?zy&hsOLt^q\@9%y@c<1AbNu<89}Mkka q_w)\2TƨtiW+;2dЁ^+Wf`ȗ$Moo:Ʃ_./K~>6 ?G b?:ovۿ ώiOԶw`Qfmh}0l.Z-Dɕ^Njmga,ry0E ; ҂eշo ]*#9iQ]dNHU+\+tb]'vt \%5) Iax zEL9hјcǵZg7@#ЛפE/!e8`,ϫ{ϏݶJX;:TO0n8@XmEC_@c )-,k8Yȸk50pnyMJ+Lq=dFNи|MXGEdi$ͩnrVC ³hɤD͙֔;Ƴ`\Δ'Kp5sQh8 3xDTL\\mbW&ΖӖhzDGk\W$释 |!53?}xsŬhx}yݥpðܿ3+ Њ(g < %&qRaRPV Ó'%$f*Fb|eUN>XŗSfT1pZ"p-ڀp"+dFh҈HBY4@ i֣8P4q/0Yqme ss"W2 YxE3wIHVՒ21:F?'ow0u饖r.2[˭Θ>0Ri됩hyD$FDd"H+< ؕ,iZEd* m&δ!s e![G,lr++b͗:q!Ei-\j%肤sOpѕ@z1YOXTpQ$祚ޯw冑FcnRjxVKGo8~ {sioF}⢞8C@ g7R+Dp&t_&S4UlQ,9A*ƒwDܔTӤqixƯoavex̠4e}b=G=yL?n Df%f}38^ߜ~Tqږ4#$#> [*4s&Moi#?Ѩ6.޽6ե{e'~1ӫw0 H\/a/}oW#ht߲siEѮIƚ\_uS5 լN_HicfD=Xz뢣'wu #h͵.TkS[]m:R*4}2Ir9Ao4^j*_B抦>~,??O͇w/?}O+Ο) &{ <~޾j۪[TFlQm-ղ&7H}uFyfrɝ)Ϸ_P.ge_n$_WaWHlR~H +fRT, q!/%4>O&go:cٳyN}kz}tMH¸nO=ʋڜ1/v"9Lr.Ъ2Z[8̱;7KUuNLVNfV nUeX &DBv(79ȳ,o O.)8CM 0'!U]`lUX `iܫ`]TN[ @augF(ɣ.E-y-̫Ѡ17N5ȅ njc [}d1HȄ)Hz{LA:M9MBx2452b^A& 'wl|匚+NBł7L+Y5 HBbNjLv|qdIeStYFЮ{^Rʝ;s:&hqB(/TkfP{sJD+RQ“WDv::6N-#e~߶-ο5lk\ ܁.wEU Jw? RZ .zV;kr_~.zL^H`2X_7ME# pT򂦪A.ݷwoՊكY,?M*)/,$D3W>Wy% xOg|:+Ttmf T1aZ:JI?c< o3^}gQ9= 9XWg+vNm@/;aS#|ä"rIu{kLIvǎ+L2O( tRqJ*"ǞH)M Bݗp;!XaM0їܰd>cL:h4LmLݫa+W,a[fߧ1ҊWoL)Rd"j:gG+9H|Pga+R7ؿjB1jP7m#W*^\Gu(o{bdcl.F-!xOx%j)` w2Y.cAf;3-rҸjKDύA+Dc4zdzM N5P_˸n#W_;g8s#'-[~2\ײ[H ؝EJΙhMF~RzT٘SN&E P~lq<$in"%-DuJt&dઈBM&UR \YPL+NdU:-RJ7W럌|/@7կ Wϓճxdpπ+jסL"'W$.0s*pUGWE$\qE}+UX듁"9vEcWEJ-rOH`!UWS"]"1\Ah{JƠ1]moG+aGWsdCWkTHʲr~!E"eg]SUTMu}W+9ȇ+TW2;.O4P83NAʔ-MP^}92२iryjd~jZpdg%}1*KQWZO]]!U+CPY]5\8%^, ~/޷*[' \e6^_ؐ e`@d*&hҝ}BntAUwتk !K:bY2=ﲂ7o%6IU L[ w$\#Z V']<aP.po![銙;m.&LQ]>뙪7Gw۷mCp{;YReǭV ~sfVTJvrKݵ_F(+4BBR+muk-$dP^ PEOO5'ԧ7ԧ0ADF=PBR(IE?2#(Rqm8s80R=g v!AL&ʌdh5DD[pa9;6 9 4ޝR??ē{=}X]vSf/r3 {3޳ގs&M&:+B['E .HKiFh%DXl"9R[IV[\.i4cK(Tt[.kq7 ǣpzwίXz A+VkWeh-g,AQ_P8 GrA΃rvv*9-NZ!ou=]T:efPeYHo}\D)3=B.vP9NpX=z;fr'TY_eͽoq! ~(;as{ ޗ-*:o&ZR{gv,LmhX呸|EyZ/|ܼGZ眿sq]zzض3BGCZt;u2E]ZeYKZ6$V()58ZL432}/.JmEAD4S4aP!PŘcz@(!rƧd^bpDM4<+ARo2xk絖Hv#'umf#i\kNM'^ͱ/cDr:;8\+c0Ug Ex&xx$ e4r) rBIYBL &PzaV4xehUOi5Ϸy(ьK"')Y40^yN& \k7buJ1nu=sphM,zONoJ_?h.REm1+8NϝV;eL LL#7 L#gǬ6=!+{} N:4xǕ$g`ogKm҈yihlQωK<>F1m\},3=E镟 L.-Ub(Gset9[0{ _n&ew~[^^;g3q~iX4N,PhC K;8a86z|?g7ٰ8YſO߽pAx_.>8D ֑`q@?ejjjo65hR4jwW&\3ֹʥ1# _n~7>GoMh. Zm5Fv73  ũ"s0h룽$  7e3QsBBh V!H+)#)sbb Y#H6$Mg֒"Ǩid(,(:MA]UZ#B!)T'OeʉD:GŀU6M"Pe+z)[0Ӕ1XDI@jt"L(('-]6≔E/w+cO)c8}vۖ_o ښ{;Zw5xt3}A!+__i"r c.8mNr7*zH^}&:xʚo )!ƜNFeiJF,a>8v"wyR;XA^H5H AK3E$ĐveC"MV:ʼhAe68!2M\h“I҈($Dc @X@.b8 \xF}8r'V'țy1=knH \\kH~uuvSɦ6_j2cIʊ7z$!rhRl|NGMU:3|g5 Any5H5jf?^E!|]L?'p}?U[jۯq8i+B0$dU= `gվvK|{Ezg5\.S4|6Կغl0=[oapA^ ibS Y.Q}U]ԍ-N·Z\V;xVL<.Fp4f=fT񰬏Ёd3r_RxE0PkRVY3d+PwXNgLJD*| Z##0ؓ1,B C1c<2#-e&#QHe8.1c2b=_k45[B3l̫DC?;x>&&z0RG`0Jm30(-DAs2ys&7Q܈`P,($S@ Q;Mc.qZO4@KM;!%Ɏ!%MV#zn<~ޏ h3}s+W#yet斒=>-Í=pX6eEn$` "f["ȌG7b:pq)dpn(QJ6\1\ 3, 7ZmVD #$$8pF}KFbS, A90/ aXTm6q#SmgB̶JvEe=.n<09UFy#o /(PK01pP`MT x\q\+xc@*qAfpQ푭rt5:aUc >i5ށ:I5^`jbK-YCn- {.&.wu`wٝ^oHZ/omfofĻ_ӉObW[o?mN•my9~`8SqHFC.RjJF +Jb$xp*9E{#@F踑;gF:*2c ,@ R(ZST.˭hX0y,z̝vX-"ro0CJr4~KFd/4\Fdo\LV}5 ^GҗE?'MU45MimsOcIbzD|QՆR>KԆOҊ׆OR*׆bjóSqգƍLEOWIKd[pW]3)>"J7rvh tp 01$0?vX*I0: W=zp0k&qZkXh߂O'oƕ+Rj>\J%RЈKf*! ҨWoW{XdgEfqLٰiF$陵+Tab[UQX[R3JuT:aDַ(ܛj:aCz VPi&emGը~ǽ_mƣB? I$itfe:d4O ')ḭFE!F[ ś^ c.>/t"T`1{qR|S+z דXWE{5XrEqV|5Za7|a^lȳOқb[+кOЗ C$\F9&p?N`%q5>.RDŒ bχQEA.6&6aʵQ0<7X2tie$Fmkp)3ZA$Hk#r3 k3Agl}A:,u[Rg<iy#2!X 9VbuC.IIzC9r+)A){<烓 $eUvW7$&*$[qi,b% eO~jQrR6_\N>i_UA0Td!Q*qLБj(E9NF0='=ֹSY_g%=~* Mp_^4 e-'ߦ{KUVYP!nz.0vQH nQ$kSjO#L.WP# \/qlg0Wؿ|m>{$g(mp0ie_]L>w1DP80X+=DyB1΂DFK'uZJ(ރ3C" #3σiIg܇t=+^O]҇?1(YNb,0|_oRqB}F_^ܯp.ګ=˟_RBs`1Ǩ9 AꀔZM1&UZ5 Jgzȍ_[* aA0XFZ_=3\4ֺX)]Ubl"ypIb%5-xiԿtHAPv<>T*R\\~8}ICk[]_NK3>gq̣-z`#^ӣeymtj:ӟ@,EEo:dX#u;Q;(H=Ix&H>г@w ңT$A 4 `Hd:&NT1YB"h !JrEkJ|X|5VȂ!QD_[!< [zݜ%|*[0֜gX PP F  '/=eD%S̪; ?}ʙiTASVvUȃU!NUȃ68| KErIH> ]1t@ @&[L17pPڡB΀(T|l#T٤H^$5n5|h:3Ď~P n*?ĖW^=+^tn|t!Hʐ "̢rJJr-Ӊ,|@ڡsktqzUO9~ڬҤ $ZC{(m3iXֈXa`M1c/PXf ci74ʺЄ&Bch~_ق!+-{d vFiĿFKs-M(qGo,Ղ٬8>Ha#-xDqSIQ*RkYXw(1EOj-* ]Ԍ6+B γ.%f.~Ժp]Ǟ,y]z^}:[Oq߼-*C魯sr:SnC{kX=/Rzvu/}ZDJG_ӲѢyu񯽋Ɤ+?:0;P4:ųٮ9Gfg}|wpy'%ĦnIG:4 nfaօ+mʳ>-rUl6^=혓喛c8*nrӨU(ݎ:ґ:fEx5ϵ L5'p Gg wfөm[t6-.'?9?}ۿ~>ow?|+ \.VmGO៏Rֶ&4rgw6L^koCt-\*?n7/=qǀfދV697˾>4NiER'JzT g\JB IZ¸dOg/)~f2AeدHZa cvvy8hm֎,&oP?67|u5tX5L؎ۧ ۹utO6|(GbK N7Ew>z i}2-RNY0IR(\j9E)]]2jp4"EH]Bʅ pZa"GQ53deaA@NXcsBU01-O\ ʹ ʉ5BS(: KH?s`xc5Mj8bJRBVHE*iryEDD?؀tV[`E~o]{zkw݃n;Fw k5oZl~ҿo \~&Uh;icTV#OK\NqLX4NMԔrpNM5 j|bD$(QTPPXF SPV1B(1C\l/,9c\[R[ͻ&LO/k5[7~X/2f#|!+*P'^O:]_K:[C/Pgbk0؜^MGB8´[֣b>$9B t$E)˖C`LAdYibG d?y>#Z'L=RF" t41R`H%*Q D()I9.#>D ʀEmYt ^XBӘ$7Ξ/[MX?CIʡ ք09iQ uPЪhmI* Ba63iVzg.FPr@Y+Es< Ĩ[Mg}vJfѕOf=frruT`IJf [lZz}3L^jck;zl;kI*q$RĢT\D*ªJ*(p:+e۔ْRsсL6=9M+E#VjmMgHfX-l&B=We ofT"XcLX<.  02/;XL1&ulXNLRi@ܢn:"YQjmάEcd/$9n5h,d s;|Azj:Dlt6Kks(Vt jjv{ \gFUαPw4(޲ l&a\CYHF8JS{X!5fd&*ٲIYD]4 sMV&L =l6'5br5CPh[D3Z"nM<Ũ$56(keyL |N)JD$)CM-w qJg}ejg(Qt*l"ˀ`AEDc춈|[*א|ͤPE;.nxZ/5%vBdjl#'#Bo EJlXE5V 68ŧCͤc(lO`nk~=̧W1oEEGBڮ"yG}nosJyJc ΛX37ƍaY,@Rī"icHU!hBɌP X@[J'e ŐOcc0!3f[dykgŮevGUxaIn,~}ʍ2W`|G)E2zK\ rt_yNZӯ/B{6r>Av";׋=1tyn@ҽQČs ]u$H8I׃#f@$ k튊.>5sso]5w*€#&" Q*"I &!cYHI%Oj|p(> :OIkZWYQddij:I./tS3`5Ux@@zEn S #LmSv%M1#^}jOLj~jUDFh˔:Y2ulK,FJɄ̽ XD~Dbu𾞔!t⊔Q bfd oX/,hZ+9&QqY);b,-f,iE bf|;zyGuQ|XIс7H5֛RKVܢ|Ѭ52(Jd4jn3\LvlA?^&ʖx-TudE1{ F5" mJ:`*tۣA -J4^ rjXztZ/vJrR/Yg+Wvԫ c2A (!XQҗ لFx1")ؘ:DFh?^i/->&DAƤʺHI/AM!Ơ  2)m/#Oy3zLJBmxZAbB>R0k TW\~1|yu>#hd#h A;b:ލ*HdPx^"MA+)kpjAR*DsNH#s:|wDPʷ*h\RxYTfc"肦d D4hrID3G;cʌLBl-:AuaaX7&7޿r='1䅂MipyR$|#IYn|oTTNVC$wqH*>   6v7Ak,K$?=n=W+[4 [rE9d57))iw[9e]B8bu4l@{{ʺwx?k_juqSt5-/ѐ;/rԂͪp1ԌU) X}-\,ن csnZ]a=6sPh˶wqX9z{PWԸp׍_ۊ}w3ЯWGg8ְ_-l~Dgt"b.5Ox'pˉ.>sNZ MڒFtJw Q*}GTWWw!!S4>NܔLN'o %;**uiY齖#FZ=B9-U Փʨ-.Z.Y Oܓ}٣mNrh݌LJq@N 9/l[-亲ղ苙sd0i}kn:}GW$;Z壋_ X'jL]F%ٷ.; !AhZ8Emf1]{)wݰo!vOMkV3k._r C+,ow 1._O㵖Ʊ?bP ( "qJd&Jipv0w0h-̔`}4a֞3Q>{is%3’_+"Qz(jfO\L4u'VC۝mďׇ4k<IRUx@fT6tVSB)@2cT*Oht3wd!a̘9ޟT.+;d2L/רXie5,+7a\IP!kWAn!:Wk{e}9ˈ~m\~#.Տx1Ƽ%w <%%v0v|c/+m}e~ww;Grb~8ڍx6ς:OaO?w=nvf'}oݚ=}W<mr L܃1ͺ#Ίf掹:Qsnç8ykh+|6'En(b%;.co_AĢDNboGRJjg*qdxH*apD|mNR@j2gE2aQ'R.i,b@X v,*50:-!x8;^.yj#d(,Ls_}]E Bq:pgǪpmV7|s"WuQ4 ]ˈ&L6ucUEKpZ,0 Yc$e}#s6*bE(P;wVXCh)`J3 PL1,ilG?`kҥwuT@,mbpeR}Lhc,NA%]byVc-EH2zj v%VFnU{! 0n`Ɛ)u6JB%% ȂA -9HSg;wϣj)6#XЭ951  cbj )8Fc G>(T7VU r ֑(u_@bU5Ttgc(s$(aBE{Mx'x)$h~GPScJ]T<r.!%ݢyB[@B[%~jgI \Y3Tƌ8o(FDN-2Z!`BB%QMvޙxa*oONtx,Mϴ9+kA#P/+$!ZpP G7oP\XXYǬFJ(]Q_{T:Te ,Ç<#y0o/^XH !Z * <@"922+2ȇ@0#ӥyc9`Ł1 ^ rЀNAYUvKet<o 14tPuaQ& :i%x4v8mP.k$Sƺb|=GfGq'K3*!2*a`eDߘbUЈ.mМd Ţ:"ʄ{MǴ'GsP&*cko% tqu.[RH4Ք,HuZUZ5&zQsHY]8z*ȘLyqѐҌեjÁvCPB^";(IՔ+9P/WȍmhѠD+N fR+)#K @Olt =X[6 1,}BYOw؊DڬTe@᚟rc_E'yc(a5 ‘5' YT1$΍:jT7t,:@Q8ژJec2-R ́SDNY 5kj`%Q*S VفLҠ@Fj)c pMYYOϲsPbrc]Jǃ7"jaw&*J7րPhC],3UH g@:zn<2tvVC%q]T{.Q5lAbj\X;Kc&"˱MYUpIC$-q(&!KkB[IP <<k!@\, 8yײ`#9l-)@cMF(sd!B~X'"WM-2q(:n&AC(qfp#]Ս"DB*jh%Qb`2Q[a#g2K!T q 5ߖ$,en4cL+R n!tP1""wkW‚VF7eǾ9~A(9gYraGaDϓ%uH=ԡHM|T, mt%Ev܈䆝'㨎._ŠCvn)R`[J4?1ub;a4FMy' wq@p?^G2ֻӶW_vq֣܁j{߿9]u.؃&<֯\[Z?O?qYS2O&us;ڑڍ8o=fvSpu67e\'6M n=l o-pg[ ݋ ^̮zy[ĔR|8WOjCh 4U:wZ#// /taWN#u<j^[弡f g 8dmlI_!mlMuuQnnf<܇mzI @7!J'gm0^Wwc_ڡyzv#?I}: Cm`z C~ck)rL?W[ߜos)3 t&=D2do/x UMƣD#WzT_u2~<đ_Yי q$A6T߆{+rdn5;Qm>i+(1"Nc'uI>PK9ة*JP47,8$g ?fa/j>YƗwr F7Ľ?kzYHj8_uqxux$}6v>1FX%&6\Z7KcLyonoڅ5M5ϡ;:MFלrKȒ?O0L~c~\ŲŸ[<^=ڿ7ԶI1q421/ֻ¸۸ޡ٨ϫmI 0޷.ն#f02Am{3hv޶GTwU&@~ebo9'}͗DyM:ۙ[޾4(Bh`Vntzi-6eL/ 18)s%݋ gՑ^P"# Ojõ-Yג3s"piu,"`oXȃ/)1 !˹ɄEΣc.“;C۴`ك,;g݀^NꂾV ^WM.lVVyUEVvw=V [^(aTZKCeq,.ťTRY\*Keq,.ťTRY\*Keq,.ťTRY\*Keq,.ťTRY\*Keq,.ťTRY\*Keq,.ťT>߲M( v'Sї(!T9Ž^rB@"$! DH B@"$! DH B@"$! DH B@"$! DH B@"$! DH B@"$!/N kv:H $W 5@猐@ d*4"$! DH B@"$! DH B@"$! DH B@"$! DH B@"$! DH B@"$! DH B=_$a섐@@0 \N ʣG! $sD"W@"$! DH B@"$! DH B@"$! DH B@"$! DH B@"$! DH B@"$! DH i3ɋGjO]6~<1TM} M^p 6'\↱.K@ .e;2O#ؘ'VW{ kӪi~TJv\*C]ePRRWHp.NF]!<u~ TV 'QWBBRW@Rd)}* e~ ,!u ՕyBJ ä<u*u* +2פROH]!t65'j9;zRw,U3ېidɊ< fVߍǓ/CYqY&ac͌8"un76. o/^Mws? el̚V7]2|ImicxX5zRܝ2,fWAX1 aD>:Y4]RUf9f*q:vqu4Μmf:ߖujf|U kY_oGг)R>=-HT1s?zaۗv콯뎮$][\V~tv7K 1w:(ǂJ],qnb{BQ_Y]vtiVĬ=LMh/D\KҦÌ!KUK}vm|4Usu;ʟ e^gK#j~HW`aC7gbq; 5:,tzamny-WzsXt]&sT3-1ʖ: b) wJ^2!sOqV < ǚqQlvSv xja݅ y N6sZs*]FL0a$ Q? ㆏O4"Hcek<a|`b+Y(s#[2 Rm`d~;+[s_lv3 XU`y>k3aӹ4YYzWe~5Y{"eWM^},{|ОX?%gHËq5 }ϻo{jަ?a1@]uraG付pbl5Zwܾ>ʅ`ckAwix ǘx%<fqDg:o1ic*^N5xىCu΢Uǭg,:1dVh긅\xKrG}85|9ߍm!0Cc1Z-rB^wv9_J4;VS+rNϬѦsX'.`ʌ}8O.;7. g&1\P/Fp[iVsIFY] o98z6[GR#]nFl9Rqufy Bh\'AE\ǣDW1-'ؠGQmm*VVϧq!# r c2jLDL~=!jpz0rޣ.Ap?iT{qSR(+},%n2愔!X!2-2#&CV˜@n $)#)N)'b)F)҆K02%egR(rRgftyOT\nOI劢SưQK,-Q*;Ų<`;gJPV`+^`^#ilNk"Y fg^\Glvѝ#YGOdadSb܊(8I*oTh-1cKu"e%vG:D0$D&$d7&ǠEiTIK;. .lpEQh^LeҪ2&kLHY/Ykm#WEb<pfv6,k5l'98[-;%2mMSncWmw*~QJG"" K_$ʔ9)}%G(m΁7 n R} _^=ecoX|7=#VdZ>h퀦(y|MLqyH7&e"T"THGK`>ڐR0(*=dʏC8'"]Nt C] ]&iso]%+!3r&ɣTg$ZCqooS;x搸>rAB<$.U]k6^ XYW3ȕ.IMʘ7sd58_m$sT֠z7MK`V5$f/WfVx~UMieH 46KlD&!XF~4&e'UXJؤhi1%Y@Q :iIX 1 hCK9>:' yA> 1fHEj&TV\~]:OڜNN6Ah<-kemJh|D|("9d{+uwDȶrM^PEo9xe<=\Rx*5 %Bj<@ax$ɘ$F3G=!c7y}j3XCHi Ԥ3|gtDD\$ &/lʩ<2&9+k9XM)Vz34d%8Ҋ0co\AYgu8}GE qjժ) Ag/TCcp[[wht5^.#mg]{ܸ-/1q.MnSwD!vxizG+Il/&[? ]hI Q"OSp'ݿ}ѯL?L۫IXlg Է<\}_J3]ny6jlHWfBmՃSjcSsR:(R ]jg.V9eeRN݅ɗŇ:嫔6ШB"n9Ł0 嫆U=-_1 ֤6 P0=hØ94B bXz(`Z)!ts#`:ˢ)ϝ:D7>Cַw@1_b:!XOt%VxG[uF Q7/8ʇoA&j2~όbtc#33pc`Q2";2Iy@-ʻ@:+JM݂ITt I0 @46F%+|HڎSz,ZWUG-s#6Ӵ!1fG_P[=0ݯ~x{RYMj4x`@fI֨ 21CN̖&eECđ0~ N,6kf~zdOp@VX7[lIfd%ʐy}OJRD@*(C_R͜ԑ0q+Dce#}>'ӎ܀Ïů|nʵ}6&: LvAzYLʻ9c 0 ok=.ihzm[6)7A%{v7/oJW%ΊJpR6|?lOU?=wfOxQXzʗcNHІin"3FLq_~z㽼MMX'rɺ:bE&h?L Ǥ_ v5 iT;nPqvLm5qF?. rv1ʿ%?ۏ~j7 b}R(ŽNȍy)=f8l nЅySi~ө՞CԮu 3Z6,$/tCV|ȴg,Ja$*'HȆgM0Ebd !uLT%wr@J]b&2H9A!f6o?}S##$jf6_z`uݡ;} U j 1d rE -m){˜M /N:sFr4XQ7 (zo煸TGE^Bhd jQV3g?ȳ?Ϳjk(Úby^^hE .nܲ_=#|RAG4 ( OĐ=JlБO#uub&yA͑eFNbY]2S,x&S)"r%cV&[$zOEOka˿n=0WM_Kg Q/EZ0 M_ yf&E$ũ8&6NZ Q2DDxD(sy= fgVц$8=G!D*1LSH0 @%}傘|zef6F7/띴! -G[K̤hYd)RN=X~ޥ[|PQ82Ϩo"EH #+ ^x.B(AYOqg)7Dhr=dM9s[Y;Md䴝<\|GYɪ_u6oml'ӷ8_/rI["p']kC9;ܞ=5yʭ^PkzFDݷ FI? n{ƑŞCOK1E*"iݯzfHՔFBr͚zY='#?ct5*LQOCU; "5B%]L g;$xv:zY}?hX_z=9ˑX7 ޾7yaD+o__/+|hW>o:I$q5Ic=DR_Gui7C6RJouHtm~(9?ԓ*JUK /BRMț6EIi?oyӓ#Hԣ=U#s0ݛXFY((hL] BQ皮=;Ow̹>xw`iFnI*.mx@ݨ콣r2nlK-mvGZ6Sܙwn8;NΡ;9Lڬ >yqzH-GW>9DznT'M09q!Ǚc,)5kR$dؑ)>Y" O,!)&ydѫb+r{ƥۼ)WDeUoq6?MRԛŅU}Υs9J9~Ґ9XRvhݻ0ߵY9 TϮjnYgWvȫgoo?x6|/<>o&(~έ~cn{זH 4aq՛ۯ#vݕ,jqjON;x*ߗ7?U;mXj/P/u۬ݺ+d~qups1z_q@8hNtADFY) T}9@mlHuF6τPCՇ4< [~t[iC5#(eP]֠?kM'mrl/4q`;+ >f$z!m m.ܲI:]9wM|SJ$xzwֿт~ɦoxr9.fo;Wvw;c8ctosTD7nmy_6]E撿V|vv4_1ko1|$P1#ѐ"~{䫵tB$/߇) --#IgdnȜz@]_XOplD/I=>X4U=&3A2"!$7F+€`͵^6ۤ(wV[ڿ2zz_]V~z4@C0hdv/%o+okqRGup{c \`N2sd#'< %& ekr))\Zύ$)Jʘ$($@ƨģn1Zks 1T:¥9#\PCkO1~%LZs ͒?@~dy|$i4rZ7wO&KbFm4d?|BaeF_g. SZ5Q]U}Vyl+V6=>/9z)ĩڸj4ࢩͰ%'{>?6(vג$.ɿ𰫷lK5<^G4=1취~㉮%칖|Kfؖ[d %ͨ·V2趌۶ ٖ92B)M!2cD wdζ:!J2ggĬ ĦT6Fi&.x\n)K$)\pkޖ)FΞ>Ә$)/œf`ؤY\1-V$IJ$Ɍ((F@JBe GAaĀ2iI*DDPHk PpJ j1rv j|C|7N"=ܗlb|Գ>nR%]Y^yrd=)w:ifj2 ސ~Si9ST+.O2"(S.O( 98HcRBIsIH1jcrxٞРU !eZE#C┡^{-UHƱ`c) b R*MJkb֌J1]X3ՅrЅOJ.)6gd!ݼ)/_mvygh4Ng_*y "6$3:J4UX:sgy.%в'!;{60C7Mʃp b&&1aP{@V+]:95Êٛz/Z[Z{@cx 2H /X%i4:Eh$AYﴤ=dD%-zu~`]H2NuTև٭~q̟c_4b18T#BaF46a&D|E閄3x.s(1Vj!+QACM'FՈFSB@ Gʃ,e=ZD@ڣ{iP**|N2,[#6C8]^g1.9T/zQ zqЋ8PɵYK$ Q3$x BCHQ@A("ؠqG_Evo 76rn8¹mEOߨ!=`zQWcG?sL↹ڸkti3XR0TOӾfZ',:Fp5?Z l "m+u Q09sSv>i=3_ .so1p1loquwCij|Ƴxԑ uv,vCOL6\+V N~eo<7|s/U/OQ~n Ey[qQp je<:d:oNt蹓.{Ct腓.Nkos6A(m-`B"@"Hsw`:62_ I@uH]Ou4c\k\V<`+x*}UNz/4;;_Y~W/o!m& SMZ0u\Z>^JF&n>hǸ[:݉ZIt4ܡ ^zkhwvBPz_Z0҅jD6y';czۤ;=6i!o(USvZrEiɉ}iH)ĹD=64&@ܩD-g,x(DSBhE`%Ht<C0XRpq*N󀊑Gԡ=~="7/)n =5ڔVf^j.;s[N/1cc/o(Des(P~-)=c1-0÷!j.L9 Kt{+&PY,\ۆZ~6OZ o+J@KZF)1k40̺>2p)Mq\Θaw/xYJfv FitҐ9F a0xn=$eăhWsBX<3VnGi_Ħ.toscC> >ADDe=AmJa!pc@=:.hb )u=j瑞2$Up;o*j|sWL#,8ZI}Z9Y˰{?{@f_,n1mnzO֡^NF؟Q ȭ"L$?y ʝ :RCt Y78:~vt(|U1` 1 d18;b1.0 px`jQ:H)H|v{BJ 8-mGw*ZxSarSQLc/xWNJ =Asךh%!0(hGM/ rTtsQ  y%J:i<%4u&VYIfA9@+>P4#kT"8h4]({lU$?}>I+7i0E0QX)9-l}(XA?erJfOu"rYyM{Ax!3M}>DC MÔ4ыtqQEg=&¤Z6Kfi~ǟ&EۤGߏ|mFp0/ }3Sero<=;y7,|MG\/~%꠱n4/ӛOxubcs3NŠ '},tpϢYgv<*'ŸHi)%RYn8:Z'lƣ40>s\zqU}Q=KmprPzaߧiKh SQ|f꧄o$!˗GPrʙ*>\h䭱#g_pϾO+V<ҫEj=, ~rϪߧZkjWAh/_]*J#6 v^bSkȌH:k+; 7bkQ ٷ62q}-wt rTq?>8:'꿨[%{rm ټQ|A%r|sQƽ1fb4ːZ6ULAS?v h1ij@O1lE5͝6./A_$-s421%0ECvP0MZ;4+;pOdA&)a0;,H7LТ콪pGg&ZRlʪdj喁b{Nn5?ц<ԓ^U.7r}٥bG\QCP/2ϥ(C@8Rb H:<28# R^QKh-|+m„#*I# D+PɽU"ʙs.gSz(ā.e'{jFx^O~t1r@٧2vu1e 0RFmR9D!+sgST˦>H65*sx(z{VBVo !~9b2 [=e:b;Orsȷ;j~3ț*o?)(Oi1`33':)L?(Ň:<慜+˜RYl4<0smCenF0/`DQ[D^rÜ9Q8\[cI"!:W>GD= $j|!%T P,8a1q&8&oe(,g\! `V!4F(QmGsT C2A[R.k%OƶcZ#nHtS S~$j|:X hm^;DpTedʬ4LNsjLVٸO}J,I:[ Q8I?9#D+sFϵy>e(e@;ww {*Z1wKT!0jiq"'s]UY(tD906V+LZb5. JrE؄ A2 ǨU1kQpb>daԩ2eP9;ARҝ$J.%fϧu̮۞y=Oz^Zh謺%Ӈ`,a r3XMT,9c*8qr\H`" ,0Åվe{ M,#ށUFǤ")U 1$$" z LmqNJqBõ&O-J[ o0#i#啌QzMkߧ=#3cðyjtK,M;b`2/"Z-#^yNr`m(^Cp8qy:AjW(c.GAš<A9h3&H*0Aej$ѹxs|4m&wS*īĸe?.̲?+Q:s}OV*O] r~sgoSվ7.d ԭXRYڱpfÑӬ>:~2H~C 9?)" IA)g2\e腑@0!&, *4 wEyb/*%t5LW*SV8y.Hբzm/vuNZ)It Zo?qxѦ9|i3-Zgu.W«u`v@W\"gR}5kΑujrGxɪ#18Gh4q0&jWŸOUZr9՘ËlÑ:nmuM6=+VL40xR2d%q{8.?nuȶNįugâ PGo_}Wo 0QW_y8_ ̂Kg7@$A{Cs0rhC S/&|q%oƽ%>urejɭToWfX~uvqժ 0 WTޮ΍[tq_M%+܅T"i\f|Pm&U[2UnMo{H=crj=@ee2Q2_O"bi:GG"ZLH !HlX{ȋŵA9<ф:eSFq LAJLʞQSi'XLv,¶ї'Yl(7$xVuHͰͥXUd:jǿk 2o/'v BЅΧex#eyA*/2HE`Ȣ1 d@9$$!zcx㔚mTV"[`$~KO}w?zmپz9b`@ѝ)Ucf6hU}aA]ku[u4\o)̋(c@<7<-r"ϵS:]Wk!C x¼Dg {֯'',ኵ:aV `>@mZ{E !IeJ #52SB;NxMlF&x8 oM 3qkiP,BJ"MNLB"oCNRGR,cHF<8< aS#b" 0V!)\MGH @ VѸ䱴,S%D+o"XT>X 1XD *P@.PƄe O$VG Ì!Xyddl #E~=BѲgހV>ΠYDy"VYl4P9JiU }Eڷ|#\ym7S1K?r?׼ۇ"7ϠX4G8?e}\/f e*ڇ*jAFc-{ѯ=/mHM9%=M"~ˢBJRLؒ(yXf͚ꮮz/х6O1](p'lоb%K7ay1 {4 09ѫR\mQu[w5;;| F/_8 }.+b7yoGO4_!NJm|;z  lۢ*U+{*pUzH)W_!\+~.J9ZNFӫr^.ɖIcbg/L#%'Fh2~"&F5\4Lr}99ϭ O LfJ0]raH)_!L+εiNA>Y0\wƬ5o; t̨#~-5vQey1;9?٘Y0|eT*ۙyQ*朄B:/+X*aZ;"M"y^ ׫!(B;I`І dcR lZ} lg/C#Er$1S)ADòLA!5Z$o VE")`5M[c QP Wc^x#&MȘ3ΒH T 4Ix~JP#3AyT(i~,c@Hp4zՕ:W3I,22xO4bAb٠ɂvH5jsd1^VoEd+)UC/z7WKEj|2la4F Й1bh$z!y4e<=G=󘓱&?uTӐZlk\NڈluGԤi4}Pl%%}, a8 h꧎΁)ۜm[|ipelqkpŖ[6b(X667K'i[vyκ:FqfBi˰<ƃl*VДT2q(<0:ȳɀ"2Q3b1@VG<71mZ*eXf |?/O2S>Mgm2} DߎF˟Anq>{s_wٻ5Ⱦ8j 3:+bivüwZ\NSr6NH חkf*U럺ڰK4U㻓i)iڜjvu/j MZ1itTD&NQBrZq?*4Hi6|_yY^&0U߱rbabRnY{eYk%AQY0ÝW1cY (OGbXjT )0E&2p!`YkC# y2ʫټZH,!~v^UVSln,KN_d[}窱Zws1PGڂe)S}رKd#Tr#Vh4|>4B%$Ǩ1XQD@0c٢j q9& .$-FV-&D DH%%y(GBeC&$oJfB3lk}OHI5=cH"+R"zbYAz|N^yYt=| .澠v5Vj _"Dp<Ȭ%"Ii<;C&PIb8=6UC&fRĂIؠ8F19`"ٌS_wONb*/XM?2"D58#S ɺ'N3W',gL 8)ܑ=r \pG,$472@% 0KRE2"Vg3"~xU{i>uVӒ⢭fo=0ʕh˙CdEm$ Ht!D5i2 1}'nxx6/\&cЧEzΑ[m'؅Qю[ яhcչ#6js#qMNk;9k5{z=XDEg4+s"$A9̓4R}\k9S}c A]4'tn՘뒑rvװ?~VޯIjҤ V2$IVeS ,lęzj=:xStAtC'ly* .TvAt/ χŬݻ.A3A =]?Uwq?:?ng_0{Q.f>sq99v{D_0zQ7]N<Ѩ$Qۛ% 6>/'sQ#A<<[#z- s?7tx<9qS‚QltV XJ6 NZϸd!wOC])\fJsi,H8c9!9sp%M?F]>u9:VHcLfHYrc,q\8-DB+,*2w144|iz%G|rr cɟ6í(x۵YpfGa1?`2Ϯ%ѣ&aD5/J譴NYkKuU֣qCIڲ]l/!Tɨr`ӣTgYKo^W_v~xz[jD]W apnIޤct"Tp*5RJw06`A|mW*SOmjo7־ci-SYEU);$+v|H(\A]T۳!!V ̺ʀvW;=Ofk[^~Z..-z$E  FYU)tΒ+nqc` ~09sH[GS'behx|YkšUŀ3AF#OC+)V3A*y (F)]ްcP-9a8݌FG{E 㞛䫁sZKJEw1n{JY4Hpyu] 3@(, GA38jZxb/FbHtk.U2 r:GN) iYm4IXo[ p.ܘ^ ywFI WWpSWSSVX DgC{m|\OS6g%57AuNhW|N*{<~dAp]"*3Y0MnqD#QU&dS }Hz7o៝335|.㽺=?3*&K㓃/`xb'px?YҏW+/~8zu$ᨈY$2f1;57N@⺽Q O8>Ϣ NNp 5  M M>FMp:t~}v~zqn~9Ki87D(F}5F;8 ?MMCܖB__RίEȺYgߠ2%)3Un}Թ[c)F ?~'{9pHGó8o^vo`oqy!%K<^5> l|qrI X^SIi2ϒΪ0 =wS\g!j7{UHk3pE`fg_-7MYE:50vyge_O*U ټK{gFF䆐c|㫋5c>1hU#CRp"&r9*J4~2il&oc!WZ]ݮ^硯~3ɖiF -P!z|uÝ 5X'T]eދuV5:~ocZK&%T}-7kvoqgi`'"tM=9ER+7/<Լj+j(EQ4cG[J,!IVT9BGd"ԗA6Z@lXM6Ͱ $&QN:T4d BX5'JPΜs o~Wwcqv}/&?-+8PSA\j  ^1RF?͜~i}ʭfr@g qdj Z8 , 䐁!L)($d l?u^+aUiC(XMX.A4UXA.9纅gtpRA8}JL0Cީ<BWTݯBltx('TK$:m!/?̌m:q G_{%Lg:+/O5X~*ϼ.NMOGl037i9T_s$ efo>08dE8#1r$Ưl0d0fYއ8iXU|8=w֛p +G%Q\5gJN{?2>.z6 0fIm} Ko_nM7?7 әNW_|x}}= Xq30 .}YE&{=X˦ꡩbk -xO]MJ#wI|@\˜[)ٛw^ǯ;ibWR+@vJY.,[,oTZ2ʝB(r6 &s=/@/XekF2j#m$ʩDP "XE"TTG"ZLH !HlU^^H_^D6:Gg["##\ZH1l2s 9o ޖ!H+)#)1KB$X#R qFA>*xQS<A@XQ4 I!s BR H*|R6d!R!-_pnRWB7ޖoM&ʛw3fet6LJہo2W5(l""2:}7f_?RXe=^R6{Q!5;z Q 土LcT6J xn230s%Ťu̺r/heuq'k$!) Qs?Aĥl)Rkmj,ӒP%:1PP^Q ֫jRz\Tkn ;1#gfWY{ M>pd멫RMЅiϳXa%3ű5ɘÊ@G4$7`'iKn"'JԽX?p)QDD?$!'kYBHATʘG#,& 4+ĢaQ($8M(rn*֑aaHƸ#a Z tȹ]B"cCG;!l (#4 })D0ƏǾ ŏ}7jz?$[-[?c @ + pݓQW\JJn{*QĶ+% D|:*SQWZw]] V]=CuE=!uOL TUV]WWJժg&˶UY`xpxr/+C̨WlM]UL7yNC`rNl &f܉x5SVU~:Zz64J%k4E,$U|M>F#Tr|P@+BY{󛗇{bkkn+Gqi4W!C^6هиزeRhٞ4x(7Kt({D?| EvYJHe@`<6k~+f:$ȫr)Koǻ̍Wk='x'Җx;N+R&J%|*&PM  d@GLcpoFKCA)hwd.-Mvp]bkAAjkMu)udyK"X% ÐkmtrZC~ك41yR+Z['2VP0"'[dZ/m(K6@0,.k=eG^U Z:TUC\C7IYYc<"aXVI P%@@f0qQBp!!آ%qPvqMƺ8h&!X;xn1掸I7o-s"QPV s978O&EI7Y$9dorfP6WU|Y##QWTVh:o t.3D=&;VGJ-d)ćDK`"udt:n:v}ºyGJ<͸]_SnwmKXǓ-+uT$X$~@*}=o tss-PkG.ړdvms+S8%FkEҘ)T]zhy/?6 L#hlm dyW+hsa:N~))A┠Qも Et(fM`(h*?(a'BX+ U(C202C&[ݧ|~X;j1]v㩃?͊;]b @1bj. ]uSMVP?=9+p{:*Na6JN[er`è: lY2 "[F#Dqmm z%]|TY{y7YMV.X6^ST?ˁf7MM`niHWj5_]:i>nnvwoGw>E_j> F¼5У? 3y4M+/pyx+[2i$t7~(7]ML-n؊͛Zri:Mcex =>zfGF d0LKMbQy>F/KJe(msI EX\D!Y+Oje, iI)jU[hk6 1q:Wj6zts|խb<\߾FSKwt6Eq`E*X:65mB08'UҕP z;(o]֗l.g3*d9A#+vo2[,V: vǮ3 #Ft[;_5^R4XLCNv$,p!/JyZ@R# 93bʠtҶҵE1EP 䢥Cz TUYĦp)o @)رK#d;r=l"߫fNS/ڬç$}_N'*&j(&ldo*$ryCMEjaȪ¢1{) C\=H .BYRqQ9iR&T%y goY/@aVկmfbҶݮ,g,=XyCiFJ8_T:u^̣P|ģ g4!ONkMt&<g^Ft-@~^C=zn(O ~>?Xyv8|CKoӊqpktq~VHYK#;^ <9RLc9C%ݻ5¦d,tȐ25Q"tg/ gmt 8d6]w9jS0 d lʬ'Huij+4{guInCK +uw=M4mf}z_8v?Fo4zZ ]HP`=U.i-Fw$s3yԆ=Ϡ}+Y *ZdN@H1f+i`2<8~)SWq~F"/Y/fvdzoƦRoji 4)LZfucހ,{ŤAZHA!0 QR_@ưNH,gkٚ_+3:}a/EG9d yZL 4uhaL،73\%d& }v娫b؟f~j гzyi#S0dP@%$F9c+[/*խDvbITl|HT !!XI`*KbwMwORXjY~< OR׍%?Ÿw,##KJ {=3-2hB:]sSӜ{_"U%2ф␔ۈRԴe׵XWZȆ&@< pbp*+o{m`cPDMrǠI0FE."!98ޟCSMyr5uv$ |MUS`տZLcY& Btʅ"/^/bG8)6EexΆoG-O)}>D]tly.ceAp[]Et:$/#L)ao\qf#Ȅ2w3^?:lWqBoLx>>3?2ޫaq=UXn O?tlpx{uPy,c;o$%È2#Z,fM|wF qn"YB{;8 }D+ Fâ]ux>ܠz~?nG/H+G >vŨEl fO9IC=a@q2):ޖmI Dz!}8??_+:IgGG iD9f LF[u.4X@»thC;{>9/Xg)kݟa ]r]H~'H #.g_~ř)dkp}xvPT3I,g'4E b ` g A.,ȵ碆Ap4}OeW} >R oit*80vGD@Yuӿ'dg,0~% !x㫋5C>1hU=CRh"&r1,JU[tnF ?ſƸE^}"о}SiO@PFLS]_k@{$T|-5q/?p {~Ӛn<4T=x$UEKhR+F6Մj1%MeMܙwu&<{Oκ07/+/ ./V;>>`MИ1iXfH `ă:0,jcHØ *"MÖZWWZW Oި>BamlmG]&iǼбuh{0AX۩iaVZݯȊC YqRz<ăai(q:H"7TEc$YZa`Ai]frʞ&LI8FҎY ؿ@,ta1rV4,hEh+kٕsDI恇 &.,;[ݮ#nEYnӭۛL˥i؁{g:0 r3ЛLTP"gL'4FiVP〴!ER :\;Wk0HitL*"Q* J3K`@0y]z|.u7H^ %])ҬQrZFRKMOF!,!nlPШ"75*'ޕuu88~E޾|M9|#0.ʏ@$m@B?z\65WwM[:FSo] J+/wA|uf\]7]r 4wG.]ŎGhədsw*XKT3_@"'ib4őAA E%1_o.kwHD=qrj=`ed(EO"bi:TG"Z Bvo#=ѽ2Bqmu `["##\}5Ly(\)2=Ft*A1X3Afd*ևTKCyt FڇH1l2s 9H- {IIY"!h|V*g䣲5EcbXk4ԫ*őp.q'eO"D+o"XJЏT-W01 P8\ HDLC)`EJ瑻*^RC-kO:o%5 x3h|$w ;:ASJdRe*`AK+/$Pt/lp~W_wG~`MfR۫QRz0.cQ)̅ eQ, 9+KL=G_ oA$<kpǴ'"2.` l{Ihegg)r{S}.U(>^uyQYdl](QwPOT}7+kwwƓ6t~0GdRVܘ]j<+p>Mal0J#EԚ 9WJ.tؼX̵Q*g6([83WO˺f{fU9󗯵ϲNGm]lf %h XǨ9 R)SĎ6PiIF[cR=@6H^ vVk/;^s$ٰ1rV(|k&<4p=uT`<4Yl`%3qo=1ǰ pg}*0ENzͭ$G~ZR^%@%T"X%ZfͫDP-R"Մ2f3 1x+ Da8 hX8{7Qƒn&x9 u0%cM EbD hlȹ]%BT"cMWū[_! w]UQ}$Hc;wƏ%l؉J~'VDhv \]+E$RJTb'WhFwIJK3pU3pd ꫁+ԫ"Dԧ`ɾAPgg~qVEDLVOᱩI)tdM2^ ]Ix |JżS#:ŖQ&Hș,$<0Abh+ W!j@}~#e~W"5?o`x0Z aߌ0e ]!| 7^oTP%+Qg5@D);8C|TpKC&hoVzt1Z#ʳ!{{c@WhCC}^1'g7anX;]$ |H}+F]UR=v7r|\Unj8ڝhaګjN= ]D;W݁+ pxWODb;W\MvZU=\=AT#vlwD.W * \1;WaU"W]Dzc0MlWO4#b6s97^EyaΎbliMzosFt,;G=f&gcaQd衖D ̩}{)>ɸ++잍Li$3}tB  8.ƿ*3臥IGkNѸf{e!hl22A//Rds-c1SAnbJG`E=^\Kޒ uRwӍM,BxY6SVB^Wq WWϏ0R3E9AZaLn461k ty LqYiVpb#V,9~E~t;kaVE>ג)E 07X|dO`;Q\vŒKjP)wmm7E@I6 l0xkGn)jy) W"v*p*QID W/4R\2ȥ'#]%jPo śP ^Jn}j󷄺?qcŁ!yG¼=rFtͶStݝz;E ٦%]Rt-umۣ>)!cLNA2$Up;oB!r jg>sWL#,8ZI}+k5#k=O@Jj~OZaOP& O }XĻ-vL`=8 X+z2q\(NƑ%\aPIj^aL `>rKa. (Gn=e"8s&HRV]<0Wr6嵺vWMEleu~Rp08*n݂lCm0nR2ZmY|:값b1.0S90bQG"l}0KM-a$EV 1Zs3sƨ?e~-7*wOG4y{ άP=XjwV~Mf+ϣw3PCkkE7#lSEĝ K = w1hpZ$fdpԴpDH9Vki/%%CrJ&#-,xGI4 Gyrms6x)}_$._T]e>aX_MZlNkoCPa* !Dz>DIm2: PvDk"-4<ҷS =-|jȃ"LtvA`C1aX[T;c*jaJafyA$7H|GDkygό'HE"dH{gD#[ereM*ـfhҚM/P 2*}DvhwޖAJwluuEͿUhh&p5*gI@St%3)Sϱa^:BraPI$Wf$<?ziԛ I8ahWf`rweF{EhFw,wgョbx޻,fc'/޼ S`_frD+ΥGw]awy'z{غ[h{ 2Ƀ,e6./+ZGd膃0>q#x\DEA5Jkx?Z".˻X35;x0G 8o*_ܴF? fC0oү+\\_}/:dHK$0S(c23R~M {%+n*ybռКW *;y~ӠZW, ǰ=.VrUfjW.>tJ5sv'B,7 Je9|} rTrUu8:'TcԹܔYj"VtQ_)UHDH="$oJָw0FEgRl!ѥOCNj:/nY? b<"S SՕ^IvB[+?iJq`R LbI.k( #u.9\Og 0O`=-x` fy54-8{] ƴ"sj?mbki철'"t]-9ӑ휠GxBqڣG\QCA^d5KMQ:qYtkE1 Zd1GA>=pxSE@}֜~`MX.&񼩪RњLN]#,5 GT(%NG$ $8V`͉{D 3\&%] z^@fE\DZ}[wcdF+]Pm ˅GUKS,DKqa -dl)(6=E)%xhOj=*D,5<p|vAƕq4J2NbHc1F13ɬ`)$ŤU\CKC{)u{yL2~ZqRPo,bLQ`n<39U#]=[ITummkTƙSˈcs\GS9nͱI:"tx?$)23q5 r~ـH%7ihr$5d(O+#QD"bCO"ySM(cZ8 RL2@3Ew"iPʍ"6A M2A[R.k%'RHDcӶy\$B䭆wjSߵw?՛`+<@!>3K YuBvI:F}vSV/x>߉#:G 7JһT8g(q:H"7TEc$YZae+wQ (WKM)0pZFeA  ":WF5,5FΆs4=" &,Zg`K/cWg暽ҭ?XDڧkOzKP3-K5ktUn @wI0K9h&ZT18))$iC˗[14`~ ~o۞q3ב @ DJNcRYZf) JVc +"XA+@2 umK}P]s0yź 5p8H9̗W2Gi^5"LO+88hGa6%~iٚ6ʻ& DHW: 'F96 I{88qyڅB% lB.j( {>H8HAE0AR:x(=$3Xsḃ~R}7 ƑN>̫[W~ Vٯlږ_U]3kYd!ʇ%> Si@wm5Q*A[uպR`Vd5>]a$8]V?TȺJįUezq%?.ywߤ7]| Lū^]Kq#0 ._^G{^0Տ7eSMC{b4-xJdk}|ufTfn Ҕ\×_}; l.O';G5=Pu޹亳Bjvm Vw۹3`j7x1 QƈBg)ռui)䠎<#NGMcQxpq$<S0c.`p-B{la䐤2+hㅑJmNوPL} q`8-RLp7ܰ9 }3nJ(aٻ6rdWtB7> ݧŀGHij~ݒ|nh4$v7EUY_}EDދ,@> DT#' }qsko1Ro)Ƞ)!eYb^LN"8b6> F&P@BroVX RLĜ$ 7م&.eLĶHmtv.[Hevsz:sJ^ a<1I\D+A1Ȥ6 .WjYz{Nn ֵ:N+rpbаBˁ*/goR4,?f+,%WFLo}_L\aHX4wqQS-jb5ɇ!q"ώ z:0 Ow(9rˁ}J 珅pmHm5k2ٞΚp"!h:b3ie| (&bΫZUgޖ)ʓO|UWqi{2Pҥ/a[xrOH?I:<~寇IO,EV-Jd |ȧ@Yk32W'{zN+Pω'[ ]{7ay8ETT4oaQ |\'ݥ]';.?_xQ6}r:u{B_0/.mewyT>\Y֪}fDqt3ޖ-NNɝ9˫?F=/gtm_b79Y[ `$26 #e음Qvuib'tR$ֆr62ɳPȋ'KE !GZh ށd9s::% 3pRˉrK!\H~wV8+Te'8;nl>yvXn?QFrDi岁h3LbR{DDl\̤ P΀sEmH2,y)&D R9Ɛ&EV ȧHD:]K3u;+m!!]]*{姜ϥ5vIX.ZA5q\R\Ȝcق80+ReDT68kHrӹ3}|;/o" B.RԯI kkd7(cneBPs| =ZCՋ"!Igm9Bhý/]J+?9C wx *4yh`G ^UΟ$b uvu/j MZaFn4z)JVyi(N?җ0\RJi:L\ᗿ&O+ &āI/Z$Y2Qճ\g*%N:2ЅЅbNdE*^&*a4(3Lz e;\YK) I8}>k *V;485x% BVs)b|hMT6F gG_EJPFo5GWE`Z \i-HJ{W W%IZ𗆫G leq/ WV \=RJ+vWzn:+ A \ıIkU p ϪNUXɣ"W$`"%~@V昜AXqϺq\i;\W? \쨜"0*⊣Y*b -Z{pg n?1%iؒǓOwv۝N+Ogy'R01VTh:Fz3c''¾ HS`ގL0-9w./N9>qP%MK\.cHdHE|$-W= 3= ,-~q&@cGm>#o<HZs\0NO6iv3;96?vkou@١T 08FH 2D66EF&G;]; gSyOwu}M _xf˿~dFjmkNѭc}-O^f Yu XoRtωX_6bw x,Nt!: 䡗molu#k.*nu.o}zJNnT/='P)V4^)a 020r%zzB˳',d@r%~(rZJ(iyrcɕqx nfh헾4|ʭ(\.WgәRf4_|j&uFgv7&-r䮷>S]!@]w| ]_ʻ˫*Yde뮬6]^~͛kBZݮyJ2e6MfGκucbwzٸz|@(L_o\ hnvsG`juG'O`:sNɛ* Di3q&^66rPݼ>/迿[@lKW %;HЋ\lrvdL^؝i6ڐ&Zީλ_lrlo7߼_ek|}05g0k @HVMƪ7(Ū UJCU;A ɕ&hRpuEJ(m\ʐ:$Wnřa (rZJ(]劵2$WfcOFkJ(-rur9=8jפ׊Vbbt֩]ޖT_/z#,OxezxVe OHF*s e5oZ;{ .,}Ӭ~+W&ȕaJhϗ+")ʕn8qy f6Z=wJNj\\"T 0aJp0~FцrqW(Wѐ6f 0=/Lٻ+ܚrz}&π@&ۣ/; {F$ZOG>2ĺ\ rzjדq ^ajG+4wJ)ʕ6 0+ma d :ENP h@r%Gߩx .c2Lprȕkif)sNQ|0z ';-zu\8{%-rur vE0rHZ.WBjbnaJpEv32E~5r{v=:-iGnnPGn9û(fE2LJ8r%n0{ȋ\\iԂ:$W|ړpñFFkJ(wurŊ++ұwDg=_2,$*zrjr+VZh|*HE0Vȴ7L VF>E|Qt^qk͹{>MÛ灶,4@ 7,?ڤ1.!;nhSzVE^h"#BSvFpɏRg!%/O ΌsFp0`6 ^ENP0\ pp'z׋ŧ(W n q&M fҔBeȕݳ~M5G~o=ҵi~fK rezr[=$W֙9 w -(綄"W_F}`~Ԭ`{[(cV| 9c( pj9 oMշ/|j? 澾h©|ۃssDGVE~>>j[tv!񿾒/Ⲥ/(o=((Z~J+Áޙ[{81[oYm*7?t'2e&p?1 釷`}~pE{^ݞ*PoGY[*mRT1. $)?0?OhpAxZOhOϑי){'t5Nt[#=,]Ix? 麭key>i^ؒCIhz#LMrdgTnlw!+ O!x[ts76{}^ίϮZ>ww_8 tMm?rr'*JEFng1YqvRR0٨S1'}~GSf ͨ'4!"Uܦ>\uFw+qTwv56YmpԧiAqVP}#=3٤SW|&˭sb4I1ҁ!i}k5D*@^2tF$FQdzs&EEᩁ\._~jW,nmYUDs6 5N66Ԕ1bL" =նG,=P`F7v!1-.mL.蘸pQ-^rJD4!9O㔩zY+Eǹ;Ȇx khѡtҔsGWAaF>w!U֤{{sv(tVu E# #6OۋTOYR!]CcFgdɺ3W4>5dTNTRZΩTRU&՝H9iE%x ZcߢvnE"N Т-)$qu$~Zc䛄E]( \.P5-r@IkU {KF!cMI\=L@NX'rd)Vէ@mZu)jԍ4bfC)NX"4+)D#;*D{jڐ]j0I/3 G U4`eJyHXE)K^5\Tlpt-;q9UsvjAĹU '4%>,@dL6*akBhclue+y,ƲT qiY6C_WϺWgnAM  A@pGv*둔ٰ֞0*\.JvdZ[YN=Rv}!1Cvt nQ!\5"|dc!]AАN5p.HxHX(BŹ(Q7NS㻒2 TXPcYl:!%` 97-V8!خh,aLaPwVrE 1np(SP֝HNHX#!YLh=4v&׹;f[*C(]gԭ9k 1 q`h&b00S/%JC%;:P% ݰo :2 pqR CP{s e*3[( Hq;aQ <eYywD"=dH_(H 8匂kc7ՕHŲN qu]0%ާVK0Y`#/O1,$ڗ3' !5Pi-%чMȲF|'ǰ^! C9TU5Ƞ H+~}hŌTUEb9"9x>$pl^vND( /Ccr]:=uS}- q`c{kz]PZ<: `ƛCvl#\aUL,۾j@J2a1%OpHv9%tF\8(Z Vz$ Lȼ`|CF2] }tX= KHPGe`Y%cܭ)H#38xGuS Y_"bΪC d-Qةݰ:?fnˋ |G. O! =`Yg> _{`#FK|wI>hw{q:EjC$*KtPK@|̡8A#tϦێAN(v"v 3J~6u3|n(!zM.1*BoP ~ypC Mfh+(Ce =x=BA+HT@z;Sn z kdcSk`'ыB"1\ă $Kfg +.`3 S !0,!R*) q#V50^:pU  E<@FtmX%ꨠ=KNVc)N|ci5܊ݞ@jWg LA p&} LW`u- ZV%k$~cPp^?'tBEw%T90 k 6P n  bdF8Z _4= p^@V|gQ2=zJ88 8BM*t+_{ BnV;Am6PjT;\;R,sj\ Yv٧OoNz ͇$KX`Wz>_,ia8Su6+Y:]maFB0BoLfFx9 b iM1pe vf0y[?1͊0@ۼfE%Nh 3W >%5J @i)QgalpTo&p l<9yZAirq W&zZ"s߼ ~:TCx4eb" MS&e?hKj;GeӨjw Lڎ!rϜ_ Xn|G>+BW\cb\+ճZFNz'7ql 8n}HE$6g# |I,.=7ns@Aq/x 2+o +AO(+slӟ]jlWgMP-mnmgfvV >yJZmII*HRA T$$ II*HRA T$$ II*HRA T$$ II*HRA T$$ II*HRA T$$ II*HRA T$\w](!a@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJU I $`.%*%w8rɕ@jRHqu)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@G"'%"%u.%3WJH tJ }'%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ Rx@זփS`'Gs.Oۋ ߀|)b}QD _spp.;d6%õFX.p鳪8< k ]!ܭ:NW֋L]!]I2es@$t(%:BR)]`-ƻB4tQҕwr4ߖz 6x84W܁ﶆaE! }-?W=+&9YK fJSЯηy|6e0C.8;G d1ږb4Pxf|>Fl2e @٣iT5 ) ï,.7~+̳bnpq-GWF]6ܧ/~+^)LH2\h2)#64e7 `dze,JuQ*O~ mFt}>\]!ǞRx8]!J!,4&k2+y>2W\ ѪԆ9)TNfCWW\ juB[DWCWir-u> ͆=V}=teX9gO؛'>/~h~(mǂA]V=]!`+ksDWخ:kx]gDWlJn,Bi?DWGHWeO>v\+s+_ˮT]!]) ^ XYfNK&p[oQ،h{ M\%iD}FVM!MkŖ8l>JT.ʬtRμ=yH$R Ahg~WNTjWlr1l$:ыzp-3=DƵV &)<٥L\5/+]MAlZ>D-4Jj^Wj,jUΉǣTv#J~JB-oX)J)}Mǃ&B){|@yF Zf \kshcCSDsţn\Ft5gZl`c[ Pn]]9Δ0>B>5<]!JIk'7ğ)QnIk)ܠ Jx~r?p06_D]<(YR}ƛ?޼yl5 W?lsEʛ a:q[O!?r[{5npmWb RH஛/J8· ]G\M^|7.e?~gMY1?o4[,z pE݌o)u _wkm9kҶ η3<|) L41]EyNY vcz6īsLU>1x#dՕ0eU8IU`^|Ӧ#w>> @).mns^͇+Yal}*sexF aD#R<-oC!,4b/h~]CnaQ2n 跾tѬklo~jОY[\ؕ%p~6su"/>mojpn\ U;+ /AΖӬr6ʹ)fwB峘f}x@_i9rR`X6# WgѺίoD7?^`xYw VaChQV{j51aR$+ULЦkguYD ^ *Ĩowt~vA~ynxr+ض#;>Md=wiztw:"%>}uCǗz S⻹H=>nMn\̛߭M:%p_]8SAs|0sVq+Y'eugK@Z@+1J:~ҍJ |)7mT4\x#)8̅0ӿ~b8yv OPmtOB}Ҳk.| 3:K싌tmN bݺI^+D8m@ Q j P@(4.!b8bMWk+gk2%X0Us`gLKnBuPbV51*yąU c-BChipclŵJ Z0bE p3;S%V8 ʱ }^-2PpH`>F~aG^B'nnĝzbe6'VZr=*;9۴lU<]b<_:sW`2 RtZ06`!A&? l˪“GُSV.* [STGet_V q7_-Uv^&4D7Iچ6¤Z6OU) ^Ot_/s:l>$=2nb?L ^#_vVT9zePy,W[8؍ezSi/NL5ubvjF8.AG!z@8g =S;oIbRd7oUah&󉛌|Mr %(Yný5B1,|5FG.5#NEgCܖBa;d?Yr~e";^,F>{:TD3r&ʭ:yk,Y b>pr"Z~Kh˭~zհ.?*#.$~u4yk.2J#6vQbSkȌ?O:k17H1rõaJߘCJCb?ᳯAS6N'|]w GD@Yu30m!7UW !d W|餜7[.z=zwԴiJabJT;&1a7PԘ}k=}^o']2t6؈ݶmp$%|cGEG ԴWxtmÄ1fUV]$3vl.7 4.;ݺ4֌DD^KR `S7귪= fr(qx.5F"Qd}KH@aUǀzoG7NrϢsUSgK?&,1x5TZtwwm GT(%NG$ $8V`͉{D 3\$]:Qz]<0_LZ3qV7уErΧU>cMV½USSeJJ&ޠҋǪW\tRVg%PV2aBJntlK:[\^ĩLe! ̵ 6;#6>Sy7Lw2ۻLflG[}.L~Ø Lz 4'5ho.bJQ…@A[lE<=0>.bIԝgLm 3& F*_@aF{0…߆P'%* p `X K`r2"\B6xh%1QMKЍ_Tn-Wc11XD *P@.PƄe O$VG Ì!Xy-s,\Wy_!6<A;܁慜+˜Ry]ix`,(3Xrɭ`n>)g(9r\n3Г|O~5 !mg!b~ZqRl(kmJmUVKU)zvv 7'ORRoj!Nn:k5d/soyw{ ъOm]{tBHPh:OS`q2Eٸ@"/aN(E!D6ƒEiX} $  Z{Hd B&1K-FXL B01@QX*θB,ޝB" hPʍ"6bd;X+\$JOƮC:#~HtS SGM* rGB(E>/}s7qbp9aEaG!^93smL :_JY}A{qjIh"ggy* #Q" S/XyM }Kw.+mo>φཥEZ^˵GeR R U+*DΘ Nh:$iC:RX HitL*"+ 6hInXDX:  63Xycf$"P{{O^ZXdWxKD8H9/d$&0{Ոh 2>t=qa{& DHW:#X$Pi=8Nb^m &;Q3DP *" yP:ct.5%=;+'׌5,YW˩q0~,3@#>X}ɣ35yrEc}Mupnsۨi=T_s$+&E]ݾ89[M8YUvΑiG:oq0>*ŤO{Z'r9՘|ÁtJQ6j۳R`QKIs`o&3^!,<٤vwZ|SE6u*'~;̍utWoN?yo1&| +`0h#'` >4] -=`h|f0]>r˸[ّ[-Q5*@ej#m\`J_~NFTrpMШdE; Q(m@+=Ҏ Wwvukz}tMHɔt:٫BʝĖ1궐O<2zuVa;/0`UNsq:::߮z _Q9WS_D\>OfO">&`ʲT$rE8?Ir&wKɼlyBP0(,jZjF*fU˫a&$.|A7 ҄p. m^io[/GpI^MK8[(J7,A\=0{W_>ٳ|:u9E-fbVỷkNCj/K{ӌZ$v0ȀKƮkM>Y`՟ۿz]V+Wؗ3鸋,T%<@rP>ג)E  S.Z*>֑ϫFh~-JDvfpeGKA xH:'. h zm~8 Jru Y>Pdshhp*he׿ \9M#⬘\-V.> ҇f4i{tqr+3qLJlCY'Mcmq)-5^H͏oǿgȜJgrɭ>b2x$L )]bGe^?B( & ZCm40@$Hbm@HJT9C 9la>}]5 OtI'/s'ʎQ'Ltq"ݼl]ࣇ/1d(1>רnjsWګ GNQZsBKIX]jGAԩ4gH~dR*$R|h8TpS<FaI%'J㑉S)C0!B*C5X80{-#FS1쌜]mTCwǧhIMLLaxE%a"30(1"XF 9BQn;&) ]Qވ|0A`6H(J=H ^vENfU ? S^KpYg|ep:*"lucmb>GZAx}TOΫ-}Vkͽ{zJ\v*:+G.MAEp2"s]Ip8a5md>&ZE@'8>jK ^Ȍ ^`]r9>eJFEz'66bggbƽq(ph$$'7Q$%M4(et% f A423gʹqbXXgbIaEf7ow~wwz/1 Fwt۫&43;`a֡C.uh8ކ,l97&K$j;M z +]z9ۍaEbv]_$H 5ihȌULdbAy!.$i+ ds'bMKcu!@<ꨬ-ٍIӓ~ǡ( #Xqk$7{I"T{bѻaT+Oicp"Eyߢh4"Gmg`sUNS<(A=KKbViٍwWC\6 \g^r(.¸*.V\ &岂Au%h BeHQ  k1p+xXwts\,~iȽ Zw~CamAp59*(225nPjCȀ\ӂR0ǚ\Sk:\cZ( BJGqTK" i–L`Wy 5̱tc;|Ե=g残avǩ.jtc[WL:ZO Sx7/ n?n歷opmvwf{Bl`v{/ ,q敖d4rRy>W`#\|lj_{ޤ呧˾-`M9-ڼ\_EʭjWB2k6Bo:6oD0=6xpjyu~ Du"H"]v\Ήt!.;!҅Љ{-QZi HP 'QFϸsICPH螹,5tA@xAG#8Dn}ii9+(ATzq9E$-Ǖ0d5d8M$}KTm~JSKT>&04$W⫊RyJgs <%h+8pqDBc%є%ՄKkhG\ | wǛOF_ny}yz”Sǂiϩ"hM,(8EKDMO,D"ɵ6z›VnJ꒮tuї6L&/:)S )]h8bJp΃A^B2Q43:E8 828%z$9t|ʹvj)mJW0)'P6Xq(ԩ85;O1Bq B3?:‰@BRt !5ưPDTLy'湕c<t{  Z0B fE">a5h85&&qT˔N) N$:MJ`˲ rK}6oԑh\b(U4r4wJŀ?_A1s(SڥݚYq|駸ʶ?y6U)َOgήS $OORyq7IgIi$Edb4T-sw`^ă8.2\TBȝԜ4*:C^,國,I)ʨ4#G-A2 DԔ0I OAe< 3%pAH3%Z(`9KQ8Ph<-Lz!&NFb[TZvCmqO5_^Mot 0tNFmV) JK쩞m#VklJҨEwނմTBV`PHmT8kY죲km79я 9VUg,z3WrUgY+UxnU0Ϩ*ajUC=h_L9Ӂ+4Wl8\ek):\e+p;O]Z蓁,9]lp339n=Od*[;ϮT )A4X9{6\ekeUR W?!\ je}`5 xgѕvgxy{.+̯?_:GĹ`_>Co2͛|8ψhCxd[(߂iï@v3 %mwM0$ofΰY߳`zaC\jkY`f><:zmnRWzZi^>h`t;lV33⿎ IdhD6y6Č֌|.:|.|lp2 =AEc_|p~ (@}@g|kj7i=(̭[8;آyY ,~iϛ3޾ůX7]< m)a$0zQQx4l1yaao,Elǟ{"NCƑpv`_Gw_c:?s';n߿3u6K\/~qы />˅LxUc;:Y]d^L.s t3KAsMoMǣa@#a9Y mi$h6AǙSMH8ߥ0 䌂:g`8>5vOVO5>#$N?/>̃۲a;*NǻS.+m#9 H}T_w] }ȮFWcRDi {gH:xjJ#{ X]3SU]:qw0( k8Xրɶ]QrHx?OތhYޝH5WB[K-3en%3J4Ʒwfo%q53G^MhcKkxZt֢dnV%FJ0 VH#Z ?MS=vbebҔHHM/!FO2,ߣk.-@&@ɆcqPDld_+t?>i)Zq 88I114`fdR9[ @>; ζywt~LRopHq2{rJL Q "18wVsbzֳET:p (#.ݾ Ӗ ŤV`XD & QsFE: PJ0(*Jp`ȟY:[njqǭvv+&=:=o9>70KuJ~dW\~溵kZ\fɡu\K\E"OScLLB3{ ڤF46y@@^O@~!SXmЧJm0Zz]T1\r ) \rQi_yW?/Yϧ83ƃpXD$ 1j.{YJ YpZ Vd|guu!bܪ :*t@P{626͕;@{| y<5grb*3N7eո`CTgtj=r=$߸S=֮*ẗg}{A|jO8WW,ǔez,h%U>EkEv+uavLlй- REz _^Dn0p\ZNSa x̒ 2PHb1ĬPN2B}[XUV -fLW. 5h][*阈&E#IyG"Tjyh IA) A>g2YJB b2bؤW8>Xucb:$!RJ[IH[`q>b"I5D>ȍۡ_T-W-c4'߸)ER$#$ Q&e KڃFD;RJ_=]Pk}nYs!* ΠB@kP?hE&>]vW 7[mm &ԠF#Xmr |2ryN)=SH6,"L|lRDQV  C{ G擰5̧ %1 |搌$[AM/ij[-n#q^:,3k<dN*D(FX+K4lУhX Hg2$fh7PHJ\:!P"^#'Ϭ $*PA5OUM"I֘R OebLK Xȝ6dh00kU#gIt_ )~O[xޗ:~BV`CJְĉ\-ƃ B8Dx5=\ RVfe!*q'Yg @(,Y 8,dgNƐHz.Km:>ρJHk^˝m:_t/x F3֊gAD 4MRV6x%t&?9/rH(+LѲYGqC< B҄tH hQRyc.ҧ4M]&{OOK ._*)|D}:L/`V5YpXWx ބ_E)sDmpN8e6b`$89LG'2q4HȌS"X TWFV1]@`WdhχtAlf3|.h܇_oFNw~Y@{Cczǿ_͎׺H.f )`2oF>i4|^\kkkw۫˓V p ^sOd{>hz6qL?#[>I~mGWv%Ptn~.,?bz꧳#EϦŃݮ9>_n9 yM[uݽJbUgś2RV>:eYQQ:wߧ{0&(O@/?~|?v?L6u$Hd$@v_ZZKz֪]U׬@|Hܾ̎rmHv跷x/ʲ'v$D<.WQ8?!Kx6 )t騢dIU8 C9-z=xdvj}o5EWT[ R/Ǘ'iabEAmpi?Ǎ*uؗO.c&2$n4*$MG4פcj_Cn]}ݫtRḣiqR;Hgͼ+#4DTn4Ζ bu/jK6z'lV:T/dR9inR)Pgפ8})i A~m&0L^Lo){WzTz!k4Z[YM4K̞ȂPiƌ f- x1Aܓ|*p;O qr#KP`֔Kϝy69v(^#·`bIIr 5 ZD%p6?QzTGħW|}k|O{>d'LxUl$m Nѐtu d7A?C@2c.6&SGB{}"ZiJ^*G +ePˆK5c՝0Dɣ$ T7姘 fB*i-Cl]95}h+.pZ_U;Uv K#lҁK!(c\R6`T EEc[hBȺeS59\hQ6Kh;lPDIQs-= |&%U[3V#gfZljg ՠ _.\RTXkvkf70/4~29g_&k(=!!!DF$F$ (=GQf-" Ҽɬ1`0nQR5D#\4. p=tQ,[d956M ֮}jڃ9R 2piϬ%"I 0i<;C:LLc1WՇYiZU*) 9bO251E6h2S)<Ƭ" >Fv}؂4}1(ƾhjFԕ547Hg2ڤ'VL k$Z g+ @4EoU,gL lHHKU,$472@ 0A`6[$ѫFҋf.\}uV}͠Wz].egE] j) :C)蘓"%@"BV& !x}X;s {|2{N{ #068{ !XAX.uturt6y]t$[s\!Z% HӮu>'0(UxFJC0O!̱vc{s`yMvbη-SwݧgYںb͹/ˬȠ!o}ByzzٖIctNH#>`'j0pqvaRoy)0x'2qyǟ.wj.Ysq֜}|E1 >ǐL"En6!l4[s _te{oWU¡L$k@xh-rU 8E~ {U@H׽J Ca{6W!Wy? $mM&1yƪIN3$%K^([NDAkk P2KZjF@ft| T&o[EĖPS/% b9A ~U9J/Reд[ 3s |jgէXqPz˜I;tS 77vS %M xF12;m[F"!Ę&BLMuOn+?!w^,Jk\|/<<9aR/1",DDꥦKh#N1{ß/{3uCGՊ$+FO>ޢ:k!)Xl,c0`ΙD&p TQ͍QmQ'(_/$O3m0q}Cۄ0!LWt%ERUBl' £m\4*w' ߎ^ɓON[L~D^~x\/>lʶg=!)y kdbXs,1ךƜ8˃Ya+>>Vގ$z4`:AXD`N( [`D8 ^Fl(!މz6 %%X0UYB e5BI`5-g!)fi6r6@zHwh3ccX&v9ؔ Cgfij5-.XQNm1rD(δs^Y@ւXˑ9zf+ci $;m鵁DbVDJ|#In~;|jFN T[[0iD`-xiBː`2ͧ \⼪91Eɘ,TUDWcnQE jJ(7uYhJtoZ~뷪:I^?n:`=ݻn}6×<퇭o?~5Z> >\T5pxayN8%8ҁ_=w0=^%X)0A=ĝ-,1 <^Vgta{a@dMd7*oWhzuҕ߅Q!H:)W8 9B1:&Xc<5:,-F3ǩ[|u<2]—e-nc(Cg9dwub|7_Lק.XICnU>|5gIS π龡/8:'OM? wvmpw+^js\/ c&xGs}Lr b{ 83F"A) k;e|AFds |~su^9ŋfdJ$A&;~l=yU$_FB>yM&Nn¤$Sl~{Tl6z~NY -,[YCm >u|aK6vLg==?gR;ͨ$Wnۓj ID+x}w ~9&t+fŞI/VPT%gFuk;0퉠jIJѴmw'KLcf[͘d5,HIӎB$mub{N n"u6'? V' &MaxUqE uD+AQ4cG?Rb H:P[jAW ZSAD E]sm!0 w鈤' 91Tror3k^-J/qjTolbvk=.vA/l5ϭ@)aDh { hCa`Tü1ݐK:[\ލ}W B9# 8 Ykl%vGmh`abjO-D 7) qPFEUL[Dp!uz5mܞ7VOzg۹zAcƤaY#1x,|8PcR"a c& `Pz]e_^yu߁0?lqK>R&do}|՟i]k(5q5 rJis"5]f \ϸƶyZmiwxzC\qXtjx LqY̙C:<Ncʠ2֓w깥kQ*8ʝ7"pU "&՞2#,RFXpI!_Ld 8e6 &XpŗDva%Y~-tQ7EUS;JV?X}0kNf;ubm}~e ݬKö#{"gwDhǦKPD'lbj|0<0:{ .(.C#vhĽ1ShD~2+Z#D,Ŕ-3Q$Dє+Ɨ.2wR%[.yb?!2,DRhyP)H)Mʠ`(Y_ =o֝-5Vwʽ5kOa#??:;ySB1ĻS^o [wW<\1YUMp kwvˠ0NRzeZfɦ_u b<KE jh:mNb<5"Vl`j1}>ł6rntZYzO; L_}%g Zx%MJ)<3n~^F8i+a$`2o*%%LPXb%0NCt_Ei0F)X|ήٲ$LvQ3̮ cwLORbo> Zo}54C!|qߞOσrro7.%^oXO a-گ<߯[$~ds^emd:nC }mLY:>Q|5-Iǎ@]T[k*<'ý|Ž_gDRU'7CڲjzWMS8H[}m/Pz,e*M.hnO3Z=I;6 }r']%_;ߟ[\x̮wF"+3tVoucخ-LfGxr%jp7%7t܌Y׹w͎,VhvpE-ڱU|MnzVŢt꼮祎Ԇ5Zue?;*:m=99wk:R2鬻=w,͏/xc7Wo xG4Ǻ J> >]p=;p鿾MkW޲ipEVڽ><\֙ƕmHJYa:5d|y(-V=i\uIZ~eC%'qQ5NȮWk{S;8n@Wf|nᩭ7e4Y>ܩV* ](E=i AQ|a>v(g!h}a959嵁}*F %:Olj_zEBLZtaeOgP3=YgufQ߽B3q x&*[PNyN ϽM$I * q,_!*Y&'/JŧRDa2чdzB_D}Nhy8I84!'=޺s7[W] o} TAEM"Rhrʂz悴1BJit zAFM"A.,ER.ddaB8ƨ9+ 5:7 ԫKUN rW+!5)K|—l x) uj i"(RbB4BV(E!He Tl%:6 ;@׭k49V7*͋Ar-z)aF;j|l~{ZDzAg\l.4dP#F$D !jX[}zZbrl<C KZo b=~@> GnQMH0VyrT;z܍NHODܒfx=->6}Ix%\+[hGt4 =:V7aquE`7 8  (L'?GHH#A"%m|!F)  I! !*iN` X,޽B"h !*rEkS/(M&u%Ոb YkPo޺s7$: )=W6>ҭH8<;@ABOn6#y"#n}|ٟ, _s<:I _x~ E^< gbʗNa|Rg'a5~xXoKɇү|z=L(G[=kڥkAϳzt/T*Esw.^]va|A^ ws|ۑvnXv|sw6{U|q痖QoE-N:=lMϹ˟tR:Iioeٴ. Kd㓣zvٿͺOM=xe.C]j<%נ!I&iO fdQ`z^!*5\P`."` tNC&ICGR &Pr6(Mˬ,T.'Oh G}}6mN8re=j,-R$CU%0=;ugnnzLճӓh(ٴ,Qw{]mH|vL3: 3{G +}GǧmF=}uH] iCnKuI7(pK_K7]۫tU)>m\BZVE+_8+[ԼVvy89:ۣ.yg]pzG)=[K{c򵧾%j%_6kC~žE\[y1Ac1h qyq1;Eum\J|DpRcզ#99oM ??>tr1Cc}gK6ՏR+ŗIߑn9f@N+] Kͽu!S#4E6 /i '\t:8YR:(]HS0䪮0^ɨҳ[w!9L&byyݾ+v 6cTSw `Hu3'« pwu]x mDPiܠd4a;hwr|M\UBlO>6RI}t?T2u\tr(*Y7CNe0 rC(vM;SCy/_Vm\^b?Zяzl-99%:P% rҨIcp9y&y{i eaYb Ў.0ZY){waj R_BեԨAJ* Dǵm$F7019h97޼(=J;_G8SJqE)$"̅0+T?IgFE9&ASȸTy!<^JK J"7']_ݹW9tH=/ӲHsxzĮSjMUyl~dH k7*dW}ѬVb %5 ۛ6Wh0~ {3vQil)y2`H1{IRkPs2ۜ,FL!@D-{ճ JP/$ rkPU"zy r0)4&e0B@"%alB6AM!ƀ )2bGԊ4IȽhڰH)΁5L:(T#*;! {m*lqqucP)z$%fb4'b 2D .1^-@!;P%c37"}4C=M7%"U>ٳ-yP%4ƫ'.`J rZAPȎ M.IhFO=V`lut{ћW;goql^ k Ofs-:mD @Y$&Olbfg,ix"x>}P>]'oD؜J'e$j dh)[MAgAQI%v*ҝ,YjW /{@w%w,g&P Z]B=Rٛp*P]\ţN'RsClr-$ڛL.W-W߽ώ>6`ݔ|}^4> |Vu{)!A*U*OFu\x:sJKT/7`Аe]^168r(e1dq&Ԡl J +=i#QbcBcs0g =$ތ~׽[Me~ōTb}oq| s3ԡpvvxuF#EQ7(XHtJv[x#QׅW_Gh'&REL$GDi+)`X!$z7OCWLJD6`P:0 9eAQ h1EDZB)2 AxCҰ.獢e_d$V J+? ڣ }ug`'-Ҋ/gBHIorrϐuTGHp9B|Qt>޵2ˮBNg^'@>B_eS$lkS3X9Hz͙tuwUU(rGEEp2"s]I@wְRBhhh#'U-&x#32x5sf"T9ˑ=]5,l3B¯j/fp*bYȍ(M<|Fl'S ds[ItEIbz86iCY vvOxSE#aHƞQkMItY$J9[ :Yضs&涠vkܱ-`oT.M40O6!0p0@Lz l#88f`!fA҂A ('@q =r`Ti>(90Oa8$ζ bk."Qqb:pq)dpn(QZ%H] 3, 7inm8X%b|yA?=rr2^j#FN>ǟS0lH~QKR;9k"S&R+ :YV"gzwp\P…|.$5U8dH{$4Qѯfj,ܛm*cCv~oK'Y)w~@;i`2QE7(NF*v/&H xz|tX7W pyXԪƟc,;W`v\UmD%{zp_0;W 3pjW*QKU=\=Aiw(ѝD.WZIjo >IbYq!`Q3׿` L/]wGvfOZs+Dy[ qɬQ6;BVA^츨3Wu| 709fc1WoYiUQeM(&LRcKF Cm&+?$1ԛ^B/tdR3G?ո!KXe<2J{P|7$N14N;縭G<*q,?!+zZ\ICNJuQe/"*(=;,^V 7Cϱ\{igk]' ϊ H=ø8} pMkqᇝOHioX錵t)PG`s_H.ꥣ*.@bgtD.Ợ%jv],Q^{cu1B]2M%YkQ0#KʀY,C2LA/UNJFnwK-c1sQiWuSG~\ak;6ė_-iЗ7&&TyOSZwT_AE_p㢹GC~PDUڤ{06a0,ҤkJ9aX:s|6 ^.pa GۛKjp&#jQI)he aL m\:2 S~9p][|tyT^yV)P6xs^O'l2MXSMVfӁsyq̡oÓ\eK !wpZ8o8` &;8%NbT'Nq!MYmq$͙IWσ~۪{(Yyph݄~"0{tDԮX-qBcB9m ;|Ɛ^6ibFu>ؒftAaæj{p|+ Xa܆T{GZ׷a5^߄2*h%BZEN@VUWtjXҲfUk)0Ua|kqEX8;O(VMcNќFC1쏫G}07~V`q0~ 9^mKQ7y2U+Z9\$/T)WEgzȝ)Eg:*S9깤C3?յ1&;z݄o/ d5܈{yt\cWuϱsH3eٕ4׍fy~vLfv%`D@a;7xg&jgN mLT7)\%y'r%JjpTh7 b[ЖTA=`$GeuZpFi~0ʙ^F|?xg :ɿפ$ iuJ'{Y:qC7* МknhH0>Qޏli%%iC}EJWYuڨ=k%T P,8a1q&8&oe(,g\! `}V_GmKإ[;-Ϗ\TxعCGSmSz}K/$IJZ}_IJPhE֖3aĜP|M#D+KF/00cuעo_u[ xA^K3NSc8i$rVyf1,2jԻv"ap) E-R5KJXR`+@Ce)雛HtSʐxlQô|Fs ;ˤH9S(*"gL'4NF` D@D`)x.{¨S5nZ"RX HitL*"Eaf`h%1  Hdb U3C㞇^=v~]J[XWa0 o0#L|y%#1qU#)zoP*g۳/M4"H +LJF *ʁHxM"p ÁljSSviGepQ{(HXG"(A *" yLm1:^\pr#2Wlz5cƅg¸Y}gɁv #OhQ+!m-'L8N{sGןLL~ EIOfN=Di-GY.e8oUU\n|>)c]C 9d}SD\ J1uf4 {u8^.{@0!MX.A4Uh WV3 ƭAWt|c*&H~"O[bp4$TK$hs˟&'5uȡUܵa։lFU}U꧋ZEKuUrp:$3g|atz9P-ճs~e)tpwjjƖ_ES3g >չ{ݫ@fmv.:[:GB [%V'ljXi(yDFRa@g2>Erw獗*rKĻe^up`^x2}}~}u3L_N޾z+w0,h oL/hoWo˶&MSVh:V_]rC7Pg6٘钛/{EgRYIļ6Zf\WX ? b~1=Wdz*XKT꼴"sGYb9>.o&d#{z6zPDxbO NG"Z Bvo#=aye?ѽV$cgyEꩻm_x=6:Gg0J[-&,h%AeSFq AJ~O=Q_|[ ~\D?p.ƞ@ҡkXǨ9Okq)9*EV[cre:z4(ju`g)Oc.*5XMn-{Z#gfuODܒzg]LEJF%,`׎I;ao0@Jj3o{"D%WDO0 H֍.>)tKX3|Tvv#ag@% p%HWV(g f-R!5*4` BN)ǞPj,cXRɉ5b$xdlվ!DM[SVc&ye4zlԀQ)%e5rV9){^?̶]3053p!MLLa0Q$< #"ra#i7f`Hj/gv#~Ylv$6X/3b%ZUrUIb}YU% CmII`%ϸAVrAԠN┶{BTlwiү?..;z(1dϘ۬EL/fѻ5 ߼A&g8ZږO_KѦi>Rh+&'bW0AH)rf1FxJ;ʙX,$s=QSO޺L!%{"dQJkZ#cڌ4f/ _^24X#h{4-6IV?yWg'G(y/T=UJ2,S%-ȩMY+QFnL]ed^Pf%tq+ljp L k;)=BHηNj8;G<nfv@nx&L i1(gU,$&"43))z[Zf8ƚ\DUOı<[c&EbQ]bcٜjzgЊX %T,&``P.>. 6C=C®V5:S?3mZo>6Cw_(X{tw9 ̱p:؅}R[vQ5Q?9~arse"ު F~ ASɢԾؐ0ax0K]>+w]L[^»wݎ rЦtZl9V"멟aW￶7wϏs4u- Ow-;n\۽M^7w8CϷZnx͉Hny)0{.:qy^jN>=r7w!ʵi.ot.]}i/p.SWn6!m7'}2 h6{~|?JC?$S ݼ+{w_#f1k袁yd <k-/&/{?7|!ݼ>v_.mK΅Tsi鼯?J$_W'Z_"""a'|>4B #֬ۙH'AQDLĨ# J[ ?pP,}=>"c6$} @2*m3 [vUdz'pvH,4CZ38"g*3.of ɜS$>7'58y Nn7sf6 snIe]]\OW}zQPԣj6wv\W=3nmkXm`Fj!y}5ƣ ϣ9t2]b{ /]̎~0L[Bݘ_R-]M/~=_ci"pn/U_.yjv ˴r." 2m+Bsib@ya)&r>5ՋK |\5Fru!.F=m]Ç_VR݄!;|m}ֲe=Sye?)B)Fd3L!-}IgQTfE,"V=tQ]7ѐ “U-Ằ)Rnɕs6c<̄9,MRcKF'1$FWnnAO/weԳ9ηOAAnU*UXuc4q6SNX+#R+wx6?P*u~L FT,PjD偈%r'i3I9듎K:\ViIMϨEL1vX.8k܀e~VD SyᎹ-QSM9ű5o7R&ab=(I gIT3(]. _4*vq9%]~ 2:qJԯsܻ9oǺW?7W5߫*6޵^oT嫳iuARFխF28 LsJGyFrvY+Q9=mh\]] 1񏲮+9 it=d>LxAYJ;uZNp2D B8'3bR ˶jTZ[L:`ɂ\d$!lR"1}"/ MumVپAC0`:!촫}NYdK>B3ܓy)WGlA>]duP+|>fA˽X>(b;{Z" 4emr6I[ &< B2)`yeX)"S&NEI>l8=x%z~\kIҺS1^ cruŠ,_HY}3~YY[ 輍P26t)|j 0--ʮHCI"tБ*2+A`zCߛkGV_)cn Ʌ}HZ֓ٛ $XwLQ G7lx]wwʊ}W)њaݝxlw vBTOs9ۡ2ۣ.%FD[v2D٫zH(L_=) *(wBRDYr=ѩ(g.kdx"y-~R:U?N0)ln3=?=7_nk[|z_v|[.Z%-QZ7Q H!(&m )-^ 0|/HJu6zft)E N'$A-aqDbF5!|n. ǟjwk*vp,z׌vxW < ҀJ9I#:)>(r6%3lE,R5V;;̦`<(l5ʅԟ|4q)=yĎX lV>2{=\#>.>L}6|x 1ERg66fuFv"Y{~?,Yqǘ? pyc|3R>gX2L#Z9-{tP?؁~gƎv2 q<힚v_uWW}ofRIU˧釧#b3h#. }? UƓ:^fUzDL' ,`r>%L }vƋQ|hSЭj2yr47> &u'M d:x63sxxqIq(՗`&(FS~5o. u;4_6)7m;G> C4sU|>ųgW iD%f LFUZu)4X `xVOkUN)k_a p_8N_w#.,~I1V`^\5z&W شiֶQ:1Y]|I;[-^[K3ТF(3s7fbt5 (vWGD@E+ͦ塚r-Jͷ:$[B_e9Ο\o?un9ק*ԱπuE ZZR8+`Hmb S4]o4ҔiYk`;GH/0Y&n'Fj!HM,*aj6Lvkzrcjqad08 o;s| }"fSk^?A6R+ޮ@Ai.-zӶ:L;+S]k? ћ6:L L#.*"QȸzwlTst[s$چG)yv0Z0@m꽬o3h3޶DŽU&HCsFEu߭;yWp9֬5CW*6g:}`*:i \j1DXBcr>ԯ": ~XL~9~¢??!0 w鈤' 91TrorKA/ % tCYYx˶M7aދXo<=;ǿKfČXhl{kRʍq3'Br`H%qϿ4y[n?`Wi{7^MZ+"-ݤ*oK]Ԓ DAtzl)Y=eLkIpMoD^@ALp6 mܶJƒLjEgIQRDAs2yK&H X@.7"5̦@cB9U `De6l6!,@KteWgӓ~`~4^#CIk۞ trVky\_uOajBo^, ȣF"ꔽS]S`a Q2M0yBCVdJFcS\-&x#32x5sf"Tnd&ndUaa68 y g3prX\YmuYٰҎ6Qg۷~xDkH3H)PjIF2nZh #؀(0ecV<\H☁ G9AQNq7{ҘxMxX; PDQ#6L[#.S X7G( DSG6\1\ 3, 7)  7Q1pz ",iPKCFbu,3"fg7"\컴֙MKEe=.nx9UFy#3q@ځW%8HN8&Az\<.viGWGY3s5Ql?P&vܧn1OL#G?:I5^jbK-YDvSؔVsQt1.KnWrh(slW67zj畖a:ooquwVC̳|Gބh%cU,9KZ|7C#Uڊ-2QDBf%1 F&>Km< Ca]| |ٔY`K},$ tV1<^ ZƙVJa:=hQJl:q` fh̀I a#GQ(rBW#bcK{k˷"ͧ~]tIٝME5>pk|$w `Fy&0D.9bK؁b&DOy3%TqlruA~Ey&VgOfCn<3!=&)e

pR|vė&N^;/Mva(o|< >/է7ps|߼zq'iFǕv|')aһ 3Z2(HREk;g5qGϾ'gM4ɓ6ި^@<Lzp6 mln̙p /"f- @1#aCvWuVnު?EY{MnqZ493tT5{7(~U4%SH h{}JJ\qZ60.ELӢe;vP?6ܸ0m*.O"$7qUjyɔ0xM5UaS:JNk(q1 (bab0Dpf,4!hLH.bΣK'+8O'E+Ǯl^o@cf%}y%yC:5+|6Eh`p@.ާkH$ [˶g˖=`l&tI@(YY/P7h5!m4Ki2d 71`Q!R&ZXJ]:#v ވ^jreiuW_,6żvw=#HBgp_)jPXl ݐVR(ǟX}ޮ`WPk@2rG\~0dDgB:ijSS0ZLTI02)3%E0TV['2lhbۄ#i eJ`/WgX'Nj=B[ ZڱͲN+|:zj\7U\ Bf TpΣeN1d)DXB)ֻ^x;z{ ;iߪo(]rۣWy[-(RY oɾtnߴ7]՜h{`X#4mPk@t&jל茾OjUԜpMV( HҺ`P2 AzR8RA9rJā&dCd2Ie)9.sW\+mOYUm* d!GGYeL HX^om (aa3R&&+:nkQF"Z1!s*ɡ^DHY &hKTvbs_iW~^XIXx/6I:I# Y'y VDXqk&0U ~'F4E(vt]#k[W"U" y;:PHZU!ÿN_ڰ ޴Ɠپ!u !sl /w{w=c pg$|ċagoɫ.|s,?@֚6>Է\uU7)Ho=9Aw A%4]ȒBl}(zylePvS$tKnH:F5]~иvPʓzuqqRq|>L<|y4U;B-@8 .'Ղݫ[NG_G[G*ofUׯbe8tzVv[е;m iyrgHv:%[j) (o>S7% ,4oȐo2"(r{Puz񀫛Gz X+d)g- ˸ bYO2t)@ i.Z` `2*O\n~U7ABf=\:/n1m;| ym?>Pva.b(iG1n&Z*fB-D(2W,Y * ]!R̩iElvS`*O)aaKSdzrhvơ/KMIO&jBQ4z5)LZfuc@ A+"!j Q^#D#T^޿WkQ $,gg]t!is(/EbDAh"Si(Q6kΖS>[>O;tGꪺG|El$¾XP%3J!-\&ԖMm: M(F'ԫ1(NFT+{5q̚K=j'0pٲqd2$P^0jef&#jLZH7tLjo2"w] n#^kn'.xŠxg_ގZCQxN da ^YhF[+YC@WWBl.G9 om;[dߣ`ј6.eƩ0 a/O4TpPkV#A=a6˟f~+V$,DsbĀQߝ gkw;e߭|yoy`۔K8Ύ~ywuN?yuV5gy o•&*ֆ˳[ȰsX1Y1 6=UFP<xx}tR}%NI0t>j\xp9i};8^\,1,O:>Yg( dǡF,_:OdFMg㳳}{mZλ׼ y1>~ PvöD_ [][ysm v?m5u4qsF~qCdzDdսB)R#PoEے WG.ze[VZLbLsZxc01SFw;&aaqG@]}yhz5/ݓ.@G'1͖ ZWoIw: NKTzpa{{Nޕ5хCwQ&T!1gϞ$>ep){d}8`_VE~'̲^lQp1XBcΆ&95 %Xm lWQs5<2rlWpTtwc;̃ԛv}&U*nZY^,d4J 6_mPXdPțegw[ ] Z[I *¤|!,_r@.fYIJԌODSd'{%JQy:I Xȹ.1H'DZ>1$dC"R[g^i$K$U 94RIBЀA%@23{w 41 Q(PUžo `Q"rO9.]uFڧz>ҍHe;@ABτ6o=y&Z8-j|aT0{I+gJJp7ϷoA_u7?Sq‡?'_Ev8f.?|E(ja.=Jp=X.ZʬLYK[eY]"gт7Gs m@>P73o]C4s$t12>v k>ɫivY]tfeսs ~jM/~8SwwyB-my;h:WG|`~>i ~eU`82>\T>?^ _r7a?ֽtͺ VuTʫ&a ,q} ZC}MN{( f^{Y-!:S$L@"b}) # )y:u.0oM_ҖetpΛνLY!Z " JC.s^Vdˎ~tm_EY9gtP`bΕ%"leOi؂M +ꢞV HdCzBjGg:4ᗃৃ<7Z0WI(D͒q,˻"7}y;a"m;<@}wP[_[gi28 iGvz٥HFiJ3-FkmȲ0*`ǻ ·I`F"5zdSM,VKevusz-/#{E/|0 z ԿeJwӶ/G&JOa~=%%ZtǷi,?z:|by$K놾)sv:|G[Kݥ^Wr[*aJ7^|#`y,ʤƗOѹ_P+P y|$^?=՟?}o_?9o>黯q>C \戠p 䏆=/Z]򮋖QVQQrX禗dv!s )pyvv\Ws-k7@[Džo&2Y\ioY]LwZrx:T4ubůub#A7S Ss;~g~t[MʑxgΑ!A*[ϭ/XO@H.RFW8+,8ƙ @ș0He5M;_ |wmL(X"tu! L ʎq\^:&I9)C֏c:;Ծ;n}>4Tzge w; hjfakl+ +ZLJ>(Tq24kiRtLpk*ZNWre+RBh]"%G +Z >RQ]"])&hCld 9ҿNnt\t5tL;m$ʞOKpiݩ݆RV%J753*/&j͵Q^P\tP <$=L)iN JfBd" kF7QB ]7TNuVNs]Uam5h:]nu+Ҫ!`f•h *J=C+5DWX;c^UEKWWHWIWN[ZRW,}*\ӌZ骢קCW]/a`=D.홮{rhC]HYCtf•h5:]UF:0 Xf~h 33ҕ˖AYbk*\Z+ *ʍ֑Ȓ-K]UܴBWCrcHWCW 12%OpuLt,'%i[7D؇Y^rl}~83M2*HM_NŬe㪓Ǭ:yD*ϧSRdnTp"D0>yj?1_\\JFwޔ4U3xEԇb]y$6H4Iv9y/WM{"o;y哷d `>J|CZ3ڜ::ΊkiGTjvϻϭuWHɟկgϪ]VTrqҪ::w=or ;FjF=bo2_OhlIvS oloڮ\miO޽瓏io`JZېV`V*ՊVhu~B/QVGl75 P3tU:jr7t(Z"]FQCtE\pΛ+gV^(r#] ])FLN%&7c41Bm`zsTTOT>A͹ ,jF7Tރ uCE1G} 4TUEөz\tute]PDpm3t^]URtute⬥03tʊ^]Un:Drm+',w܏mGtUFuݳI`}/i_~(v+]_ww!`.T3tUJ ]U:JeF:@R pn*ZˇNWBjHWHWRZmCt%㌚+̶BW@+9 *J7&HWD\ ]U4CWVUEkCF:@Rny I=b ܥ%=٦TtU']ƞ㪓Ncm 8t^ >ȥ+6xVNV<өzp-'n&pM3\E)Rlh0G4YZ عf pEOW#] ]ŌhiX'pM3tUѺBJ;+a-UlڙPfAbtUQ)3Fdm*vWß^Q>Rzu4Srt~pzSU?CB\+5Ck0dz`c+%ch:]UF:@ZY UlE3tUf h݇> tUQJ6!ҕN ocdkttutvKUV_%`k&k;jJ0_Ɨ '162>$bH\8Ò\ J*3 nHS" K0qB{k/,p" "%g9.њ}ΰPa zIe.PL$KNz6AhѴV:'!j-TLAHmb3CI+T6T3c9$LK}AHZэEaИEE62DQJ Pt)RdٺLdtcLDV_iKy20h` AiG&IHXH&!tS@03:^OXS+ƓѨVTqg# hD9聤}˗g'>UfXJ%jn@.X \Y_cĘHgys+^*rr(Eb9J ͊tArkuBmwɇqUк8x$oS[GXGm-<_ztHoCZvSw j) AԂ%I`!.R`TY,3\ D@r,VzNNb1JK,oyD`̌wBZ͋/$9z/aEL,tAe9E DGk#/&eDxAv`L\0o/1m`"LKdb cKPQ.C m>0F3h-携meޮ84;5 %JBN *tЩ6% EɃc0mVa#vulP Yhu2$JB{3xYfY0u`/wk0;ThU&Ds"orBŋsAΚ]-h'ZJ sTN`,IC`6H2\Z[\I, |d-GE!Y̸E(Mb& E*,ԍ ]jUXQ&=(#BK|RTG R{{9ІHT7AwAK|C0 ZwAI!Pᡔk5&h Z7I3y @9@(z-b< m|Qkh f bꮭwATʝC!kH [bcXBF?v=+BqVs' trPµn|"u7Q10p0;J,j1"@jSBGÅawg=u(U l?@MtfkA&Ƕk^qw((|p*@ΐ65E I}!?QSk0ܩ8QU{*ΐ:QX2:⌑4 tP̯;^#@.zu{ٔShpUݙ~4D AvjTAvS$wR@&j$dipo mK*?!t+ygrpPG 寗zBnq^,V륀ڰ D.*$_ѓX}9uK>B׭Ch픭RBU8Bim.l?_&im>N1y9_vE\]Ni7i6lx3Y7!Ku|d3:Bo:9tZ],R?rQ49B WXҙV޵MVK~|m~v5yfhcjүak $}&u "Tʼ~ -%bS]к e`Q:zb'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v}N (iO l]9N A;R)vBg[v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; d1Ȳ$'Br@Do'%wY6a'1:@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N#vQaJrl/ >jN Ԇ@G2B % N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@:'ЗK롰ޫ'/~RSju^\7mr~,Ǜ@Ζd\`MLFc\"ʖb\"&ݸD(¸[[VJ<.-tE,b+BNW1]#]i+I UA{;:=v"[{w0]]mv($c++q|FVRNH5҉wK[6o|\7kyS?䨚e4\ڛ`o^ց_[51aaRlyxM#֤ tz"P֕IU1客ReJJ%i+KOgjuy1rN}lv4W4]eֱ=ڷIQ rʦ1ޚ A%F6  %Gm,FܠKQrDdƮP9#Ð?"z7 8v(h>S]9o\K+BkGƓP:wtLضXW ]\[*BG_]~S\#hgJW5M笋Ԗ껡+'Z#OWC/ah Vq ~]yGJ+~Ͼ p(c+Biҕr4 ntE(a:BH +Vϰ̾X ]+B4ҕq^9S]"~06x /15~*T]? -!(6h4P ZFAˀU7pвZWJ@h{QFn14+ߠegz]Z+NW=Jc+m|[O+˙D&FBWvv<`z"'TW7R hrاGIW1HibAt+k+BPtUxЇCߢ7] AkE8Pu5QU@WC/C#x bzS ]c+BҕZ]]s]C?'sڃrl뵙tt֫ .ftE(#1ҕ18[]pT""Q,LWGHWVeX%b"UMPJ^]Jj15Fma8y4}m[NjmX![NCy\<mpB|Im9(95]ʱX= V5O~:]ʱdz "X]]`W]\/J+BPF^4utUQJau1tEp}1J15R~*>p㉐F`/a RA3 > ۷NFAWq]ET28_]I">ed:FR JDW>w5 ^ԒJ b,9+{%*9z1H(Ƕ0ճЕRZQ]:5]Z/NW箎D1w)&"N{"t6Jgv:h[̻lZ]<^= Z nKYs b@;'໺t }6wV׽~_P$iy1q@?g=ӿ! u־E%Ftc? ^>^-~&o/8aͻOo0f{;eۜoG?dysp/]_l`9y:\{Xjbq>_-|Zn9DC>w&6ҟ-߫5ĜmwzgNDcyuV)ls"B&F:D:7V:ԛ`^ 1^Ҩ*ߞR}p룤gWٝJUjT=[P%'d(GTO?Ny| ա_ͯ7Si+j?Z´Ϩׯ-W*rq^~u[/jOxkn[z:ۜ]BjK:j>ۻ K3\S}euv5ẙ-Ȼ}&uu/+8ߞPRi-h*0V2Cha(VTM+|RyGbNٿoH+_iߤʴ&Uu mnXݡ~\Fl!Gʣ+}ӏwp8yoK5aj\3Vi:KN /cm7"tN&nMAh?OTZ1Yt־1vFtmDn!Su ݩp>wHA0t C!4gZP|7~&/oeO}`AHI cyߪ(nO:=]uas-,里Dp";rU>ǹ37j>U|} ߩke@؉4q_%m_U'*bPu6Iuc+#$&5WGZuƱUF|W IܪvJIŔ2J1koUVFtJ)VmS:۩ Bdl\ܠҘVN}'uQ)m|ܞνzFRHKt޵5q+鿢s&ƥh*gΩM| XRl!͛%ТIJL 3x~y~4f ѣh>zJaUg@g#29*5P ( %jHUܨNrFA :KX! 콢9P?P˛6S?̅6|~矆4ܺdiɃ`i%x/+buqw^^^?˹Agom.>y=?˻vgkol'/VjP̊0i 9U J;T%.. M!8k+2"sm(j)9k`cT+D_2jQk HFnG~\vbEaE]of[}:nb/ b|q#v5)albfaSʚ1>YU)lfi:YtVΡA܆&b62mi'Fބh;EؠC%O:zyvISAnq*&Ԟ_!B Aiĩ)6DJd k(+ v4Bf%J b(G(y!r.$;a7s(g7e>?ϥ b7x,"bgD 'DS]1+їTvO *DG•4Ă9&+"JŁAQԆ39,-D^V!%z ?-Ypq1:ycq:㢟pq])ZSI"BU' !" %4"̄Snq*xH#@ƃ}->-w6E,TeŘx@ܔLُ?8(ُ s賹`xiYf4isʹD7D"g "+A~"&2UUzii+X\Sc4SgqtWl 9kBTVƺg2'EAuF^e~SjM}d578_m$1oMS)aMASnX>_q@k-iwA'C*O8jv窡P{s(1>9j>eUߗ+n͈`)*j5q.#MoJ$[4eBrc*|um+Sw*SʗL&KIM9HR%p9Dk0l\FK6^Kv`f)RǛ䅻{5S$aI.ad m4ò:S ('.\lLbW1n+ǤZ >41}B3к tWTxOm/Kf6VzNN+F]`^9U:Zcr¾{g%epH1Z{v[N Ow3瀺_'4#P q 2ײ0t3\,r{omՆQ<[>d})#3]:lOFK=ACjQ-VfB{P!ecTh G"BlXg"*8bU1s6JNtmvUu"\4ZQ.Z@J$ X2rJXWREO C;@ V(&kKrNA`#cP'.(R d!9:_q )^lߙ-MPI)(#h (넄)䙒MVk7} x=u\❌@b6XbÔ??fj3hm U9ZQ(RDNC[݀j H&h Ȁqj~~2˯=~au:~:=̬j_]W[&09A~ 0VbXzRaq筼ІE.qF)BWMjmr?7ލ,n9ܙ #6{t?3gZ]u,Z_=[VW\_.|~ 7:{gֻ?7}ߚ*Fpy~b_Ox>q yfܣDyu RHh&mHvȾ=6i6/kzf{9ҠxK y \s0=,|k-!:ՏXsj?SSX'T<tN/:{CpQTF~KBFEk/&2WF4pm&p;.-_㹜Agޖ˳ki׆2"2[nχm~ll׎J*bb7=ڶ|n{VA @l'LLƴ$`fY/ D؁. o;|cE P4 ;C5pþĺ+cx+$$R#BU I!C.ӵjf4:y[ĵp/=;Y문Ãԯ6M~)qǖ=v}xpTݧv{ZrNq:lh-=XBZ51#q'^Ɲrf[w~罾{#WRE+բD]TDrVKns쭦XTCH'&c}U<$v\8ZSxf`x愠2%ДN@i|/Uܓ]<ҙ*Ŀ7*#΍g\> =-~ߊuyeң'TxP6! 7ٟYA%ziNn6O#$;-%Vji~.^AysæW1ew}>;)tQw_xo{M_oFp #&]ShJdX5i֪MּL޲c M S",pɓQhtbb,T1ߺqUcVIe~7CP" Ek u?8.-Tft6}^t/tW5b2ު Ljb=._]loXQF AV`e&nV{%,ş=ޅ$giQׄo1[꫻+ǻg!ſ5mrM&_ɹC'" i3~–T);Ƅc }1Z\}Q~Pl<}䝷쮥 `*3T10abED5Tz|z/=<2(x7:O3^@E zEʊ~ 8l6=Y =/xp55iAcԖgglN08$xV,/8Zg+%D(ނӴ R03x]Z|I5 f}nӴF=#印D8 a^! :}2MF!BJL[^ .p/c3ۛU>^_&Y*v434E ar4$̄T(ض^R$Dz꾀|@.ݟ{uXOrc=ae]rM*myRJKtNDNfU!Y6` 벷rHiZzÒ6r)-1[МaD(f{Aɲ99': =FۮH$<yʁNOH)C$ Н$FmYJm RB:w49}Fpojȅq^׻"ȰRRLev1ehN  F&,(zC@xM|eƒNB]8,fDD:y U)Š6:;-KevʍՐ>kL,Ju$ "Udpo3r1(H" FsgC/@I3dà8pOǁBwi1R/چRFMd"SBdqWB 5lc&FTY-kXC:"9[)w6% Gn jx{βW)~FHiC|z5-}3dT7g^62h/qңn5q%IR:ϰ3H ~**HN`z*`6&W7&/_`_/Kiٽ27n7-hxr&C \҉`UɌŗyV ? ċa g8}f nm{{RMmOZݘ„ɹf]ڒ+ <0fYM`!),Z9`:'9'~-Ye(x'цLR筡>˳>-n.]̋ZQ4mW׳׫]0T^p|üoەÑb<$>~߲s[`kk.UWcjV'˻IƓ&,O=st1=8XkkeZ\W뺶rRHTN4;-V,aur՘e czDyaTexqF`: o޾:?=ͻ3_{8:n ᧭;UOZ궪Us#Pm'ԫeM^Ss3ʕ2; -Z@_8~3jaYh5A#t=E\|N47MPNʫ9*;S NbAuA[s[~P?-M4ґhg{#q}Q8j\ւ`?A-s"w)[ b%69P|JAGzÆkXg:fHP-mR[\pF<1i-tZTĆq)i7q"2RoC-V f;{119X `.z`p苃R `a}]!`{CWVBm_ uB+.](yS$%Sք¥tt-{(9]djRh{DWx@tp ]!ZeNWR끮T8jXU/thu QZ:F3#BsS^բ/th:]!J+:@!{DWX}+DkM PJkWӊj.iq;M=xbQm,-DA'IOϯ_Wټh<#KJK N;j.rۛRrt=^L8oKÜưG0AV,U/qZ.d:2 RO]][ˈ ^&oѲUc87XyZ4r?* 4 |? N}1 ::kҥPhae_'/ Pe18= B jB9Ņoo^QK/3K1dL8 ˕]u-yŹe02QKLRWz- d>u2ۍd9+LJι/.Bo()&e2}a]e&wRZ0#BF$CR FSHWZtN599e]+Db1Е\>Y! `C{CWא\!Zk=DC+GtU t(Q]}1tE8*;z߇\gjvע:jҊnm@WtMŅ]!`{CWWBv%'] ]1c=+ ]!\BWuBLtutHݧ+lCW(v_GPJ5ҕR#B? µ/th ]+Dٵ$ʮ: 낐!O8L*!5nwKvq}SpfW0-9 ?mhi_jkթKgDW윟 ])03Tt J~NfЕµ4Rb;]FrdG{sP ?qՉa>TM@z͞-/#?LDvryvٜFep 2,ys(C{osA7B]fDWf>2 Wmv%5U9؆mWtplJцYLztuvNϼRa6m}+UΈƮgF{?v(jt%Olzyk|`'lXj;‹wCR;9mrvr-Jtܦ`ѕq6tp_ВNW^!]`m3G])ܗZvNWrߖ(3+vgCW gc콺R.ҕ@gDWl ͆.Е}iۡ:++]^N+ؽ= 񠬭 XrO 7\JѦt#$a5UӜ ]ᦹЕut(^#]JBP/vhޛAt&UV(4+ 4)銖~ҔtrOlz$k˛})ZBj;q-MOIЌJG;Rͅ;])}N@W߅l!'kfCW y.th_jv(p]}┒ѕ~\J1$NWʁ^!]I aΓ p|ַCۡ7RfUz6+7>^4`G5Gp驖Ltcdm"ꇛ ޿ts?g?^h6u}Z]-+?UIg.Y7~˧˚WBԋ7Ec wOW%esagm.a+F7|r oY"דOO6q_ꃿ0_t-Y7e~n:⛨=r!OۖZ7o~C_ɪw܋WoMe?>-,@E Dre~FL o|F|^j63#y  qrs?̟uUʿK W}wz$o%l%X3uqt[vl!_tTߔN=U|̂?])lp7+N^!O7(jy89o/p%{;2Uc ߒ8Jn 6c/G8rRtkg}~l%$[rMK%re66 fkc;6L>v*eґze?/4(`_[Їi6Qj! X;ymP(䤷2gAI"m=)J\ZEc$dQ (9qNPD˶\gg!bi j+} ()׃$Ir7f>b{n :3qX 9\ uå1Z*F-9Z7@x$ ,I~"S hNCQ@=hÍA2ˈ&fK 4Ơ?ȏ@>R;qnc0j!K(wՠ}hxh( uq$y&i4"bdctTOd %SR$UCܽOOIR0 ~5H%5SCk)=Hn$*1oX;K>@{"Bdʝ%Ǟ\1##[$=3dT&XKgaw Rjz&@i,R!JX][Y_Zh4DxyrŴ#UԡX3&diJ=7gcPG$X i͒aB,AN# B|YW(P!OՕJI+ <=J: oO#4>8 A,7XuU Q兩pl5 '_Xtĵd)4<ں>K+5J=V+;L56f=ak{5h[5B-%9Rq#Z .x X56Jݑ@&6E0W؄&+.G1껡L O@ez7Hᱜ6#L%w蹧%ٕ/R +Zq2M!lEP !,CHFPѮiB~"ZS X-ƒ> ;`&S1JPk':s 4X[y(fi~^Jo[Z⌁D]=u!^d5+& 2 }֑: q*4Xvkw Rq&NRc5 v@T"Zzm>P 1z^$lHHԯ>0!5P v`3d>! Db dc9WVcG=D &O < zkt7h7X#fCCc̠*SUڡ;_r0g=bi6ժ_.tXs/Nt,z7`o=?8u#D1Г!ƻAy`n*dH}136u 2j i<#y0 /)XX肸^`z+R|$Q*L&rZd^[`K,VzGhK`%v:a^b )3G[ZHw 3xhC`N,TGַßh{B9¶MZWANCO 1xwӿ_.OOu_zkJ<&BZ&A۩퐄p"` p_82i`F ֛.2 6'[H')1TDyCax( kUx,R31t \K` TATژbkc2=rǀ຅6+zn12 iփ*HLJF]@Y,d+Z=K4] y~cPqac:|k΁ >wa*C6(5< @P֚6k̀|122[}.dO 4%鰀h%#p{Z7XO(8qKRZG7uڂn-u3i4 6iKepU.I!!c-ٱeI=-"8zccT/󱼛[Tܰ "7 $"Rhx73ٟt;޺Eȶm#n'#VXkOaS~z:78`w%4$OOuFO|X.<.mI竓緛6@&zmnq|˫a:~lxkE uZ7dNӫ/Ořt|xvsf:OOPuYWG9_5t +NWU8$V gOw,ܝo[ß3KjYM @(%]_!$:47?-st\2_}ⷹ1qu˾#Nx{|yuݾ}WnV6\iz· HN#oh_ݗywS>1D6III{эE[{,cRʈɎoa}pU|PeU$}U*.{g8{tǎޒ>c̩3Ⱥ;VVVVVVV{ƭd KU{o]U}$*M&N&u߷{%K)́q,Ԡ%5(AI JjPRԠ%5(AI JjPRԠ%5(AI JjPRԠ%5(AI JjPRԠ%5(AI JjPR~jPAWJ'Q@\#Q@ZXJIu@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJU1rIJ p/G pU9B^ sNJSTa$)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@'x꒔@0%*{%>^"%)*-"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ Rz4Lً b\N;Wy8]NT e}.!JcHt ¥[]oWuJWb r&K+D{D;Uӡ+p&Ђ rV,r+@+rsi)׺\ aK+c)w8̞]o0ؓ#GW5GCi{FW2DWz. s+ K)tINWR;S+y +1] ]!\/K+@+OWR*Ktut%\DWرb zW ]Zm Q Otut$FDWؗCW-b 5\ΞDWCWsl\>ד&Lomj04PǠ *\uv&կ&bc2Sq CU8`T/o? nm\2{~g|Zgn|6Ԋ6,9M lUJ娴^5ͫUl'X^ JƋ$Wݸ#p:'[zU\|t(Q"sK7JJ*U{Sn\& Siy=Qb&m~\7ʻn@}ZϪ571PZ}6MK1]q\ *ww?bYLkU:95җ2I3i,#-(:9;iDWCWVzJz//et(9)ҕS_Ū]-'wBV] ]y\QISؖ]pYBtբ]oϙ̰A? qPZ޳@W]5SFDWzQ ]\y)t5ЕиoAt.[0}S]}p_d +=]!JCtutDWb K+D QJFtut85Soc0vMZFq^55PפYG5P)J}:h+/%R*pLڋ|􆳾K8bCH{PoÌW rUL܀pE1ȈV o0QEc%-"`'YDFҔBWV~(!:AFY^]!`닡+U)th?{: Q MtutEdg++IDES+o%EWؔ]!cKCu PJJ+g׻sfaǏMWG~DZ|PZ޳ ]=UN r zK+D+{OWR9+a!.0BNd8 CItut%g BCWWCWsotB&HWq݃]L1tp}1A@kx'R+ 3暃*Ù-XnOM[/<ż FLV`D&[d#/N.Map`|Kj͐Y/uRTIATLL}!Ji"w9S%vB;2t(=;EIDWy&ˡ+\)thMňN*eIOµL}+D):E^y[]A,x|yapU1DkY Q:KtЕ߳9ֲO%ap8.]XPڞѕ?SW kS;`_;} b'tgčUUsRrLu1xsl8Kq1"ggwqvW9@jR%e$Ay @ۚS3RAjϕґ`sko)l4{3Z#׼nP(}s4x8J)aZit *H1wxOijgnz-g֮vWV|I~q-MA` :UJh{(w?`t5ŔP%-A> }Y JNW+ʹ ^-f) jwBF]"]q.y_z LO/; 1EW[,:f8y 筹W\w@gŚUvV!j ny`Vn;iXj5lmW"^nxy˗u %=5̅9J&W\t`;?<ßKch"8,gS7 (o/yLs8|'Pr=ǴX\=r] 󏳩ejulm+?]؞{ݶ8UEtYLe)Ag5glfR&9zT-=5Ù>ybUx痵@z6hJS>㒔ɇg aERs 6Vrc}Nm e>kRr܆ 0A Nhϰ{7.'oŃlat4}`kVor[ƗQmnTbcJՌf;p;BӴ2%O3;S]G*/.mA纜ވ4z}mvRmji*D`MÌm 5c:1J8gIl),Ԋ1͓hutx97 sίU]3=G݁+Kׂs=hM2ot-ErQwEL(<{Z]8iz"mO^Nu_&hMXpe)Ί盫qZپȞ?3L$ ,Ȅ ax͢ɶJh| 5w īY9dm̃$onE-g y[AG{|]cw|||{ Pk1^fL\: sj<4hRndܣ;cU^wtf!ѱwS\]q`"&wT%m$}u r8zN:;mۋq/϶C/o%#@g <L*e0q9fQ@1i] A |9hըIh|U/Xu)b8Pbq5ž3Ç*yLo8cA|i6wS5CU 5[gOz<5,mb]imxO}V'UB.P'$(w_oQgrb>ߟ/0~@o0kLSGdrŞ{ܝzo.pBOY]rl)M_6mriyL"w]pjJd⡮sGBDqyD >OTUUV=UV*?-18`q, !2 f0fmd.b-rH*F^C;]-Zs!ujX:lϊ2 Q: [s.fdvB`C|o|{*W{B- ou}'=^ۚ=A(*8H!eNC>Z?Uݬz\h=p< <.覒!J1McPGxu`u*֟QX: &$ggiP)FGVP )Yu3-mJuJFg 䟚 MXs5@j/k<u')Oa' }Y]є?>%Q1pOQF9ajk`''Щ&1`bdfhe#"G3];uԤ֛ x@17[jAR sV;vL fb<ԳJ)NRtR",f#vg`vc{<&%||:ߘ݅qIPV~8Nu,TIȥjbWPw1eŝ0J!di2b5 ɒ ]t6ԩq9ڦp#k||f<̌*qagX.ąEqmUd9= 8?_]Px4_|m1f ٻ6r%W|wN( ]`d;Gx[ldɶ)[4fj]f!#Fe# =ԢugTJ׍Vfa(dGdqٔڠ!X.:jb0.PCfǡXm[?%A 爑&ZHyhΐ pd&&\pl6U!2i e2dkbbE5GQ1ȐHu2vH5q6Öԏ.x|S1b͏]-luo{6<Ȕ&!2Oڭ|d8s^YJ#sP":';G. 1F&D/#}&BeXMQW.NNi,.vvkFrgEmI$CbABsRdHE5iR 2N\kv==yF&KFnsԍv ~|"G+={?3]# 6DyxpӑNkۻ95}ṕ*`QiY~ZFs"$A SIj)m.}R.nɟ`O}-Vٯ\+d'MӜw]z=7$\0ʐZ7Ժ}Һ^= py͉=׎f!llon ﭞo+E -χh٣y9cȒzvo+5#O&Q|jtۄ\wwKm6I>щOB[s'(Up˧c2W<ϗKU#eL08d WδJh˦hODtup$]IAd@c䌸3{infNG !Ƞ]I@u.r[ <B&3B]Aŀj$,E-síZl&]>$[x.p=6L71aj esOcbz4xU3˄Y2a~`+}%ݿOC< Aݪx3~+s',@i!emg  ӋLF zw>U'SC}RzDfr7іt1ey%JJ+679yZ!:ى r) sHQDCocy|%yg9[mOacs l~,qcG87.h_{«ޯM޿>Oyi(飿;_=L`./#R_KҝqzJG_I=fPaR3L3I BB2ĭϒ8snz7^w>h!+}P־@+It. .m ] χ ӵJ vns0]wxvً]Lݒ;dj| Oyؗ'\j{յ/b<%[_=w*-!o[ۛ %˴m y)|9~JN~!ˍt'ۃ&(FedpU@)VkvR;K>d·nvxn K%I9tBB+/nW"J=pWg3"8VRxaw^\t5lfY][577ӖXo5qzM=,۬﹃"oyc5Tg}z@]\fJsg<#HT^%./dU'`$I+۳e'nnkϞ16hf>&u-?! ϮP^qC\'9ì!j^&*c:e2d. FQlN'ǹvs$x[6.C(,n TY ٬w%1pl[w7ӭ=|"f AҭM:r3~B3=~(lӌ%L ˭ihqDH>ͷvo|˽cSoGYVJԊ(Q`F44t5劐V|N  X.b4\RY!,J04p uP RcZq1cvjlIcgWvb $Ecbo  %kJKʡGDQ-5d'"a}6gQʙQU3VI:Hs4y'-cZgFkQ!lkgx&}}jb@DvLs<b Ȥ7ڰr]E5^Y,ćQk%v> #^vW*3wqh<(^RJIK[+i921 T=JKCi]S417guSw}~ṱߠʾܺ~c687x? i:vGx?WҀƗ&[ċ=sOJx)]Ƈ485 ڕ+1`m wo.^d8:\8O=D-_~5vz:\vdkMڅvɻb#'BZZXHa'4CY]QW*dODq"Xy8&,T-%I}nOgKR& WFk4* ē .BL9sYF qs{Ukݞq/fwH6M [!f,]?x "bA=[Cp.<}GiDcV!Je e鬕nOQSO'{ҽLϝe g5Đ%6' JlQbtH!&j77~gvqteΦ%q?G_R)U*q])(NNh#AGI-^,B7>"ax~!03 :eTÕB-CignVH`mH}UIH4 1n @y(dyBxEMѧ݊b;ʬ%*/$u_?Ձ|q%(X,],|[ 9/AjN>^/EPY-<FMc?nX9B6'/^J;;3op4ή҃_ByŜZɚ亳ŰFׅxFx[݂q𷶚ӏ~E;$~xӐ~BNjqUxs z=8}Q'6$qf[GN|.;ݿuy}69}b׷y-f I)';u_|_l_o;sU> ` Vy w"cgxg͵'x];mS2l?GՄ{H]悬_30BwG{ck$BͽSdCm 6wvkgpx܄,͊ZmJ:6q;z,Kd:M0SF3ݰs/B7lvݰY:4o/bMrx fX4H1 sΒ)! AK'^bBȱP P?ц{US3O][z40zRE[.Z%GB8D*yM)YO+b_9$,W`RN:M@$KF,:^ڬ]b`,'F!dĺ,ִ86x-K= ZtUOfy">$$lB8 4dh)s^l~n09ٹlgS (N%3"rU6l(tpnH##[GCsqpI  S",_!(r$l 몚z#$%rap/{,?2G" "rfd+4EϔLaq+KB\WKTectpl\쾕5P&2Yˬ1TSψr߻.Gpͤ>lriZTK{leOB.jU7lu7fqO9V4cWӉ]&kܱWF:^WV:F~v,apiabX9ǽF [ר=KXoP.B_ο~Ww_}2sޜ8]V$xKu͛7Z궺]s#6Z6GLjW{}u\)0;r+mo\Vo`9`7r {۷QkR;-, EE"IVT9ɀϪC93\NkWK \ʵ_F ~b +ƭ[+[+PiέZFbWH5W@P er @\hɈ/ W">~aqgztpU?6KOF\׶" \!b 4W)kcP}0p>z\!;\UT^ \qfA)k\!B \!J;\! Y9:z9p%8U\!plWHևWHW^ \IJn&~NZb7P=9*:AZ ciRy;חzJ#9iˋA]20!3 ,`8xSm"TX IQNFq}}\ @9v+F#35|jf{iv=Z܏/߾}}$sF:Z&*rI)J/-3NfU nwX3lmԒaU+'Z2ÜL 'w((h"Mߣea`$;ZͶ) &=sK*j.$a2*TX.YqŶv&xYurOe:& "O%?%◟߼*FIhFh#KJSJO9Vfoo?npH'$X\y0RkRovׇex` \V \)nW@%e%•DC+ X2v0prr(p pTJ +#_xV5Xߠ)S\%ܬ>”Ysu_%L)嵊RPo=*JJ:m|r -2.ДϫzM}.Zbq*3Z!6!Xt}0FR)Ol.Fr9%,ņ&&E>$-=-4ʭ |7^&ƝU+Y\)@tzus o{0ۯ޺;N .̪+|~L3^M `$$WC8ZoʅЂNxASR4wO'@j]5o=e*`)>RJèNƨ}]kp5vdjlEKֆ(&bbbI3W )dq"kɼcs[=Sr ! 1PRb)d罥B$2Lye,wVz=S-j }Fe/bn,kX20L9 apdr\:i$Rd)7j%%!ٸv%Ae*Op@!l7 >zm ךvܶyQŨJ6]7pU}>ٲbTF䦽gLF*(Ahl'=9&{p<P赊Wv$d-XO.$U\Đ%TRB js3SȨ1fE:8L̜2L0Rl=.Ep:EEwJ2T9kMVi [bpFmrrM3&ɯ(]?'r:cHl@f׌I,yBRG=<Z+<@ZHp%n4 N\ [,vIh RxҀ51Xb&(1H(I;2FxX)XWpԋ q_-"Qu!*a*N!- gF~ި)N!@VHo(-@7NRpj)Z8F$<cBs*2"FÈ2R{q0>:[%iu.Jn6g$*8ljP9I`@p)pq_ݱ/xe%'WƷI|KpswFQT +#+GB,x?N}wusğss|n|+8HLzfP, 7.x>̧LW@x~29w 9v\%T ZwZ%-!9,Ί7@^$qq3Gaz꣬>L} 4닑C$\.+ !ԐLfʢL6b";Q|hWpΊfZKWA1b~ x rDr8f\UZE_+9 Ϗ gQt:&쀬b!X&z(';zQMMf:Pjľy%\ЭUE寮RU9 ~A>vrU`W Uֳtyyy+9lt{V9kI&H]yPZ4oȨU Ot5~|^ 'd+IQyh>8o8 ~ 㹫.^/FGJHGTccq$B!cwuW0Hn . voa&$ ߯z8HYCQTSa[4gjf^LSv `ZbHx*]l&1T97 7 |v->V}ݛUf]hf_??g\56GET 5ʌ) 0>&EV '[:D:w)z-pϫ,G}gj7zDt>0's08ZCVl99Z1Zreά0J5COQ*z5͖O#AL;p;طځ< K A,AlrKVIZu}lDNAsc ^&-tCBy@8:4*yJ~sOA$FMB2CI]Lu'8AխFA!\i噽2Id}22f9<^Gv5jPilyAcobV]֗L")1i-ek_G6~}.%2- '*Dd"K*l D7͒4=s`9Iqj\'GXE={h]=tک>$ PjI2JYJ)Q=dQ֭_XvBse@9-7Ik˄ gYd6J$UgG=kgfR[&*~]E\!! 4F1as ȄhQ4"uɘ D}PsN?!rΒSR1If=) ǘ.0π%*McCp>xV{}"%`cVmQ~w>{Z9[Ø"hGV\o]c8XT{VnOw{A.44A{,-J *k'QL$GtAlD]liPm|PYS] @g"*eED\ܵdh!9 1A QO=V` U!j-24iR@)8f@,`G 6B^=dR{?G|ۚ#M|oD ؜ƹFgM7bcĦw't'(BovPw^̥4m핽jӲcyy sGcM dFN?2Q1B1`ȥc"mbf,o=@oaz1AiTVjIL-$z9m~rkWWx7|L?Mz0VM6ݢi\b &Zt2u) #A*"C8/d2,Mnǿ?)r#tp &t>ЯP+J4=XeD8n㥼rۿ| %1LN4^p(`\O8x2B\^wv"Nd!J#Q dh0Jq B."2B`uĥdSf"g֢ۨR\I]fD8#,q+n~Svr,{Xvz9i8_+ Y41!诔U2{2rư9rVV[Ey;tHQw7m@O[" cPJ@oߠzo|hDrtHEn /߼XT(C޼>vQG>Ng')?,W|?VJMmtNzJ UF>߆8+ϼxG/}e> k;t& nR]Z:_z,,75mX6 ZOoiG6\^`GG^dl4bIrGGӵ@v/G4"Avyl//: =noXzueͦ_CFK >HXeF()Va7p <@[L*HDX rrDK8*ODUZM9XAGFG G"f_v{ -'Y4],Ƭه䨟>~Rȶxb; :BLV3 D?9z#%|~P+LIl㥷M{Be^'3瑟;nN[z 9;X֤ dND+齍t OFZ 9;q {g7zn6sv֯=bc&+ Y&ddo*:}GqDb> xTATSMpfa @[rho?el:ٓlQA-W0tInj[,bAZɀ"$TIH?E-Cl1W~ե~O9dzhiޮơO~JX cfJ(놜DZN h=GY `b\> 6! 3 )&J-A(HIl 1@k"}ev~H<0f @@eor9'$Nv(=}_N*Fl?M#m (}nB &ԝ@Qmfwo!+F(p74iΟJd*툄@*  \U`(3`nQ=mFܿCiہn5UJK!PUM/H]fK Ivh~@)v /fW]X [O?JN5" Jv|9b_%.#u~v8lzSvף]^4.x婌KOx77ȧb.2o.ҍ_}BU:md-tr8bR[ VB ER,4|^7%ZZq6{74}$nrÄ~HNjYUģeIT;W!FDXؗ6PWN/94:=oӳ2-μ;=aMH.{umnmqݰrY>5ۃcn7v5SY`In;o["="=ޅ4gzQw2qZ6w}%nϽM37?q:=$#BP+_>psZbOŘ悬2`N,B8#sH1еѷM淜sst9&)LNK/ewԍ}zr2wSei/(z>oВwlOqS׻5f_zx}٥°kF52iAVdglNqDtFsb㥎|~dB U-04YD:JFuIxtI{=%/%K$b.Q~h!J/ JdF 9fKQ!%MփWC3jxI}Ǘe, j̼@E!!&\ "<8P q ^k|1't(^UuU\QS󖁻np8 %:wvi@vs­8$`A]eh"h ags|ξfXܞ;*3fS=( +8' ,N$#\"rD=7HQb2僐`p΢@M76wd ˤbjZά_Zi≮%v(AEtg19+Y\&0 QB_WrVMa}6UZd`uCl}r.qE=381-S` {?G|VmNR {]{oG*،ԏcg ,pYC5E*$XYw$!#ih]S8)?Úq'q|ؐ|dcS @u$_UQ4WVjp^P^)Rg-Gz.'0o[2\T_ޙ^*1nB+w4Y(6ZeTi $ I҈ݨψ sa=76p*)cPpHQ9}<59}FQMWRw<Y;^Pm}F35wjtMS}JQ,L;gy1'???G?bljhՉ GGuHWnGJ=x1]G0NC?!Fq?r>[%Ӧ7mkOefg  \zy?U~QEOU55{Zv;`\)PU9W`ѓ'k] (E%7ZOHt7Bq zk\?-"q0{P~mZaq'uqi+ZTi L..O?*7hKZᐳNM 3F+'q[ō9M[u|V.'o/&^-+fwB='~ R^[nk)W*|qqܿ+Gv$GX7 [?kYfw1 +|օV.>g =^ٻM6Lclu6u+GVZ:·oq&#y`Ñ]-Td8Ǵ֧sNmT"OZwԏ ߻o(DR)QZH R$Juc3 i,(AyhO N<@FK@ *V}XYQs{|wuQl9O (DqoWU+ $>Mh-l 65Fvof^[&@"Xs7_!ZAB\kP-Mh? hEgs7UbT%hOvJ4Q`ߗuUS8kl\2TG="Hb:(O( `:(Qwg9c<6RBʴ&BOkB2N AR-lBj#iU8cW,)s\ed5IÜ1mO9iՋ_7(m?Or*y u|tzS)IR,jpm톃db㎶P`7G*aƠE'X4B 0V^"h$iA-d )JZ<*Ě,2%*1W 'օ$hTGemx92sQ/ mAcWD #Cč8&PREE32*Ju% ;D`sEFhJTqAy<(`) k2JxK@q(x|#g;"qU;q96:]qQ77q<09|Kv~ߚs֜MWw;+ŸhN?uWa:Oyڌ4iOT^>r$\ޖ4$mHC~3`\6˧=#WK1C*$ _pHQD)öitU?UTU)xm` r1W`J`+!v:<[oIi2̩`qNhcf 4(, ^Yƌw꾪Ύ$8! `U<2:#:fCVrJZ{a)}xO/gKv?׽j,l@zAn SZ0:*1^}jZsowz0lݥ[[Q֓->UEkO'I:ux;fUjuyxze?<_js^DP\̘crYDmr*yrF$0)ef{}\M{Ź|A{{7t`9rjZ &w zHPgu)5q\RڱAh;hR"f=8+ >E"mp&XǛ-3_G3໠9ľ .0Z]'mY}LҮc#r ʘkEnx2iWY9$84_8t-gUoM{œE!Igm)Q9g$0 eR@5,E cPu3:(5" U$!P_\@{ڙ١#PU4ޮESq7*ڙ׾ZZ\Bcy!L)v˰*ԣv|)NKɉ {ɴVR_ 䜔VfYfBx,'U<[Msh0VA(>iO{!$.v*ɂ'ZRRVR/`(s,;2`IyЌg䖛eB4L6)iO9+*Y5qvԳV~{jl&eХ]TbypNl6LhE#R[Ø@dix \S",91:O*dܳipIBp XrO?6 Ow0k1\t;=\aLhGv\o]c8XjT[{V&C.49iYچ,TN`AYG5Hx5ssE9/m!.f,tGSWA|jDTʊ 3Wܵdhisy4)GԣzQꩶB^Y;1Z*z3iHYd?2kn@PlDdw.JhuėЛpϊ(ticQ1HtFe >I[Y|Bףo$v{s{gӤc7]/ <Ӹr5[-5صxP=0i6zu+ A*AkOok3dÔi09x]qM?%+h{ёЃ"QdZ!J#Q divh0Jq BFxdg|,@c$KD29 ]-(WqS g,&2ro9˞*dfk"dI 2VFo 3o`dYJZ*=wZ[r6Vj@k\Ò`YY=(oK𦻛6>MXrGw0ܡ;k(7 PKpB{D on|P<-- McU hVXa<;x8Cs##ģh963 aA0t :,xBRI?Z9U4wK$:X'Fє'*ڹߜN/;!nv{ ۍAic q'1k1AߧO<@}lkåpI$GeҟJc>`kjp 7Y,ցK)$g>&pC\%)U6ש8wkFT}ձg_C1mn6~Y>gE,?i1{9{X@,vePݫE2ݫo>+=ٗ,))ĕR==DZdV%AR WѐtNzJ UF>Z=k7k\wҫ W_Edl&>6ұM ]3,&y5-_-/t슳BakMVvZvhqݑ ,h^s~1̗&7?|7Z tGm ~U#Z>lKQ+*˳e{|w.D w1ct֩FC%vEH6~ ܹCpM阈7r?R?Ƨ~{~Ürv4zew-=b{87Sx$PJμ0jd2y|ݐ9Z A͹ ,A,BY׷`^܃ax~39d1Z7:h))Âʹ3`|0FH`muiU4'>l\O:Z9!pC]rPW4bilh]qO qآ '}1e- M"0e2cx8^3Xfy)d"rv,8-ad ;4 cy \wa\--dzz#43@N{Q[F. DM2%+! %L,YSO(uhѣ(ٝe!d%TÈmn2ǫDIҕGtxaW""L;D+TFraT8trΠcߧEts iOFT5m)O*嫯0R QH]f_ O$}?[m4hu _[~\qt›e+VÖw~~9;}"(SOWg%?t|v[qGH\Ŕ_{׿ΐƻʸP(oO_=&\d߯.ҍ_}BUzmsc:6ֲ 0\$5ZO×o|UJo._o$҈vJ /fvljGKId;3TbO"~R-QWe؋y쬬q龺5]ʻ׼ klBw9߃r]xݰ_95꬏OlMv>`ߏf 5)>}䵲Vc ?Y|s.|l9Gݕ[.-6>2v-4sӳI""OZʡ47GH=2-+v]i. ,S-3'twr`:sE]}id~99ܝ.֊bEnE Rl٤gQw/Uم;NFcn۝l21ţ/[. ک]Y5Cf߀z-fƳK%AȤNW3R[9%y̓'5„/u## G?9|fDcL(cT :̴W+;eF%.$JWfQ4j6lԼTQC)WąGol*n/zQ*ӷICi~iVc%Hwal&H6[)LBҴ)H GZY"d}p(iq'M%M5cJ^KI\0rQ\ziPzd%;.h%u6`e1\ )qmmR_xTe]߬uz_lD<34bQ#ihI+CK΃B`88X 5rl:Tp(S-D+Ž7 i@­8$`A]2`49h aX?|[ƎşSBg~k'%8G!Ȋ#b\ %B!V'3x1ju]L|RYiPxczGL*&.ϓW氈i5qvt1W@igW0bES򟧑f~>7;N%ʎsrhu]ܷu)hxQ&L -|좏&'79ӼHo 품,(/c RQ0)Ed!Dc :lTe&}}$ rLBEm:A%f3:F|V>2ҬZijP( ycrY_9̸% 4P *cD)/sVL`0:wmHåmi d7ݙLfn[Zrf2Wzb[eV3jnv!Y|XU, JD5+ Ma}>Ugͮ!6vb3tE77׺ 31'ǜB?Q˵/4Ÿ(vZvq 8omZjZ'V DŮHRE"pXxs Jtskقv~Ђ%I }U=>J0&b O(N ;$hU7k]n!8dZwiQ|tA)hBD1$K.&/lѮnc`>8 " D]֒MГǙE&k7Kt6fbF3g'ր/6٠9KP:^X.O6l/ԸȻ"thdn%Ri=Z$m֡0#ǎ5Z:_6:n8 o!!/wֻ?m,)p 6 Ea{%k Y*Fweͯ@;0GEv-*҂.;ǝ)$BdqJL' J0#x2 wvAy:@X68tQma83yLQHoJ9j>gyv䚡mɚ'Yd|sq@;]?ZW-S6'[}87 ?04~&_Ђ &S$|oYDGr vM]Mކ|޵i;_/cO[{_[qp[iSlmuxzvnÅ3Z-?wJY.hHaX"ljw8|k[si^>)<`\kuQ4U&ة`oQ9}7[v92G_>G[Vi=$ڜ=u8XͪʴhxvEEO )r\\뺲 \'΋>B+/"-1`:Wør$}3J'pvƃ&ZFMp] - Z4篸q .Ѣg_d#v⥠؉OGjHPA@Z+ǔ Ueb$mx"'AgOi\h Ά5m/V9D%h])pS;.c`sם&!POH9tjKD[m8m3u*4ܢVa; R̬݇\Y@)ƫɝn,+qo:_S} ΂B4G A^20UE& de01Q]G@vRd%/Ef 'hpV(230+*+};eoa^Nih;SäҋBE0."hs!V,唙t2X~K7vvS{JwzJ2\8G(Br  pJrY*6DE1jVe9v2:h]UV13 GIrBNأ)sE%0~nG{LW}O7B]hj1 )uL !9sL3H}C}{A'׺h[K/Π'wve7Tof> ._s;kŀx~>{2QgO~+/eVyx?N25LGv+`rO[V(dyE@d v@ܱ߭̓Bsbxy1dr#޻Γ~[v6;6e<]⽥>7a9]dRv&o0{ΏoEZh2#`V.gQioiq4 ܎qG%!U5~ǽ,BTDWXjtujQ8VJVDW{骅j+D+X+B ҕH]] 2)+ j+Dt(؞*BZBFVCW>4lOWҭX=]])`M9>Q ߖ*яPf+Ύf -(98 x2h1|y6'+mrBrԉ'L ј?\ ehs)F_n''Tc\UM5#+%w8vBW5fЕvqV ]ZNWzAҕqFsU]`+x-th  1*+T=ASWZкG!8W^1Hw{?Bj# -) a3c!b=]ݷ+<+AVCWQJu"JtutŁ6wfp]5NWH)k2 uP ]ILv%$h+yEtEhWVtwE(e]"])r^+yǦGLR77lm ny }- %1(7+}JTDW`UtE(5tutŅX`LBWV@Pv-G+!V"Bt5tEpy5 tE({ JJ`++U5tEp-BWLtutE[쯬--;.i Z9e#+u莍$֮AVtf/ (+fdWWА8`D#k v)U%GƒS%Œ#y佟-9MK&j#UCWh00tE(Wp(9 [ ]Z-NWkt(te5㬦tle5tp+ ""C+UM\+k+DY}WRwlzuĴ|mxFp;]훡{Pʎ-SЕM)g++u+DzrOWCWׁɛ M5tpժt(j2 w NWRtut%-8*+i9P ]\j+Bk:OW0{I;.:iKVNx)hKĀO?B 뗗lPn Iq"\]W:ހ(z%+Ch:]!J!ҕq\9^]!`q"\Q=מԽvutEӂЊkWRs^HWNYjҮ"]+DtytumQ>G R,S*12lx,Z;#v<'_BAei>C ,RfME~-fUY{ |~qSNU!զpL(,˄F To-2I߆3xm^r;aVZƷ5̍.|mu}nx.wS&KJ3P->?q˨|bE^飥ͻe(a@"BtG$I >x\5_H\G#^!gXU4Ni*InRp<yMgsO˞K[lBpeKmE Z! Q 5 d{{R|]C{'8]O  5h[^!2d%- +Au^x8C4׽)ǃl,`|plCcMB1 ^&L aIxOlP\`!+~FcJ s܁KƠ羰&9E,X!|-Z,І)zhM>gPn \ [I1Hc13DIKNx6hj-)78>;JNJQ*6E1fdEe JԤT6ҩ gƊs'Qh;dKVؙ PQ,*ju#ȒNI1,Ɨ l-IyפHx[b%,R7fTxUU1Z#M1ZrV>g@LEk3%2ƤnmS1Jt{Іyd,4T2R rȟ|vL4VèiL㘽 oT#FSHU"πQWT/$ՏE!MuG:d %SR(U-рOBr>=7BмA޴\s9 Sf0j$7.(]cpπQw0qM>I5j'-ؓ9fqu4~1`ی U}h3kUT9CHQR$THT8k]6Вj,֓ڮFR-+Ɛwr26z=򰡥VnYB U|n{*H #-Ġ;*T{|]Z};a:ir@,*MVD`5R(- O9@EE; N#4o h;o~lX "(e*9<Ԫ7JX ypl5M}`wuU6`94<2acbt/XY_uavTL@TCJB QCK@T:|\ t&/ճY {S6YRPhq ,s4e`jP VYRPhl NZANc]V{Yq J*H{m>COFAXyǒ`:ABb|Pj]JDfM] p1ľ=(aL! ∽}AB>3gʠ^;XPg%4' ǎ1*Tei58!a%G< `⍥^|W \sG/x bcgG.0QCیZoC{T^*KP d㳒9D%@ۈقjcW +0=GOE. !hd5B5 !XP(ti?X uyt%+&CFPVH܁m C[ WVUS]ߨ^ uT'(7j%^ EP; IƟݟ_n^e~"d*Kq ǧXk3tdD4hcc.u%ǠmXQP}TPGQpA~,jpⶦ3-Ah[R} y@5 bu5,hɭ]Fp2zvhu<ވaD td+% Ϊ$12iU(- A5qwpDip p&*b,z` Aw:+B ىB@`5!]ud15MVoj3TJЎ/5{%E;L퐄M٤$຅> FKP0\qkh}6`O>|l^/eXviϷs=; ͂ ]H77kF֞5`SΪ6YQa6fѲFh ƴ[E @zUF6pݰ(a!/{úSiCݞbDuN M@uW=CE7L _0mQHitrQlFR<A lU(?<66Tc@sກ6[[rJ@:xUTAZQkԦ5t&1jj=ږVϯ{*6LLOfs jX*vpڠ@~ "UXkʛ; Z&`-\iau!~Z)Ahz|dNҫzOQ Fp8%oC)cf @E=Ap*\ίZR- ڝ.!bc-ݱPo#)S,])`ZX-. vuv뉯XT$R R DŽ^˱}؋[ۭۛa D4PJj!S\* s?yXrtvAmK8Z; z}^`wޛ|wy+)n7m~9mҮwtw폛Cz8b9os 36'y}6} gn?|5__lS[;plw,l~8ݱ^lv{-3y`nkbo:H3RF|"r yrGJ@Ps$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@%S5j{$aK? μp0 tJJ@^@Okr#I I^I BI:?[*}O_/+sO߿ǫ6 ]AĊEw8YtMcR\I0|}L =>kKgWZer ߹"V <ݳH?/EFimrsb<|ᇻ®aTx̝]8C>|hnb?j:CΜ=Sv m۷\mona/Pw?g49^?wT6δ 4NsѺ(8Str>Dt)iǠMNW@]{: Iy7e=zz LiBWHW18զ)NLCW L, F7?=HW)yDtŀ'4p448t8J7&0/i00*@WAKh&4tpwNW2 ҕU.8]1HЕUů c+FIQT FMDW 8ii Fst()ҕ*į(}?z \Ơr}@7G] ߯.d/]״3Qi?Qdn O?m?z_)Om/1O3lK~`}fi{Vi,4hџDf M M=ئ~bJF߈O'+~NOعyN"3y~ ϧ+F)O^1Jϳ= ]1Zk%? ]'N}| KKah_a(ӑ׎UҩKnb;]1c'BW$f+QW54 ]1ڗگ}G{xNW릡+K4 ]1Z3xtE4 %5 ]1ڗa(S+޿?pO9Ѡq=)Ѡ#$A3f|t!lLH23>7czdf+Wjisdz7FVyGŤ#l2Nɠ36B)g62gF9ZGJhf:πC24HMO1J6Htl7BW~tF*VLtΣ.MCW~N"?{4tp4fh2NWRkS`죍۷ǛHFeqW8b%$%j,1mf=U_UW\. $W*oUfi!ĭ1{^v9RjBfAs=? շ_uf8H9݁ghD:6Z7c;W3:ʘtf)`5J0T2Ʉɋ4zJnL!_=kU?gDㆪU8!+n*V8,^NG!E`WPvw4׋e(hF ٶKZPG s;4S>!S]vCKZV $ +8VNy:֔ ^4ku_Vt/:|= }{;p!e/|?@æ&M7=lm|hY?Ո sk1N׿>h"h9号i\)iddf|1 I]nZPn1ێq[t+&Ph12 8ydj#E~y( >}T_pw7Fe,Bjg/&UKs [$)N$Jإ3⩣h|O,OVn22'f'?\m+x;٥kDD2̼l[.{: ,/̾bSc{UkD :C:DMVg a9YO.)J3A$)nO(lg!C)v=Ʋ}s?V+%Uߓksgdw[@˚P-H@FŭG1tZCo]+lI"t\-)ʴk2'iƇ.'Q&W=_ڱZ ^=t2UjWqM׫ٝGIlgRazW_‘2+uk")X@w)xKuǢ9Y= \N7-j/zB0 'FFb4 z6B.k*=dI<\C y*^6ӆ"0p-4%k=Z"LpMଡ+ohM0;@y>a412oЊ3jG('lDk9 2]Ş-f?"JUQDܔU ]q޺bvB"KPL3!|1? !0;qGCn2"U=6WYN=ګ ƭhfۓ^yj]98:8:u8X¨z[VniֺU7ϡ5 QTb:]x 2C*>|R[nsi و޿D[Ln/ B6$ƞ\SS7ls7)ݬ T> Y̫pлOtN cclu>Mn+8בܱDp\5Ll6􌶆uwBXwPnoh.^W͋\Pf.^ˋ7߽o8qQ8f1P9;]y܈=S˚lW&;Vg+gKn@~~|1tK?/Q/v?#Qq;\ٺlŕʄv7 ʅٌ$~EC)Nɮ[Mon{HH3pH\r*1e9,pDI"Zi q'у sWT1f{.rҲ3¯g.[[w;3M^@Xc:fP-mT-w.8Ii"Ǔs/_D"O VM`fs{>RU]y`GFA'x,rm#)'"t%Y0b`a;Aܽ {]}iK8Dp&{O0Gq$!HI+\vs&.\]rҔ'c11Lbr&pR0q^5##S RgV:kPDH@jd>Ǥ\ik1 /j Y$}`1r!D4&.) [A)^ ٓcvﯭk>,֠Ep}0|pP}No FrVhB-'Kv*,%R+?} i\`}!Ij`kQTX0  N )՜S^aJ0ȥW6+ǘkDa#?i)g#1N\GrBns fC)sLq5z@6(b^dTQ"g\A)q<^3[*`ʭxb)s}e>a aٹ8'r洴+BGiozU^oN1uS;/ AP Fߣ^SN.;iSG>0(6Dca"W K@Ts-ƙ )1@ FbT21 Pcpd}pR!"e`u< l-DiJ;V}Li$FpQC*A 0ZA2,%D"I$YU%%!@E5 ڲ||*Ԁd}ɠH AD3.B~(8M{vM[QqqFǒߘӿJѻ:wv Ѵz|ۋLN)~k#ٽ/=,碽ؚKFO2YrfKO>*K4Ǥ c5:VSFRE.&hz7Hu,@ &%:EJFi;{$6UIơPlge M׳TiwKy]Yq /nNj4cs4F8L(C?s)m@I4dg#9qJ{%A3tBߎh=T6ktmpV\cڍIDZXmy'MB2k#1("A7&X4׸H8"R0I;#s!ZȐ\cG&Dx1:[\H2Nu m+>'ꎻ!fx,18"-:YēEܚ"^ФZ=nI83;Ă4B Vx"@>רE@8JE}KθQEE5ƣ{ Lh Ϝo0^c춈]F|bY̔bIɡv4l.6'Tr&ؔ9IgȐZW!EFUQB@NvScIDZs? [̯U^دȽ#rG|g4*q$Se?Z„ُrm 1Ze.]mZ8laU@ *eNi_h#?4G.z,i|#akRAsD?QXP< 4m,pF N-5NiOs<ç Yt#B +HȤ`|.Ė`y҆`Q\B:~d[o(K-<G]\tI̲:9-r TسtcxC \ C-45$LYhxuq~x 7,yk6-b8h-_̆e,̅}[%ת : nRAsJW1gp dl nY;͐{+Ψιl:2!!\JՠXqrV"D~ ݌{f}3NC(7vr-x3dST_z\GtHn8?0Y{n_2r[~)Fݚ3m>emFl]`juQNsP hգ]}6+زhdMƻRAyvy[n/+>2W A^芺n'~ J ќtJMO/. W`OwEHV˖,˔-; ö٬SdT)7'S`!ƸP=4֖i-KÔہOGDbM)![&u3B+3p̀%P QvYI[+zqm8{"eӋUpt(aTKWU;%X*k.=JΆr?7+Vcg  r{hjb>0Umf b U.qZ^>%T 8[9r IL4EM-,``+^҃>:n"c1+p7}:=?]LZЩÛ?6R }hf_-Ͽ$;~x PFA€ʐcYF$iEV&n#rB\w{syn n<^ԋXl>TRguSk,䱹ɓ;2f'PsJc d]6Ǜ-G>3y}BG aDn#ZY"9a*9Y7'n 4 nO Iy u s6Ki,GyP)lb)̽"]w?3hJΨm29L%9!H87k#9oӿ8; = \J./$Vgݣ5OT$,Q1ӁdsNZ'Iq3H˖BL"BVf0| ϸރ&(b2H8OByu q$Z(DI/@Te Pu<^!v@NB` ǤAdelS2VkLy'+Y5qvԳV>5kJ" #OA V3r\ ʓ"dL"M>TP`{`k#:&+!YL(!Nzb4^ ΙsZatKx3sv1MQ[f$80RpٱpT{rpVC.n4AQ0GnX;łAgH䯹W@( #1\k;!.f]tGC_m|NvjF \esIr7N \1hR2=G=cGUW;De$haO̦&MjKYcn{ J $J2!aH6ydBgR[>xOOu|KtʧlklFdDCH!6(h ~*@jȢߐzsIz{T;gK/RĖ{eOZ9-tcSNr`ĸYb2 U 5#T\r7vȋV_l=WB7,=E.Ʊ5FSӅFNҤeV*ZHj4Z6urk7=xߣ_<|L?Mz0V}ҍqvl`"~WfýbWjv-TLFvR=Z+HV:󷅍. \X@,v /ڿo_,RvK/~O~_dS˽6B\_~-%p/f~Jq?i5;K(p?)JI/Z4$Y @ճ]s*قo`-$)ATѱM5]`m.ע҃ӱkK Kjƫz3eQ-N;kA-$d_LG?hu1<4,5ثRԳlIr/W d),Rzu6H!L#M $ : `mR C|7i 8p>Yڲ,(%^b.EK`;T1cY"cPխR*)luLZg Y F&]8/nn,1#Ylᦜ7>>lf a}Mdz>"bm[dX{(7~\ξ5gV MA)|9^%>ﳸr/υ}g2 QYZ2n*ip\R; r] dID0)ȣ`΁hʀB+/|f rI=pj#j܎&Φݷe/Yޑ{z9iѕq`^ŇtdN gjsZ63VQ=ݥ6c'[- Q1ݺb6!y e}QNGtЁY) KΦϿX!1q 6NgՀվZɆ褓wC3LG:r)9O{X < q([ I%M (u)w^wk=Y46~ܯfv{ 5-~SӓEtӧpF <%=z :fMĨy#ZijKЈTU*Etrk=G" r~_!T9U7Cn 0נBd*C`>9ԩzKOaώ((gS*D-]R;$1Nœтc\+5Q=;ܳ;bYOZ_H#f9ެ?]C]8PEcWm3?NYaA؀g)Me鬕 @Kj(4RJt ! 3 4%)К`|B- .ZM >B=ٸUq'q-}'%圐|8ف{gj4]RoDt.0Ö́^h9c [$}1e#JM&rLKgY1@o./%O$}7[y7_:/m~ʿ]WNhe .-zyxv/aAڊЩkY%ލ~8/[\yo'GaHlzS7[^_ .xW%O\8"J^YQ[pHz}[9Y6\^LykJ*aHjtMt:/+xYެoGirH-.񓳟'#vˉG˜;ݟ3!FW2\mh'Hw'd._hs/itzfϧgget[LjGޝu/Yf~a(g%k~[_׬'-ؚI x7)ˬKy\w)C6"qǝ2noړ68)&-<2'w=5im pTnK 9X.[z"Qݭl2mwuSvkx`)[V>頮Qvg,nk&vޞDce˅qww;ug.Vb"Dɂa2dɐU:vBf0HLYe0> }T)1!dH@!3sg 1'ƆI*s#K±/ JT)L*&1mPLjku &̌I`5wJxiTD BR ѫb }3OQT66FPRUąGoy3ʦo7iPCm>lB3g†dmRR 46yaz ɠ{C2݆z@]!jK!Xyy18r69Vq.9䨇xϜn;֝Փ:#/pKrx X4Hb2P$r9KC)! AKa`ke.PbNȱP3Ph½~ S-CHtǮ "x;,ߛUZoW,4` Xe G8?xn/Ϲ7K<, sBLs84L%R5!VRE˺!& )q!X`@uГ} I(g!zENcyeMvveseγHrJ4ň/f!QwSc:.[)L"J" K!d=͙UD<8KQO1Gcw25 ym6`"f++H5q n_&aPcRrDt{6$_#!88wmlqp+S!+AU 8zPCZ%rZDduӑf(Q#D>XXB,'r7#j1yD5GD4]X2' Xxdt JєҞd4Gb(9YEKKPbUk `ԯM_U;Mt_ L6}%'Ba :Iyh:' ;.^;ħ7`6$o^Կ_hmɫÕ8]8O.A`fl.㏸?|x0h.$R.Q S2(@F#z՛pWGmdp!CL: m^5um frl@vyDc2;=jWiYq^Pi-`k U cM`4Gjd>9Se wE%KP’_O{uJq+{Ѯ>[f(3W!0( ZAw]1LPJgR.9&{Į+qή|oUV]gW JE2z*g$]%\Ѿ-y{rՇ;gî8Ƅ.t0y;%kPFj5!/ƣp> ,(>ou̀(00IE`L:丗 ڼ,O *-*}:(ƗWqq[QY#x2zpCXG7/I[rC /1.a5v W| K!zW΍; \/s\ =(vklWL S*iT:-+vIt"Pk 6Z-u^\G8E P B @0" H9Op)ݍ`ɖnQVJΗf6=k)VO>S|G4:©c *^)8!**4WJ j#[IB41U"h@Q~flrD{R*:dzYO'4vq/-}=t֩NwB}^ꯤPsHXNu˜<='Oӳn9yzN9yzNú}QHX pAޗ`-V,$2<`)Ck=ZCZC&\!zȵr\!zȵr| is1#'Os<='OӳIK#$- j< s) R[Ƣc\8D4e,!]Znu '1bePYzRRu{)탬ehI\(wp)B*j|sWL#,8ZI}c# G(Hf}<:L+wEkmUE# L{_/Ά=HtGn=e"ђѸt&HR,m)AE|["am#$F5?h֡⫊#f)%Fh'ۡu O2s6fMi81؛*NjBq}#ph}\s,UK$Mu7wӦXPa* !DyVQc!츉D3'3YO'ɜqIe:)1ihWHvaTDÔR;`#l82{WT"AAro:zF4Q&MPq*O#ﳢdP}MWEчۀPԷf7 Ȟ>x".FYݧ0}*9a,JY}nEZI^Kd1%agߟohi`GR&zpưD08J*Bc3h}"s"'Y OU*Jub6- e]LLf5 ٰOc_,18O|[ ;4_L{-@hI:Yn ql0[~Wc/{m(/p86!pzT|os$ W(N$|nVb0Ϫ"a^^[wx j2ڒ|~-cHm~_۬/o;֬f'. WXAgͻ&ίhJۭw˳?Co\[v;L;|b{NeہomS΀U[)M@r:26zDvޡ٨ϫm79X.62Hn4(`4ӱ@m{kΠ=mjDUuI[kuƝ~-+scA6lj yFI٨WP'Md5KMQ:qY-%X+l6]zzN8=Dp ώ' $&Q2;tDR@a֜*JB9sΥ 'e KiYimݍv}Ɔ9Va=ʅ;ϫv-۸Z25c"$)'L~vI%#ΗXAJY.6C;T5>TE4NZ9%m>jua\Qp+e^&Q:? .;M @{1 0 sTڙPvkZ7\Av6A;.a/`S:+PAC"rpNSw`*A5x~l*aUiC(X\i.mNjK+ iщbh`~Seqaj&R_۫ N/W/jD4N`yק8̙&g}sʜEޟ~u7'߿oR.(~0?>&]\o\547*vsa\rǸ--ܚ(=.}~=oUSo~Xx  \]|珴MAUk Tqrf!jG%{sMߨ]&#H[?Z2X(_O"biʴ#FYALH !uG76Y^YSh^qg#6:Gg["#MXKS!M-S2 AJL>LgzÉ]+AMOvFI夓*|*8qYYʶÎ=90e GT2EPv6jyzs n};<,y6A`*F~IIđpZLPa 0+]vuဂ-B{-zCJb #52CB;XNxM,B\jr쨛z1J[K#E:A'z> P4EL3\J >+H9]A2X, @Hy4>D+p3b QYǓUH p_5ꗦ*őp.q&eO@"E+o"XV}1ܘnYD ,P8\ HDLC #%ݚr׆Gz?ixXi] s/.zɬ,YBmYo\}~l%wj[fH{VY|żcgS|g/|87,8r_;y_ί|wm^|E-fMo..'gә}_0oG3qQa7wsŗ#{׼I-#ie݋E Y3;c^_ڡ };;`\Z ؞v\X/ie/kb'k+!R1PXOKQ“;DTn= N-+4n%!jdy_KJSvj^t).eUYL/b6gRwe{m6(VN:N9+gpzf@ B SPPGHDZ>$!'kH`kѢJ&YВT3 o y (iFcJkQIcP@_ `QR=cQ r ]Z5~Ht  kiIGePPAp>O}ms;nvmo*E~:2 N9Uй֯"t_&GvTΛ"lN5[kB6  D/D5aV6D5E QUlL $*oГ̒41DsE2n5 %@0RN 6g_Z/Y0b}p+VzQ<&f!PGx[JEk%"̅ FDsThs2dXZ@S"(T #hEl}8CPnr(IZ>зYg+m9Hbfa1B2uPNieYzp g6FUyk)kA7flig qNnPT|(tULQ'kc@ *":Q:f r<"s³3&1hiW~?Ĥh$$ڔl )) p@! a RdV3/NAL|([jy4Rȉ M.IhhgK0Cfu֡>Edy!|ȧʤeQ0Dz0ϩ l.Rŵt}EXJ{sy.t"4S<0NP!Ir=ia+93Iɱu*$3L;Iw} ya1 AYU dLIj 3Pi6. w9c內z&)')EN݃RRpE8O' ![O,4N%F}HYsׁK8#=twBll K,:ūS&6}0PqdDUD Eض @l%lgQJC F6{B*J-j^Ozz΢kr}[ék';VD%-*ǀ.&5hZ^ X'/`mM_\mS8Ӆjيυ}W*XXL Hx(. !/1D.&1C8ҹgޕJ ]OڥH%ɗrjvR=c3qJ3_L3vf.%vqYC|xy~]VOl@w'ϐ;XLRƘ$9\Ԣ8=NDIZTLf~|'ڳPrz JY(BNWĹ#;ⴛ!u6Ӓ]o|4ڣO,ˆd=[Պ 8A6l 0jl"/ A+8w–UakO/>FnEvϸ2F#-b'*Dyܮ~466g@wjˎ$.XeeYC(sd:%;ҾQt4l|V6vm / @%Yy2, e۔9`-Id%7h@@e>y9W(b6!FrK*ֳ!^ 1q%kn:?lPPVNi0E N'/>z:Nh ;/Dtʆ~:Eз[r%X'ۿF?24Oxdпя=UG=f+N8Ƌُ X3 E_zDtJitY[޺]~-FѦc6ݞ_;ǚluvvR3;n6zy:Ϧt9~VߐGvZ_sEJsUJeZϐ{oXu} OZ^ࣺ%{e/kxJv:@a;"Pt^kbwM8;ϓK=+%uO :]ׯyэWG@R Z$AT E-De7+A(.;L.J"߽XS|%zgrfwvww>ry3\gr`Vi䩨'U]\J}Wt$b}CUeZfY|A2ez\.#z}Xvz^o>k'ּ_adє(h`ۛqS\ _a|󿥼ƥ@ԕ> bqW޶7qAi{OG%?ve;+:^؄/tya}uGZagPI&> n촛AiFv¬1o3N"c{@[mCm,g5 M% e(mYgY.Q|DPKHiee G"H:rZ5l3qd8٬nL]YO=Ǖ\N]s|{oo{va%Z04 Q8":oYS#YJ);%;bΗn?۽/Oa9}4-{ |;ln',<`Q'ӆpjXDJzN EoU+Sĭ$t|Z`(3*92:ڀmT(`/XJBQE.8SZ=qMmX6ŭ)8v,?eI#ݟeFYT=֢ LEd MIAbݢܧa[;״N'uսon8kIP =` DP,+;h`*RGaTt@"ap)AS )qZ)K)µgk仮8[ nz@cWv"J$>7Lv+ؐ+n-ުإgky)|o*XDV0"X4`n*F@0V)4 %CNX B8FSƑrW^)Qu8%!  ͈QKXb+WLGfUg`zDMΛɁZ^tzx@DC5v&xDD݅%s0STҜ JQgd4Gb(9׉#`9j<Lud}u[7Af!6gzgrQ[HlM(L vf'I9j@D%4kQjcL0&hijVwPx4ԂO(KN"vۤv!6^ -|C14mV Y@SbXVcjrKJCKݮrF:-`IqF)n<m($|"rq$*?D!+GEqiٻ"W柣QƑ`_Wyj3ڭQnԷ]n%̳w{ 7&g8t\x-kIm=KVt5Ϥiqۛ;CeҥDse,:uyߋ.NDc`f0%sH[]FSxbeK>/kxWt-JZGK!W1pV{tHJ]Z+IuntU{jVo.o=J}#XSwf_޺(]m˪W>')ݷcئ>c5m hݞgMCΚzZfjdkt%PUŐJ6S7W6Bc,>XĘ L<)eGXJbc>9 x5kmFrHMv9I,R:wmS NJ =As 4 XŒ5=G GH\!131X$pt:h5*!QBC 9#VNY%c.YM}ނD#msx渽Cɻ-څt-;`MW!J=-x@ Cp`Vaf'4% `4 aSsHtxa?ࣟǾj|t>a5 1Ym &ٍL0NNoJm/^Q^mi lzY!?)~;m;?[wH/0O|JW}R\p.2,ݼrLV5nQbo9,x v]/=MV:Y4orfS%ft~|b41' 7E=ݟΉ'? (ZRIef$q> `ϦUaYٮy-Lf#7tP1ٶWkK [AunR,o(\IU7uK( m0½uZ]}l%Y WWƅ9C2yNRf1l|xjzJHI-iUz?PqZw$|8+y8\`+E촨z[a$4#Ha::RLݑNå^Mg/P_%P0 *:mqk@qK MZU@L_þ Ysern*alR 8~+H?z̅NNgW˗JBJIӁБj,̿j;3xI0QJ<-nS=']TTw泋el̀\OQ/fo#hʯ; ,g]֕_e2}[YFw! 3WRFOo\f^uU6ڼd۪mJnV&~hh$-}64u2EbѿV&UdӤpdQup @ǿ;:{o^_ߞz ug޾؁Yp`?_@#+K/eWKzT-"Hɔt:99|pb|>۩';S(yʳFv ۡh ${ ۹q4ю-; a</ dDRKռ,(dW {@W1 ON*+b c0Y6 Յ  u"E ! \ Fj*e:%vZ+n'c3pW,}ak?8Q uZ)"ԁ> |)bd,s~7 z[UAzJ9JY"!h|V*g䣲5Ec:k4Ի2\(XEO&L9D*%dV25 D4H?SB3]a 9;˘&-P3\R8:Tڜ$UL2b"iiD-):@H|5 9 2;}^Vr?4г(_)ʷ:IA=6NfH&6pΌY4 = !4d cdѤeWz&cM\}vT!j5n^Ϧoq8~FԤi4&DE-('+䅜McxQHq/aDZ$\** jT#Ex@lH!g4{{{d7Bwy~7zܮp>Ab_6/I!iKCqd=åy.`_ף:AYA4̩]Yncj)ћ%8G}o80܌7p"cd|Prw:{{;:-]Xk_LěQݲDl=Ю@={gu)) Q*"R// 2WO i,#>aHN绳ni{/~LbHW8~>L`%.R?0iW8HJu(k_"H_H]{LCMmK ]K^oV+8]]yihav:|v,n~b8^=`5P2}g~={N|wEMEA&Pۇ a<,ӛ̀f3ZԫӰY<X[qL_Y?>˨o4H!L#M $ : +/ڤz8Z!Z J1dFk˲ #4K̞3ܡҌ)0E!pONB7KQ8ב-98fMYr;sqm]A[O 63M녍%-]Xb~巋Ǐ͂ LxUl$y`'^hȀ[xΕtֹ,Y't)kc1ųB/dJEEF% ),Wv 7)2%&)E^Jv qYd.s$IPuVgǼ[!K4#~q.%9M6V'è2dTLD@0c90D)5i@v!l 7_1yA !!FVikŴKCطIWmwqO4÷=y'.Ѷߩjj?~GJҦC[ʎa)_MR:p 1dRp+C ]T$:GBȨ1"X`U&f"mMk9W3Tmd&%qjXXmf싅“µDoŒ7-.iYX~/4~4܌Z&J"G# e#Q#(g uRH+3k(ƞIDq lJmpdXI1C`Gyq2wL̮vV=j vo5sd&ZS$Ff8 `bźAYiZŁ,db4- aML,h #a}1fY :_;8Oakԗ2VaLǾ+#Gĭ)aLmR̓v+&54$Z g+ @\ HH}V[id';G. ``!餹2/#}IY(0.U[M2P#Y*W:͒}qVEb[۹rgEmI$ d:CR1'E @*BIjD~Pkvt,LR;mi]}5>\ r?/TBq5 <4֖_9*[8-#mog;妔lRbE\ W$=6u=/% OH`#9k|CR|p8)pT]v3?8u+TiΓq|C/j1$`2N_?Rt4o~NC:,{@&2?lhr r嚋A_1⣴W "ޗ-KƩm`y 6Roë+mq'VNu@*_?;gH)t|̍`0 ڏ#ҬGY%|8s *tMdAh=hG v׃Q< Ҩ&рdqFA^%'Vs&wMO*6!|(obiW2SJv7dhiG4'n. Ι=rv)~B\ةH\"bi%W^%L\OhkN uOEXU}d\ч\ؓ"<*Ҫ{!)mWuoj4(.дeʴg%29䢸~m]['\轏YaAe ǛY+%{^d4Jӷ =nyniYY nZ+g;q9CZ. Ba!0x5>>۸f-Q~?ݏ[zg|KmtзS?}9Sk6J.v2ܖ٫ۿ^#b]&̿|&ۼ򑷛M<␩,hANԩЂ"6]EJzZ icez8^. ȏScE5@h ^(9  gh nJZ,#&dz)isiCiile=jmQ eA?{ȍaaa`pM;m ~edɱd'![,e=Zi#-dUVYgx+} ɤhE0ȴjņ?39mD?W:&k1F`/?᤿ސfYKY|Û',s~?kJp C}Z9rRP ĄT(R+1@CYCQ #qc#@5P)b%'e&nǙxE^3gX|#iF7 G7ŬE`M<`R&+LoGqs;wJVO%hi6.4gl;Z m s1l4B%UOFc( ASޮIx. >M]18p^CYF"V, mm U̹4="PZ˕4gd8W f7+ٱ51yӛo)^Cds{ +jvH.@0)`R>(rf Po]Rp1uBB(M\: xudڮ>92H9E>嘘7 +X#RMAgM 02DI p&>E+Y,gvመj9a0ܩ[?od"̢E IsYdH13#qVLFjDpISZ9yg*xQV/Yշg4umm+J6,aAs{r7ٴ$&U<~]6tD<0Z 6IyM9C@9 4.2JG^/!Gz" K, Z0k"&OH80u6&q;׫PܒQYO;ob NzA. ~wyojn5<6j@?פ_K _Jxpx)V[Zp3!-IԴvIܲ_:PÅW;%Ik'R(U5&9 BDNJx0$hBZW+Y.{f3< dŽa[v{&9]|Hٽ&;NЋ]T{!/<_M9sct딥!%ITpWpv(rS:"7y-q?X2x9=nVn/:M~ i#k)n,!Fu}Va6T3L8Q2x:pr#  _`h6[% !&\o$$1J29b][uANX[) g՜k0ueفOomu/-RWq SA*io!Ĥ9P Q9=^.1 vj)tQtIJ (NdJQQ (F(^{I6 w"I{uq鈱3<g/ $:*T Ҧ %ni2F h%\q8 BGv4L_2}h6CC(j[y %Ԇ \? QLsv{՞;1% yaOC, D?= =azw}q=^Oկ… ?ԮEbPO-:0{tvQ~_MrGoްtnٵ~9 @$ /We@z?z뒝uE>RN?&muPbrylu~L$xh&N8>bm!W!l&CW,5P <׸ z[So7 X+l qϽƶ7o{cUmkֳPj*e-ӏ .~zWr30 3Zf!J@ &7(1vkSf6+r#^hHRٔ#2 Nn̿ἦ'dP^MRLq=L') t`jr w E6?s7-˫M- V&)C N_0@S/yGgf[wIL̤l"Cw^wX6iהg"ڰ^t֙]3_P'" wk@1F.HmyrƦ cC8"A͝Z}>HHgeiX>92 Z:kx ?Ǥ.g6dG:t>8\,O\"߿&FSBv+lpT,)!s7ɍث8KcJ$ңhPY?8g*MsFLp[jPlPN%9UOd/`x{28C[f" \U[!pS 5 8n^Z*GKW8ވTc_pGs]ROvW^)NOӋ⥂q1ٜov9k᫙,ծ׸O|#^-2zީ~W=MkgiS}"0{H sb.Q>~Ɩۻ]0Fw?.(W$XcOB颩܍,.ȴ!ƥ|̣whx;]t!w4^g/kӻr\JwN=x$w,q}|sד^}zB{5l] Ŀv.?r o}pN9? wLpHto~܋wGtwm޵muͺFlеmu=eAs7˥2;-d z](|=TuggG>h5+혏@pOWldRSΧn½dN1,@Lr/ Ax(o|Y^G[H:3_z R! T6 - Ks(IEFt'76D^Qrh^;rg#eu{H8`:LDK6= \piZ 3Vtɨ5ql;w(zmgvm"ի`We%g7wqU|em+'=MM啓y{M>\'^.^n"Xqqp*D+~JY^,2Y"_[҃Q|VnșlioK ^(ㅼ徏(!RUZ?z j;S |X6㽻 FKPK:}_]*ܺ{UYUhR So{UuuK(rD5p5nQzy5d;jכ?n8D0^wVnp%C!ND * I>-.ak>lQ"p'Z`]T4P3 L`fL=ra >RJ@VuTO NP暁ڂ#)"b,b`e5rQg֠poԫ|~΄^ !AIb0TUQmr}X!t|tBSS@`11K#q\ M`Α x]jL1d\(!Z] W e˺DHH 5lB|"sܣIx@ F'e1$ծơ{ʝ:whHA"N5Y*;-r49y12\ydj#4D$'w#cKu1$%dQl*aBHψ O,.`ZiSYZ#qH,)WZuѣϲ~HL(h&Ag>޾ah]Y9;9QD *۩;qy4NN]uySGZry Jiƻ[)oq.DTA(z. ̷p<hSJq BNȜSe%#1(ZI R.,>.RG!}!+Ը_٤5%ok6@2ݫJ)޾9ٻF$+~.ufdM;ka a<=F^9Hn/7HI(*%[eoRqDV\Kq3Yɞ{ 8rn*0 0'7y_Ez͚Cs>_֛RgԻОM'?\u|ur=S>}ُנڀET eƔ7>&EV (HD:Hlw1#tӣn9[wK]vT^j]}78ZC!)ylsr1Z f2p`Va:eDT6Hrs zie6. 2{+`B aTCrxXI>"'ߢV/9:x冢ޝ@@zxeHYkl&aTd!^f$g^]LuNt=sBRuPFkDBTN*T+H'-c@CB˹=:G\BZϫ FT!2fo*dQ\s0cFs\2(h^2UJ6sNJ+p3,ɲ5r^eqj {`j4U3E~QH@:TBX*aJY"ŀQwQvXvBe'ENܠ4`$-* 3٤R> 欨lgsrҷRD 9ͲQL E+ךS##`5MK%P/6D?L2{,FtDr1]08KV5'C4I01mץK:Yf|AaLQh@K[;^]_H`Ԇ{ =0KnXYɂpAe$ u䯹W3xD(3("1\ODH2#/Zzг_-xe*^<i:Q)+2&̠rIrגME4)G̣y\6L>:iȾ^!l֏/DKnI\%2DI@JHM^&ل3ܼؒ , -y^vlɃ\ؒI)z6?`l ƖTF?Y'8rFg1tdᚈONj+QKo3$h)rdNE?iWhw&SQסyDJ9r'닸) Ҹ´ƻO$ar~;SS C7sZW%!xL]w469ٻǦGx6?O.B"IfR ˓ 7qBDuO?7z`+"G)Um}9INOV0%PVZLZo}rta LO{ܧأ EsVʰ ĚP{'|vKŘց1,9y}jd:-]|r:fS3jm1P>YS N0SM2fY:iW8aO.t:*3+ƣ b5UeDT"eZ"Zsdݒ 0Ix:i+qdRDź gLi8Iܒ=r5B.8JkLjJBkO%x!ѣvi\}:묦%⢩zַs)3іY9&YT,^ lj 9I2bML \| \MZ<4@u-gftȝZ5: ! n~|ǒُLE[F >BmzmkU$U iCqM-f_~#$4҈d9 N b?IP?#A1's`yudyHs%q<= R)Cv*(IHdX9=OZ)L֣>$m\` ԕm._bkƎk~YP[k~Mq2lJZ~P̟0;ڄ{p D7䆑אE7Bnm6! g5/]!sW%bX Vw_V/O9s5wߺieK+Nw7]W^Y]ɖ]}w{ʛ{EwJeˋsdE?'>s[t^_-ݵҚ<$EsNڌAk;େvWۋpYHSȡ(HN^kgo8=+J^j|УΨ}g4Did2*d' F)\ـBYKȄn'1 %Nz^f-| 5JJ:py=brjwr&/i ~5polzR/ګbQkpC>o->}n6ϗacee9U2<8iHSyT8ÒnYY=o]/v)2ZMHm((Z]rO5A ޖX쭦k߼L|CWOߜ|"Zƴ翽 No+D8ZZ4UYhke.Za!2 6=miK^i3s :;XFQJ~ʩ DT%S?@tabgS\Q[2!$gD&Rʌ# g+jPi]h)Gfevk[w>X]^N?g=>lKhR8$$Gŋ9z#%|In?`xgU+kwI3 gOgpdP‘QjWcV~ ^˳;.'{;Xތm{3>X*W KEru@cZ"2XiJ1EJ1p -\ֈ+ ` \ܘI +*0+䒛C+{*RJ3++ɹ\~>.X؛G?x7.@|hq'GHf2Nʗ4uxgP>o>)Kq@}b_f|R _}=(PAjQql=&T`*g!yB*gk"S{"f|0.v'%R7 ~[izS[xw?/y0hmb~eh6o&=- ⋋7D #+⊃ O}x6VTz`cGń#|:mcN >[ZI=;c(Q$U ͺ,/_O>~[axQm{S`j!6A$*WUvj} ~7 HT%vZ_ q؊y+[ 6`c ۬kGp2i x􏋋j]tUJk @uնD5'# g n[*mYl=k|~ w[WH~_PպPȏv"۵@{/d:;OÁ}{i:zp_|nT(3R w*?]=U}m ݾPɍ zn;yjz'JQS#iWZ%fLY iJ[D2ƒ%D뫋ŮsOvnjtf#vݒU 9O''׭:|ʻ%k5ʰ\ǔv[&wmo[ch6%.W vZy􁄞ͦ,_'^e~a CHgQmR5yb=uLSHS0%ݟl^>E>ڃkrk8>Mzk/=6F-WL~<1ۧZϺɻ;Q؇MhH)m|q4Fm>_c͑SMo {oa=|&-P7a`kcb7!;S蜶gJΈ !e&3-v{jgg]̟Zei'Rk̟w fk~tHWJ#`E7tnlhOW _CWzGSahثCo9.ơyz]酮5};ʎ Um/tZ5wj(Y,tutj+]ղjp8o^OW 3 ]!]iG_KWlo8UCk骡d^銜>Mht^S7tn ]#])X7^p\`S[nV9>jx3yz)m.NXt6Kd4UJZEU@ sRmNiyYa4!DNҷ;h7 t^G~ Sٗ9 [td눮`酮ڷ^lzj(2|t嘥TUlQJ^誡?]5:Jb/ꉮ`gFUCf?e{;+x0SU ]j(I-thz:8|q8;9BāяCf6_F-tۓ(j ]5ĽUC骡^ ] ]ia3sGtZ7{uPn `,tuEk4ơ.Yۏ;{ Uۤ3t "Kֆ^ۯpqqmk#՗d;Sk}}]5es[Kisu6hE~~o}pwW;ܙ5э27!uso{{=H]kN¯pK?͛Yu-m?Tys7Q2DG <{s>3 |gOh6G~zǗ{u5j+`t1(cϢ/rr Z+õH:#FOVXdV&k mAbvwn{H {{r~~z]pۀukfP׫ϹU5$fOktMNDVٱm42Il<̵9GU $-N eHZrq!ƬR֪j\%U' iv:[H#ӝkHGU۔3REC+/}vmMP(]JNhs tDrK dDJ C ,})V1'K"R,CM.*;Vkӧ/!|VcRJ[WR8ʪ%e#oP#&Zr^A0ÌU)+И$7:&UkPt$)Q}M1a]v /x3'5VP4.ޤ1įCΫ: 50֣4(͔` DD/ E樵)6e "zҼtJIdX&!)Q-4[YC.ԎH43,Hؐ Ht=+ )D!; x{(%xu (+J|.KV^eCrJ"rF5+(_d8gVh;DGmK28(BI݀D)P,@W6l EˣcpmV"גuR6h ]KbHK_-lџ1Tl/ Q8 Ԏ^+CRg-a5FD eh+w%CAQp%U:TJC`mvSm[]EC Avym$bzF R u3!q2F0[à@rJڴs τ6сH}%i+UKdIFyTUDr \F8M2g u\GY6bV#9ihV1PT"_HM!D!N?o9`FYn6Vm^aU:o{8߇Zp 9!u:6=X K/T^xi:2QStt)\9hZʭp1%[$;&:/%%) >@ E&rZ&d^e(&JôL6SPpD/2hP:GMu|s*YȩՏDA}X_EQH9ܶ)겒N";> Vyd޿|w<qLkH-ܞ##W$PYVHcޣ. ћ6H /&sP6D.yf| PGu5u $$BkLFQ XRr,a"m.Tu B4:5 )$#XjT3``XKi j dDڠ td..v`m\gRɁH F|d@)Ce1vGycQ5*+YgP< c P>F,~ITbl]N$5%b0~C3`e!gѝi FJPЄ r*M{EE*:ߤn$B{ S4/YeLC&lZjhl`m#.VǴ7痟ˆsmL`ơ+F`%LCSncr`ߑPAhfautkH=k 쥪)8)Ę4@F,.C@773`6,07A&ٱYZR&Ej=PbJxƭyH:q 큾Q wLJ0{ӂ!u IC</nx9/ٹ ʣNJXbR T*}' rk`_!;F66Q+$p>^n*eO-W'g.\6{9sn: +gbaB'B`X!R&+ q#>zg p l2kq,;@>: 5RF S=5<cugRۀ'LA p&}H0q1?]&Xg {PMp^?tB%w#W>0 k 6P ~  :RbF8LZ ^= \u @ܰ6|o{5k2A)pDXbJ2tqTV3x …H`vNlHiU]Mԣ'z,|ŬVO9t>_Άgg\|i>2i!g3,i>6_Gw.u{)| ڶz~uWoxm:W]R ;Z>䔹Z3lIJ 'Q9+K1J 7$)NQ ;I DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@*$ESJ^r^ah}WJ+RHbE@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H tJ !%)c((0^ (w\DJRg#4)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@RkV;U~{%@cI DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@:%ЧV[zSM]z}M_%;f˼*ND \$\Ҹ\p ^^+4:F.JIt ¥^d2%q [㋡+ >S)t`ZzwRNtut93|,1dݰW0I]gS'`bQKQG5­lÞFQ3h_̵L"Qr>[ xcNMyziصӍRmTׯ4WX|d|#]O"r#C5{3V!zb4M., |+6M݊,j%YժΉG=]soh 4s*[VK)ZojR[+) FHK#{nFYYasjkxθөDc#5fw6^2SHն[pWt֘6VwǺe¸}:*ZO?TW8X}GלBVdy۷oƺ"9le9OAHjHQN13 _>9FCWOh}+D)N^ڂ [S ]!\J+k=%)ҕ]]9/XbhU+DNڗ]`t1tp-OCWCozw: g0zm]rg]]v蹓 K'+ʡ+@;]!ʝ,ӡ+ B jJ+DU PNg]!`+kt)th]+DIIҕRiW]`- 5t($:AҜ+d1FrW8Sm'-0DwRd6FIeDnDY2\X^b+&Hc3Vڍ@)ob+DF~7`FDWDWMdDe Q$]]yIrL0nK+DkD QZOtutrZDW ^ ]!\J+4ct(%ypSl}gWQCI wJ3rЕ#s.2KY ]\Y)th=]JDӡ+-%yWKS ]\])th+0}]}3& +lʡ+d)th9}+D)ɻ:ERB +p,RB;_ ]!ZNWy+|uqP brXcL1q7̵wP}ox8ddh | AwH䞿GJ:Bb"h t4~G?2+`Y!nl(]99Ssg$c%m$K[)6 U ]!Z{B] ]9c6JCWh]S<eK.x ~׌;5t( DW7z?`hlCiz]]vRޫ+}u˾V;w+ \b j] ]ZzOW+9 BCW35Y)th%;]DW'HWJVyWטR j+D)- *vz~CϢ Mz~ RJivޱn=2aBƵ~B#LWgY/)͠ݔ{|]y<@m]YRQF!V"jO9 %oKPu^M],}ȍ 1r;^^j7!uo>y:M>>Cy eߓ ƣt|DQ]=oԳxb?l YWPޝϞ)&^,D/=o[ӆhбqI5,tm+\ld[oK^9zEFȖ>\l/jjxfAh3dR;!g.KRY8%j RgC&Yh@,Q,?KM7_Zp4[r 'lm?0}xl[gdK_KoE5 47|"QaI_jgV~4ڙrU =G_/m:O_J Xt6E\9'xqe"8. rr9YQNd 5|eR :\x1U/^M޷Ҥ(Z-ۨVEZUPJG|I 0#r8,x/_w| u_M~MVqߓ&o#ڱaB&6Jζ"dgKnZѤ 6ߋcGKZn2w\dClKMd0q IMHۀyDp㴈N%h"w"uzM޿:e O\AYLvQ]:2}n n3/-h2y BV*q Fh yb|>AK|t[~lV̋g q V:#&W4]`^L $9`F~x,Fҩ3Z>#OtQA3R @e9Ǹdbp8Yh׃bǾ^}3,VFM4ty\LkQ|οB ʓyM}RN}C䏳7 twtpתóZHosVx67 \i@C#]ӮN^m(lPeAVwW۫a v[TX~K|uc)8Ex7eJ۝͑j2@~ qtnZ׋e3M]Y?#!N۟g3ل ŶyTT>=<'G$ΛoyqKo)| c2:Kmc\6D޶1' mQ۶ͩ)YI9+۫b;>BJ,j:4|*k@enF}KSf|'9czRszvyeKٿV6N6FmZh# aضF"3ܾf4x'Y! *LF/+_j%C9vdnYI|mކRRuG׽bv(v:DI]V{f:=[u: 9tUt]R,f%W&6ĺ.˙(zw\~wɳ˩DByT_*`]>n|u}Qߧp_'cIk,4\eb9ݦJ%$U2#XN>^,[_ڜܲgZ=V?XO't)i'\|1 UPى;Vlǰrx3ouUU'PMƢvJ=N^&ů0"{q= [ݹ =(RU7/irԀhpSe&v$jͩ+9I; ++Sc`89:89*{O)u^0Ğ. ~}+NE6~bJY(WNp'!"M$p5Ti]DvNk+/)EW lҙc\S+E)|>r%4<jݿ\[wg0י;A^]P:Jd\,W5?J%$W \1nߋhk.WL * $vM/WlMFA2z]\jr5+*"\+M*rŴb+4: AH+@\eS+Sr5@Rv䙹] |9^^ԫDSVHyAK1zmY b"`RXT04Z1)pgOdZSvs9SVhs?)Ն:'^ h)FL3Tdimtez2m=m~C劁u:ȌkaZ=S<5Dry&$W O%W++U\1e'r5+6B$#W+ 2>bJ4Y\ ]- U=( 5=5Fi#;k \=R= #B2rŸF"WL\rEZb E.!"`\1.b*rŴǞ8\e\{OIXt+žBS6vI  [\1ھ"*$A3t#OR.ɢ{n j $A ! 22l&A.ϖM(YTPc:Ojo+m^?M^k|o7&< N6e%*WUԘVBTicoE)Ahؚ ߴpՙd|7~6+q?]1z9ڷ~>\7)]]tbE~m[׹}8w&zwULQ/Lˢ.'~sJ߿o䗷yWzc'/wdd08T:DiPKkV@5Jh "0t "0-BL<0A1IgqLE֩)=d\YA$W l1b\(yyi k\1I3ȴN.WL\ Qۄ䊀-d\Lgi1 S|>r4:F:z `߻\-WaS0JȕrPK lBrZ'#Wwg0`Y(W` \IE`,W+e*!b`LGWTi],W+RD5>XtqQ"WLmrŔNg\i)R=7%>ox;YC}_n K9Up{wI@ T2a"60~ ]r0X R$$Wh+§"WLk'JykreE-+$#W+Oy1σ2C+焲)0t: Ly1-F?Ŕ:w(W)aRmڽdqә"Z.vbJ \ឦ4m 6}Їڞ+5P ' +rPKQ [)+HE`el,WO"W+JX\1JS*r5@RpLfsS+\1%,W+I*W 1a.+B.W-r5@[L΂ȳS;N3(THAVvS%,Ȳl38(AxI1)( @ <1nߧ0Ҋܓ`O "U d2rEJTi]R,W+'x$!rRNtEO "Z!v"JMʕ kδ㦳Ɠi.WLi#WzO!'<,A^~*˾ J-Y+ꡦ 'Xr%s(Ul\=\h%S+6&#WOE;\jr RgHGl*rŴc+Y(W)EWȇ?dqѤ"WLk屫!ʕ. ,zUm И;,PHd`N&3ʴ~2ub:yb^ w*ũ';cXعdred-ddqNEi>e*璊*XdqM2s^LDfJGe(W䊀H̸29/L0J;C+8\*b\ȕG}%2w\=MoNZ'?k }@y{Ii7\el:M\,W5i[CJj%#W\kvjrcH.BU\1m_w( f\)m1ͻAE2rŸ&i]Hjrdx䊀B2rŸޥ"WD zbJ-\ PsW Z D$0ޥm5Ȭ77ɼ5(-AV,4AFN y{Gt;hIP^|{AWRB誚֦ùt@\wq^KnJd:mi'+"]J򂶈Izr~9W?ӧE,f| dZ̮f^,ï5|˶\O^IQ'aO^3PϯǶJ#mYgʫe&nG`o*~hT㛫GFesJ~f_AOۼmάkHn5~u{X^5^Rqzӂvs?7Fa~fT,ڟO3Rbug曇+6=!F>-o7n@x˃8Xg3̀ǧdK+ "5q$ӖP|(n>u qKSJK ֯jޖj2iuIy뤾ady1-x}}>tƵiһ}#n<ݏWޗ\p%[STT- l*<*5uګ{߷nw<ӲCe 1Sl<ƻmQs[I> 3ħ3#uʊӼY4l,ϳy5bQ^g]_2$ZF]fm{H&M,j;M}b=G߰tߴ_7W9yBK98rr,Id<т9yt{iq,g*$Xʒ:9y;q1ZP[2i(c9Y0mq7ou)s!)`RHd̳ $G 2!D0!V )T8ZP|qkEؠ'vHG7-d VoYl 0Fl$xUWYua ;zoCdV* JhƓ2jj`iF LS˜JAm1I.2p2B,}m\|LVWI]YT+X+-`V7/߽Ǐ2ja'5Kn4'C9X!%*͘B#;sy!Yuߤկ~Jc ʤp"̊0J( ⬤cQm6^f(P.9΁(c4pɱU2E0pX&/B7% ,Kb%6Y Y 9m&B:6?[ p}(Cì^S@U8&<(JTt!)Gc2p-f2cQr8yNЁdEIϿ5kBRp/n0Aפ*F#J_߾ـv<͖_s-Z D QSKZyZo=,6ݍ,hmZ .P1%i7t{lHcL§ /nL)a@sq;]ZAgȼ~F{yL?.)KY:k9F8J*d%HLV+G.s*0?4!T '-ŵae7¤\o7cI?/2j*Xx7CUkVy%GK&1hc?v,֓.M@GXȱS"FJ+Fe9x1C(F&GP8NQCZg~+i+ >4܂rR)i%{%8ܽWVtqYnWY@JD tqH% IQ0퉄ȠPI)ӢЁDItXY൅׎[(Bijrfzb̏fC*(,p.%]q|+ڄg߼3-G4Ivp@92 зf'O!~`Y/[z,V3?j0rTQӗa砐TLCIdʰ%QLjT 5&QLxqOG.ǓDb2acwJ4I)@Tn><8Ru|41 dǐpf#LX0䍊.5 [_EO_WJz\oΚ Xt lzt̅J/8gzPOќ7or{JydNkI Y?v 4 ,9ϖ0jf<dEJ#IrX|3,9TRnCl &7c`bD{{xlo@*` JNyF޾}l"[_air3vLg={au׆%`n]?\'7Z|S_O(U?#%DI%ft2pU *RAtP!hKt4 R<J΃0뺳1 !Pc nMwBvvmLt`nR %'$UM ¶L[?k*A/ :5 YAn Xgy MVM`O^/ [r`j~jD [}-ZZ% Ydv*8ž@TקN6Gtܮi+؋sg<訋cZ-y5c2"RpUX8=HrA uj[̱pPT)b_>,*a6ڧTi;u/q]cTb۾FA"G5%xSqͽJ 菾$[K9;Ou/q9 QZ|q-2k6,.g=^x󌩔f2R-ePP@۫I;0Q =XG9FE(4bYl!Ӌ?J%kd{yC~܅v:ZrpnX$eBH{-BdB;VMY>Q=/'.}yM^$r3i= :dNxw|i0';|U%[{0d.#N̫([9Z5Rv;-;$NK?1=^ oE[S_;ҏA$F: v و PŘ"}Wp`R*+ JЀ<vZm.T)X~t1lԘ'/3_q3_$ gҒ \V*iPZ$`IVP p޻ɔښ'mGn r},X}^fH@v_LkCsdޑU%0BP?s+´-,՟riaB{b6*e?ֳ Ș6a=EJkE Aܳ}nDH򞔷EU҉bvG=6OjTA4c@o=TJ@HHK+(5H'5,@lعwT1;Xmcܫ~uNK`\d@B'%>|˯G5>ΞC$LAW1U[To!J2P r Lwl{Hn<^`I)fQ%GuS,qnQe͈)qa3 +9\49EܗjtD^2cNp//XfPf.J;iX'3]''jryCޮ %^$Ā)?Ts˭$kZrY^k S 2Jfe^R(q%>kVϑ59ZD'~K$~[޲uz;xGoQeʠr]; n\%KjQNq 8>G.x}⑆7 0t %Q"84[Ұ?oU;^r(.yÇ[V,eXHCTzT vh<ʑ>tU `=FrR0H! p(8=yǖ{F>>Ɇ~|l@ܗQE!DC~;"m+ޱ-'c%Տ[}޷-lcBb NBW AFCLuB2;e6F8ڳY1 )7J"mbq !6aP@USW%̙ly}v4D#ʅmU_uwv\iiL/c.s*og׬kɱuX+?k@T^=lCٍ ުZ?猏R"D>I@1 Ez daXYHQ0~͢Toh| +3,W)z˱[#sjT]:(|L5\Eґplأ"sxXF) XcXrF'h `)^?͙[a{ $g{nOFS"eܜ*L=nR 7ȫCWO)˒ W3,ItF~~>~3-a(thG\ɧ/ôӠ7iu xnѢ(aLCJJ y"bati!rE F ɱkTC9آniأk@0Q6 d {ӼcB\n$#\tSKh!l jֿ^#v ר.]m˨n!7TOG5v-fd$c{%5?9* {}\ItYxrsƑ%WI3"R0̦=*E]P8U/hdrEI.W 7mUT0KvZ_Pbޘר~[,aaSH v r ns:\EZ2fn }N1YU>T=È1rKEa^9 l gξ*{뫴V$3ZM}Sq~憥Y汹.t0%q8PcH#p/>HYo;c0{4(4oarBE$|0o_itۏ<}+f٣^Xjũ*ʼdyP_ӇlFtⅅ!oÿ(ac#v??t^֪~(=/y@JS+/|^;Wr [.9|JGNyK֧~ ~ې󱷇;BToi^} av+V%rR!@ WW^xW'7d !`춥Kqj@+Sq7( GƊt6| <)f *oxfȳ` *eJ>+Y~VoDZХTVh:qe3J`.pau\S g[RQ&4H*Qt@%ԯZ@A2 ;߁3EIJJD쎼Q{|FW -@>O$w VPPFR#)XDVp0HgRνΗߪI y)zcBcͪB@ "`@ot7-dC-xkmPIeiIqw>+|@h-(\_Y A(H0b@QBa%4Y.8+.ŭB&|(*JLC@X]8EHuxH<7$ݪsH:bp"F i {UG'B\ jb6qk-0˥2f A>j'U) rngy(;>{okeϬO^jأmD7Al0u|x:\ hpWFqC*)jEǣ 66)'L0yptN:(!|ym_V3./=A]ZbZ r8>ΕЧ𹅓$ڰGC\bܝߩgOlÞ4h%̶*Pň7:%֔~Pb y:o f_{EW Ԭ_ q-^b%K2`fGwCepkU J.O*Uk?eW2 PAz9}O4&X8v֔r'M߃_'`UV 5x^ǓtGe -¡U/RI_vI,ʹyoDE+ë۠ zp ؅vbMMVwk3W\u yrV#ۭ73URkH2r5'oB^4ְFT]րq]"( fec-aۯ]zie ?m2a43֯[͏}UOIQϤԺڹ JA]XtH 7_t -IVם'I>#!w֞U4q%sQV7%oeo}aKAaQ@5[`a4`V.{lpM?d:2Ӭ+Ly_?TdAy; F쭜gna fwvSVa>Ku9&(%VCV;#LƭkN;J[ ^[7jx7ܣu!Y9Fir6 %RL*^ lFM^ Ƨ!y[r!g,Zo(e;¥r}/ qy0z#ԛzWaIw"mHoأ8.fU韬<\D-0&VS`D1o]rlt^Y8Bos)b\b(:]A8۩;*٩ QW6IkFq {b'rZdlly9O+ sHYE;)sµ)?]ʵ -{J1RKVrL6<$QQsZL[e~R E f8%l}s5{޴A3r)$D86`߶#Y5e0Ӈ~!O'Ӡpsrs:.;m20+]^ -uJңA;}ݿ~De8xJ*1w@&?+#f!**TPVmίPŹ_kn}!TBdZ?8V0?-!%,{ !(}_ $0]U^2䒻v[w\N9Wu[a@ /n;8!5GO5|ɉ"%˽9$JC\YABޝxtE8tӳ<'đS얛*sݽ/xn6n qvTD*{Rgn!,>Ю,]PA(c$d%OTMlQӬ~#F5~&9Uyd88dÎM0XlNh\qЙE̽+ؕD=CĿuSMB asaC,sw‘VdK+..($tbʃ{. TΦ8񗴨g.' -H>$+5&h? xmxX٫6ŒһM1hkV'fBcǠ!N\" 0$Lv. βc]z)y&ף銑`Էto1I̽B'鼝ψ*0vn[jeH@Zٽ"k=.F%ɊOeC|{BLlr$l$=@J]5DO9T5EnG jyvf=yyrXwc9T εa͖|l&Hit8|Zy_&ϗ>ݪ @nS49[@p<3sԈ]Avm_'_s;ha, d{Q Xt&ѸXz^rhJ4yjLM)*o?~SH/<`ap/<l~0K4bYQgQ.g{AC?f6nh{.ph@}=d_J[O V 3 PQ[I+  Oàh-ZUXHnᣊ$g]m :z!|dVB%qAIBC:q-C)e, J`! —"juّ `6 BXfV(=_}Yā<2Q*cIxeI4k(G)h#[e r@ELt!Q:dRj hNX F ΅tC*~h?X0[9Ĕwɤ7"q&gRE (q!gO;{fϖY#)aazlj=}tZx|JM%bBtc A2-E.CNUJK:uȽ.k. f ~$efvfdJ$[T2LkL3ʬ$;T2&9e ;*֒VljuzS}o2K ^R)C6GWHiq+DKCxwxLL :쪁@ `Vs`qA`}嵖V)*jx8mK9c!t \HnXJ1Ǘ-#F tN%AZWC2V;qR`[Qdg"'+F;8Ƅ= hq^-soNbA"48RBJh `GIa ACKFږDŽ8@RtTޣL fC)xĜ[\PH'4+p۵T\X*"NlF!R"߳ʗ-ˈd;POZrѝT9{ۜiqQ!6`J[/gztF `|'c$'>νoRG`GNsD0rKua<,aua{F ! ;{ECr8~JRD5SMҐMg$S/]U]JyXER!^rGcS#nSK)±P>n8^Ȥ@k"ؿWarX J*1rV3kC16_nM 7=Kp4ןot0e.Na5 Ĝ~?j: 䲪Fojp5^N՗v8trVUWӓ\d (U),r"9g[r<qze}oszߐ0qN`wLYLε3D'1]fy Թb՛ty_} Ş X8c0.EB;<ٍ^Sұ.PNQz`8mCPPex0-CkohlXT']_R(stϧe21e]/XKbGp9Xv}r;J'9l߽]nC姸&H|2FPH"E.ZDE.0b ޮ`lbgq[/NiW8]$bDQ1&j,d9 F:R$*edd 5xhZIAןNz@ v-,Ȯ(U.Bu޸HQBY\t}7߂VZc";UU948A&[|Im]-֯mɑÞV Q3;/vP]u\{Iv2r [Uߟ*T6z5k48&Vtƙ[<шS#b)ш`:S{qQ'!Ӱo~d]N@ .yUr&=18^l^ZU5Xٴ6*c4rWﹹ5mڂP԰O[E"-q 0~s] -;/*6f;3n49]l,$R+@ c`,ncsg#^gl?E{lS/o\a]KȄwz&OyzB4q,i> xVG[I!Q &wg>w gMU`iQ#2ek^-|&?HR|3hg϶gx iw\d3KLm ~lu3pwn2lu@د<w3z뷈*noA6k-<3!&71ӂ!GEZC٧0[W!+Nd%-IMK,B6BCS VBR>W l6 .:R9QQүQl9c>ֳQRLpNhj7VNq$[d%O\yG+~j kT[_fTVkAߟNg-dҮ-^]=Yl\~"AZs>cp~8gZܐBԚ2bZ?nS Y1Bvr/ӞO d}6_s~ĤELn#評p!a}b|"rjye#dc];{}pK- VW߿/'i0+oe}s=(cs ɷ'2&fS YGD5 T1L,@UR昵 ;n 1,Ȫ"M21Z-0SJHJdԂj+|rԀ3<zW%J>+Yq.Z|+yA[%I"UDAIC1(+iԭHN( ;J@%lwӆyzu/a)YgW$ P\Kg{h q^'a $\p,v4Ř^]]mYn `X'4B B-jM;y A%B O?oՎlK3^z&W(ˤ$Y Z d>դ5*$qTT*S9ܰ#%qMyzf& K5mvՊ#|5U-ˍ@!a9 L3⌝`9?&οuaSqs\#rbĠlߊ`@kt1Jj$6{\[%T<)hJ\ !׈qS/o/VR0b;- uz|R+q =4 ;0f-h]Wd/VH,SOx <(#ᱱ["i9Nysjj;ֶu AJGr2Lc'L ^<:IeA1+#U9ijtH'_wġ)t}xY䙯8o>*g>oJE5XC޸H8cY-9a%g<+3НU]`nVIm7Վ@"y*%<PōjXW/jdX{0,'/ۤJ:>4 8G$jnSK9*o+$ToU]vB n(O"Q!!nE4qZHi({Ot9+$|UOu+L_qn!q#v\36\ 0FC0rԼյ*4ȓD*?&JJTKRB0&4; b&Cj w"h; 'LXHَW#GiqE}-Rk)he*ӗ{eBmE2ƛTE.G6aJ%W>|%6_j6=`3q"f,y c^q`W 7|Ou+ʷ[|d@'ҩ6!usSjl8majdr#oq)qg&pRu*r`J>.[%gBt5ߵHl6ϙ>cѫxrZTC-P^n<:I'͌܂9 0\*1r)OV?\X ty58<s.ڂb^+ɸkS9h>f4BpJ]K i g⹾Vϰ+8!zm2ozщrV }9ŻjbQBHpB'`7$Xh>Va]3 TALVә=OEV_%(esJ_qP9( z斱^u'>#t܌`ٙc?9#Abw'Z<ۘ_ ed/uqk d`=YjU̳B?LN`~>O1cp9p<`g\tq>E-d5ggfs^P8VE͛ b7gG^YF ~X}ρ[~MuuS?>V Ϊjzr|L;JԻ<DrΎϢUg_>A_:;, &1Lc8lt _޷A$V#j VI1;܏C88pߗ͍{^f[ӛP*Gew:0Jjߣk=8tijViymР}Nʞ$*eddd /0 "'J%-($,جRE.hX3fG 8p9Fa0G086$}2Ž@& N4x0ĤgV)}IxB硪V>spIQJ"qMMEx9$:JXZG 1JT,_ p; ףaKp&8'45`[޵6r+"i^d Ɣkd}ݾShܺIN4VKe6ND=h~u@>ٙve@(džfǃ@ esb+$dlu`cምT P>,rQ)#Z*Q5u,˛;<++gԦbgnn FPS~U  Ɋ|v#AظV%:Ѵ.ZE]l7]Zyaų62l@ 冊.$VPdv>4L(<jAUYA$T bWh*Ԑ b ^D$&)J4-ISFZ ْZ]bzwy=#W7uw51/tCN5 ;ؼz XNOi.qJ / PUTJ)&i! jiU+ $L}xO*hn8rV/ϣ4$Umw 'Lrdy5Ц Ttz<^/ub0>ǃUs>g3Շn̝[^l`Wlmo p3T2== .PIٱx>ZRK,䐹Voڀ$UD&%L,(m,y=V{))KCuqdS;&y(u461:a*jJ6 *'f)c uCͬsA3wnEz~|4._kZE^-Rg,M%ꃸIKl+0fϧ5I)/N@)&m{H,?T?5@A@= o - 2Qbk#AZy8L,C"6N3Q'c)zF*e&?:g/crg3TLx]&g[19S-+O9 79oV+_>e2/,9S8W`-|2=0iQ yϋ.MsT"1iY0'Ɣƥ-zZCwd%Ek[Idj RBĤ iZbU=uւZYYzs#:ID·#3=>*-<7=VF#x+k4҂$4C}YL&dk,ص"*40x:m`3j|\H9F>l05nq'T5IiQ<ǒSxy9)0J@a[`Qd|(=_s1-҈vUQB@T1$^NPh@[ TkGdlBB+c$_:|>RXn0_r^k2jMSZW p4lSӋd;h6f̄dy+vFWu mcup!z c|(PާȧھAGDL>IȞwv|z%Yd]j}*~Wϥw9rhy&}ؼrakBe&є>NCVX -yӤUkޗA")7}'qF&M 1{.kd= Tmu[(̊I$CE,eX-S&U nMZO8͔q/f>o%sy6էٸnv^i&ϩW5NCjָW~FRCEHo3樯;Ea0E EKo1.3@SF> _'X>>+v4KUt{ RJBc\}FV~ y 1@ rkeE3yzwڞXgxu֑(:@>ZJp͡j-gc90{HѷR@߯ԶOP]=|Q WfHͲWYG!R?_1%m$OxD[ 2wQ Ѣw bJ"ˌ1ճ -2}l8jZpS9,fN r_摥:G]!Xn2t-2fYQxɥ$&%) sT:+dFd<+O+kLxi)/Ewnxk⻁qF 5Lh2\Y4 d Djà~6'l$*\,K{Skc:^Mhx@aSdԅ{sZ2.K,,nm 2FշM4|c~Dӳa"Jo>y4b ŗSٍKՁA[UX١2WXy-6xen6tӲ>4ӡ-lT?_Bo)z>lg?7K$ /ҢEy1 #X%ZP,O7_OK+Z`G }C>qGm&j@jqy1B6̐x&)Z~kC2yM{+9v:"Fz`TߑhkvC-bJ#%؊;҄uIZ]j]/K}kFX0 Š탞흡ǭq`iܥq}< 2 B BT1A^ PI켸93bdP&-GAX|sƆ~)u}mGY>T<}->Y 2:] jTw(,]>BBHD?h?r3S#`y(~ 9nry&uepw\O'rMpU36ծ`o|OcJ)_y[ k _?M~[.'u~_Z|W}7zۍ[3ո٥lƧe1?NvIȘܷLj9_V׿.Hez'w.qswlr=Wds̋% toӢےxNIM)~(Z_pýy 0*\]=%Wɂ|gJǙ!L STc8)ʰݚIbwQök݅i>2΁K8)M*e8SaoCD:NsAСc p>V$sƮѝsAz^JYl&I>C{Ǝp?}-P*LWiؾ9"3qPXjtYܞ\`齽T9 C9mt>)LUǼ&,RT}Ezx)7qlBk㒜he2/$<93HmQD);U~$r33hTQ@2Er*f>[~qU0D7CA MIa q%c X :nj*/R#ՒKØLFZQ  Z4B[/S~i1^/5B!0+꭪*{}9X_e n=\# ڙKU"+&IoZ!*05M$ 2˵׷&ȷ+P/% ]xrOn9 q6*). ĝұZniR&ymMTZw :"_񄼔4s#[s&F.$&!wΦoΌfȃe=Gps<^ۏ%P*iW}uJTPQE*y\YReV(.O%/OTPe\ezp.0nc\"&1b>Q\ɨ=ZXp1icGLo?P;m){< Wܓ1b#&WAhʤmX&0BgyF\&%aaW7x\[-d1Mfku%uN;m "B/xӳvq'kb BCN9&Ix"-Mv*:ʐ/3W|D7[û.M5>)E ,;;_xU@ 7.F]YUjl͕Z&KvioEV. zs43Ey4_+ p]}PR+d zL*WKݧ AƻUoh=઴yL> 2XA L89dp+ _A6xnNު9i-+o}\ÝiI;}^nj/B?tv*1[P#i=0`ĔLsP%O.nO)ҮjcKdKʮ"%gv٤PO}5?.M՞s\vRit~}Ώ88 [|-\C-wqq̧ңF@'ZO ׍ UM G#M UdS{OG/E@&g ?r6AtpƂSlgL2u>u"Xj24lQmѾ"C: a2ntƣ-oE`lNԶ#кyfM4ǕaK'=:}deLsǧl÷*|,}I?s7cpɖ|zɛ):;;{'gٷubq25z_==xqg(E>Yfۚӝo]k*+=]a+ݏ|\w!g?\ #(Aw{vs.:ڟ]lvlVx\_h<1x8rv$J)$˴Rm1$pD% ׫@-+1IkmP_]3:1kћ6g)].c*t*D ,)_;˰uBo?@ƫ[;z{4?8W72z;WJcH 9y.E[gHǜ^)`Aɇaʇ^ x pL}S>tʇno֭m+D="8~\m]FHtcF_Mr).wGT ^)`"ub#ϘRVŠ_RFP!9k: l}4dYp ұ@4BDF\T W͈Q^o.ba ٙNJ 7h7'XOr{LH2/7G%Ew*@(aVwPpֺ,Q3גt Ÿrmf5DLAd(8 m^#Bhh[U%~C5ju4sݱȉ@]qnEe Nj„=֬w7).}Rotی, 1 A`=rOF[ނ4gvfxE9}PH%*[S 8k(Z3نbTc|bIQ 1kwdGu4i!b7 7y9$QM 3)QJAH _ MN#>49~lAaqi&Y lus%FqM9P>~J1]a%b׹@bՋ T/6PCg1])Pbk~慌9O3t,5Od|K]A:oݜ m͈UCwWU([ ّ_׍\*7>7%x)qrQ\V|!J, ) rR{$#UH*c7p5ǞFEŸikq! 4CJSHJL). zc(sJZ<:?g}m(# Q;C>^Z(J_ T0(S(zyY# l2XmTs s#ȴ$تY-(NP]xh (:dxDP"XLXmⱵM:XP zej.1 ib08($[-4٧QTEԗfa  !F,:DϝxZdDm4;-Riw`rNP oSf f0}Ap"⢧BjQ[1 p}5׷sbk{[{]$ٹBtYrE;D5(A(fK5rwFD.l(Ŗ)QsyѸ(E:1(p`V9EŤ2R"!%Dr1"iۂm s@5 j@J.|մcNBq{l񀓼l%qS ǧo-uXn41Gqf^Hx% QL~#e䓐5#F;A:E>E9rr: )uszKv^Znlq(D;<'FqɩDs=9 ƕ_FjW:cuMjsTBZ8uL$9f 0>Z]ěj2ǜ` >Ĭh?9j3gtF~j7OQm$Yq/5-pO5mj1g N]qh߭  .dGXA$,=-{d.|\RG1?bﴥSar‚v*B[72b7N [aedWm\{3;ӻmyZO0 Q}zo>y [)!!!^MN[N>tdέ,ѵ^C;7U~*47k tJC4L{Z䕳B:;986~8).-eIw.Dԕs k<ݏ)W||}y̼km|Ձ|7:⛓u޳JD$N狋gǢhgQ?;Z-q/y4;.۷r}w1@7gX $wyI dKZզIcw&;o稞dcuy͗QdBd65M9%7|`x~~ | I|BBˊыDj"BSVSnY%`֖P\k?䐈bj^#l . ~f^W8䓠c}{5f%AEbHLCN1RD64ld`L1)IQCw wbu7u^w &6MGT)fv-〆3bkj UzTs(< Mn~ ]mv 6.w\B J{ގMתq^FAYeSrQho10xPE6BZsW+$#e|F>r,M:F]I"|;nJMuZ慜TUkKb j?l(g+_{]Ab2d>}Hnw%f[Qw?03v.̤|'넽Ez/:{ѩދNBرV|xOr"ޯ8גzQųmVk RHda DQTM0H+T& .v\ΐl{Y9X9:6kH-~&XK있ȣ˕2hJ%jQh2h)!&XC@HM8 Mژ6qkЮDَn|T:u]*.#T zȢ ]&CŦc!e65Q{d]$z5/93٬- [>W"EK^= >4"`pFu̫MKa8eK8LjPFԧpQ X'+-$Grgښu_Iy;&Ruv֩Kl>jhԣLA-rb\; :drM]{7ZN݄S)'(mpǔ":gZ5Hj"l6Q]O(wW>dMLd2 &@6n@4%Cbpz)h=dп:wIذA1t9ʢW<Kdj+_YϑwQ.hl\qrYD. ڹ`mB7%):Κ\.l`G{-DSKٙ.*9\e1W5"w~;)/SCF6G[\ 0Nb.#h3;f؁H-TgL)h6=. x#9I% Yna#[6c-;9bDp31w=k L{}Imw qX>~q4EOSz<|<& _L^GCE0Uo"7>J2\}'j|O4sn{ɓ&j\I&`(oh8dQD1șZܠ7WAaw)H(>&d9V~ zbct16d&hƷ{{:hC=4>X[)bIzdžfoT_qQsiK6Jhszxo|~z%YHl- OGBISlnR8nXD,B}W)\G1SB&3\"t/k.x=p< -=G$tl "Yq(-L&?h7lcљ.+#b/;Z`o0Bp`FV7~Kfٷ4} b teأNYIH7b:/Zta闟({L#zpm xtތy6s]"!cM$6Py*)5[If?wL*dk@5Yp |FZQ!l׵Pjy)d Aы Cz̳ A=s9x/=]vVs޹)U1]R~bm@I td鏆!{x@(_3t o9̵*uOMx3,<$L.zWu\[I޿'&$k0D\+̈́UKo?N Geͳl_rzMZf~X%WP2N+s3i Zlr:~yj} Z{uJvq0$ .~'(6{>ؽ05՛}<5=[֐#>_QQHp{+ș{22E`Q8V 5$`א]<8آ锁׶kLv}n!7=\710oL h +mxI ϏIpop|<^͘g![gVuxtFǭ$gp:P?|+AnziǗIʼn Jf$7-[m[x]Ƞ?Ծ!-xcUW- = +{TdwhWdwʱIE msWd"9Ͱ[w켌]7oFUQ8^WhB}+dvw͜TA#"+[݌}#`5\*aȈ-"T8㌞2S[Wp`ŝJoY,zjiIC֞T=ksU"{ʢBB*!E$EBF+J,RCKI @h43HrM=PI9VHj⹷*ilEh%CR\B=+^R̜pTx֞a޴&^l%R:ahO}#؍(drLdz<9I9QԞLujA {C=83`,HBаyf2kI4:|vE5oZo V 4;jo;ZQ FpA\w[[V/}Y#8^^q*~,A\}{M<`ap'Ű1:H諙|') !I;y\O٦\ן?"4=wKE 3jm&cfB!31&][@))#6b%AW9,7".QT} H d4OFkmEVc#i4-'\l ֖=C!so.Cvv|Sl6q\1#1@H1Z؍[S\KKFBOъob=00lC!bP95=q:pxh+G y>(ٔ瘩 *w%4Xru O%̵ FB/͟A|O~rpttfixWŦ)^aj8FJyΑMߓQCmF^CZnd%VJ)qZr(kY|t!6:d6$S\AC90p' ƃ( ޽~y)ϻoMcN￾tڂ+ONdB+NܓaaA:G]׊Tql pw)<즮hBs}ds|1:D z|[)mba݄QG0e=MN=tf0ky6` aC5Qhڋ8 QԱh2HrrШV$AgR1~ի#Tϰ&'3GS '07C 72w4n6`l鷂' >>|q+J8˟iloi4K.۸pTJaP7[>%֜&i%&Ǭ GjCh3[  F27HL&PҹtÄ>/8LI`/RxdxBbݼDo(͐c>0cPr=qt\a$号¡JNw_]\hTKi)Ar?(5dā\ϣgD6[;IZtU7r2.9c+=Vr>}"{3[dѻђק\LS6 c.pZ4c%3$ >5/qoR,ʤbf`ɗ0|AX1u@d-~0/1镐Qsl\n J9b4Ed#;8IE hDG ]C &Un®.RĦA;6b(Հ]rMU5۳5P#8Ƚ暲ɬB%ߝ5XK6yLczu_ nIIy=/`I0I&';9~)m]nDZk``Z.Q,ř*f,'H_p2U0rP*E&+DqQj9E EOS1t㦼+FNKn+G$ە#Wv\nʝU_>~Ǐ˖T9˾|*i!䄵&;/\^M12e`h֠nٺc\ĪQqީreIv#^^\b bE\)˟[D"@@`á1R3 d[̈5PᵨL!5sRJǰHZQG/@nJֆJPH%*1*ɬ<ʪclIRK% = x¾` = lzƚ&}j䑵Ϲ\p߹z8WY5_从Pxs{Ti/ߖp3p]X-僬uVy`bTZBNi3 2~U9wV:/3ao.y1䐽m< )5GG'bѻ$Yq#|oڇ|'ɕۖ [boq7ǗeތoKKEn#`ƽZdG^t?\y} €.Ƿ=O ;ʍ`Vq4|{'vehR[=OT q a]mp7>-{5ME屵[U|``)KOJNcy4& h̡4/ -S.sfg T4T]6q 苧E4V'񨺱OO?=Vc7l~SL8dOYm-cUUI.5'K TqҘR2=oJ#%iNè`Nj4B<&2|%-WNF,t3f?GQ]q1[f2toK&}cO;}.:_ݿgJ뵒qQ<">||Tes}}YB(؆^<D ` \Pd&N;gQJmhw'1p/IT 1خFǑEP@z5!p!S-U*Ь:XTTaքd[SM&F5QXE~sg)tcJ,g6jSk`mPUЛCrAMQjofSV1J CBWd0KHJl,$y*񞞌j*TVP,fET='XSj5; RUsIM8}^n'jz[C=eHGRqbKhXx|ѽb*-'T+z Y0roB6CɅ"RdaU`Bk Z*51raQ['""J3T4HmUgQ̖<fKU7.R>bgPS\BF-*"Y ¡pQyOm MC*$5J!0n0jAh\=L"* Z*Z5(vM$e.KM7;vj8%f͕M(#ЗԸʪETDv 3zO&Cִ$us)u##rA {@807RW}o[y#הoexf z[Nk!M~|j~c'+j"L<>I5S¤KHaV>]LEК8B)=HT%B%vcܓq~:i}A;j?'ӌYkϩOX& OT3 %z=CIZhZ ?HVH|<$D"_p\! 䛌gׄ83j q/60Mɼ WL|@&~!`w@,y9ڕX&MZF`pr4ڼ#ԞX]{-:'x?ދ=CLYrNoл;]3 !'N.~VrǬi&9$Ѯ삚*\лa@t4gmh8^ W\C?7?x#lNLx]u35:% ]!zap`IyRPxBVW~g~T̃ئϽgZhot `yZA'נgm 9@'gf>w/]b@'v"bQ N '8ˡ'x:g`B ?N̶̚%/!4C(PXֺ#Ğα4XZjX\zxt&/?dK^@IwPYs m9d09\fm[f_Tq*Kwral0*kW6o H6fW«$ƛݹs,rV~ѧTx-6}.#5"z_Q]1:!vi7!g^M_ܴ(ȴ 3M$ŲWn5>eմ{.`p5*UC;b{Vjڭݔ|Cu^!ڝ2/͋En} ~Ǐ֍sїCR;*!VY j8C&dihwQ6XolT|7tArMRA71"G>ݣpZRJR"?zALP@TG ~ 5.ЬPC.IE(AU)Gv(3)ƠM<8Q #p(9Ӡx`czjv`eg"6/T* G$9YjdlT=߻X>[]DYUۃcѺ.sb{ xž;xSv\<=10CxDƻ8یUQYi׬6v9],qNbjH,xG Oۗ* F ,r >dIY2Zxs\/:̝EO$;#cDY72-cRlM ޑrP`5>ʌ?1_vrN w-xI<3Byuɥ}#[pPњcr,|Ƒ=ɿ&|MiIҝ3䊏_)>fb+x.`@F=W0T Cl]xaa+o<vi+J0y|Mȼj}B[MgbJ\8mgZ{:_!e\/ؚ'`g 24#O՗|%%(f;Fl>u ް: pvKM%lשQKOM 3_!HO4Hav F0-av~*fP*BjafWb|S ufp611#f![bw[@i%uZn-tD: :ӂhK#T3Q*橂45Y-ƷfCsj|ވyXb =*r3j^2R&d4܁9@PbŽ%ZwvKWPݎS;T0|šDKJ=v%B2V(Y} DHW5޷ĵ>wJ(B(MoG>ցvP1OFo;ȇdxc 7WQ5WR oPh{%>(R|tԾTR{Yn9AQJ˯UZ~51,7YZ~m Q ,5à P.A0SZ@F9Ò)5%s۩(8S04)= 䬀ojIG$F0WgcP[?4ƸL/+C^Ď;֔m/,F\Sq{6;=Lf :%=t͝4DrtVSJ._el5Œ|o#8g{gxUX"m\˧ qaFd!^~y{-P\ dogv%5JPk k7}n)J)0~<\0j ,w U8ZKCA:N_kl hjÈUdө(rGVN_Z|Bwfe}za|="Ĺlbh%Iuܨ>m5IIќWP TCCcJɻg >ͩC1|b^:̓M!8lH!Ձin}rGGd-Hm7]"neT6*PbRFa̔Q̴s>{d=M'*s{wHU`#͇tP"+!{OIn|dWb1H $mKR\KH*4%6RhC+/7tvɈګG,Y yB+ ,aqA>eqveBrEa[DQDQ*ꀠT8GR1L%(g0DXdnD&UEV'2J@fVDj3 oCURB| 9P?9l~'݁,^Ď{Fx2^d$eW@Tîl˙a·Q=oR;)VBS9 TCOc|O <S|m^S6{awe@فcV: ;4 gT^lGZbI6 V'qS&QEp8nOR($Qp`D bgZT_ |u yКNZbmaVKe-ɦ bU  N?q㽲EJ;tZ-%:Y>Ǻ3m:s~eV,ף Ŵ6^A%Q\ƋM$"Y t1LދۡR1S&.D1pPכdF ^U7|c%>8ߞCAH^xNJȒs{;T^ÑIƠ`&ٚCl؅`zCi57^Ov'>N\MpA(f SQO4' ^) 8n%Vk5S_\5ËJPe5_5L 99͂L3ipYʙ6͉h'b:/iggk*-Y"[%аu:>L#lY;!^Y7mO>zZ3.gd3q ~X%"rPEH)Y"KLR-Y"KpG?SV7sJ u);[AJJS/"J5d )8/%Ց9;gU}Hכ <`#"x_A&=ROv|8ÙKw8}f h5: qMԦt \I'4ܢ|Ơ%ݘRKzp!ak$rWd:ކ9BA=sF#)ou! xhPvF>Q;t9[ԧh5-jԎ L>wUW(Tt[(74*ZKET5PBEP]F7" g%(RhˡjVd k4ӪP-h[h-c>Pʹ$%*Z"|(#=;k$m>i 0v΃8M^=#Gq0eg.Ljn~P?s1Nr.I8ɹ'\ xpVi&0(r-V1< $9)8õ`H+Sy4ok? 5~)n2]X?]Ϭg旇Eן4hj7+T#=J=93 IP-(%}8V@(E;j(sqs)װןN}=jwM_ϷZ?QFw<8#Nv MǟLٳ';C@)i֚#М:&qB TM(6,yj4?<9;_[7ȇ = 6`tk~ϮgT,O`>NEeBҦpKTDS*4 Gc}`!+:SBh{jp AP޵\֪٥ f|Wf 9ȝ|1BH C[ vpG=J&D"4FI8!7\]律Ix_oe5[G1o.9/bofUV+J{i2K5%ȒtWT;Gl}-ky#3 򶩒~y)TtJ1u::pD q-%!ŤR/jy%v!XLu{&CY,2[LĥR2j3A$jBQ.%Cu"`-ĨEUg1t?MΆs="iG%V;h5!^"g<*!)t:R#K$q@[fpEhOE$I\/`#O$: ؐ*F)xo;1(m+k%;+@)˝-4/)W&πЃȖ&zB%=!9qT M ͹b2/$'V0%E$*0` ֊I4/wYm0Y1`RΕPu#M/̡{CPJ$׎Ÿig)fdLZzLj%VcgivFJѝjzVmV{z_CE mOq&jeɕd'i~%+je96zN+ֻg! ˥ܣ>}e1tػfR_3M$XP5{7Ұ; ^x;.¶w~.` 8gG}=]3nWm.H4OdTvr0[ ˩Q_# rX$1Xe53$,u9y&U#^*g0[Gܡ7M?h1.k k4yLaZci+HH)rUp>DɁں=8?m3^)9ܜyܜy}sSi{v;S SFFy\ΔYL0;GC;:w&NES_N}¬(B3pcsIیuv%RG<~g+^?ϋxpyަ>0sbز15;7\ǂPSArA8f 6 gegq-g!#S 7>ϖ|TUө1O^S5 Wh-=a&W eH=EaK.N˥It@jKv*tS?p-8oz!iZe_R a291WJH7N^;8l!kfd0ݱV~T?~iТ^Q`ALz1ڭwfp2'C`$e͝m6~/ԗ9d Q)+I)W&Vimc%x CSH_$p Fmm͘ȯ,grLeLs$vnA-]SIc~E'm (t.(N\tԄԟΑ"y"$Ϸ4/@ &6P^+ v&@c)lכ}"y!Lxf3͵8s3prUdޱ^sg1/itO4C2j(D=h@u[9 1Bs3n Ly.>eRRRS Diձ2 ʏ8ަ -qB'+P[HԪI" ܖa#?6p|,AQaT-?iHn:ǔ[E b g(Z{Ms--͍P $9OS0B4ts&`$$:Xhr)(UJK jD6PhC=B5D;TZy#9n@U v8 +¤8d3q~; Js= vBx~E52Ps;|xw1#oOpa9꽿\z[1jy4Pw:'? _}}]Xӳ)? ܝs !B[>#}pN]]gbDZkF6.up-.Uwg`n,IԤ~8:Zr3W|bl[:Gųe*Y U,_e~s4 &h28[EB#qP!8 !ڜIW|svJ%TgJQr"Bb*5 Pr\#:VJQm%C%@&_.3 C|,p$&,5 NHN)rd 3%tDɘ֛ mOj* ]pr˜0z=FqUj`c9P) )pRPkm$.tnsIqڻB{&%@Jsa3Ca|,_nX ]yk kp4~'aܬ56$4~`.1k%5E\aR`0&;k3H{kέ.(,K؏@+ưܶ l=xkqr\Ȝ7тY&q5>XH[0E2D5Z\ca}Z/@m6XkFaR]jUNV+;QtVúӨڌ)XJAgTUiQkÛ i1V{yu^ a8 ڃL ¿>8Rɍay5 |UscE~/n/ɤnxhе@_hܰȡCm' $袖*b/F6abBu Կn0jx!{T+e(@]ea,cҌzԘ".3 YCWHV*R<}sEf0Cc/J͙rW8z$t=@?R]zTpkF=J$0OAo@(z x{^)\vh{G"KIػo IW~d7@ &tM(q@H\hLKYh b2U!.y!0rsCt !Lΐ%ܥJl;5Ai4Q%Ҍ@sX"s/B楐Wވ;vex)F܊z#)WrqqC#>pr@{Q)dք Hk0y&eND)B Fhp5W乳yem!elGB;`i䆓.Y1W>^. MBjÒVf 5R }yd" 8 )dPRmć*Wy]%|/Bm'ݝ {fyq'<ШJ㝿,Ab";yoxDKEfߏ,ӒI/ :+:wT3,խ$SD\њq&ׇs ! `=$.yC :mWZ,1,>ܾ6ĝL/h3&<975˻6x EAgZ3 LFw%!kN:4fӲZ}ٌZ$iHJU ?$~;?ݎ.?&S?V 30N0{+`x*q^o&Vj?tcuJHf޴e!({ځ ?Ň}0> #P96v:wVBۉ̳`Eztߞ=];jbNNnOGug#n~;oylf=75wX/٫,?.0 >9~x W1c SLY^c NL>_7?cr_ί/gS;%Ëy8+`{®wÅ_Ҽ+ ް?K5 s-KF)qE+Y1fTҠ l3Cp]5K ~9/$JS^}=7Dn.-+am05;ݝL˙Ѡo? >O0|FY4LPzԿ}~`2={x[ո Jw߿>|1>!WGiےs¥ByRMx6>,^ĚAtOq<,olɚF&1Ѱǻ|UA^qyl|1ׯ}9e pgXff@5ʧRS fU !ѻRGϤ:ʹa4eX`]b&揬)JcWR63g9`6d}Qe;ʆV QtiKw7+a&, ct62~<)]B!WȀg'mfo`vXoNWa^f-c&ϛӝ.2c*3eH)caFw䐙k>8 H bCZgHڳ^Z3Rga==ŧŧdUioKZ Ѵ{HHiIHi3!ly ifhSVEa6L10"7w˻/jyW[a{ڲ]*0r׃g ]ĠK!7 r,]*?Z"%$H 1c$fFY \=]w=BeIQ {_M)l vmAT14f3JLL.jť^XڃKi[f$|z\LZKKD#$]s ȅC)8fd`.߳Ln.y2^ؽfFh֘@=V_(o :2Ft)Z3s'^KstT9Z;u*[*j$387j//qnQ^j0kHk9f9l .}*M2&5D0.7 nlLqA h $] _62,p陃dT'.7\ՕlQ" 0Y&+Ak j H, F+P<[ 3[JB2۽d&`]K%c(/k-[^1gɖD}G4q7mYR/^mTf)`j1X(zI$IReˌ 䨼 Ȑ32+M.!F`<7^E3őHٸgmkVT fSkx+2ub bCaai^Pk#X,]r M OδY`!GZˈ4b8Vd|x9؛_h֘y-x"dgD9E2⎈ gA** F2#;״693d¯\ʏO7ut~N/`3k%K/Vnk( D4҅l<ӳ@RZsiVh4nw|A4/u@ͥE#N&eό8++ĊI̘+^xқy KKZaЊLj9/W$;l 5z]բ)Ƈr4c}|ѣbu䷬mcm;Ԓ5O`D?!-~JwgTJ1;M T(ڶXކ`yRXi0-ICs5,X!+AK Z&Z#D.pm=-FMr,YqOsʛl$YG&,tY &\H/8w!0[ܞkIEAȼc 2U(xA(2/c0#(,ꄼ E#-&"w Wr4 \2%fgFs({$4V,0Axsc-%ZbLDADNEȚ1#&%auŔ"RV451ND,)-Q4ZcIp kE! "`Hfcݎ֐C?o\c8.9"Mo4Qf%H%zùz(YM7 O) H-2|OnTN2 zchXw6 KNzjoLgQS0k"!`zZ" ۂ0ɴ!NM(RɶKԋUwm;CXazV8ﻧ7ǀb(Bw;=q&մNO_1|i6/c9G(Qz[@ZQbţ <+yJ%[5#LP{H#F[@C:{JazDb@QJi3(ZlH6(҂Құc}ߧwss鲔[ ةo<4*燎IΙ~$V-:xRzG0qIUc5nCЬQ}y $'KM?}[߸]+_r.:ݦv2nV#E>$ˈ߸E*7O } )z-qKq¼ȡҭEL. )%í7Ys+BH^,MJ,DC9j[}y3W3v^Ufzu#q b*_Ccg^ iO .o&=p Odgn-9OdҿE+jb@yTcCꪅG;qѠio1hFa5gUBdI]^fdz,` gD;5c@.gDO+AǽI9Ƅ6'j nf6JRPa]Uf~e#Rl$@O ?-d&oքdb΂ GtF9R)gFB (7 l\5!k&dmڢrLmZg}Dbs$?XW!Ix_#ܵ}5}CdWɞu"q>{jm=L'$*dG$o!F0i8z! d1i #hH\X1`0cL!bt"h%00P>/S ?PٮSLw;☗gawU~xk|ڛSy}!p}%UՈ(HbT ʩ;;MY+˦ei lSk+s'V$ }UZη ,r9WOy1.r l]U4> Е& 2v!A,a3f^]$dF:p#٫3g\Yd\0L4 L#/Ϯ8Ew<}ш=/\*3_9]\"{ jVO& c#7O=>;OOg A2O_Rq=rC-r=;gbcekߖskq[g55+bwzϧ~פ' fո@5ƹ 8f5cs˞Bke&O _r1&0fLMz'7OϮv1O[0~pUc\G!@DOĒ܎oO3DV 8\}_=A}Ǝzwo+KGQn rrҊKB@aDXwʡu܂s0lŨN.Hmu.8`$Z dDH}ơ3>9qW߉ZL?caf#]$òV s[k#/jYJ!fڙDrk]tBLW>JKSu ΚYEd6s+y9 9UG8QCB*<4jgםFF㻉A0+1_˜/Ľ\cJ%zJ8_> i~d>0|i!ޤwuZlc4QdDv9W$OS%'3g]&>O !,K'DsO7#Ys9(p?OE>Z]O5l].:ջn̿w%Rn~w5G{`r.4>o<){7  ao92Q)ZR!i%Ll JV-p㌃UvO 䨼 Ir jRt2^vbbf %@ W{+@BZ>MsMi"qmE&\q&Hr-hfx/yu.RjmUOR%ij;<5n5w4݅f{wYK{`"~]pgԠnO/^XE ٧ˋuv!DQ8W\O%3E%%8,U%v\OI*4Nu34ylT3qՂ\-^6#<]瑼ErTwWE#POE-GƤx5&E0)cr!ɘ\ Qe4p]# HJH_~ojEjբ'}Y#W_=CUUGshL8C y!/&?I]$꼱S? >y_3oHspOe>>ɫ+_"nO"au[%Tߧ"W'o"^_*O }Q[gxEwKPiWj [Vg;_]aO~!2pJ4m@J3[5y \4m[I^v~yH&kWy9)=J4J$ΩmP5C Y= 4k4FwTQzK_#[&jaWJ0QFI4KbM2\Ws4$`06W17~X\2cAs۲"A1>8 xI~\vw}J1]y0w!,)+A, ~WiwIz1r26vcmEщأG'byۼfY &6TQ#ƂrKK,+MDqLBb*ԏ)c_w,4~1-H4`Z/\,M{U`d(c/Q=]Rvipypq[CEiF8LD0L)E:eHiURqSu{hˆRȨFH:! iiIKj$Lp#qyZd ]r4+CY/Ɗ(2FX4$51Jc%! Iqڀ4&+矯Bٻ"*Ν=5_Y߯ ӒQʝLm̢Qx CB;pn; ~\O#60$oZ4 Oaf:/斍mj"1'YEI#Y~%*? Uc_xxSPd)rNjqx v=  l$㽄?5;y~igl+f&`ίv0[H ~O9B+ݓ ' 'H'snj륛v fI/>L${fq"%6eS*pˀ1-dK[5 W `(5TE J ?zHl!dVFRjN3 B1%ƱBI#( 4`PICNiaV Jʚ ^Qʕ7֞4/:eSvɧJ5anXh QN1}e_tO; HFMRud@r2bX"gS{TѠp`wAhA]dv "T9 ΅3*̙d=jGf\|,yw=Ƅ cZd$FFihJKltRezߵ-%gFGX.hWy^aB)\)$W VO2Uc-,oEe b>rq2~bٌ=d3cźTB喩M^R# Z-H\>:dԔ#66F\-kZ5&+=Ve10ĵ=ОY4ۛ eѬSWZΑxҘ@K8#]o:UdjWb kg O14QTO(:aQFRF 4]F|J|q:Z=5U*jDe}D9Ռs<5 B.,XP ²X&qQ"$A, Uj$k 7&s;F1bIb ywL5Am0zRÝbO]Poz z|8O6?X< sE:+lV9EVf`Hm"?y6kH r7{x<#, )st;c7MRjdEvXm_<=;]Lz]gA$!s|>.{Mr8w|yPFt|:4eGMj7ʒڭ L['ڍܙ`͗eDǷ*ڭ]E ݼZV!!sX4G4CWO4a\ő \\R 0c(`RDAUH xS1 B;'R(& 1XZ[,@ 1 qT nF+ Cbh- Gh{K`ef LYOs8+#\+}mCUʴ U)iJ(b ,G}.z\g31&Hd0YL5iB_U1bR ϵkVjcPA6 3 8' *lTkhsZ80Ұʟπ%"+^3Y6|1s1rղ)USU1PM ffD0ML}F!ف՟ڱ7]TZix.z5sړb8v$T(( :=jgc_~k)_]/Hy)n~ǭqֹԫq#%v#-n?q_]X72uun ^E5_fSӅbd4W0n_ю:R;Kx+ƄSK35?V`&Xq{< U>Tu(*訮{}@V7"]tY ;źְ5؇5>z{jisHo+##3:@:Ku1f5>,Ŕ/{]Z:WtQWrju(S yA#=RԵ>9Gz>Qr^o/4LIN1|3ZZ =[aM9:(^!rK&}h$4f ZWMZa͍r6­^ 7?2Ep⬙S͖z` JTow┪ZrWLNp6Wq%0Zcݪ3b⫅C}3*fJĬ$q B2 AV(BpP*$4 L=]C9e˩d =^JADt}Tʕء$ r%z$es% !s\\!\_kkP}9= % #NA'F2]Yo&6tu=n<8HiUKBw8t}xQjr{1xNr'G \_3%3q wIEoʫL( +/QMm|NYA {bZx>9H>笺-͇Oa sZ !B"sPkB>|ދ iyܭ Z\IPtn"(>8#z˶P(8K(u н_P Pك? &714-Eөwӑ1{&""S~q%7_Ê?~ã?s"xW-(g8ft)3JXdpE7l|"ߤKӿjtd^˱ RbNc CPv1Sy A*:\v!!h Kp<3څI>NB BV54+- |Lp&;]iK#PΦGn#g~*'+ԕS|gx-' .wMoWYP /` >fQP#bѤtŋW$˫Li.6vz-}hh^rU %Ns5?Vs^we.lTcjY%u(_z~\jԸG厮 N;ۻtb12,KZJkX#HB+RNk\^Xwv;p$r9repJuIy޻4!.~R>I)q1>#ǥ2WԘ/Z*Ozڽ^vAs\t!zW/v9nηï&a4)% ysy#h"jN!ҺmrsՂ%L7nޱZ) *Z^5kNu3Q4RFwuhʐhjA0hWnP\5H7DfPn~&cYO7kF$],[`׈}_;f qI3涯3Zg7w} pkwjMѼbՃ" oavjEEi:v0]+D\\>fq-Y灳*՛Y WR#b5#aقjX\u1C[I%)c"bM0[UĚ J }ݿYY?P7!Qdp1i"c[-AQ,I)4WJ %"Ib";r7c؛Zb5)F89P@9}Ro͸v퍖8Jo!@b,4b(wϧuEWpuEhT߶";;kGSWѓh$,҉MLLZΐ4BkY,L$C b2Luύ4Ky?ݗD;1];DfUz""V!2 cFjr;C&I"䀃L,B6qHU"te@pii,&)iq}M9nn`%ղ8g fD!1D%لBFh"SJctHZM^KP,g9/ \c"`a(悒 -|D4"!QXoBi T波ͅX}m{@ٻ綍$r%P~wr=f]& ,(H{H|EK㬋sˢh07@Z33t8?IͲ(xh yMH&̲lPspiv+მ1/-) +JT1U&khEˣLvsS) (׹'Q!kYԀB#kQUg~::& %-j_~< ۚm%{:itpبW Fׄ$G<._hW G>[ p~޷>`G~#j$Yuv %Ъ It꣪ _B.8}[N1R@ j=_Ġ+[v 5k[{_N;JDY#wv\v=-Y}WOoqczry__ (tPzXvWkv|O=JJ6"; GPkL?.nuBD O4L*"C703,QqXzs/dDaߛ@lk8\ e?~=~o bJc۵sd^O:6R(CMOyF[|8N:$ JKU$a'*^ba=79B(3Ͻ'G;1e98'8_8Il'siX&XXA$ғX!cm !*IuqN%)pLSJh히ul a,K(",O/qXq;a$.4uRQq8ZqnXdH"q&3?t-NYIh:5cc\i8^&tQ/S/-//k 4\NE &KD7) L6ob.^m.PWtt% k0-"Εeuxs%-i N( 7- FKVH;">`y?Ö'7o*>i~)F,`VbXzYʽaiK&aKsHxea@^%aIO6a :Y\y $!hh( U6^۵п%ǩf.fI\H/u:bNښ|ޚ|衃"}-vߔ7[M8)RsgxVh V뒈T̤;"PIig)X7O!ފ&nq=g)5w_牷N3jԮ~p~0*W<fҴS]]|=N7(3wj*ݼ!/?KGw*kͧQ1kީ6<4!~we\Kޙ0m'u $TyVRNYh@֓[萭17s/G.G<1;a`}HK5A5тNPcAtonGN Dc1tءѫٲߓ3}'e xjH{AC!_`=],D0CoF٤7N'wAX{yPc勤W8o-6ـiJ$ bG]̋tsY9VpTQ+rOWXH&I8&FULt3J 9 P4O Gʖy?D6sRtْ=ݖ:IH$U);S%kKrg6ܓ4HNK)Д\FS8zI;1]^bu8ܸdjQjGXRZT79j ƎqT ZPp5~ZP\D%1g3MK0TӘ)Gy)2S!l=궣bQ=PU_%兀砝zl4^89֦~8 8YݎgCW,, rb[c^>`z{]g~;K"nLtfSB_͊.>Ұ+rϋ%.u:֘/$+R y&Ħ8gkl5Ai:vtZ.3Zڰnl鳟l5Ai:vyfЩGCƌVE6,䅛h#R.6Ѝqݚʠ4}Gt;zTi@֘ѪֆpmbSӳ_Dr ѭ NwAӭ[c"tnnmX 7F6lW5Ai:v-JFƌVE6,䅛85-.[\8-TE(0P\ie_YWRfT`~Z8uQ^׿Wy*7,@k<(k1k0+ujAU" tEP:XVdMP8**ެr2A ťkfx,]IgS0fopJk|UI+famΧ>y+Cv02.LL&#_06apC4M._+3h@Ћ9bݚ1&KaВ\a.K'x}ι.8 RɻtMƓp̅L}q]:)BHSŘӯץ%|$J{PG)fRrKoF%~j[߾}+p4 KFԏ_vhy-k/Z:-iU~>,Zk4XE`(A.5w3gY YY%a>ӛU'RKrQGD,_9yK?{m[sA.-h9eqĈu6BE{WRֈYp\ FK~1:S!Sj$n;o_~L\s5aK-A'w фx\x-wrJp`Qwt9# `O~5~eTǼjY>ˇgi=Ts,R:3mCy!:F",rO(ݶWf{>n*/Md'0yV8ͽ5*egLU]Jt>瘊WS\b1I\ˬY]ŜIƨpbLc{;gm|"BB!e v8%#6ʼngboi-iZ:YOD˱Lb 8XbcCOx('!f}CYAvA"I:7W{"00QG-iub>b~F:ؼEq׋X#B`FmW=L-VεXy ]$b.YePDE1B(&:Σ@d%1R17Jk2 Tl_RܷJX_H`,lה샩-Sc|@s+ltpzXcb}L8kCZKz#s=V)0R0I{H̜p=cypdCі-byQeբ_3~61d ϏlL=ը'O l!k8JtcR;VFrfh\^\naXJ^쭙e7Yo?lrД1Gn@(Z[[T5 Yi!Oa(\tԩ` Չ&BkC`:᡽杺e'gBROcIKL D7f ,i7Z؉iƚ].z}+~WLx>!f@tCdlާ6yK,4Xeu:Kqʜymh0= ޷τRn|^QnLHAy9JL5S݊ϑ Sf Ƙ5(=;. zˇt4E4[eM9R>}lNKyj .Q:D t]:33%_JFn(qZAu9՘uN .ў1qNY4,H 0?NֵaD+;=wJq뗢0;%"[˽?g A .R1;J֌LZea:Ք &q'T/i$-0ޡvՉ'_ YLH~h@1c5 >^wfp%$qu7Φ!ql scAUN~w7D)~ |t%I"t?fAbgi0$Nt24lVaLd% aF9w~V&Lfs*M&KUрO77>8</oVbx=Bf gb:*u⯎N`!"4ͳ6^V*f0/љ.*DC;wptpFn,6gk֧>@Ow_aI}4:S3ѝjk;*蚏{q43ŀPt]kd1vHOG̔"2EO}ACVFkn(aT>c{> hp=S-!Ba'W9#"4*e>c.H`U5Bg?Gk!u j&{Hht:rϓk{VPK!L*ߤaboT(ef8[·fz8\pa!pkDYYpVW agjC_ ,#C-|MZ Rޯ|%lxF,CG$7\Q8:Z~lҪ]d=iz7ȝ&fAq˷\͇mJjڊan 9\}◢@AU sz9giJ@rW'\PR‰P= -#x$ Ռv6q.I(uw' 9-Of:ỲZx˥䨇wWz:td&vQX%%lD{c;m6@4!֌}l0("6X>u`sx!@T耧Js~ri>|@Ib61S=WjntƓ[z `Ł>x얃H} JxTQKmEբ=2 KEqJ=t89GQݟ$LIHOzѦո=ԜP.xb+nhGVպ}}xS>#9SqiY()QH-3 sW(|r"LvZ?üDhY[A>xɈ1!zt,jl"'mY0€94&*&<0(-nDRDߘų=apCȶll`th< Xap49bU:Yb o3f|p?\ ^3l<, P\J$$g1YxL<[LDǠsI,L4M Ղَze<ˀxR1e2ME85T)/e=iUS0jYQ9]FR)JB$?i "r+!/Ĥ@~ǜ[ VJHk^qmmآ@<\_}i ŇP = P\a>:_rښUmScw9¡<$shJ'MvlnGO)D?}x- eqX'ypSnؓ+)bNX̎ĚU"y`dB)Ƕ~^$hsW~n} ѠKdv`طi.Wno )OYXPӬk @Dꍽ<~9aB*%?T!a'{u(iڽ÷^>b"rx4X/He×3N>GIid!?\N/ DPdT ѾOo.)2N<~y?jDJF)DBe~= Hӡ:BV\g xd po=@!у@ԇIiFl<Q|t2 TXMwEf=@S=NB:xIj30ϦX[x I(Ζqbn1+F3Hj<@X}XmG.kT{c)D0|}3&ȋRfpo~5?Iz}Ս#T(V wH){OwT OjPSڏ;WZ#geWLoDCO򏃞Uju!R)toӞ9X@XuO" hC96<҈n[h>F_O8\hTQa.0; ۔+z0>㖁Jo}x5W`^=Tpw9s +Iv˓Ds =}P$p6sm َ (hJ#y9f}W}{$-ͭ Ѫt]A*q! zJfVinSHBͤiMJ@(M٩c,I9_V^x~/l}f)z\<:a툲pdfA:9Xē4>d$||:YNx0IAmx#lm^k39׀c)7az̝ew1s:g%R+\^zԎF gߟ n2t~5/Mڒ @ld9Y#(n9zQ^IY|~s\TUCo:gЛg[nrOA}bpUg>t]dR/ѿ[wGϯ8p{[YU|YfJ2< JhV82<}7E~yN/m#āz4qf\;z#敧~NrֽDں7:m$N4luz}+xUfo1p'[]-DzW)=9y09 HnN{v4@RJ;-CS0+@< ottGG]Gg:jׯx{ZTCx1Ea-2ed(1,b&&bGZceL)QUn$4Re(a /&:aJ_Hp* AgDS(j|ul2&vf^r6;ݯó^l"LX(' XHBD%ǚ+њMR8F,!T^7rGcͧ5h4Eٔ6'ǨXV̎Y%#Mܯ&#Vdױ/(":ž`4}컺x ٱtQ8bnV} =(f]A)W7tAo:=y-+R?~E>g壟UHxZ cN:jURv`'<0#wI'<ӚS:ȳFyi7< 4A1$bb(PAqd Y0c{!fSr 6Ѓ=x]kusV55(CЁKhг"CckZ[w1XBN섋6?NWꄋ8c: ZxT]X`t4L,.8 :gnQ;xq-!Qy(E9pTL|׊p$,AhZ)!! Tg}컆7vX`/0M) 0ٻ[cᣓ&{UOD`G*ĩԉ -5|mFE^m gn&d6_v`7ٶr$"-nI-~G 3(UOXEn&8d LxNG8N"̺,gJj<H>,jޒv'/j^r7wC"LKYNd`W[#3xQ1IpS|ڰ-Y+J.!H!0F1J38. sk.sGhR`Q*(~ U:*oRTlҫ5>\r/%Zbk낪f *g;s*DŽ nyF1X+"N Yc0HQBX#ɕRF 9-))aG=N$N6UdX`b)Cd.qREh@FgG(ɱ-!+Գ & 1<܂P>9Se,C8 L B(>Qa)Rpg9)֡LHA0.sPv) veR7^ J_.]7ղ`3L*npSNSES, ,w sgU:1k3k!vEnf^ 8Eٗd\b䓹~LB¶bzbz ׏c?SOJ E ;7&ysw~ }o '8rYOQR S,WmR؆~gOjnN`?Wۓo槂J)`dYe|dcE THPB`\t%tJMo>tB5S1R a98],͐C)`bo'i +c[R̭k!ZNҍLs#FIGB|1 7)ÛƠSM`1aNDRYuMmŇoے?-f-H$:2Bn"质9tꮧ.+ KuHi9#a96HJ%L~p(0Fd&"ׂe$kpt֬%)E!063DQ͙c9`494u||䁳槞ӨzE붜Ջxܒ! J!/!u )\JM|UEy5,.3+7y@!J5Y^߯V۫q,lꏐaϡ~ h65q n爇7.Xg3 \>ȔywfV ]tq{Q9GQe*vqM\qZPq6m8s\Xaq$O_Ov 0-S6V;kiֳUaA7dd$_o&g5[,Ow| y';_}Ql~f Zy#Ėl?Nk!}_c=sO;߹e+e-x鑠O4Ku%,t{` \/3mW7FO ~^KZʾrw$O.d&^Ϛ2؞֘?CGshԏLk61eʵasoqJp[0 Mؿ~3]gA|>/7$FO: lʁz[7w\wJu8Z\)卻Y'O۴\|f^."R" q\ )ݣIs* )IXb<7o`-A靈 Xx`RJ%J!1 ʂCOUVvS pp͞g* hp*PaQ!aQWI$5)6f(Z*@RQX٪ZSUhOTTh82E/qLe[LSb1TFgRv iz&*$5F[LB$W!UWt~4DC8D0l/ed(g<S="g Y)j{lFq0ƄGILxԌd#h=cjќ("(Zw܂(H9sG>s&X Ʌ]iҋwO/\3w(y$/rrF|MxQk/},Mbe_%7_Y֊*??5&9|JA`wӳ6Sf(OiZLYC]CXH.K:? 鎕i \k|j8 uqpSn1D`cxahkk&Zvt{B#45c\l~ oyoR$(9fƊdnGR a]橙"̄D?!R/Wy[L/Ov~p9aNSV0Z<ܚDH@HXRCcX=Ob 7)F89ReKdq2(q",KY o%1LՕqjPC,5EQIXҊxywߛf_.}B_6Y>Ei=냔N)tYH~|sS'ˢ `)4+ssw~B "I7|spS6 ܃'ş B?rݦtwp{ pM%7Sxu}>OF_60qr4@gv~n%k 5 %TȷDE9(Qwz'ѫDg3B)xz/!tO˕?%-/|7 c)MHF|X#8#hmQiLV%V``d(NSM|s5Fv;x ŶH~X1cVɦ!nV_;]!V5[č;h8-zsz3YQyG-vЮV\j9vB=%9嶄 #t<_! Q967ң :Ն`JUq4:k @ELDxEON#TjSDpнv&NU,Qj/^zR{Y\46`0!ahR*\TB:2nMX; ;VRgmj,;AŚrcYogŀl ɵa'/9Gm$BR[=fpf"r8}]KU5[QCRfP6G b:$`HB+hhYjI,k84dWM &Vc5")-sTExW.)_}pXz4ݍ!%4j3ZоB`B8qKzqb+X 9ByF>QR --|"sH%aWdMxŽ)oZQӻ!z̠Uzo.҃dW/_^@5EG!r[˫+y[-h-;0/śkA}zYLP`0ّ~Jj b*+)'?G*zZZ*'LNEhPAہL] ?IӪ{1޽qH4[iܒsl#5{ߏǛm> /{$~ /fx OcU<קiCZ"Xwe/OkֿR QՄuKOn|{TeT.TN|X]=;ư46 \`Wѯ@ <+*8אvVgȩqHLNE_`J|XM'xX,];BA)L!]M@k'V4wQ,iJlRˎd>n*f75C~$̏fgQobS%eiNfx&lsrx*0C>-7ZrL?]^qń8iBsI93qęx0u<-a$߷u t͑t[bK ),*(rtCΌ*;H%d{8XlSKgS)6Tլ9W-Yo?WS$>~47Z=4SK{?EN--$+V~j4jD$n+eyl0EJt"ݲ:χmq@WU0`V5M{cN\x~=Q݅&yf'ZKp_PYn5*/>{_>eC薲x(#[wՄ}\=KTg]0%#Y"RSDP~'ŘwibcIޱ, OqqK [Z*1F{܀跖^]1=\D@KSȸ+vO+&&,%F}lp+hHQR0omúiX{1c~Z[Ԭ[UE3X! ͐|e[{-wԔ`0M">,k(adfb: }T>{:/ƳYhyC'MeXURM_:e9!PH涝V"Z!>}7g\\Θ2S-m$^K{)my]SV\px^!? $ttř~.R{-QJIXU_DEY2p)^!j _s?_k5m6FsYkϦ:BuT`DDNS$ E \4}kw?Р/_afP}WSvֲ)[K8(^05: НPc PH圄TΝDWTz4N:Ah9"5h7-_ތҭ}{2 t!)ݮUC H^ٲO$.MIˎkՃP3i)PܒUNzkKr7s kKOn|{TeTQ")~WfJ1)V+br͵\ph+,! Bָz(מx:>,OGfSN|9hm,_kWSD$ VA`AxХ`B Zc/Ǯ$0F _@ݗqH !%R{g8Vf^Z,%6Pp$G^{ƴ5A-土;g5?&V=t X<(Yyɘ1,ʵ5\u !HX T^KXd^+Jl"'ecthq6 Wܪ8D^M2l4X0{;Y+j?8sЃpR2Ŀ Vlrؐ⃺m,{z_'a^||\b}.|]bps#w] "HF]1vܗaQrY^! y*ZC@G)v5`-=4RGV`6'؍&zn奓^_RM07aVųZauW8QJ;-}{O0ۚ|p9G))$8OknVrwI δF?p+ 3h'*jvULVCH#)nXUbL K8\f$Yn4Bp>PicK ^>0*EPb En?>r$HLe##3JddV4^3"U:NT >7ԙ@ >|T?0gkEQܹt8֚.o.噽ȋQ#/F^5r-,ʭ!'(Y{bW+1|&H8r2LѸM3[zyfk6~*sM%m7``}}g&Zҩ]1=Zh*Đ g 2EEI.c ֎{%,Ҙ 5xX,: 魃 cIv9rAH<ﻗP!FrNwB/ZPcM:Үh;s:rWMqG3=R[LlQK$(dh|m{垯kuymASJ<"bD^' w.PoOX+I.2$3e NZ.ݕ rЄVv Om1qpϰAƎH"nj˟v1W(^D6OM0WVLkwJ'J!PFjHU04T&aC{F39 #`CnrA>͐c>^hg,fB HCj$0ў= JՆZjܮTSsoЄ>mۜVV"FEفm!MVZ}zmzy߬abz1-pϊN$aU:yNHݿ&S$,V:Xyc5x.A+e5EB4:_6nuRK6ՁP#!Z#d)ԓ(u6Ha JP*F X_+$1a#`DBbԸ;/'$ކn f`Ҩ獢bF]<6eAYL`Jo)5J7OI96kBMգ׶qtj 4r 6(c(+g s9 4[+u`dRLcž"^״g}o8ڽۧ{ziŏ~UwzM r;/h?ט9/^f~l4u*""C$[IM{@O޼Ie6_2WSX7z*Dabl6'#xhdwN('0Lnx=ëT_a"|c* aˠ%v;E Ńe,=I:Ίxf,e)~C;7%7P 笽J\1|<˦0nKtw^Igz—h}$w5eU]&qIw\[,M4ok7{2 !:|P f 5/fqOnsh=3qimLX2m˂w46|'͗wdx3;a!'gj{Ǎ_1-!Y%Ev?m0(jƉ;mɢ,Q;L6¡)bcϏHϏ5z~|]=b؄Q_0kd.4]CF)xu41*[R%'zCǨ=.QIbtj#^#[E񍆳kXRj.}z"^dO]tLq+6ŭCOݟ7ؒ1\"e 0_ &%g{=njh3ph CQD;NLXd{t~  a݁(*Phh\8A]Y'jlwKg|Mz97 gނkwk=|5"y8|c~׳"Ddu1L&ЁNLMHͲ+$F89N3ΙUqepa T۰Be(8cre_$pP&DcRJb`Hx?N@in:qdZT PZ2FhLo?5mH1+1!TLDPr%a23 F(,BXM@"-i1NXK\z*Je[%Iri *@3rGc*%l+/_#\L7w}|lۻ+hXtLrK H!Ќ i 06ÙTzЂ8Ym9_J]F4(`ke^6ͨU!nߛL׭j[rL'PôTH1_$ǀ",KqBÓc[*IfaL?$9 ΍)8=,UYn|EOHܘr1Zâ\x\ "D[<z,Rq7B90 V%aX^`,SQ-'BJ{;LV<ƓڿT}m^УW_xBm EoBm2D9{yl2`-+K8T\MH!*S$ ɰD%`VJ$lmP j\+DT&9\T,W FB\H!bL,1b(7]O#6xҝDXH$ X6]mYy@y:P=p ͱI%K&!=TO^$E.S?X3 ##EHkrʉKmBV:q>yxӏ:CԜ(+O__orbBxD˃aD>W + z-O"J1`3P6Q[ڲ> 6{&W(V5"H5?ס?>;}9fZv2C.a3۱ֽPeܬiIu$0FI)C  GHjŢFZMf5qp"q%T:OmCu?v`- J6溷(R fſ{o-5ml{|ڽ~={x[+wta}+gw/տcr |͓uV {/FH|z~^`gJ$KԼ|r5"s+T ºp/a+0ؿ80s5\@.x_!@VI!6 !MsZLIKt.5eaGy%;[gs4 @9l7}(:SL&e g0L:ΐMs,6_G5wR1spsZ-з_˥޴j13g/Sq㶅_33]:,JF%轄>h-@'<{,a(<)8{k,4wsc!Y[jP zv0(mD^@_CNRmI!!)~Z:"K8Rڂ* e{Ibq}ppjñLq`T2MR ~di\)%hax";"P׋ oL6vd-lH6|ow'eRD"%V9+N '&KAs*aYrcBXk2b\7jƢNNK8iJM$7v,1tB_b0` h,)A s,g*k FҔ5QF 2af870'IFH :]k-DgO=4da+\)Ĥț* k<)򈊜rq):x7T!~c6`~-ErV,Gol-20y%P8GB9}snY LD1pĔr.p~s![JOH鴶 |vH}.ȶ+"G]a qHosm˄e*He!F,KU*n#&Bi4/O"a*ŨaQ0tp[a qat:aJ<i"A|NSI)QՒҁ_Wj|ۼ/7gzK,2RQmAv S Km5:RpnZːAHJ}e&%o%ntJJ"`L%5@=&*CsjlkYXbO"U({]b=ϑMXPBfZDAA^*/+g!)bżQM?|uilPB>>Z`| 3K{,in6Sz?֟o_ܗ3~Gұ.>nRHӍ) D(!p21yԤAEq_xj(Pmiu>{ z9 ᷦX!vT,BЊ³M=/(!7j '?7j8zs7DIXmƓ)=nGGut5sCG`mz$|k#ᅻcG8&tJr1M+z*uJ(#FbB `ū|0" b%Ru\[(XX'4(ch&$ 9PL_|mIIJ]o|cf_ŷV~`Z3L ivI1ܕJօݛcx[⒞IVG{W߾ѯ8.z\lLߜ}ڭ7? nͬBha+Xjt,z7F y!4EҷKC\Yvv _߷;%ݣG /Qqe׮d!/]S ]lEsW_.vk-Q1 r_0gylKbE.}ڹ~?>:'ʶ9h1kOHr?aU6wqlOв25@/So2 {oBx˔=7C!c՟*@m -<>2JJNc OrY59t>fkQx1*fH#S">_c'}bI`F^ +ȍk! Bٱv>j ϣƛ|?=Ĵ>;jbxq4߮?ݮc\16ࠑ+-Z|"-Cz_Z=楀nf8N[f"F>j G I.~B:&*2Av\8( |\!?ڱ-L#un{zSOqڮr:AQ#$krs6ٍnez!b~^ԟ孆;??JC G@x?"d?v~viP9sKv%&yA'|3X˂钕Vװx & B˾sa\8=+#,}/%金P8#gM2i DRE<0Dy`&RERoqH=sQ>L{}fj3qq3;WSY~8iX5 ȆF=WfR r7;zSҴd l?g$ FÐ{NWƿ Ґ߸iMg%5;|JCItv5yKAԔ- aieHiM{􋮾DKm[6K^L7 b"9!Ҹ8^:9+w|PD{]_W1tՍ i~(qd}^J E^D]K)r@;u{G^<9}]X];XjZa=59*  Ypփ{LT@jCKTf@(| *@#׏ )KGP V4<*0C]OŲpp.~eT0*a1+MF@uT2&wuŸ7F}:QaV"e7xJOyUT: $h8rJW0.ѹ[.9/<usA*<lK(u,V!pY".tpJP h~z,mdc#)E+`bҥIM}SO"쪩Ns2-5>T~)"?c k1LW.B+=FjO9.t onr Q-ѩX n(bU`bG妚I$_0DXL5BW4敋D "a3FWTT2}*K%p۵Q/8 Nr9`+с^L`Һ:'2Fu:H]OȗaәYˮt:(֟,/_(!iY%kIO)1r;`\ wj t_Ga󚦣9".`M' P-Zt#!k[*͉`O#O_֬d[wmm d ž_ YK1Ae}H'ERH!{!q3u;ЙQa]I PJXaQ.闚Hy&!wpj3 G҇E^'t@.PuGz].2h!&W}1[sD BA93.> gKT*;]r_^( s3k0KzqvګCz'D V!d Pt`|֡&T0Rg#1Fp@{߷0^jZo>xrDŽ!6f]+T[gg8M珓~~˱arwf.g3O8Ň ;@%\Ѡ1rFR.B* N8C6HC(:ߘ%j;]5ye11eZ.v0>2@O|S'x7TޛF 1i*\/>_ϧfvTƽc^1͊i(cMz wl;ܯ4~ƭ'м@`]w6ҊH 5mRyĀ!WN WT2)ΰtbGB!SQ De*`, Nh!CQ9DhCqb)$ZQ| 5bECpy%J7<%41"r:T2phlS,biLgA&n$r{59׿ah;gSM[Z^v= 6gSUsrMrV@z=Z֤Z pPtTSՐ4h{!nVHbC܌,Q %@r٥t|SL0鼥a-ɦ^)8-^ߥܬa/}1SSZEM8+63+/>cIyгts?dMy{2}.SZcL&{ń<{8㗂M=v¼"Rѐ#W,ƶ7nB}n<:ĻP(ٙw˯xnMhȑhNmݚε7nZ}n<:ĻPkHP yޭ 9r)cNnFXeMɃZ1at]kl ew-n hȑhN=ȻI-uT'xt];n}5[r*SGoFQmx\Qglv(5Myޭ 9rҩmpa/wcr/ <3vkdw7.vk@C\EstWvlמu*ryPGurwk dJμ[ꡚޭ 9rҩm;"w[{ryPGurwk_d*w7>wkBC\EtJѽA^nL3yP g>c̴žrF]fnfڀf锖{Va ]_{ބ%-}_1@w=MZGb{שLt]l %~5z+*g6ko:Lk8{߇N]V׳լ%]-JRIKx({))1zA^F-ARqpL)&-AmwmlƬ(%]17j Të1]QԘhë1k$}sѶ۬1kÌ1EYSNƬyo1w5&,Avx5fj]QKPB\Y#$QWcj ZԶZ1k՘sӖ@ -֘5Bw5ܨ%pux5fj]YKHr|5f&Wc֘i՘s՘5ֲ[՘9&ɮ՘jWԘ5[՘>՘5 u5ܤ%ЭQ5fJ8j]QK ^]17a Lë1C]17j ë1SPWcj̍Z&j GWcjYNJ~zsa| &DR$YLtrU,V_,t .M&b|SѷL>3HѸ#i9>oioLC;X=z7 azWCځx4H3%!‡8!wKzfW8,A/Dx}pl_[ cܷ`-*ǿ3E0SP˱98r/IǗ?}VakdX w?Q?<`?JbG`hIT7GRz¼$>Y@0^Gc=Y/r:ճ337i!!vm7^وC,qC L$m)S123~,LZ"Nn WZ`[+Lm;l0er0:.%$;oeu2ŚWF BCLrTLV: #%V2l1^ HxA8UK*Gs",4, Y;ÅsX;a 86xGH88@-6kfHJ42ry,=1`;`K橱IOL$Jg4@6Pd>QLJa`YM05@y L«\X PQEp`Xb,$'|M }b!'iF|Ɛ ۳^1/. KN I2 ^z5OYV#[&E7Exv4m^ >w'jӻa|e5?E͑ǕYh.W|LRI=f#E섷v` I!ߘLB{RGhO~e/5>hO4u}ѫwJu=~R!"g 9L,b2Qˆ& pk1ER9f$\JQIRދeEԤjM[ ̭ZV.Jٶa*ɛoosK0V99?Z{tVީt1i׌E??O7+yڃOAoгo7f6boGD'hKBA* uqs! J/ju{riQB%jDN8|dzo>oҊj ՠz>D`!W!b5P!å+5D˂Nd?m8Q_yۅi>_iIG%vא!b*Cҁ|Yhy!hlx+!YAR\Ju""\$\~w !Rg[f7'3`D V!B?"SCҁDZ" Sl$ >L5GT7eY8ifWI4S? hVLvZ_){RgӋuc]-׹#P^O(`FՄՂ|m`gQqk G$)#IB/b}4/kpm_f@䩖:,RvY$%fr-U#23"xL( s[e"4!74Խ롦k_8x}՟QHa΃O{Z3?1>!I,#XK|&u2j cbby|֚"'`gwj110ʕ|Lp +I'TI38kVܹ`7$Fr#K.j+n !~Z;>5F1BTК)FO6Ț"! K5oӏ$oѨ/Sx6㯅 y]O }e}&L$3Ze 9R[b1¨1 _iUQh^cHv#L4%ߢ?q+T!+)VM;z;(ۨEo˪\a֕o*_TX:@#B GT)Yr?o'aXLC4`)6YqM72plYa˚e2wu3J*peZɕ+ Mqd->x$kƥS.oܨ.$$LŘXD A91<%,j*@skPPτwZҔ~6jb; Dij{u*xf@%Ǡ-LXd3'U5!3ž@)\u=gf"yu>IT5?sMڙڙَ&B|0.?@Q-iT{nn6<>\^,>3݋->MD*A_2kۛQu!^7!6!rhGچF˺w9~JL 9]#zދcѫvfn3!8E+њTqX<6{Zb6l:Fnww߅9|~LZt1A+3hrgX!}EV mq {뿼k@dE^l:=}_No R>MŊuI4OŒkl0L S(iTSe Ζ!E s4GcD7[SMCG $eKqD\#ET-CQp)XKņEN Xr)asSvkJ?-vC5axBan9(YIXM]͵4R|!{\'IY G_ͮ.oCZR REr'ɞ,$&Hi0Sc[՘wNȬR?&j_U_҄2+j_p}7R76J$!ZXc!#$X5ŪՖhCrVvKsU;1IpϽ]M_W6GNe^461" IQA;$&r߃*~QEWe v ghABa:z<|T ʟ)EDBgU7FKBZacjU4JTa `)aVE͑(HQz&FYé{@xwm>dQPCcلҊWLkH:G:6YlDskBt(4wѧ{\Cld1eҳd;/ 8#7+*NmH9[\*ˈ&8d!@Le$17fRqJ@}1f,P3}QP)bka5 *2Z0h ^e|>#815AҌ3Uٙ*{p0Th|b^ˇUڿ Z-^85'NLSDz;v5jNgN1Du9[yEiJԹOkΗt͍1WB"Ruf1 rD|u#3RVX6Sj-"tSۼ %з! [^ jBRa_s;_1/EiCWKz$~qJ8@㤞S=鎃mИizL΢X}Y;Q_Yoo3cw?o4\d}lYaeQOcb(&o~Ǐ] $i/gs̔C{]\z4#@X֪ ,W |T9j 9+9'A!IJ5=8ǘt9s4>6꠨y1Ebq Geԑ^Y鰎=NM'M'M'M'U6Vjn1ZaI`'w =*2Ѡ̮"ݴFUv:sHQ@rrs|+@`htQۧxu{wA"Eբk4 P6NA]N#`=}{t1G`?}1|u#0H:xZo1HAW]oAnn4]%$9T%ZiporI]mLFr\LHctGa~fWNh O7oZHО]8zB}lJ|4貺'. ]k֡S`ƴ`?~*Y&+n>?lrxLw !)ɷ>[͢ɠػ v|oEc^?~dA z?v0%mi6+fI_רGZZ8bqeLc!^TtYUe%ᬍ(CTT/|} (B;I;Ύv2p ӎq Ese +cQކ]Vvh2wٰ>Xt3 5EG"^{ D-TpB*>L} w(wFH,0"$pC:8y.܀ad"`!&.] ƍ(įgr ) "#2'"[ yR';s"8(1xQֆ.Fw`L`f`[fEk5ƏQ֩\x Z`jkxZ` (jAv Kv5FJspsy?^cЄkY>&1ƹALUmWuOH=vs,lN1W[e`wn-q,ayuTQ k{e Es`ةgd`t rA%q ~8%\UB~x(o栰58p+h|"&ÉLθZ8c-N! "3 HmH2%ټ۽uPwOc@)Brk k)Ψ4W0_'HnO?ʳ8H4)qSDI>v;!ob:W\eC1T)*ABT0VW3G |EÝ7ScL8oEa4BjU#RXpxy3|~Nd4~wQ v RIw9 Ns?!de{cEv}⌿?$_[%~``+6,+Co=\0CǩMk?Կ'׃ZIŜsC3ug',I&dW+x1 6g(li^My}rjJx}J4AQMU`ZI2cwyA~8I,$B4베" *EUn4#hhWy Zс۝+ cܩW#ԃŽ׸i?7};WW~n41xN?3cR=d:M}f˸&z␳TeLk$ H:-a*TJ='ð$#$7B%xTT$>%$̓{n}.U^|0hz@%,: Y$P4,`r3>z_ID{AA#z~|oHioa*_.ɥ 9zu%_^}P3[F0ը,Un@~9͑j2Qlq]l,fMAMn%{8ki](\K?s"[k7v V+/nL_{}٤ ~8bwo n ATܧ-R?_yx$`kz=+^ی()V5˶B=IhUBeP9pjbG{O-ڇ쭙G?E J4=3[5OvnI8ѧ.~!љ͝mgǣϞ4fJ? Wl{Xzs>M78u$k=F8&TTRpCXA6;W\xIϨ4+Q6qjN͵O65רzjb8mq+JC_!X8PA=Dǧr"tm:[Er#V9Y]i>CB:fjgn! (*/Q[?WiX?" VM_W9??\aVh Bׅ#|1 CngG`fY@jI3e.'ٯ+܈yre&`s7)'{ y%QҐ/\E)NBuE),hkG4U+FꂟgjĂR#pFE9(ecCb mbsխԥ=RnRȬF1aOs:Ă`NS(RT)<ւ,Ecl" *CvjEϲV 6l 7XXHoY0㫑7 {]sد[G@`8AA3R,ή=jM]W[^$Vzx_mpόΜ%$Zw~WNj/\h(DCLٴ ^\^O8 Ol-W}u4-:0~LN]TnJU9+s>F ~ O I9N!Vc < k`(͚.^iU{Pqul>/imEsÜ.5)L70Ke0s'}OMF[ߥpx]x O(c_mɻo[\a(e翿!o^/g9)@:O? =u?DDHRWW~U Š}{NȟЂD?wW)Е9l̵l^!,‚~oLJxU`k@Xrd)x]y]w.NQJ(v,A<%H]ypART{6 xuoلhl}EbN%,I&dQr7Wbt-;5禿`T KYmaCHD'L (i4 `|q4/ @2.D%}Ȳ{YsOr`?zBO(%hg=D|/+`W5|*{WTPJ2 g -)֖i`e K%{#L3L-dFko߆Q/5,(m6l9ʱ8G͓֊y,<6&!xuEc(_xlY{ JeibXN.gD3mjηYSkRq{6,HTw[/RWzR]Ģ\]tQ/+nf]=iQJ5Z/yQHdtRMF#E"kIPD6n*b>6 >J\ϗ$Y\\͗UTN:E:kCX#[[$k,1.3Ju'NvYJ#IDiqsf8p#M)ւ?%WV5tw#_?<(T*Qs+U }%&/źF\~y/l(?St!_駷!6ކrv:'*c GƦNq`(yzZI 7r\߯{oBG׼_z؁{ǡzg0;%.z5* -քl\`[,AK)sM#Oտ2!s_l#jy4s;+(HPB `,5ĤB8q[rpz6j.W~tkݑNQR(ɚ.kMFFtj6)>ˈ!d%@'@  W>!4)A|&Sbrσ-~%QEҐ/\Et_7 `b1:cԱn@T!mhUք|*Ss8'"10FQw;]}Ry[,}4 W u^qvq9ț=x:;F{\U扙t0ЃPr;y9Rm|8WZ%iʉ( hdNe)憄5әH)Τ֊weq Ϩ*HF^Xʲ^@I%$%o;k`N=QY_udzXb9\jjǜ(^fVIRpX>EA. ̺ug\!x&Eu'an5m&dHW[#ƥt*EdŁkPPw+E9UCH;I-wVgaZL9#dݬ{.\R9i4nr=)hв 4RYil8eE4kr1dXkrm:"jٰYY%ue878M빼L0@WQ6+r6j}jWhE3zo+cIe ݬX&,+FF+EQMIB ̺b&H-Z(lH +&Pk&*ڕ!iC[4hqEaH?lGj`CB9Z>򕛛'w?(B0ҿ]_ٙm:S!pcbDsgN ]ډld?ݝ ăB҃ȳ]=5C6 Λ^w!S \JXގWO|tx[qsOΞEe.j#Glʞ,ՇE|}jB-N\v!VK C$|Kz">!5B Ź2t΢ݍ5x=mX'Hix7oJW7ʧ|Jqf#A8%ac~0NRppۯBV,n,Xkx\ɜDze*#4F$eKT <)&X)`ZYZˆ&+FkErea&yZ[zB=uBҳEz@,$_Ej'ۓ < x3R@vK׫!eJ%UQ(Fƈ֫]`E'lgEv}vH|$y] q-/4x_Fq;͋+ i#`H-v=n|}_hq=w8.gN%7|݇Zm5 >hT^fӭY[BFP, 7(BKi1)RꍝZzl)Tr k]H ef:u4I5Uk}҄o9 H*s&5ޤLzp$ZF囋4"'k$x\@1@ɿ}{Ih!.nϨOkF}Z3ӚQ.fԳ 0bA*;m|9UP.^0*PH*:)e2)KO^LC}>ɬaX]s]n'UOJwj3kR5'Uis<9r8{0{Ė n;иv@73222tXIZD.GK,V] U tAjBcA8y)-#>YWVؚ&Q, (gb52B?cTysNfKC4o(Ɇ.&?'B,~gc'Ӭ5 ;*6@cz +[2xʘf,[RtsU`Z~L(isVJ=C`f0n@jU`и6udUrUIj#/,}tqUX +m01UoE3̴;c5*nN+92XZY` 6SQ Y l R01B]/*dy[a@v>Yjb:}^Mۻ+U$vv] -|3+rXddm8FCvũ&Rp^ =;{1$`eI5E?Mid7~n2 I6E;I0(2pMsF5LɒVk,V'ԛ+ɱ Ad]`q& d%$Zمny\ٺƿ/X=4G1zAVBW1$lWkmP&;yoedund5D9#9#1cB6\u["y )ke%`$?Wt8:@̌.zj$T_C&qR% P$-`2sLBnY!1s-8)@fA.$Ev-**Ue5Y:bvfT(Q+Ѳڵ:[9fhQ(Y/dY@lHWa&@%*>ZuqQS0E=+,h6hf S__.i ⭱ ?Ђ\Q!FzB1xIWϏ(,V'cDWl&-J$Prj:'YԦm(7yz̉{##lL5w]׭<˻An;.Ƥ-ر4uZ`Xt Vj0RUl,YEmI<2$ʏX6P1_RfY5@~nLBU [VVbR"O*Ap~dR"aJ Ph/ۄ-=T8aɆH˱avfqȧ_xdjUyz>дqˋ6 ']%/r% x@PQѓWNYp_߼zy7t㬹Y{f~5ч<>;>[&_O_Yo?|&Phph&a>lwd<{TzJ0.w+'o_AeE(Ո|QRFYXtu9YD~ƻ,~:N 38M4L~f[J5TQ{F~'gK}X]YV'!^>]#P ?՗cYcrS_Y*=X%N7ןϻpG|z/wwE2k 컥{ w$ߒuB}:^BD_q0ga9ӷx%X)sr?-sx4E?7wHvuƼxG]%IXֈSXb$[mZ}f2?%n\ob ZVy hZN1R@ЬB3F(ڙJ9A M٬nmu8KeX[ T4` ۦwލ?ڤ14%0Z}#Qj Giv;ClaB^_`=+"vYì֏> {8n$/G|ԃ4Obq{Av%Iʒ"8GHVk{zGV1[z^uS$DS;BFz;ntf[5OWrAgmR,TEk(qXDA !KY&aGJdL? J0P0(y&Q䑼Ɲ]|l= 07i} 8;@tݡWBhesF< x kUjWLr *OeWvwnn1$[ Ϛ#z?5aϤ۲+ 4JL5\$|aFHxG<& َ:=V $݆V6 Yfbbwr=333λڭJ!Exfa,4&- ,D`RㅑLh} jW\5tAZC7_Sf&}3VR4kA}pipEɒc&jkVHCF[Zh*]`SA5Q9kTVm`\)Vj#"cêQbt"hJXFgBAԫJVF9i#!ִCucRgRsB-YXWGaҒd>KNd`r9TE-*$*U^(wUl-(kp& T-ͪVQəe)b@+褉 KQZ)&3-JIy*VT B͔\]Cm9=ҒL棝O'@I{fV1YЇہkec{:2$Y`*a{nZ(H:X^aSQ+RIy@ G_ NV?N3?|"=Iz!'~A -s*ꗟVkV&9-\e*}׀X*:MB_~y!U˞ɴeW@->+MS_~Ni $fan/ͷv{l3HӅ0x[s~ 4VP̫q:yCI.fF;ytp8Iۜ g]\给+tY]==|-]{wjw6x?,n}dAYcS: *g!k {ڲdzkߵd]}ׁ"fHhA9f7P_MhY=c/_cIPwY7777 ܣ1-NC!_@l k&F$+O&MF#pkFƴ |gfBӗDŧ ES r l%Nax]s{w$&v*4Km& =%>ﳓ?]\± w.>+odm5I59k~$AduxϚֺ9,Žwa;_ _HN B|yoLD#ׁ}g{vg=<}%z–;Y~HlBxI-X# :[1yK*DirN${k2Tv?N}|yczcL9BmLIxb{\:-|Z| xJ.ڂ\$uUm<1V%SQBlLY+D9S:)g/z$*i5ʗgċ(e#ỄGvQeG< A+Q OHG@=Q]Qʬi']5d3F~sԬ.Y!X8A2_y臕3~9Y````c !B51 I e(&iZ!ˆ hcpGa 䙖lDԬ L6NGVI:crP;J"A&ޞ(E}`u05F Z<5T6i뒴PR]{jMP1ԙ ̒EV1a0}gv#"tzB{@mKf:Y-Evoy[j:Xp֜%~r͇ti[VޅOU#v6" 7W.YuW>yO'>KM }}E'Ż$ dX,ޞJ3o [s,ut̝,ǟhe"_ Bڼe/wܑ.ߵǻ/mݛ{t)=E;c0T"LJVV[mj!Z}w~nnˊD7SfJhgbjə&OkRI$Z&4OjmH  ЭP[cnչ=fD4e|:Q[?wrQ] >̘T{3992l'~C} kDikѓCv߸Yu >݀ v[)E?b?K)suvq(痢OO۵C$W2~?Alno&}8v(\ݞ?'tp #6PG0l 8u7(IZLUy9HK<~0X`bʇviEF:ˡD @пx' X^965d#y82u""y#^Xxi Su.~cW%^ΘK v4{ yH_JX+e6ik+[^;-P a ( 6)ZY^MXmQ BD-Ҕjq(JnZ\6d)D6(}%2=`6tK̨2v#3 F,jЙ gGޑ77\%U \q {YvÚ$YP!2<!P\e#*!THbI~K wDҚ}m7g%CHWjb'3XoԶ`f{2,aJ!aEta٬lv l%; SJy#x^pչ>wwؙQs4G.1UQ&3NDV$[xtط᰻a_,\ ^xCE:\O DHUG/Ua^4<[}gAsCt]3p0Q /Igu9%d^W'}~?A.P< +Ao`g fg Pj_Mr@Kubu Bmr@T 4ͯBƍR$wSm]91qB+/Tk㘼)-##?!PqJ106(̰{ײ]Eiy>ͧ_Zk&b>&oμ\q];r]#&tN `xn5۱@א:scKJi1H]ݪj*d,64ܼChD6xQX*N%윧P7aӇw Y,dVh-:^9w r/yem4pD79.kF oX5wk5R'ut?Թrv,ǿ׷WbM?X˿y:En~|mP1>oٳqw,aDQ,KM-bS~_'u ݒڳwҳ%䞗gO򝋨L5&T`S5Cj7O8=pݼ+][򝋨L)- 8/AEt|hE0 8j>$;Q⚳[o.zTd}rJQdb z3!KQ0f[FDؖVoI{ _p/ erb>j]bkdžշ#lw'EV8\Ec$'4+m"yn0,OnoO~@;巓|"o@8[de`|2x^|^vԵU\WIv]|}d+.F\4jM'K<,vAJ\,U&$OHZyFH\{6^#"s5,=9Qi:g5(wWej-z:l m2ۓtzv5>gI=ݝx{sZj󂪘9Ӡ)U`;RK89U\-iAr])p]mM ZiXI.euv(k]+;s;oFWTOWyoA x^^V_nOYWٵKd-h Y> 5U?!h RڈxsE9un(>RZ j;9&Mȇ㢜>;:csY Hϋц*5k"X|raڔM( S'&Y~U(xǯ|ŔFhY<w?K6l6<g $ $ ( Jt"6aaDX8)CIaL\y{^4#*$<?%B"rJzr+.9%xm dVd %폚CzED=&1F +Vn'E;zKO0י99;q M<r"|TRڻl$& ]UB'sGl !5Ey)) I+cs{:LXChaTǧ(퍋HCU2F&̼ht;y~/?mJ 啶F$tɕ6Z"5߽sGV\~qy. Rd.gN)HlYcՉō-=O 8 7rJY& KJiHD@E$&2Me%-.;q\FrNo.% ^=$s޹Sӄc WJSӻs~Dhm@ %*||9"B3ډBmFv he~.}_'Y.pe:}fL%L. "1B43'̓2&D`wk/9+QrYM'>"cD% ^)go*X֐3DBLMU$(TZ *M'[XosP24aiD>ÿP`RQRl?g؃qY uᣱ _\O߮w!3YȤ* > h1Ou*E`ʁFH{d';G a4a*4G+e 4mhi!{"8vt̓4;%_.??o&kp l4qN8n,V+豁s/"$kL}A&&KA3f,W =yy<,2H]H0rn0ΡQҨP$ޠUZ *Rj͒-k<Sd-QB\1zՆ+ųDdL(9%e,o^{y{:"'x8sI EXS w)U,xZ!26m5v0`q-w}s~?ka-k|WxQ)7e>+tBm rOs s2JB"ǸAZc\ptCƘF0((n JJc(.`(Nعл-:㍾ ܸNb2 \t;q qheƃ"NIþj ! &hu_kC4 ɱ5䙽U}Bx#}E,LF:vǰxసE cཞXF:hPѸ6p3yo&Chti3y1?Je=7_ƓX (_u[IM^cGCI5.X5g㙏゙ ќ -e* e,r^deۑ@rsO["~z &^6H> 5;U\ P"s+wr=HQa$eDsJ$[xt5|*b !纤HB,X,M4#S,Ye%]ҽlaqjVzk=y'({Q7{Qkl%;]@ an;׀չ9"Fk{Uv ,Sat&d&gGWב<̹(x\aCx3T\*=>TB \~"ApOK';e 'A`&uq!jq. i%q0u^EJ(4:eQEf%(EVi"T,%梅ԃ_|ktB:#d-"YQtZ$Br*8Ve:/,tXϤh:7 fzxoꏁ ē'3y~ѢZ?g\WVPORF,ɖ5wx>n"찃=GMT@AٲV$aatDSZU!V5f:'W3v3r>c,ĨDмB&U*O%[xCBm-C;jB,if|`_ZM ]I2ಀeS=x43ڦq4{;9ȈlE=:oْݹ*R(&`܎y'e#RgmB.?>7t?Tp'&oO\F Oϕ)'8rC@kk6slaI]_d DGz_0s^x4޵:Hbm{I v57G!!LhM\ADa d0rњk,!ne=FL@p"ہl&$hZh4:…I4FF {ܗRfcz> sΑ ff<ɲgʛ6Wa{{ZbSl*vŘe@APr, At{fV0eM! mҿgT*rTH_AZXB ׸:GaI)v:O%xʼZš3,Q\,'g2$mr9yY%nGQ™;Ir8aG5%,NHe7jzHj &NINM?.ѹH@<(~G =66 Ƭs@)yWӽk֤MZO&&٤dM()Ft"_e!C ,_ )n"nvtK'VJbi Z P,i\$Y+]^XiyύV˜pTP*Jl ;xnRIWJ]DV|d,qY HW(K\7URp]Pz]KaTM ք#6:~)tMj_Kh_k&f(_%:yf*0Hb$O@ qZpe-K'ާJGP\d৴6tf`N?}R+񴵑#foILU%Lzs7E.\d΢]EB(eXEe$1"2J 3)WCY[%ȼl@[ e55)?ض y[ӯot!SއSg`rkJ_/q-C l}m˓(Z$5LQNݕGiʤvX%z+ֽk-їS\& Wnx᥽=7|gc<\J( 70h]5Q Q$4M5uT+o1[櫨:-CGɾfheĔԁ͔z:Pk8O; \5@yCe[C>ݩ49ixӨ8ND5v-qqלbg>M9,۔CdsdҸ3Ē'> 9JWB8sxZi !|E*1ͅKnP=z EY80v.&T1D"T*#w 8;|/G퉬9ƥ&m<Fӝ}.%;cq /3#g`#i8z6nK;pP陎d\vh"~'.dJ{+nr[UmnkHs&m*ZԺBBV)ߑ^n\h[UmnJM=1f*ZԺBB.-S,.N3o2iǵr2_!tRAŇ9qwGʡm%`)K;kXWL?0m~eJAjespNƭNF,<6{6U&l~=Ske(.`f3sr OiB=8"S$΂C@ ax7:(B6RsuM}oOF=T [ӱȸbc\鿦6b|uWn|i)]kEI85 EcM~0`xz}-f2{܀2 2ѩr3g)b`x~n=|(2ϋ< XնGSjSV9}"B sdkI&siTZ*o<02^=t(QNDhrHqϻ0]%Zjb<s6HmS.S)z`x;$f0P% W RunQU[a={rR!IWJ&*V;zRtcoN,E/ov&q1t7DM&y؜oqqm36KUi^kX:PU/]K oE&D \& %{a\Bzpז 縴HB XJi'SF_( bU뉌LBA+ZOA1XʿLDxQga$UVMbX5ɇU @daS*j]Er#O"kNll(- sN*4_ƻJ\ _@L3\DqpR Q)H$ փ-f^q@FJp65c< N-Q.2<@=0+`a I0q90V\a0 DŽ; b杢25eX *6ؘtwn4wcT<_B/YtD8%dqB% 4$ TnCX.0 U2SKA!ZD`X,yINlOxk̑e>ygW Z& x=UdluQ  ivvy]DA0vF([ l-2 W\ur3BR!N.iX<^5P\V0Zϖ/!o|4~zws]@/V`!Yb K `, 7 +p}* (#B_Ն*Um|y1Zu]?*hJ2;,Fǣ΢n\_ȶ&p'I 8CoHC- adH/VmZeƗ|Ž}Z$K #s⣴t&Y%5?܋LTO935KFwj m! KJ1R!"|PL`թ ħ!bz%O?O_ègVi%Q0Q`CbE[!Yzg*~`k7lo ch99#cng}p?L,)|e88nw^wv;[ۻ;;{}y=yCϻϿy= <}0~tft4no4pgוֹglvpc?yïٕT(ܣ[?Nk2- vmс[y7(f#Eݺ `xx67O=GVwlS!sr Xy<]8 h/U"kfPE jdzx*?Mlń~}Z~~u07HL;ޗ=L#'/2>;_|Io37_b$Əژoig6 S~v *wpXA5q=St0''_ ,>f mOV5Jྺ~Ȏsw?0KqjI_@(DEgk/+'.^Sc{} O-cmО{9`Ot;94 dNFF^,4g0LLV?;˄C޷rCpE6;eLXpz'Ye9޽G=չT[{eFy6_\b 4[ċKg|0F }'z>Ppfkwz GW!md2g, nc.8P"6nc˚ t$D2AQv]8%k b_62VK2&VCN\~m%FTG~.J^a0x!E%ȓT6#JJ7 x,l\ϧzq=׳q=빵[F PFS+-q2ѩ өJ"$ ERw=5湽߳2w;DS3\Of$1}=.ݸK{7ލw'0"I,{ =.M q>6X O`VmfƗQ:4}d>3y̶FTGh!eJr.7RRCWLr8\ ¥'{Kqcm@?6S|LjlfW9(o ϵM!a KS(o$&?Xvn5*TG8>%XJA8R;фsfGZ,]8H H@(H@R7ԫ,T v$^90y %,NH0ZsD3sX]'*S+(`TǸ"HzJPd4Jڦqe&d)&8 Y?%dyu Y7ȺA+DK߈gIHbF|ƃ`k2%st]/pT`kB3z [+(f8[i:lu#dֆ*d޵q$_0.P̈ț?hlbk˰d[IeJh{lTbH)C2RUq%3#L%{GmЛ5Eg.|xk^5W~S穑g0JVnN'|_?wYxPJ6tye a27(Y7o+*nu>q ĵ;ߓ"[zucoYߨ㻛m2ԸOk(UiR]$%ŀ 1؛LFuD5ք)U9Rd_KQ9G&~Kw|W囈QӉʡk-:_c䫃pDsmJǔa1|ǑǤe &["I%_!H} _m"93ۇ̱Qpߺ/}F۵}[PW{9?0Ɉ9MR!k`FEzfXNsJ s k;Q ^$kU:f昘yfa9Zsib7-NYӥݨ׻{ٰ6K2K9vR)u)Pz]U1*)Fu61*x-;z^+&KpKi;ވZBgWz@(WRS@m[$-e6 >q쉉xaxaED104jGDlG FqWQS"{B<,mL>|E6 18E#D1Ř靘fmTe_`mtEk @hYhz}Sˋx'yn+=OxVC:?zuV2~ {n!$k-bw9WG6M'瘀/jnֱO޴W(X-ESAܹ`HȾ,\}ž]R\26;[-wSVy||ś śT~}GWNߌS?ip j:u˪YWZi (ef8,uh>j}.Ez.i젏}o[av洞Zk poG G;1F^m x&{?!݇d%:E:xZU5f?EDkg~n5ݷl2#|DwN*XNaPJ, p0/EZ6jꜞoB $'^Fgn8<tSMAy|y㉯=|gw6.ѨPg7fiZ 3'duwvi㌕ =feArzy!-d=&)?F:5=Rs$)'Z$)jvoעOM0Ž :cn%)' Y}IʉcH : fD0l$t8&)V$>k@2$)=9<]6 趭x.-1I9edIFP tgӞjl`E[{v" $V-Q1Iw#oğvWICaTh;g/ϸ魳>йo*oPR'ZR8V%܏+} \}_BM_I7n'\\ f(+eKMY;kpa#tE[@ۿ?~sьq翾- {Ơ\ju84([wz%b>UON߾`v59W="} K9MElMɘX9ɰ*$} ߧ_?W3D`([6g1Jt۳+ sԀ + N^]")V{*R)FsN]۝[bD;K~%3aӣgIrpLeAbB!M7|3jg|'ao@{{qjg~Yߛw&3pΘUN0d2%Ϧx(j(T};gi!^1\H5J\SS9XI/jh@3]3zY} ZgSAWS=9p8krRt9HpT]5sTP< 2ƲY2{I{ƨ{OIu*rFQuBaT=QI@.S%zjQg*_kR|Nsފ{g?mmIr:Nl.EW:đތjx246 -O)a2S9q .5́M;f?q9KxLwbl6ֵ&MB *0R  779ZvG4E8 usVF)>B .fG>ž1(Y/i֟` FV']^'q% V1C3FeOŬ__2׳ouUL{Vw?C`w@OǶyd<]tG1Q'k ڢ!e\$qJ*c8x*Vm[a30XޖҬPG6{{c\a‰R@^p^Z? ,J{4;J-6ҚdBMAg<؁|<9E&BC;{/<Ve0̘t0"ũnwl[qV{55z|0ֺI{1v&6N;$ a59Iy)ہkMQh$= kvf2 z7*XC%t Bk;i[ޤ=|SvYR:==y)ڡkr;=a qPfN{Pd $wy/x$Y2;|ISF!F޶x)ہkQe _u0%ӷc))LV(q&,y9u nujk8`5iN\z6~ U!󩘼wmmz9؃ ?,lbqvAr}I զWr]ߪ!Eȡ4嘈[԰U_ՌI 8{96y")I.QeM։p 4,g8, [7$iEjFD1 ۈ`ZLkY\X:~ m])œx% b-D7" o~􌂻 M6Mm4I/4pċhz o([TD@l@QHy0@j |8aؐꑪj0a`Ux/{ (<,6uoy˙Kf9-cVk&FeOM1({?bM#w7y?/M#2t3*͘FPeOm#b Qs4h讉םX@C/m-0H6{t| Ф`3GdFT-Ys%J jZocN[]ѐbǺGL҆2E%'Qa4! ?G[/ׯr ⺣ LRCPb т.,ʈTӆ4"{!juGhC& %m2xWr zxnAS"C)cwPVY͆R6$oʆZi=`շ Rw q{ E~]/+M`jrH4_Lna=ЄʓwEtyAƇэN[Xq'un{b>nFӻٟb2=&KdZb'jO`Rl3e&OW]l*^J绒ei9&:g@M_˲DS#&C >HSXetH^-ClYɟ`ZYȫ~wW dΆM'Y׵j-|ws/@1 re0ʪ1>*S",hfR00ӳoVm4\p:_N0< ,.&D!x"`,\:GY>O/WQZ|oFM 2mEfj##q/tQjyM JVȍb~|R |!VQ.cIw%L60vL$tKWagn -C7kR P'V xPaV{cZzC%MiLJ^q2+o#A/&k1&8楣k#$.HܘhW6طrAcefjAsǝ. xyfZR\¤ǪВB+qSo'#F"zsں =-!˰{%ჳ2(:0JO/ ux`2 lBf^(Ս ݞHʰōe"YHX[*f=W<4b6 f)%*:X89:^Ӭ\4pe"d,W~dmLTKyv8Hӄ *u,Sh*{%U %<Ē׎-I4щ:e~A_ޤ{öL(_vAlH,\I!!FF%om}1(Q* c%[! \,3k})Y=&'LhTmdڸQth,l{;)K{DPʴPHBREL7H Z,$4%EsZi$UV-dRy=K Z!Y[3& "d )4B{J;P G !"wH K C3`Z#Z*F ö>j uӎ9+JQ -=oƴZ3ITF} la`89R=Y mLnzƋ2 "۩RUꁌ,L\m4>T !_[ջmݘF.Z}^HFﴆ@@pD1T)"ZnΚ-laem4)Z/=ѭ5ucrlq&ɸlwxrѠfX4#% p,ں`ia"ƷO4#1[($Tn!H֖kV!׼C;(d(}22v0gJaD" Hl0ұailB%j1vK"Jr Fb1&\k֩$e#UCTc)/ %a`x~S=14PN(Z/ *!F\#œwxY U'/X ; n_醂c`-X-ڨ 4D=LBLw8G H&dfSs2ݸjc: \1VvJb.FbXo]Rsչe\GƐP H| ,qt&I=!D=p%qL$heD Lحf\uJiZ,lbh5'PKeO4Lto/n9 ϯ&!,r_~%n_V7a68M`L61xI`ك!M 5?ҹ X fi]GxSm`igQsx6Nb.s:]^WWU%A+&h/$_ӐZI@_k`+^M]Œe+ipLGijxax 8h vw]8zQN4D!wjðg0q)eXt ̎" AnZb kƴ̊B &=6V bvY2&ۿ}n>M6Sɬ_O"Oi`^SL&,ژ(0fr΂} %%#vzVjo]Ī 9 @`dB ޞvc &;v?YRa2=L=a\f-Iq$βqzҌ5cq#K@RR6!lWG)2Iu]+.!' Tlh'%ʹxe&a{/qfdeWP"!B@PƝ/ LjX+#Yn=;ւ|Y%,Co;ELI`]|.dvdYq|r?:VWZZB0^0uR|Vsl!w%T$fyj4-t[`6KRQ8/G\r7nz2 -lnSXzxb?G|wwWw&7wP|h9Ÿ''wuOvZ bWbx˵6 7g?wa7BClu?=*O)^s_XŒІ )7KA+;A3W:3gp%ChC1vIX]­Y ՗o$˖,2@7И#L|23 \11)GC5R]^:Ցu+4Β2k!Ջ״ X>;{֌*|fxwtiu߿:t9v:a)sEvV}pBHdP+fiXELg&&jY/ސݚ8f>_?1ʳhu0t=xB4i6{u6 r>Wkz5v~){4K;loҺGAw<;4ˏ/^cܯ.#.e,Yzoܨ|LS}/R{ۨ?ry^ғ|*I*!^>exM:uʃ&m}/ec}d3nѐJ1Ogp~򨨻c-vʳ\7U:]?6_qw7eшK"Xb,"Zi;P'F$r_mI$sA&?}JKQ#D5byArF"`RcRX 2^cE&s{ؖU_nybagUnfNF3Js7N^ s?:MWsq ytWV&0̬f7|i sB^)&;Ų;%۟d\>unxe^P*G ,dl۪lq?xUi`:C4 Ne[6̎w|Ⴒ1![˺R!7=uMgp6 fG&PZ#2;. 8d;HE֕iB0}y+ h.LN؜֤b=j06mGEMV=ws79i$EA#<|Q崍aV}ھT)BٯߵGzFRaOϮpZOrzlk 0ai( K%ΊScB埛Oiũ ';@ n0H^3(c-=D@EL4"I£q$! G c`1YQ&C]>6Ę!F9{?$Q=|X`$;a0 iЗpD+{+{Fŷ`W J7ޑ !D\'LJx/{[ޖm?WCTB)KynY+r4K .eJ2e\&yXׁ:ŋuh~>0?[I2?>U39;/ݡszՙɈH+i6*N / hN6LzH`W<ʲVjc@"!@}5w,?wG%92B$ z6W C G =<$36[HX{LYTyIp[gUHX|/QWԇPL8@XVOlk ۜb{ ?$9Wotf '֝W| ۬n^˿N⃯k*THU$)^{0?e^v߮ @n9T;,lKTV*e1 O!jSG0OYzyXmkppr3C$Pb:XF"fHpV ĺ˼ m' ɕ @q4W b&Giyo a}O7ƅ3aC"4vȥ@KŴLq B=I|0D!B|:9H|1?˪aa1BWE@84ztV'ҳ,јT%>I.)IxRIQm.BR]* )p |H$ ; 3|NJ10SV84P䬧d@Fe$BjwxM3Os__ƨ<[0$C^d,fs@ycqp±75t$IY @TFI$SG2 Hc%eTwggrZϦֲ5 fVD.lJÒKSx|i$1%~\ܷnjF@=\?8qzAb|gߣ2OGeA A*hlzP" jWeʤ?s]EctWHL$,+-ٗUFf9a>g$,)-X5| A/MGh: H3m|gvߨp`(fx(f XHű",2)Jc@TH K W q$ٜ8)}G8T)s|\rcra (%>kKq. GQnîoj2c'[!Ɣ9 &$vS8uĆ0Y$I2 'J*KJ4֋0S D6 )WAYzdz<.AN_#|17qcCpAw,KRPGŸRLz&Xy= ^pWv@1Ŏ=xj :cV~ڬW` ɝw-Tv iS8z8*x01y(\VV. V)߃nk^{{9#]>.t+`2c~ ŀ U~G%H5 jlW'L SmP/3[kքAicI>tN~mbY_5X$j6T*J@- R 6>|9X r"E7B KS[Iuf=MzlVN&ܰXdΉ3K ^(~֕C .9b}%3A2Z|`֒'&GRy'J%X4Ei0ƳZcLo& FJџ7 HV  Gq G!=X {~VVź"KJ.=Co كp6ykQ2C}0N\b_t{Eڝ3،?`ƺn[BqH"GȝyȄ3!5EDѣfE@>esT`nZAɂw W|B_Cm^Gy̴2zϻFQ\} I1jϵRBgǸ7V[ 5̖U0]ÑEuAwºRD{ @r.mcSF8zz!%_sfS<|8#|;IFI>Q2ͥC)5i]Pli!9sqʝVC6­YN'p8n ʐ<|ȸӶo10]4%Li 9GLs^4^DR7 iv86DB]Υi* ' j!(/1?c9e`ڴ p04gT/CITi[DeE[Rcн.jv cU4ujSG/R3g۪m[m. K @$wշFRpʮ:L#3H"DqfS`5Q^_bx7e"eΥI9+?C< 41Xk>zF,G9H@-q1SlIj/{\πi P*XzSՙ%775 "j$>ZodAAQw:'sBTK @$-m"os=warrsz}5Qw_WibLM/R/{"էdg-6M}w_}?m2Sb}Lrp,6B7A:Nf_1NTJ ӲvX垁#Ìgc3FT yY MoE6ӹz;m58Ѽ}\lSt%{=痟T67[]7?,ew'Q:IW|uֳxNC]~iٽ{{wrog_=I|/?_f曆V-DSN .'|;Q2?|_78O7.Z|S>}~TnZ~}q[կ} g ʷbjUӛ[+~,st}日/3|_o~Zb: ɏ*Պ`f:Ƶa œ.^z ף0Qo|.?ڭE4߶{j~tzm; ӟXiu1}P/c`z=dv9UG.Uzr$3ޅJm6_b;hHXxlaaYߟ}Xq{;ۚDV7lpכZFzLyS:jUvЗi`>Sge Pw,g\jP(QM7h!m}]koF+/'٢:HśBk-+4 II eR&993g̙ҍOi]i9V˗C[^,'TqѶg!$8@ݟ':)zy@?8:==<+h[!bt~1̓]W*=t_Jh8<tQ7KٻO`0oVs@ęzO!0{\B)e-3SKq5DbX&*sw~1i7tyCJQNӓ`.ߥEZP3ΈϘq"ScajɣrgU p kZ9QTLk)J[2:/*iw!\зw{^f޻%?'%kO]6<ý:x樲f ӟy!OEݻwwJAS?PÛc."wnqMPv:?q)*MaaZ |Zѵ^  cA(Y0c+{nhN+ܠK^ې5zyVb~3I0Zq6xw BwUȺt\#+B2FE3y̐7rl)U\uJYv)ߙMQNpWji2}Mmnk*x8rqQ;;ٖƾs3SGx{v 5 FwiA=nbjG7A7ZRӬ\z0aQx;uɢ(5hᗕ$MJi^X)iD9ԡEL D}\?kޝ|kRE˗G}ѫǿӋp6pkOO~zËnz/?=y[GYV/WEcR okIgU+%fbrIuTK*T+WNאY~6QqsVD>cJ|sԥкɿE4BBf~%p3}{{Zyi< :3_wx 2v8^Ia7ٴT~ .u~)xb 1N֒wͿJ9gyOBb=5põgKcp?߲g}O( J x=%ysp#g{rJ7kIr!ҲJ N;cr7z_R:Oq,}'}PœU2K.L1@[V~_Z E}c6 d?ʇɓK:$rAvDn%n/p𺘐K˴;E?H!+yt6RPhQY g0cXpp@ - m ^9&Gv^P;^ r!)?H(Bb< `hPD% x\O]ݳ!=YN)\Q{G)c;JS #LIJ ]Ƭ%R.±P ()Œ*產&I)ȎRB)vQ)n}/% Փ7$P\T:xV#ypŁ|$"=!5xIT^')Cb᧏;较;>tWp+Н!!4Øin2#( a@w(WU9u4@Pvн. ;辱zߟz'ܝx1$;S`H.Ӛ0G sn{)7݂?XŎUXE"r$eztvjBδ$c2͸'"T*_}VIŒ&+F5y{ 0ۖl?_~(P S`s"GjtANƬtTq/-}ژ \7` s`JlG_<8wUxVYY0?4zcB-K}։fNZ筊X%o&V*m^AF|EZoM*\}ײWqC\[/[؏b7 ?.? N.&aTf\RSL(%Z/rr;7hNϽ&JV8/g8?PW}t<^9|QBׇB5P5i!)}%RK[ӂBOW s/W:QrR:f Oen*LtjyqɅbUՄQ@S uyw˫q]ǛIݎǘb{=F"mؾF D֠ϽP|`AgPPb$-ur\WamqF@ NFp*Qߨp8ٺXRoS7iįKВVaFa<@٩_u>u'ү|urAiS2#4\ Atx74MwQ<-Yfx ~zH(EDj6NSHDHJ%= ޯ5+e^ӂ:o8C Fz/օӂ&y @yY7.]fF[t)6*|AQߍ3Ԡ㕯l$DZk-TЍ.Zvupm*q'#Ra` Z XQi'f#Fr.9rM j}~(ElمSI7;(bQƐTZߣQ B M]F U׺ȅ@LD̼Fc)L%\4#TF"#H)!]{5-(kZ+g,YFK - is`-;ͪ&9W%NU^<:(RyS_~8;z+EaL Hv  ZsdOG|*DZ v(H( a_DWc/얦=ݯtE}SGeHz1/q 1s*C!/bsr$8CJ'g>,:xMpˤˣ(FH""h f,B ̧O RX%@ 16`CT;]~0MEqc&U.1p.4j9~W:ߎ'mݺrʑ?9XҢ-Jcí&(\"Dbh;N*)T"00c3 Y"[dcDV6\!r7nEwGfu:-[{Wλ\OĒomoޠ~(q-Q_Q}9כ^/VWy){,_,ޮ ;8M>RɧV7rNok:o|?j[Lu-ڊ-L,ŭ-'B̌u}t8$ƌm '+xΤKq4.o*pN[mW}hTʢPG-g@m-gŦҪlp{~-kIBS- n`)YS8r_rQpMocx` 8m1`Al.0ʂTKd,'ZXu$).b?MY`%"sBpR,E4-R1z-lP!E"zLcRQ00ECZJ+ 4/h ?g̈́\4! !|ygdAXt.Npo4=z~S|CpZ5 oN6œ?\qVJ)ZA4a` @BG#V2e( O~0{nǿs9y;;xf^֋BݡG^GI%6U-Wjgљֵnn;7g?d5 s[^xo ,Z\= C~#Dvz[wv_wM ^/0i:~AW\vJC@fsk9!)y-k>9^7}`:cݣ07O1|&($4|E?6|JnՃև |.#/gC*]+ WRJ`kSeIğg!Z @=̉y>v `[P]~Tb;'޶6b"]IؓˏQw7ߵ$Q9UmrE?3=Qx/&RE&lrU-Cmb6LɷZ<1Jv>Xx?>g05M94ux*hu~'z|Ɉ}>W5x)hy`JѽZtГֶ8IOS^uz:9>2Fsrɘ_A*GGyr4f-I*8*oHHXKZ>>fgPR{rZq3r''+y#;@XTۊ)kC/b>_]~+o|#fϞܘ%hÀű0+k}:T,uS˅u2ra,8u-:;ζv08αx女@zi`9[PBi8@LRܛ1a_?QbN{9t~>QkKڍg_XӺ3pdo5OU*n_]߾)gY6?4Iʧ4I?Cn<\5d6 =+()o EtR)]H]|kYBO%2>iX|6A~^YnL7'giP_g7S_)݈-6ut|6Wc_1JM4R>xS`6$3)K$YH6.%qtR< w}PY !D[FZjēx'(FM'cIԌRS,U}81K`^89%L@B"Q0zq}ZƀыLYbG >{^LB2[7霴fIO)S՗\[j_P4`Żh'u4HQC2hF') cT%dOUzGfn޷uvs RR;ЈZ hgwXr= ǖ(مr 6P̴HFFhBUv 2cZ榴YNN#.ӟo/&aU;E (򽯉k?&C&*^! sb#F2cQ!P>Yibq;A"߱݌U'9@ ۤB)arc \\TFh.E5Hu6e&WWg|xC^Qn^Û_ۋsoo=fg˗?>%#LZPgi%V{aU2#ۿxZ;Z.<֛WlUl bNhD@@ }AkjO9]'Mlp4Ud2ActۈrQv1!TluŜyk6"3"D'_P>bՉѡFSx.ҺyS)l #!xۘwcFa/c6{{ۮRig=)񝠭)@)Tτx4ߖXHyyěZ0=QJ KA ǎA=,4@Me ^.,33YPs\؛pZvl"jPךE&暍V[I9LP2h 6= 95Dfcy2FgԒu)BBI*_S,p a/r sJѷ2u {]\&Ի.uU:s7(p֖n ĝC1KF' @Pfnz҂sP:GڌnetS+2&BsD'&,(m3!`rlcP^<4!&J%:؆=H`sP6 MN-]OV* IW.΁7oZX[.FT% M`}P҅-&xWIFX@5QOd^~^|,?FGYw ]zF2<0Ny ?o]7&Z]{w^r'D7MbO%5;ay~u3FNR=3ńI5$*T:$tl%y!6T_FuA`wANl4m)~ .C I;b~j*zqPedr܆~tu{,C=CEFWS:W0Z8Nc佉^2VN뫦NSⅻ}W3QMUDzaўuhL'NQh7<%)E%:PT\(Bk\ރXOKKlt:ڣjHc F jb$idtLtu MpԵDa7Je"ʶ d2` U'@D荑:7`R෍LB޻A׾n6M!f&_-c'I>Oa{9A$<D̷X[ e tbh8CP#JZBK^قLB36iRHY!ƐTٌB jjF[;ua:Bj^dM>h@;B fvBqMqAcn۹>] 1iY\x@䰚|oV  haZLT&L !U]˯_ڠECtNݷl٪f ^T/ջЄ}h#^/Wix]I(+m ܆f+2,y\gd8߬`{Y"]m}YJaOI4\;ڨsO3f'_8u ]tAP3'O^kV8u|"1W}e^5:y0c+λKmX(ʚY=;P^~E޵6r#"e7ilfvf}BBlٱl~-n$%d0KYH(?E;wT6ZjB"| bH >&U3+ `<0RMji{eaٕ.Ьݯ]!yt+>1T|"r#Ǽy^[ʥ3AE*  UD<>|N[!`!{#L{εڷF W-#t~|Fϳe^8zӑ6ƅ43mfm/J/Lnpe$m}(:vLt d́"c6x?EJi50+(%1LLJaKGM]gn@qөf TbP4[r2V Gv#v"]5CxH(]E-4D%=$&B>k`.^ 3*汫1%roH"zJqnr^TʧJQi3h+3X[s ?o%SIg׽qIZ6[9&WN^<<|JT~xy?=7f@Z~,Z+pm^,BM`0?OYg|E'$OE+PQjI` %GԌo+S\Z"}0ȋs}:я~]takwO"/߿{x]isc}?0%RCgF>NJpB/xڧߖ>8;_Q.v>zHaVR<8Ƈh1UJyrmݺRH[S RN6X#2Bn7'Z6F{ #A5/sc]KiTbR -)B.8^Cv^aQZH(g{U :5B=\D 9"bY""5 U7?9cyQ TGu5I,J!D<,è_y[HjN{PyBIП/KŎ*ڻsEKʛBWC>ak4qK$HYư{x2s~ lcʵ4r ݣp7!~n3?N(EuUpdj&]fr?o ww#ϙ8T{'kE :] ;QG.m{?:ₓ+w>ItuEuiiNJvtt4`N ooOLJؗŃ?}`f7^.gI7/myMR:]I,5K!eGw\NJh)8amEckdA);fV[b3RH$[+X=JZY_,}5z:bIi6*niƴns3o"##ݷ_KcZv@LW o˺c6ZL\:Eu92\.X/ּN9;y<Ek~R8)ip$3 3%˂׎h,[sI*Y6r֌#J{ΛƹCCY !lf] ʡ`D 0ssD(\`?-߲埿\y]Rp$͔Wdˍ?amV v#枪aig-=/ *'!K%jXiz*Cϓ@Sŝ/&^\dY6UEyٴ<w^.r6-"cOLly滻G6.ϛ[!>D(~YObO"#8&_6Q1T۱u)+.O~H5wgqƳ|Lu;XĆ5W530L1_t< |p 皿./Fm;Z/%e/6@(G`~'fy_)aI1 y ܚ ٗ;48"Trp7\+X_P$E_mHɰgiGX.m]< j'ˣ),:cNLeD`WzF\OBMy+]sLOPO3N3ԲV-~C3X8+X4ł:sV2x2ބyjp{JՂzC?1!1r` 6p$fFr bIuŶ×[ר ~V G6 Q}2sBG x|EWY*䅱|7_`Ţ9|63wġs+3Fч[f6slo2~iFrE5*[`9rԀ(,rƆG0EX˩ 0Sz3qSjsǯT4mj+=Mǘ3Ӭhs0 F?g7oރh9`؜|w 5C'WOR-J+,lmPTs& @oƜHb&u^*ǖ1B_ 1іqAr(kGfYA \ /bq8-E\5dHkbUa5}D[gKGJƼG,%SgGjuKHJ ]# APj3a=)9haR$M&|pl@ЂK6ei/% 0CQhP-b3&."!Y qr M< 9I}P+džb)q<rͱΑfUN# ;pZ F{N^55>=š",Is*]~cC.ׂ9MD͜\|/ <3%{_matIGt3S5I6yDbJ7vZnjOJhbF*CBP*C āM3.2%p9^-F2X*C/yZʈK sl2o3A~U3v~]K>JggDl{ RaBHdǬv0U9PhCP /i6/CVm)ƙ*pApFx"iZ ?ӟLt"2;;?(w^A6*X)E:&_nn(u'Y7zC @dQ}rt'kZؠZbE&C&Ϭ*3#e8(nc()*1F#w_>[$li|U$C&ehG)M[N2E/ x&su0' :dGze7ݒfX JX{t9H-Guos`U,G_rIy=',,'M3bl$yyuڱkږ^J;xHw;3ZgH4RǙ@t~EZ'=!KE2Liǔ0V`a5=lư]-d1]=(Bтm\\ү3"ȃ&z|_:6 L=Y*\F F<^*(gEbqg$8YÊpFP,I +`j A8Jыy>]ƕ4 e| ) ;As 3W+<&Yaa[n1%+E)A͉}G HU17R r@QaS+2hjBב0$D@J@[!)J`cLse  Zy RYSݮ$vDi`0>֗>))c]q=7:#dNcOoVFOdFO2i _e2~7-Q q˩wPCp0FiU./Ga v=JIUgRbP%ol;5yNR^1o?%˽Kw[ Pk 5RhjͮԄUj?JVw!\\*ÆP-sj\YC"D_^,: &/'CjbM.Лd ~xz`fwlJ˾s,~Wj<>Jvm_u~30w S]].|ڶ=!PYRhhɇhҘh6yϺ1\a8uK f褾u;:˪ t}[Z.Fȟ|SR)C*k4_g&/웲>|I {6wr)z|yiuRwf僀<A7CbtS +r"wBN-^Ͽ/`e=*oWn~;g*L4Kbv6ٮĸ8mԠri=?{߲*!#H1J',%w K )00@֒#Qn+HeSƏsNy˭^iv8'],HWf3%̃tAZGy06&i.jN_S se9Ŕ"\2!Dv;9;>O.'{HVR;{r|ڃ~N?)<))/L)o_'Px ChX4_02_zB0X~% ykm0ݚ^e{Bgy^aBr KF^\YU qxZL6eaQ+QnTܨJV%^py N|^Fzpe֋U^F ^WH9KXC. 9NFW~Bdr5^&ix#슣j֕ ln"Wz ss$cHGҜ% h:%݌r>+ d>m̈.BJ4P,7QG(TȇSa+ץ *i*j|VaSXqKF5 9p8aZtpzpe֋1|0`2Fۻt]| 8G{]|ͬcxJI('  `vBHZG\zֱ^!ztu4hWpA ûgӊHp%DeϴDDaD65&Y7M.!MrR:WxEA nlGyۨ7?HΩ E6w #QL4^>R3L61{9|& hrb:0{cH93)b̨vY!7V04*A]L1A\!uNJ G wIbHt{אl?~f^^eo7;׷֓Jη+Շyᗘ_BJ ޠ7\[,*󯼽{ySDB]Lb*-3;O"CmהB^7wF-Wߗw=H һgo(K|BЯkWs(} $o'"qߺ\nV~Q3SW3wEVRQ*Xuԙ@XޏeW[̲İ~, ]ߺջ?8oY04]~%_;qSelGӴRIjvF]Fj ܷplvh_]޴U9),\;kTA;UECǕjʑuFEw=HtVN_Z”hi#d$j4U^S`CtYevדq8 an6fHT7?{I1nw9g@ [[!: 8ڍ`*C0?,^>ֹ#^~p^՚nC2Hb3>EC/ql 9VUS?4"ًwuN?w2!WsuF4UaƂ!ۯFro*Ž͟W bl8쥃/3Ux~ʡ?(j.i&G ^'Ǎ7B :}AK%:W;vslh+OiEܫB9 FG͡2}H*[.^xW$c檉2_[ΞIi ŁWbf;kfpxoI:H2> 0J?xBw*S$cg0ieNބB 0)FxQ,>}miյ:]:xpzpg9Q!5̱ Ab"Zȝ3ȞW1ȱtC>X"0I}ыɇ"G~B P$1UoV;'O/է&Q }E.$9;Ld^D7a? 63򐞇Ӏ< Ci12q[um׊R  `F\8X~lGWpZgou(]0H8; 9; 93.7!r3BP%k$+ ߗ塀ˋ D5TF1`}^nR NP2dyH "&7B $KY~d){Yy䅻_xa~_k<yI6~Ob]@佾$5.'߄[4o/N݅F~f D[kJ0"ݱ>-Ձ7_r0dP}OEKSr.o&EN`~'%CSы dIНqt@N`/4 prAl+I%~aB`B _ H?w{p ĻeMzG9R=E3EqEdȻH+%o4`Z#$F!D6"HiXQF% ݰ 83]Ċ r0@[#pP 1kr~Bd@!u4#y.'`2|Ǎky(~w (x4ZA)LU& Rgri\f=.V"^W ϳ =(]>_xj~N:> [vh۩OG3Ayy$j}Xe< e1L'NTZϨV&+>6#kf㠬GzU齢4V(Wቂ; 9a#.,`^M0REeN=R4>y/BW,c(@Tm۴\kW?w-][Wi6vTnǤGw6-WCIwtnփ^is>2'.pcx=q@c˃i:jdxr4^Gru5C"P 8EĨWLd:/␱>'cq(Lt9W 9 =RJ鯍>_p~ﯫks+h(tp` %IN$7( HnBcn6k/o=y ' #EPP:dK<*TDIL2:9uRX%ϳL+ɰLkM3kHiʕК)TA@0u$! EQH@S?硢jNjԯ`@XRLN^9̒{9ZST#?G 앳쨡#/A2 #w٦A,uY8.YgO3dG>A(Ȅܣ ($fdcC Ejڳ^cƩ2_}*OpNoCh aeSfINwC~yqY-n|UpKbpU;3-AV/G*ŋ`Pqj^quJ"A =AqB2F|#= :jB!t<>^\btu:Q1Qd 6-8K#KijGo 2 @h8k(״#vPժb;(fR\b HA'4TD.C3p# .!"TU#:`3cgPkR2\YikMnѮ#K/2PqbX{u|)K(aخ!-ΤE!O: uS9 `]F s;)a~#0gB]ksuչM5 &9V@yC*Q>Fwr:'xN+. ]uK/ϻ!Z9k:}cyX^XVEiN9L_zJN! CҡA#ȯYLJ`1șx f/JM}#7$`x ьd3 ˄R@bcl$pThK`6.VtukVd,2EtlJFӲl6Z1ј$⮸6K*C_tOfO6QO`+kxn矝G&|Y޼lMWBkE R<>z[waHyTw?~peX.g~C5꒬Ku5Bݻp1J+(Y)=NQ'ʾ$]!׺_ui(!@%{0_|SRzo xw @ui\FQ]Z>x~J9njŔH`٣ S>K2e,x&a9F G~eDG^7FS EF,@d9HX\-Wb^pG$h4ҺDB:|!Đef<!!_5H[RǁwJkqMC_hBJQׄag{Us.Ӡ*WW;W֭7TbtteF9kɳs{v*y/ <=ɰ`W [Gޞk󫅃Ɲ"A13&rXwAB bRl.rĤ1EDě qŘlYS d^}{1)~yNY}o~g,_^D d<[\]m^oh$.elFРT2@{o+k8e/z6XWfe1"bw }tOwA 9ʫkY9Yq$ khն/yqc_gI9.pW$I-eC,hZ]ioO1$'٩g|nPPRrT5JAոX`nZ7R>+@ Skط?'B~墎ﮄrE&fc-"Xfv{nyW5qW(.oo>޼XϪ<5nrZZ#/察]| HYH ȃM;(Ib$h)Apw}he▏kvq f-w%~Lv#Eb^p![^v2C\ FQ^0g'ߋqRnz3 Ĩͨ9 [ٵ왏\=CPӻ0H M'Bu>c׺ ݧw x{BVِߛS6*,/byؖ-7/op0w:<+n;R2p)d5*"Ԭw8Ob/dRh>2̀+N"G6F0uBdL!xJ(.HZh#0Mz<ӻ?]\t?{b$_X_wCSA`i@R?١t 9qBȒuD,bͩ!K+ Za1'ɯ"^84 v;B>3($hr&h:{YTP^qes9ԸQd,]aC/Ѫ|N{Em 8a{҉=MSЃR+P $ (]/),‹~BM nMrU.T) 1T=(%Pmx8^5sèG qjur=:։bw8fg.o[A ض,Vcؙ37Dm{NX'>X| `L.&>t;c:pƺer^}o&Oi+X+>qb+NVC |;hl6&u JI#>YߗRaXe#cגD躆rۓίB˲Y'^|^@ո9~ -5 P!EV^EMIR9a I FfN9L9_ց9Ț9th>(}JQ>h# 2#`.jz,,|1>a`"MQF%Fq~j-ʑ r;pd.Nk \eRuU, E_Z0eR_xItf=&ʯ:5mŻ5s| ّ)Ŀ'J36I<0pY' t} ߽pcS{Nc}pҍ_AfrH:(i*#:cɺH*{IXg7?^V@ E]lb[[AzOg+{^5b#nqE#z}{B;L-r&Y]l6id RYnjB "[ƬJIz"9~9+R@"6@` c&,:ڴ>rcmUK <،<޻|Z_\3E4ۻEߜc˟~B|xzc=4׳OhO߾RBaų7C?0b?Wr{~~WfBi~_qr~rۋ |Msͥ٠b}?Ha "! 38>7(JroIbtD>O-B6+gV->Mw'ELďJ>۠%P82|r9uٻm%W\~:[/T&uj:SR zؒ#i"Psa`iEW3]/vo6G4{O,nƒ$F{tMUJ#W%Q%ĄO4~LpU a^QR&=V ~XXiLna\R%%>ِ֢ hMB1":#q)y++PxZ9;-l,3X^j8"aO1ۘ*MnvڃwZl@o_P~I'G}тH)%A~.~- ;FDAN3ci6n%Wv_GnsEW< \ǙٗiMʴK7_㝳kvx|3%bt-BW?|L˿^_'sPÕRgy+'BGMGaU,Bz=)T(B*,uG8k8e _H Ǹ#Dv#&~l{-pzrZ[;ֆ`k⃵Z۳Sk ^BŰ8No'.L' zcVLVHYwnn1R0x8Fތ4(jc-n1gT3DL3h-ۖ0A#^Y!"98z0OpMDVY  1iOBʪwiK)n(V)*Lnz+jk*'\\ύuɁj;7*\ʔ*M k<0+D 9>lz bͻw)RU9gA)g+#+a.9BqAr(qǨ7LRIxLtcBj]0l^LˀK)_t׍ߍ= /r9ȜLŽM[ @U'wu|cԐg<_hbE~|xAL-59_?x9zs#s$e{_ Z~@ R=ٹdDuVbWԔJcmsr3f؂OfPBk*z%L6F ƑѼFbG38պ87p%y½i nl}?U ?y:*p Y q25!X4ɫѡU TiŴ|߼B١y EA'O:a )h^hdATԖ$fX#1XVAG[PXŎpsT/w;wOR6R+rG*M#w>׮ 'D5@iwGXRRWK0DsdGK)]V2-fؕՍRyP=&ad}JjIZyRݎwgHa1Fbd.?n?|&?!Hk ?mqdmqvSqpn'NR - -)dh8gd C)A볂#H48O _T?GbWa`Th찭+l,8^t(d1 $ZiS-$15z|ަnf׫HiG)?R[/!qY&ڀ^Vn>9of&j›[X}Q slh ~f;gokNol1-a kVݷiPlKj]KjΡ sh}/HuI-Qɲ_;\t<;8]BD,ŸWv[^Hytp;dBf<;℣ig+ ׇ#y|.Q((#mlb%\Rښ5ȔLZ8CgƈZ"\9 ;(}cx! > $`2d?.|9pc 9ha_ی1 > ["C.!O;oHl-!1ay^<~0E+=)Ñ8e )k.~ɝ?og5w5Ss-0z47@żi~X*vߚĀY?&N=&srnuSqu?~t.魛.G2]x40]~]VY_JpË#utPԪ6Gq̰# *JTL R(,LJ >̜Qi( ^}#&2rrwOTR+ ۮU%Y3 rvZi*bY9:bq q ;RKI_Pwd?:Z֎ߴŐ#=ƶO]*dQ *hp `BǠlY" $e,C i(0k%YK$?[&ZH GulSB3pnDm4jdރ q4-Dnx3.{#{1cuV2[g E=}UjÇoCK4?c.9d3'7_~t+GGKUIQiceTUUhvloq F쁜U/zY4գ@铤$T'Gd GN]{QNT8"箳(_5l]'%TO;?`Iߓ7d~KISLȟwV(yAFj?;y`4|CQyυ *b[x}#Tb;KxkB}^D\+.'D3̸֟{' <\e06"v󺶕!XغĒPJpuv*=rT]@%FT_tX`:2$;W ⤻)+!mU[UOW#{CXIH&HkLY5q4lX‡?4F>ubGW5X5qgؒRh/g{:$@:)j&O^fx ^<-*w}˓"31hsC͞o#ۂYw]P+B:J6IhFF)[r%H+J/cNZ r>_f77u$Dh;4v QAcr|Hg@U]7 UbYTNJ/Vݧ-n|ɍo"Gj| Њ]t917"~Όv+eB#OZ$# q(9ݕ76(HjIwY#}wMkV".Nc򟿖/1GR!Z" h (gCғ Us""DfQN(% IR*XP)daZ0E,ڸFA5|^9._1ؼfŞoܼ/>d[T4[,;7_~r3 y㭷 K6s.}lK9`_yN耝X7k=$>xLUTǮ!lx̵fYR LbK"ρC+Q2=Mnf@1kHm[T08s<,_?TrY^B;k 4I8# Z1FQ+ҪP 5ig yv#fUǣryw0n_j2X1M+Ғф^{촁{s55_WR)ۛ*+<~5[\H%9 3PofY{ǯ"[3sҜ y:H/xcU|=)r,\Ĭ 0|qI0G= |&$鏥΋ xkIƯ=KGҸ#@lʕЙw˳}-hdzvFYy鋠sJ1uI*$rBYsN< 12^iԇJ!kK>xuV3 ](N^SZ\ԂgԥȕQTQh?9Oգ(u~gG1iNEX1(޵q$Bl;v[bo$Ko˖((NSCJPə!&ıtuU^L &wz%R<33tljl_2_hFJBKD)RH:w!]ͣ}a|t4JֱͩL؃E `2pA!c hJSX+BԤ;gqwDžED?Cŋkkѝ}֨5?.MȖxd7nV*!%ypG S}j`ر_@u޷GJ?>I 8v 틬V)Woe')H%]2TH X8%tEQJ!@ƅwB1 J=D!m gEYDk`0J_,X=UI;9Nj^Yl0{Ͳ+{SpFS[Wj8`9KiM_j ,🝔fjYe|OazYH2myGP*azs<80OҲ8vW]\VnX>گ\i;B%| =K'AKc-_z8쫿߾ M,E w1[ W8cdiO\%zS~:8FK+eրTQaQBc^bhJѣ,(ܙlG^j.рB}; 9Vŀ=f2Y@cjPZuY&HQJ^Maqlkx8^g@~{ѹyDXrqmHdTT5wx"/"QD/2rXޔbRu}zr )r@9pjQJfI3C:Usg ~Jz֦7> _äָW \j˕n-Y]ލ;PF+5۾pW/7#WړyC8 )lT %檿Nv)12_Beg&!w]mUď?7Ř^~O^E6cM<-#OFhr{.t?~rڷGVǮ&DeBwUdwt Alg-쌣 BwiP =@J_v-l e_ME]>r˻5GKh9I!ZXDl?EZ-6mrZaQ`ǵLj6s;5QHEL=)F7;@M2S وˉo@;nnT'xlakPhc> Xц3x T?ţf!-jv:tJFF~k^Z'Y) ]*X4 I(e!kBoyz()!~,1-~wHGv3Onsf \śGox9_,S}x)f/K33v/O8_IvĮII.i[>~_oUN`0eh˿§M*[7RПX%;>S> Y\ȵ,4"T} N+QDLYzQ /y; ENklF4AӅNoox AN%!4YB9yj~3e%5Rk "sUɅ-yBPɦ`yHY Z%!$#3_[hgIY +B AA3Sl)'(<@R2(nv^9u:QD^[ENUbL@;FU/ym۟ $/A-}6~}R$Α~8cd< : }bh"S@)x)⠀uej'T4y˘^A#L}&v4%Zʂ>KOcH9BH@N\۫j*(qJsu1$ȅJ^'Ӄx;M.Ӄ>O(byχƓW|}&?"cwqEGrϷH?CWdm?wp2s7~=N?ގF|E*&Nk?=xO ?RDapJ>8y)rۤhKx=$_3r.Mv lNPD*y**P6d{Ot6X9䚉I$>$#"J,s##L'=4;x yMA-ojal͝BcyIV+)׶v0y)6Qmn4]ݍ2d/0>&2i@GS:ɛ.Aʭgjo/@>:2K2˟G^,4 sڼǿ4;@{!| {kO/x%*5-.fTł5,qO帏 S)>Md-uϒEWRVB]7zkzG J\%Hӻk}^F!s$ 3)חG ;,ootW)L6?ԹGzH:GS*ifʂ7IL LZaLsg 5o;<_=>e{Vz ws~]0I)VUt``t'Itd_Ycp:>ֺllSplspv-ܞ P?ylw`=c^KKX0g\vIw( 7|7 - >U}O{&ѮsCdǼю]ngd}tI5]hHc &ۍ&Cnq4ڠbXCX `z.Ъ ʘÆ5,]!k$פUS` ٶٰK8[?@Xl;u!Zzx^n Hzb=),yi$ 4sau|zimz4s-8󨆉..#m i@5=-/|2*Vi-Af5O>ü&,TF*{nt,*VSfY 'eL|Bְ|Ԧ45cv ?ׁ2%DByRzs!-v 1l heF1zx ǹ/\h|6_dʎ)st#O+yrUIջ+ѹpϮ'l94duUݫ;w̿_ee2/nR+XsFNM9C LʳRUr EAւo%Dd\ YZq9k'oG7OcieJDh}d,UiŬ:6QqH# 1ϓyūӚs#gvK0i߮= ҘG:suq\HWD~]_,ԋRf`L"j],)U8! $ Q ZCcP]t>T,J;Tz"һBq\Rد~7< bĢB09FL`J(o8T;\r݁bzTg=[衉Xl&l=j &~F3 [.mF 8"h%gIy{v]A$ 2LWϡ+J akNʏ"`'d{Wy4?)V̤$h韕Z[:**+']gy#ڈ&vϪ+]!'SyuT]YVA H!) 3U+qY@n토L^jä&PPRo=ģּSt8px@zzEB+("62~BVڂ=%t IRO+&ez98Y2W 8i tQJ~R<X srn }j4j22pv[BL +}ORzP/<`8pCyJkתlviqޛŌ 'ir'OM늦tc0eٹeMxh-ז<4ٛH/VXM6Y\rW\v7rTҿ p Urc5>r1v岭VuC0/Ea֍]aT خ9iXѴnS'e/ܓm:msҖ7o-HW&-Uy,UW/zj={Y(ZQjEA Ph~7NL$NF#1P|2aH+L+$xADH59ǶhEħϟ _b]>RM$`?=擕8MqDhrQdr`us1X}nP3wJϣ8> լܛL {hj%oB- R%NlrTHV6Yn0%[YRsg~.xu:"A2aaca ]Bl!Hv3"k70L[ό9kAFSbjX^KN8u݋a56j&fq<Ř#&ӁJ*?xdmxbM؅ T ﱐ8I`gu~6-Y]l퇟P9;V$p=x 4d}wja= ~10R~a'rFd>j>0 V gG `)҆-]Ǯ6WȩeAfF&M+25ɬ ђu\6+8?iު3Г&&ޚk-;:{eYB7OǪ3NFɓjIav!W@=*?xuV[$yz.^haIؔ Fg<<)̔\›u]'3JDw&#ϓ9I ]dwO#YH"F@b#mLط*53f.f iɅXde:#GۍEp"٣^';&'n{8pq.I#SPb=XKzpQQ}?O{Ju7<8z$5uKdl曙' #Ф3*F5sC -fJL oQ+x_fk^[[[p]խTM}^vhq~R8mUҥZ4-YRٟ}TuhsVc8m Ƿ,5렕֨"pGؘo@r+sj1hiتiJh 3ZcY FsoUso-[Roռ޲jzky]=ȽU- n--\/gqd8G/k&SsDy2Z'Ss hI*Kzj5m$.FUZ 1*qSy>lئGnnUCU>$9ܔq3VŤ9xsn֍S]#D`Ǜq)kҀߔ83N""J/^vjcYR:4P% 㭮JXxX J(ln5k#EA9$EآṽjRK-G/HYtP`k 'Vg%#vm *G#(U #Ecj=EVh4ʩVp?zqHE3<|N*{ָuL`ҰYIZN1fgj6mV}z }Fg%'8ٜ~ϽWa΂?~^yZX~6ydNd.w J8—2$TpHJI>B40깒ዿ`392b_Ð)BlYe$yx>"t!0-ǤRc)cՖ/U8! $ cS~XJDm̮=Z:fvYIp9 PլS } סK\uCPfJDl+[wY9fm~6UEOV(2E\EA"@2YJWۭv(!'y/]^699%{Xx3Mn׷7Z]DnXِ|!$U;Z2vJ!(YT97#xx@T R2βk#%IAy[(2 5OEY%Xq$wz*+^ux r@ ^sr"[8dǂR'W㉓'u|_MÆR2OR^9Z> I(q&Y~5T: (FƷp`!DǯްXD1&kc6ʺՁI0Ƭe]H$礩ea4_nQHaJH9;`U7z5y@vL  C̐s\DH".5kp$ Ak-)ɼ{ҏAm8vC4 EٚA9aMV2?8:Β\S'}Vl=a. qq@![42%MNl}D~[ѓsS By;J>.g.:  *p~VHx˹Cu}TϢ8p\GAQ b9eY\߆695%#iUv$r:" hm sQ{({Q0Or(+Z9u'm=xb*rRD0R*r>ð#@6+Y.99I!>޶w@J+rw 8A+xUK1>0ߛ PDmiW';A`]20Esӆ71KBļS8f34gz!x 9Up8lb+z'H & lģ=2t0ЊuclG.!n;G0dj!:y>4(ċkbQ~l!iђD6pva S1JpCǧD19LcmF?Bg@bQ]٦7>M7êNV  `|@9l͐Ƅ0yT G,ւt26MDtEKEXo0t .:j<~r1p“3#na[$"iw^ŗa^`]ݰa*~0ru:~4 񭋨P2w]'Q/THu=ؑRT{T|L‘f*&WNeCMb[ AAm"?qEѢ>u46LHg;ݱ{WB]ύo|:~wNwS=|y7ܤǝ~icfǟpy{C~c_/SyyO/Ro?H_K _&Xٳ7/^67׿<}m\˷*ު/hL6.,{F|Dś?ѝ>⒬3A7j;ws@ x.vIu3MYp~JVo_|i^*:{vdž(3?%_Ձs2k`,;5ɔp… J24Dwe~vDػ \ V{~Wn,Gn']Hn̯r?Ya {nn7XRU ._Z2sO%&GP~?}^ArfzEM}wl& 1| jQULT$yFiܑEhzHۛ_t4o~EL2!t9z."Ju frt4c''~Sw5J;H3s'ht|7g?Dc ~oU7{oWzMd6%^ڶpmz7rK07&t>ϯ' pbl%mۉb|97 '"v@$|P0?ENڪ|4&S[K_?|flj劁ɮܣc06s4w3+vh>^$,ާq>^"?MnXSDj(\*˒j^H1B-p)BA$g\3_)zP@B S08Lj} !]소z0hA9$qwjj1jٻ7$W~V;=Zmc<ыDG}dɫG5X R7UQ_DFFDF~ɚsWVQ5N1W{})jxPd .bu+"Q`USBsj)^D}_tmOO fZ;%'w†4DP~6HK2dŐQKi= NH"xX0)oB^cń16*lD FDr!7Da}zy fގ;+X46}L'8-ͺҪ3YW7~&f'oo1ɔ\# {,,xo؎@-6qon /-0CR'"R3{Wi|իw.|6s%4bNiFaZ8)%a`X-(T+ XLp`Kn.%SڱVO.ҪP @#:D)σԨ1`UcD1)hin$ 7Bkӝq g0-)T6y&g8Q4(rr&I`Z(Cja prp3:rɤHI&"M&Ey2AHb4##g[RlD XF `z. w~r$Cܾ]~~ELE&{|~ݧ'Q R_?|u1x2]x 1`t iro]+Y";3~=zM>vY'b*N7 x~x{ulK>pIIY]}SaF=zp"s_s&@&ZJE$`2,;lFrD,J M`2݀*-jSI$xq47Ŗݟ~j40NQiw[0{q"bDT޽Z REł+$Ѷ\Q1d5-#CwF K(k%n DV E2$6!'0K 6`c"nIlh$5%F%9ɮqgj8)]?L;LS\ )gP|sc 29fB ,j ]7Պ@AVZU$3UxWKbⲘKLF9cJH 9߯) -uk‚fhnfhne*a2t"DH[@pd,B ~!9^QhF+ Ma4ӎSCVgUq@-5qmKYps H$nWba,E5 ~Z#F/˛|3 q3 m@|U0Y,Y^a!,6Xu %lDQ1pk$^j* #H}xA@[LtRA ta9YX6FD<$#TZɅe!ZB0 9A@Ǵ|&¦ b>Nd*8!-w2_]ObwQN}TjVRs>+b7W[/>LDUqy #s7Fx178d 3tY">0 W>:. '6dr5 <I>g;R1#b`◎YB0j xP=\,HE#E'\`L_(JN{jH0%I+LbJY^x0!T p ;MqZm<+w4^qH9tN\ )92wÂ?'"nRa mZ$zcyq_>u¸' 9Gdt4AGqfg̯^T+.vf]K1-Tw︾OalW=1NK=Um]zmг %FJ7:X qO:3;}wZxÓLL5 &h+u뵿ƛk hu륿McL0TKL9N<k#Qs%4pt G$bR iyF'EZ_] h`b2&5y2K2T唭wk= 0en}g>l+L8"ֺOmSoahʨbdVG'2{z.gΪyܥM!߬5<6U2j3/aއΌ x_w|d2!+/74Aգ*iS5{ 1u5SP qW%q(e'26$Qɱ(f%LY4%oHh ;WOhL6~u:PtA-Ol 5GJW+/dmZRA{#}z׽7,+qʐ>=B,_ Ni^c,xW6ZX }x| =w-UcEq jJL'U4 |~𞵥0lExt &5Ƥ4˷DuB) 4FXNHj#2ט`Tb<3I*3*#jϨ0&3,w6"¨Z#k1Ժ5y9)uՐ)q_32%831g-'$G0?n1Mq33;5:~=a2 t4:L:,.3cWcwFo2$۠$:GS 3mMV>fߠ?oZܯ>fHG'Xy~1=s0Jmnu?`+)ieIawN;Vx.c2Z*,\9j9Ө>`ENv2@.rZzvY[m"3IW'O~md Rqvi@P@ [ُbk1-͌"j/kgsEz$W+.3*G4N*-zgP>uYq_(RK3[I{{^>,ͼֵtɸ :5~ֺZz,w k QѲv~`RT5DbJ.t ws2+w% dqUvmhgiD!,;%Wf5! /1YstCRP;! $]UkLچk)Nqvw829'$&a"&a∜fZP!f6HNF ,W0jXZfv:n1ű86}CQIѺ/dhB`JͦchKi2 8LA!qFuEZ3#~:7ow[~,PᅣQ}""%0FcBii >u h BO*mEs'B Sv/iJ;d_UKLvD":weENg^ۺN~ovNQ--LJ0tk߉lP FE8= & ZQ<~7s:8s:8r7qF&Xb=1$Jc R0\8eLdw7}Pn?؈1X+=i}4L!yX1a (%MQ9zL?&T4֍߮˕O=0+oܬ/&oXsFfB& ,\$n"q;en 9p:b &qV(O cQcښ6_ae7ݗ7UaRRͮ]3yؤT@a3( INi @&R6/}pss5HML) YYc2Fce4{TbAE|~~8XWac 1Evޮ<T}+Ss!uC8%$h%3 I9TTqwȄ8}7GbH-.2SsrD"S} he:hTSq,j ! 1L&FYU-"9A93H7|),ӹ{>{~ya WA>nxs!LFE)]ߢG?x3Dd2-%ʻ_g@'x:n҄~vɧc|'pm|?}zps-D3- QE,m>qӗ Ha\(ej r\%LB&E4%ebu 6O^hSq6%0: [X{uIZ ʠ",Z'q8a=Nbx"v 4/@CE3}V4nBTN)xc|VnTZyjGUM )Iyk4tal$h3Ic#iB!{ȐE6j@i_mWNjBbpj^NAX"&C'7ᱱF0xDLHp;akS.bqHug @_w%:0U/2̰2뫸n 0B&i"Dt?8Q\ӌfR;ߟ|!uyGәϐD ZfUB\FxKrmFX+߫00tD ri/HMrQrkS3 F @x*̈́Q*hq:J\#cq蘻f.NBh4==2Ldq4ۿhcS]g}n;>>$#@Qp`f$I=efXPCw2 |vrEzm{riAKŞ\j흒, Vk|=1'.m``f G{B`@ z@I[(%$M!^en3\Rw!kV$#[PTb&H II-EpN!vd q~ >P%ks Eן4ɱ5j95YfAm2bTw ,lŹg_zB&T8NI0ٚ%A1m'H\ T78}EE<ua&)o[4 ԛ5Ck9ѫ4oC^"X,={ acՋ/w׮n&6BhZS?<&Jdїn"*(o67kV@oSA>ɝ^CmCgH|ڥ:vOp-_f(ԪGu{ $iMR @"|Adʵb}&ܤєGruJ6A(yʦFMi"3T(% ȗ(>"c;9_<ŏ &X~KcM@e"S3t_V7Pţ$ $;7ozމu!8mj1\pM?Y]Dv҅^Džsc(ДL#E8,1IqGA%ޘl%:"ۅ>LŸI4NjFg; Cyu(8t^zH<~iׁ8jڤ~JL?B)X0QC5s۞&QO6c u 1F2 xN+ɴh|kLe? %)8Ԕq$&36F} #,bܤTp!cn5֝͜ Y.iW`CCɜŽ뚴y,(\"q%B(1ıϡ]s*k-Pq<C%\XꜲTE,hÍIMp!km.$T]">DNa+Xk?qչ(hAny Ggɧ\Vx}fep-b$ry9B A4Z A{&uY [(;(8Po>]f+SIҹ&͆] @E&CXw=e5zKN^*]FhyXVabZ{ 1:BEJ8 ;l`jw᜶UtUtuśJX[%tf:`U_n(ۂI-ko' r[G%C7LK@ M>'l9Ku{Z36SU2nfV?р$okxy}צO&Ԉ<@[cyK޶ ('B0 ;Tvt k P؁K_W ¥/,1^{7xxiwzU{s(C/ GE"}=^T;yóa -hߤITHsnpQb@Xim;%1C \9gCI#R`)(b GHB4!7=`R=u}q_WжLB?&Gף | pQ=p$sQt{uݰxw';dF JnXE4%`;;h(rގeD^']1gj]]h?f(XhJx}Az@+?}s>'S/çק/xӿ.ysw>dɈI[ ~f8Fli8!JπO3th<~ Q?ݏ~O&pKc $Z >3mw; zr\6=262MÎٮ봢Ds3g>`Dov}Ϋh7NjxpMA y'wpĢA24NX&UT9!S#pT%heF!hnRSi~|Sof.ץ=32TWuF d9!1L3Ib+^1M2 A&o@,UD jeR[ŘtVV*0p'qIY+C_3lp9aF[uC?hcޣљ|5܊19}r^(0{ؼ"~B^7%#N?M٘!A.!yaoGO;M2yn&kBUcL.}8݇[TS+ 6md7,t,gqcyvr@^3\F{Y8&3XeLDbw. ぷ0 ~Wu:x5xw}q4~z 6X|=sD}m1`^T :Y$QE~c +ʷ--~S21iGc<5FY(fv@6*c")cL`BʪoK1Pڀ[/$(g(tXG:1w!Ϸθ)bT,N$4^\b41׌d"4ipJ*0vGX3fg@4y(jn8Iv0jee}sO 7YU 3f;~+ _ϖ szj^侌fosRqu7_P2ƋQ58gTg9T۔BJR+"]D,8MVZάΈu"F2EW 'UK?C? vdжOA# Vo(Q$[\2 RjWBr%@&Ǘ_*)3l!˘5D1W<|xq"fWWizʆE/ #9楙+/.,+*P|?9l nZ㻸MNei2ceM9ƃ8djQ0e9yǑxċ#Ns.K\磠^}+hR}xz^.?xzK ?{/.v3v8JˌFYg_ 8/x*j(8,7Ʀ Ux ~]ݸg>n7~~NG8 / Vf0t+KBDN?8JIpi$ڟlAɳyv^K4o N"_yA6A٧EO_̢GG㙛y"Eb 5!_r8?둭MEx.NV'㜵=>t9+-YՏs,fa6en; )&Ssd~TPN%t`ݝ _YyWq$ y":EhuuAL}rRѲmZcH+g-ڱ_W1iYn :!FA̜|P/ɘZׄO'Vz=__ߪx_盻 ]箦!{iQT$NgNt^ Eޭ(ζN} ߺtyݦw\ݽjykVĀ>F37-Lw(rCǞf~vm;sdv d,/k_AvđA>4v8ݳG} $=9/R7oǶ?U`wJe-){f[ڲ+-h PQ,B2S#((Eb ,SȂEdKٺhr9UG{ْ\$M9kBs=&SH. #9v&)^iQY1>'aQE&lQ[jK9!r8ͤ |Gi#4h'_a=qrQ3?j-QFU7_?ۻ^-k؟$@zMHEzOs%3j%柂$1vYCg &?ͩ\޼85G o1!VO45a7˻QCx}}] ܼZL6z4kG'2gPBzmDKqJ{nYi1rVN[/)Irv>HʗP5(Q#,G@>]K8 4l6rQb G.ȑo Nun*loYDH;c[RVrVDu,$C|TSӵ9&'"zKHq>ёLإ1AᴨݍaH(9{ȉPLD:e&/L/BIh*s18!2H+*lpAʁ`Vf Ǟ %J8D}}D;V:H-;_P'"G` +ꇷg}Nf SqcFuNdJQPf#)Fi CP>dvɤoYcMlV[1 ϷW7?|W鼽ͧ[~-R\(i/ąi)xw._4H[ڝ[iۄ>1H].dONi 9qS1r. XcdtFKh <+T[I2 |=V֎szƨgU,ok$ڪ8!(6fcWn ÔJn0YvefS ~;XqKOv9k!X8~ 6Ҍ<DEJI_kiD2I u[ꧬE1D%"S(ؼf)3GFaS4/sf!PLH&pZyg)%vhC쌉L)f fD4I&'?⒌Z,5/zmTY=s*R Sl^K&4 95Qb:eȘsN&+uva *fҫI1DOsG1UQ\wj}N0b {h Y™Q>D0UU rȡڬP:솝 Ȫ؈\=b8g_ݺ`Z[Zn 5_ 8 [0>PM=锘Ur!لcDrs7#7_屑/8ܔԢ5Ap4GzL?\eD Gui;To϶#4i=XsQ;*+;F0WFBт޾Z%$x|Kb@!AGcPVB:gףRg=1$G,y>pf\HSQs)M'+ 9+OD3ZG6Y?I6: ʐ\ SPcg}zFגAk"JA3ol݅1j-,G1f191X3hP`mg4<7_ea$FdtIyJF pb!su5OK=ՍI=8trq҆V^gRxk)o^ҰO.8>"/8y1+@|/* uܸc2F~ ˎם%d+=1ؔ vP}x?X8޻[\.Û)HUc#wunX4[<~!og|[]>rR3ړP̽Dlv\5_$J,Ha/(K$t(JX3 K]K&xwEw,/hh>5|CȎ &_=?>xǙ6ٸRrfxo><~S/xuG?/-_9=fgOkbOۊ={.^{@3-Tϥ'[ xđY5 2On"8TQ'܉3t\R:=R>ɭY)>'JIJYRaܻr+% v^Fiჯi/ O-jnH>qrnYJئY U2顏½2ȕԗ0uے=F']7|G=>dEaVZq'޵q,B`}Aج8yI HPBR&)iHΐ=QlKNWU]b1T. CA9OENΙFW*,I~rvkBBq)9|mj6Ο|+A O/ɦM8ۄVx4|&xyMg7#GRWfq'3{5M_~+~;fVN|'11ڵnjάgkA>ζ[gnkqd_bϣmPZ)\~RltAW^=/+˄'4I%7Ԋ*'Pu e%I?xL3PEӇ`~Y,_zl'/%굦qf790ޏ$T$U4qW_aށq4G4D6s&+iiKcl!HPi\gw N8˜JT  K8 >MRCib%3LrΩ"I~ s%ij}E]9>yy}"qaR8mMָ_kdTW>6 VO;}aJ-lrg ,6;tHE/2u ?.`.}bG "ҼPƹ.s:k=W܆W bBA2Q&czazAW_I'Wկ<\ ncf%ʞ휇eD4yv+9yzk)L5҄ zsPǭeD#zԐsV͵]+͵P @%3+jւPG?9VQ)T꼄DUL81y6QT7^ @4Pݝfωu[<3xȯחOٗp9 T yG!xLڋ )^<}A"-Hz5.k1\jO뵪Upeh[G<>:z8 HC!=twYyYNFYQ`g;HT2paZ=OSl=ZAҠ]̾ӠI* Vݭ|a:&ΥM($Y!uԝr'y|x b;Oٚ +4Z7Ki!L4I O2emmH k.ap. 'ğa萪8b7ӍI[bBzoqZiVNUgI"SmM;:z84" G zsnyO4<%ИWnWߧ(ys6/R$)z>֖0m9Y[NzіBkrZK!IG&{[ƃqZU8Xpt@mȿTg(48et~=9lOILlFL&ZbYJv< |tQ1 F0Up&,5"9:/+qΥXeJ8x6:e4FY v:W\pAؙ<m"T+]WY`̌P}EET`TƝp(jp$sUPC JS5rĩj̄T8FOF/__-~jkR|!RTsStpX7iƕ͔ l =EUST]6Kir)'ɬHNY5l8FHE\J*ra..]l㪏Nlu֧Ҙ[e.7=10.>NG x_(4ϹP85:BRsXtd-DD Y]!$ա.H  +Doa{KJf&8b^_)V=OU|غ,#o\X w/XIjUH{ BAD$$OCzwF|J><,dw0J" *ݛc%4b46k_>_0ᗜߏۂԟ&xX<0Ji>q5#)}6 {>kxK_oGC!eKNi NaIPnr0Ǹ֙̑';S!D،!cvb-CP~J4Q\p3p̠\AR-`:m2p5̥0Ge5#&+Yױ ;?7N66igw8̍砀W I Ö $64Ơ?`9x]ȹhSIcTQkZJR7UTk(nkȖt搅Z\I L} X Ns+0͔)bA$x R[!ܽZTu~z[肉_uT%vp߰ |:ocUÓf77OBlwoε3[ieDWC&mS&q3s 5FO:¶k|[YF鎉PḰ'=Y5F_v>-S}a)USWMOuo?>"R8>EΨacP 0fwu#cP!uA C Q(}ٓJcF@N\[qE;9ΨDkRe!bMLS(aM䨮ECtmš6 }b7Bq䗰PupxNpg5VJξyE9+3 w֣2$qXZg2 Jy{ 7 V!ONa-`Fhv1Afx(;ޝB1ߖ Ŀ—UFlz7\Lū*b?1*)jU)$6˧ĘͯϹ߻9*TܽP݋_?`6qh-Z*2eN>_/nH6{QM{i?tPo a:XXgmMkjT˩%SLDs&@J ee={֍y! kyk%P'(;dԱ=d)S&1N(lM4˜uIuS`HțKҤ[٠iwl*BB`Տi@/Ѓu#dVJLϪ_ d}V=ՅQ< 3.5̘ʪʧ$q/O" acۨcRZ%>]߰qIuʊ#y= A,P/VQJySv*gPMZlAh¯P}-~ގDGVP+-{hĜ0?*Zj!  ~5ngaH^9|Uf*pc NL  ]jy@"D\ F֕Z.D5%S3Jt<}A  H*W`\9t^bqP t:PE/XNj CC-B#;}8bc621/4@`K>;L/)}7L9t2MVe!rFl_8_K.l8K/v|9S6)㡻pRbRN[/!mt-4X?F?`Ty}6 {N'\_R=ɗuO?&/\XTϏL1+3*3]4i0I0Mu(*[q*TKKCEBSіq~>Fzn6.w}t؁^|睇VDcoxmsqA?.nFYG DϻN0oqեrG}W^k_vquQO(I˔M3[͉&k)LK$ 2+AbvxTA+xl^px|޾qF,v" 3%0ҔdVޕcEзټ`Ig>u`XXItv1>d*.J*#3O[nfDX0tVL=S IH-IZ⳩p[JɬkoU&`ÊdY"B*!gT&pBVogfu 7DuGS1QGU`漩 ȥd6Q( ; !#=دA2uP9 ;:UgbmcR+%r c,B[j`dØ<$ =bqbǣ/vcEPJF?9xR\KVKDmea.o)_Thvp pz~[v9ZkNYTrڒRL\JT' 9jSMaIY cʼnd-C|E0_ vNO@/N@fm&}0:c-߆o60C?H}ǂ07apwgŏټ7']1(rb)-ޅs#{}NxnDz%O_V#w&g*xB/rR8pnmUqǗXN"Krj<{N" LV aQ3Ld;+qF+? şP!f3 ijXOqtH(*ĠW)~jhYH-ʽ|z  ?q ZP%AsY1[0|[o/3ZM~Bk׷A>K0o\ S7;3,Z;wK>>]PR r' BuFWl./=;`JVx!R!LkxUdyXFiB !QL[XW@TBq~c5(ʫ7w b@l'9֬&%8\)aFJp:Bi[q;G^&E6pfkf-쩒-[GGZ=hٺt|6_nsܲX#yr*eylxEL g-Ri-[S_Բ5G-ZG7ĥMǕkdfasvׯXvNx?rlb Fe,⡫tFp0X7b40x|uh.ް3x]}px"?_3A{ B)~ڱZIod1ՔiN#gr4V֜szB5%ƈDz|Ad%ȵ=lYg[qi8i B2.3>u-':[vdf󻛆 ~eu{Zr=yښe> { "Zl!zQZV(٫2-Wm;hK2tXXGYL ,SS- k[POvbj(0Z' 9beݠZ=i:~s!JZ:d5u1ÌzZSW;_^2S xK /meЌsN^Jqf:ZLkj(]_@vk2NSһjGcN=jF^*a2mU"鼧H(y<1 9Α`pę9IQT%Dfi;@PN|(TgsϼW!Zɭd4Z`] _^쉂 0/Sc^3>i)J*Ե(o<9 x1l.hʺ)/=&i2*^x9HAfvDzC;s'Nk#55H"hNȐj ~;ĥ3R0Rk,x&;0#T B|3R#\iu̴@^g C-KkV!픱JL'Ȧa~9Is #0fE3g@_VFZ0kBjR0&VPVjwvC -)"Zn&"P삄ݠ#w ъAi l0XO2 a(˘XPb3:hO'਼!)y۝|0U rO=IȟZs;kĸsoW B"x u&%xK5GWQk :*@EU 2pM0DŽV Q#D^4KU2}|RQT4j*d:$䕋hLi{oY7*ZT ʈN;X]:)gY䉖nuH+eJklǓ5D&g=X5DPηo.Fnz]=^Bm^0f~}1>=~:_:.O>ށ-xu&oe_f6fYJ4ϕJp'k]Ұ<9կكƲ4&'|(c;E2%ńS[=@l[Nte`"Cdz Ls!]D1))WҳvW@"O`F@R:L}^O72TjzT֮?MCE (>fNVs"qq9"aD: 31D*镂x%aX]y0F;g("˅Qګkd<1 <G8YBS,uEhណ)kdݻhܤI"⠽Z0e7dB%) ( Q߬Z,Bϟ<8O f!֩VHbGz;ڤ8 ^;XihF#H"Pa֌9 fco/+n(7y"6#=NT덈EaĚ`\˙X,xB>xkgP+aqp٧vyъ|$!hͦPJgMSZ5TZauxJܱZAAq=졧w_+fr8g~ # MB{\>~6 lqw`,ԉݗO2\3 DQ{x Y|L(o2e`Jd/ҪE'QXk}ǟ2Kׯ?VQΫٟ[/qZ\W,d~BQɗTSm<:0A=Raੈꚮ0TTE4IT(dAS1(#:bݺ < @4m#JZ:$䕋hLBqp~ZO|pt.w= 5!]BCWX^{3_/Zz`\`|dQf*-*!e*Ib$!PH!&&\yyܸ\s&"u6ڿ []zFRD5$iƆ2-H lXXV/nz;dQ<@^dcL rPAx<4P`(-H+AcɆl0^ڸ^d![~Fixtw"/1VyG#Vΐrk}6@^Xܩ8o50_nokfx1-8,]%)EgFzc(f&K|?nK29 tBW хOJmTW~-#KNHJk88Ryx=uG3+Ib F͙UY[ O< ayoߪWs'1ispXg[ϞF[BB5asvIPKzmP2ҝhr.a)<+zdYNŢ0SFfuuGe)D1U%=<{^Fdp#Աv2t'&(Ggwq c\*@*L92@8pC ƬNC5v"kLNZH};/8=bR<Hb1yZK=07n OԜ[ s*WiUCJZv411|zp#S\7bF4vGR6\{dzPF0w)gQ/"șmΣ';rS~U 8=1춫[c Kp0=hûp)lc| (0,KaD8og엛>;Ψ[ö*쏎{ eEY/284_@mCb(OػP*@_.Kb@fC(JQ-sLb eYi>-8#)#R7BᩁņnNA,XST͇s]@ $GCGjJ Zր5&\ CAb)rYr;APW?d~O BtnA Nb-=@mŬ^ 4[bk0uJRMOcpvycphNhݖq$/2x8m_n dSLctHf.1,ɇ$ REȗBRh!-GJnG2޸h>ީ{uܻ*?Wa9ec;7f wnIm^5aǟ޻0Mf7uIߣ As"A2I %*aw|$< } pt{kJ$E4 }oAצ^0%{EG;qw+(JN9Ys0H}EάNvnDx|k,O3z8AEWwL|9A^ye'|B{v(/S%RCoocp 7*$%hk[ir^:8Ӵ޻Pf|E:zTZ^TLd!߸&Tbk'(;MFzT bL'.tbAqD]ڻ"[ MMa 屔FBYMnW8_F]J$I0e-NPֶ`X@X]A%0A1$Y_|po. #ɁS\ a-d`9b5V H~NHSLvƖJЀ*lQx5W!E\+\"CY-qoPuPdUXF:x;֭x /$`(;ūC:1c俶GaYGQ_<B,A̅ĬSNrDJ"%WXVa4R2z8Q\->Ҕ\~<~}Zx1h̸wIaA^x~T/dGcF#Y ۇغL]rԀrlrOƔ1we"8p>#G^%u)4 BcĖr\(QG!HBZjǜJpūwwūp΢Cd:.FtPK{ ĄŤ[zn8 Q I!߯&o /F.~x"9csYSd)~.*qp:B|J{i|hmQL8X.# T7LĀUzPQ(/{ `vAOL[[s6GaS΃s; 9B؀sKT(b\8[%ܶC]")poEjS8Q:!_ʙ'' |} 6\''%}ժ:K>ʿ,o_Q2nsCuE@Lrqֆ`(BIô#;ٳWw?5^$KoyZ zl,cJŰdzgP11P;fij/ 1*fqѯ$L);gaTbYhƆ^Zf)FQ 8z-$@gz8_!8HFwГd†W9/9j)96 tWUu1IT%WG\;zG6B$iҸ4=E`9x\ը.HȈ׾IdZ]OigӺ!H R: EB< MpCK0justs2Sf5غ}Hy}:4g?EpRqg2;]Oy4f?YP;yYNr|PsJtQE;BR_(u 0M(Z: VY6h׀JZZP=@WR irBKD5c# τ͹4y/V(e=$ȣF/'-"p(j#RDg\mz|?@t41z"z }VJ)XSw"6R Y*1 xum`Um.gOfք>rD"z4DY&q(41CDZ = SA۾Pk'yCqЌ/9t^LpﻍN6 o\.注ӱxc_ܽΨ"_?}/=0SBGpߋ^3@e(ۇW}4^BNg>~%x9O~^߹^`n:u!#X.0cݻ:n .YE1g;o*5/e2*Z Մ™Z||!G;mESR*r ZYD^Vi Ѭ8Ovv\ᳲŭN1^7sFn@ I8BKEUBH#t (^T4Hoyӛ uΕtWlDj]FՑ\؟JnP(x4F60Q4%騫"G4#s f rG6u~rvPΏ.F@G!' Q_IR=^'{'c$\;b&q?'bV[:3t6؎Z:m%*qS[7\(vR w%EWg/Ӱ*-~+3?JKU6-@(XՑAv M7 YL(B&O8Ӟ6~#bL}"^Y-Ps g)$< ce1eV._y3YXd#'p ʣ+e% MZ7wvv2͏9Ł99q1}Jr{M*Ҧjvcbᤈ_ v2PcSt򕂹ױ~oE I\PX%C©LG^PZi*a(S ^)'J3Pxx.IqJFXǏS: 1F+RpQv"FY8Q22NmqJokN"yY#޽┛K9p1iSN?Foq ?pO]zxsn= ҠKN[V#?9דGQiڤ= s MrQM6vmh$2PylGOv/XHh[?NMُXD:Sgu5o״hܞ%fȃf;>pq< Y#8&ZZ x[o%Ֆ)P7F4}qF$BjBdfzqQxJBM e0dP%TIU!M`9Z ēI Q5Fq@j"C5F-"s+Ό׋-0q.o8~Bl3䋧{a:Em_lMݗDžk@_?}'|F*J!j> оګ>*  ۇW}4?B}q@Clo 3af -u7σ`|rxAP { \MG#|Wغ͘F%PϜ2?wd9k_*lk4tӖmɥESzPݰ_wbAQ1%"('sC H'N2[%dR,D=Z$QZ 60trػ柝1}M*h/Mi2TL/JN@ ,{V2:U:{ F^/4V ֒6.^@skZr[PJ@x-ЦRݷC4ʩo79yq9i01xbW_|&"}COF( 0r/XtyRlHvhE6uDIНrP%_syIы xg2G(s͒a2!8ttYx-/( 994W*!phSQ%ȭ+j ,< ){jCXGvjw֑3=_۹oDvj(nԒPUBrPUBoGZ>kkrE/[ުL 6Hj;=%H5i?$fe =9&"j{cƌQJ 'ZPkZgT\YF8-# 5l d(pc|O^?ȗ$cAmH5PUe8Wj* -B` ,4IeN %"$mϜH0ẻ ;~A*a}:AS&MJ%2Y, "9ҙ9ϼ!a+3GI/88vw2%c?t$vBH~ZQΗbT@N iF˿uQX3˓Y|bf7ӉY&FesY:UcyMsJ?CvG}Ce aok¼Dbx~,hT9#DxΈӒ+P,lȰ Ks-XcmݳlRւ\<"20 qFj "3 W.8hð؍)kvzq< D]kXp/,CC<2e yF 1 9+2P'LRrDx$5EqhQt͠;޸%xLyX3'኏A ؘ7aﹳ>e{e.l9]C ةRjT./8%ouponb_b~^p$kQTes_>/~ыBSy5)[YYẮZ)ʺZT㪺j=8VUmju(`@wDa렒ֽYލcnPiQ)T46zAnvp[@weMxUazyI_!Xh/G a>}/ݧH#8QxDEA+y3!EQS!YOj@8xg!j )E˯w5K/<y\D),KQhW, hN8 Q»m G&tv$Bs؜QOT(l$kjSհ0)L PŸ2w1?/NHZi]_+Qc/k%׊\+QgkeO~y6ސLxxhOqqRkx"B]85QM!;uzmؤɲ2]BSH9gI㣈.!蜳)$ZiI~FoYV{ʻѿuUe4ֽEw,1?~.[nw *ьZ~_ލs0 =U1%~.[nsw jЍfjt=T\e1Mdb7<|;5ώ:d>>$>F NŚlQěo !yeYQZ8_v\yڮ_G^҅L]'1FbLt%܃8+@9!ff:^h9yK8dp"b0q̌-hS=|&%UYϞZ 4;!_gSٹiuq0qxոpc~!F֢:h1kEf;"([\,-z_WenHvg0+I 6ٺ\U^ i8E=F5ֽXލ"0*){=p|D+r"akG?俅;~buqb ʹqMf29V! Ô"BbUx22?.8QP䤩W5qc?jzв 傉'3Hľگ*g5bxǿU 4u:%h4 a$CR}n2x{^6 !OE`UJ·s"^B!j ^ &i G펀W#>;jU.Ok #jեBtD^ 1-DQTeG⇧XWp3 \CoOEr^%VCYVa7-O_0tKJ$X<%IŹ") B젟V|9E3Nyqy GGp|{un{~MzTǛ]l!{Ի۫[m0yMeL#ܘgdl`"1aPq[u磟knB}|^<=姫7$xE4ڴ$ohR<л0A"sx ?K/; RJVX-Ǵ{,퍀E!gJqcL3Y+L0v\<9Y&d)} xqaHF+rc|!~ QPKNc1]T q-b! gyLʪUv-O&6m|b-l ;,8#pɝgH\qi8HiSr `Pxux-J@fо4ߙAQZbQ=(2+=aN)XnN2,2L[X`'"(%hH[AO\zŜr&Sk/z%i݊ȫy.zن#Bby.OX6 讝YͮnʦT_TDX\TK1*O"[=F*]=azОOf>Tj[LVV]w\ _.͝uj"~ u7e7^]ɨZA#MSG{EЃVGX Ab3uRQGQUN`nnm̐?m>4I>TK~o[K-԰,m1ji 6=E).L~:Hs)(/ÊԽJێ= @)cJo_t!a{|dl=NHCs~rdeϏLo+޷vblTox*F*_΢+ׯF#~{s UO}7ETMqd\7e8*Q[s3Uz_7_N#NP87| 9v^.홬rB 铬r-DZ-z Ab8IEg% ˼3ZKg)Љ%8ˌ,j5aiԂ'{E['j7B=В/"\F<)+B/?Ez3IT!qs.Kb ǦJ`FdmcZ -_u]6#.iimZ\-l3|ϙZ UO3~7ɲ$z,MݲwO3"J!0FP+ >_DI9+%g#Q%BPBڛ̋/>4|2~B[{~W%TB!Vͼ:c ,H/(OV;5iԠO"!qʹYi<|aAz ]q}z?؁6^ҔR$B U' ~Y[b8E_S4Ak2˔ Aaoy9фutVhը!ݕ9!R׶S9I8't(u!P^Jf<Co+ ;Bȇ&0`c怍|6؀ l$MNPV9L1Ƒ ؐ5^)^C'2!Az/3K ɄK I M}AG/r@-b.A=$'8\ւy)2c&dG6T]RPYi $y{9qrR Vrٝt'wTp{wKt=9=ݡ1UB*o W3m9)KA-5w4}zvآ ۇo_igY;eh1FѰ='s>$П?DAژ4^FSyݙK{g.O |5St_>ayOumXqw!NM/1{AJ[LC[k}P['֦2$z͂!mkZ` iq\I⫪6JI$p+CD]Ae@_H DX̷b{iygbũ$N1Fi].Gn+qG-S# *b! JU tYGeHNEL*Evr_2-o7rF7I}@TQpK鶺u-, q*RN-l@tա}u]XZmhC(i/z LiƸCc(,_6d/\m{q{]OAO5-z?Y5֨q Lܓ.lFkM!)KS\l벢)zk!)'\,]xB2UG0wZVӪӥUzqOjčywi1LY[KIUam   jvR>\?|u˜tpՇkW>>MT-Tmgخ%i[z'ynOx>&rv/ˣ=:S4U>._7#ђ1%>UѺb3tJhc;N:em݊;kݺ!o|1ThJ1:mn]$ f݊;kݺ!o|ZTӻ ՓJ1:mn# a־[yRub!Z0?5f bf3,h-K,KZtkkus$ gRkg#4QHo(4Ucb8OHʂLtڇUd,1}E N~Cai޼]JbD6穨_,FDWe1.{]++PF5Ic7sq*j&t|M0aTWe'k8/X 8`[0:ث[b M22JN!a 1fJD`|.uz# &C|zuRޖf_~5nURO[y% d?]_-^]b2>\9Ͽ mуȮɁdˎDW$Wt$癒9PQ#8q~|>!4d sŅ123wH dV!dT V3U|:c8ia.o6#ϞLl @BKPSmQVs'|nݤMnݤM}U'TqGZZ#8 ɤHA W0OFq%QBKYPfi;#ĜG #s69ԙXՌR+U€T˸;F+MܱV2\(SłUB:TH+qS!%Qg50@hʼnS)]I)i`2cH +Y-ТV9c 閶҅ž8DQ(&TJ5vEHIɵc2niwsBTHL}D5G[1ަKR^Yҿ|i%R=lIp)$雹Kqmݛ۠y* }^h`R@ x[)">E K Ձr)Y*[Z -A HtzpEFF)W}+:nrL\+!Gq"11p OԒvLt34<>Eȸ䜡uLƢ;DDc}t(GR 720ͣ?':{+(윚R79KtY -OqQ|[e^*TH ][EW>pR`l=Xre0㎗z&07Tnƺk`\ beT3o[Ռ%)}^bKjףAP>ut3 `DPA5]6T]oJTmdD -ol]ʃ6yZ@*qlZE΋-nf/A$ v&3RSH 7F"o"z(=>zN[X#6O MchK^we|X֖Sr%I6!YCys!B0N8}ޞPϿ|>7t]v۹w ue;W`/sJEǫ :v<Ͼ'{" <#%I!3SA6<:dZTYU6όMCkI-~OzsR(04=׈Aa^1|^ ~+"RzLp9W(93fAUnŤVT$u-b|('\Ot~"<~`Ukk,s`P4TQK/gs)2rо6um8h @"Ŧ/o @hFMOvF>۷ w-LGK >U[87ZJ) ^)ټT:"nKG#R@K:{N=fir+d)%֩_ѯH H0[|mܷS:U,s9sG%{歉_G 6e,pŒ8 y}g&b&?f^B>k = |ROmGŇc|m\Y+l$8eG;Ag_@$_6=u1=ԈOuڬqNj̯̓PTn_3RDx_R&(b]'LD;7e'B[#FtdҢaEZ@w-JdPrēEƂIM*Az#=wʉ`mlV yBʠAP~I& 9pkgv J1 ru;Kյ$f\DC4VBx4'k]18AhJ! ؠbg ]0 U kI%P.~՞kJ:n4sXtZ:-zв>j$ a`qሢ9i[!9E!Q8-J5HN^;곜k+_>:9yyWk)c!ng|IC.Մjafx2El΄iDh5P66h[޾dU&Aw7m;Xgm[>=h[ "JsN}jB+vaoSkHuYK'Y9aW۝EZ={/i煌 b#TD;[MǫVW H?cePeU8T;*ωCK4TN2""ӊ:fqU[E-(1an|yL뛇/OӪ/Cy*<0su ŵ놢l `+MEQ, 5H1s+P(JT:&s{ dDWHuiy:W<S+knF]LCzr;|̼xBQC m1}YP*(!w"e"c"SgDOcέJ/_{|2\ۥ-?- fwj0fP-n9Mr8Fu1Rڞ:5{D%{h2=o=CAz[sQmzbJEJ#ĨogexZT؁XC쯮\ Plir& 0 dF iJ*gM4CWKX-LQbfI4?͙;J+}ِ*އ ;!hBR:nMDk!*h2-U UP-eCD&nP7ȳ`]d.rH8Y, gӱ-]Y.J:""- !'{SfiFG 3Kr3K3Rf~ˋׄW3٣>"7JXR65W4ITQdžAVɰZḼC F8>^gI:8I df&~ȋͪZ L|Qs 3k۷#VY+ISh\靐 LAa/ K*FE/!X/r&zѿq hgw{x"\?\_ŠsVk,,xV^(Qb6]WaS\a)@(ڹD]/FZJ *:sH|lȝh1[4\0 ƐB2T Е%fZc@pUsR&{"1@XW*~oQ413H CnfVF9r-fabQ3 LsKtTkLnY;|:Mtޭݔ'QK:#>c[7IߞN͵<`Nor-m$C>a;>L/G ܠ9:TB1& ,ߥuǟU|].z53 Z~QGXɰ0TF2(-ƪb̪hks]d  Gw-z-RJ~[fislRAu Xw֚۝&1l,Ѽ7hV&O %s€0 ibSbJڌ"L˴::BD5r_Ozb{ cPio5`6jlq7!Е1;/+pJ[kƀE+'>hnFDm.\$}>CĊ9uqXL]Q\L6gUsr'yf岞`o Bo 梼 WQ;E]Չ'-CaP֚GtСDuK v?Ni"v%dULB"ATab'Q5pp%joĸzdc ƱmkK5s(?}C("҉HYS($c2[m*~i20Q4؊JV8Y{KU@#m4⭀}Cmv@\f9c 49GsM;sb > )j=M>sE}hTclZUZ|ҥheD@>RWD!ԉ?I.в|(@O6:ͻԍ*T,Fx[q#)&WY.Q:8JJm 6"V>V 9ObK/+^X!!zd2:z2/Zmq \L]RxW#oSϷW Q.Bhu|Y`N!.LxN*L34KQ;H?J݋< 3s&ߖgE&Y2]'_ʩxHzJjMfi 7g.<|[Tf" to Igr~&'~&w#_ԦJ7(3R<{õ-K.!*]z>?8p}4y}M.=w']':v0H&iMoə0tLgsNCm/gӏ3y 3=%2;7s}}M Ny2,+dUXLV+[^,w&_Е1Ko4 j=*ҙ-ކ{`3u`kyi3\Zlà7n-W /YU;Ur7^!#FS;>BѕC#Ĉn4!k ;$V̂Q=pO+ K+8c-| P` B>Q(x$⡓TM(= 29TJ<5hTʚ*+B0Ve7˥%g#QXy2T:g%廞Sݗ , wG傮1**E;%3$|yBҾX=VO?E,纋ĺ&T;!>U*=L3, 5 J(EQQ RFҦgm ^Gtv"4f6 ?x@q솤kJHq$8eQ3MI&0{#P02+%b~Hl͕ǬGMUT [wgu b7f[.衐v;t.ݟ'k`i=8?Kg;w ~T$Xw?=00/g䵔 &z1o/긾q9t|@kfX?^^&r$ӓq䡑Pݼ}iPiδ̔y1zE/LҺ^{Z2U@#jEv2XYUdd;]ĥTAlmO# _LfL^AZt:!Іwoc5Ȃ (6z hv$U:&IU$^]nI!uIŒ*NNjVވMӭg䭂T4B {~Z3KyNID+W!1j";7|M&D4C[zıiP)Ѽ(B)BxHn'mf)ﻗ g̔ʤPͷ7Ǐ Ee`Ir2@#B^:B3\!ݴV;βlSZ'xѰ:]SmaBӜ5Y8>N2'+'?s1p}V\}uga: $Q1)Dž!& OQ@gJhhiȀr+TPȠB'Q9VjSNc%C6+b(5IH*/uU#VЪn80k¯qM?v~ubwELsm/Xsd`ˀ(c@mΕPʗ`PXYJEiV4emJ*%yiߒ v".94"(˙QhL1X%K^[Cm:.,'/ YY tY-^j[[(<\ Vdg׺. mj&4Y-(g5@dRACXWxEۙ,G|PRdZg{ʼnw 6 ːVS;X=8(gB&9 ?X)nڒzMs J1'B^2 ڈS AhZ+%ڈj GXkqyPHux!7fpqVRyQ*X)bH;e.bB?#K%iىY6"j~z l&9>/GWn +XfO//RwgxDÚYE꡹{MI;yW+?3]w Go!_XwX>( B-0B@hs{-+LG< zmݺ!\ 1z{.l$pUhsZ-7^|&.>^O⼶"QU4_OQzU!N~,_ژGl֟^]hv]$mYmJfkǓ~4>7wW瞌Do1hZw &~u9酎*~DFt<+%<=`@ىOe̘iwG\gҽ3KO|hv5V=IȎh'Vƺ[,i7 &[G D-+^vlj$dELSo|+A[u^ek{~O D\ ?HkN}YBvnOjM%g L4x<9.@+tZ"\]'|gZEn%٢@-;V1&vA+u=hZ5Ka wmmX~̔%@2b/`{۱ݒt%Y$Y @Tpt`{T܈@k\'VDnPE"H Utun3<"'@'Z2/աc DO uT'dIh-S,#*(c}/C]-Ĝ y/tRܷy7pU;J~!]ZF@4wd)`f$1"F1L1 @qA҅*56 n_s/.߾u&nn d'w`'c^ͯޟ]|UӋbEۢ˶<'󓳖څ;($dR3!j\pJoAc;H9-N0cNs`ԎCK#WA*g;)a]3/L8yy,~il@K>F{0`Ts釋w|](e3/,hT4{A gh2:vRٕ[ow1vR%ׯ ;)c>)e '"aӫqpqGCbC]UO\WV̯~E% (!m'F(Q\t#nFDq{= Rw2։$1a{~-Dt: p1GrL yM}_͝Nbx}/vZKˊR+b|SHFѫ[*hq0Ke!#.$ CO|XM55sjH\HBbQ<Ϝh4-n2N7I1C%rUgLj l0_wڶjicy;bȓυg3?9'^rԘ`%1k)Hc+x5!1i$Ñ%/^ey.ٲN%_QFP]V>~,ǣRnSc%XͻhC*#eM`hB%dxz!{E hLcÍҥ;:iKgsX JwtQ!>׀+Un_q|x>"jV : +VԠ;wϽދ{qw˙3' #w{Ä =B8lB dd q9]K`jl51I+Rɠ PoXZymQ'׷K=<}ݻE'xrK9; _?/zںANl-7|>}7(~JW˰ 9\J(pX~L,!yxj(MiT{wx|L`} .w\H,03N?wp$Od [I\ ~@ RN!D7 XݳBT1,GJ|f?G7,0 eJnrc>,U#e݂1G7BlN4 Glcim*M( /%AXd_Hm`ցc@M{Pp G1`),6SC}Ur4ټ?x|lZnmS?]Z u>1Vovb]k㇛kLz§]^O7uEߴٟrƐk/8k|(P Vd1GlO5޵N-kS1G.īX5W g1]:3 i5˴{W])_w=}q FcOIX1f9Ʈoۑ۵@]IW][ _X]y5P풳0ě-nzC^ǗTLj2 /s +VFpnZXA/'4Nw7r'PA.I-{+\Nc' (;%VéwbsQ$`31dExlpnKy++O)o+kQD V-=~>>0 <ȞpvOdl}{m<{!tIowFML!P2PAmݚB+C @P^k0|o%wDf4{Gr!a6g @>@b-_kX$\mCIDq']hκ9_r?P0RD:N;ї1TOA߽d} J{TIҒM#&ӻƵuΐ܆M>5]'^6Xn\3F3|Rb A,+F]Sb~ѠRT[|,iZd3[0,ܹ|pʶ*x9ǏMLEkɮN8{uԾq- Vk{~pld`;W)#ہ>?qƏP됄s(EM %3{L}; ]?aѡ".e^ <\jNB9h~ѡ@ry/)'ᾄ/ޟMbe3w7fQp( ք5Zc_m#RI͝5eGaFLe-EbP>qyۻ( =w trݎ9"UGۻg[ y&aSIRmK]:_9mN7HMs hZ$@Y.VP29B%&M%+/H4V *[dݠ(,;nGijK2`ԐH3cO %ɘvK';H!Z SJ$;K _YLؐDȭ\V4T"@>IY$6PFc{ԠPKA+JZT" 0-6.wn@LxY;Ea{"{E y u -m:sՑN^tI:h'D/’j@3_<ڗgOdN ɞ ձ|@V !1_5QaD߂=chu_< 脺/4#H39FI” L )/|ZdO QI%kPA8c0 bjn +)2[ pBZh/'r'i 'kYZfXy~:/}q}sNМQS7s݆e/J6~>DH+Q R)I]] Y[u6Ξ5]\0m1u*qZ+\m+]Ӛ[pgb.`5#бE~há" G'pR˹o%/eHLHOObK6ZXcsGKaN֙^h645h&x(xL*e.ChtB84Z#U{? ! :&~AC)pZ.'d6GK+,-rf;aajpcUwmwNn6j~⫚^\˦/$rX}yf|Z]OyC~$Bc+5i{}sݘ}&l (cGᗰtK@ )ьBj 0XM4C[S?1Q[$SV;$ي!SED5A 59-- ю*?(,fF٩;gd[X&>F(8zC|LF{c+0 n/A1n1p|_y.%g璁8! I0&9:n Jk1_|kVpdj?_(t:Q[67ɻ{_Oan~ ̢+_a6o/?O4 ?Y_Po-{ҏ*(E50 ~s0I+ ZܹI."Ǻ~\. > o|IdO}M͢?jzk"g?>2WCsLaR; BK P7u|]f6u)xiou9rDuߒP~6;&PTJ4c z߃{ zs+!‹0|,*/e &s͎#eTM8tq9f<Ǵ { M  R$'X[ixElMZ@Wkaض:d]Zp*H-Hzt6 0^P c@M.B#' ]mo7+-F!,s8pYl/ͷXb;z$HjK`[3-vSb]&R׻E=z?m@$h;s%(v\+ M}jiAPo\qv܌E(TJSc1%>8g4G% Ƅfii,0'FӂFh&?)C\?o AF%}A iM^,7Mj=e>L'\jIX!5".|̠JX֛%U]䂍Z2;ˮ3^.+0ڭ f-D1db"Ya5R!W_oIm,ZbPQ{>E DluaFMʦW-GnN=x#FgƔy-nCX 7 6eH\?QUrq1ZGeo1ϯ`}戴VkQFԪHlNjbK]]}=,^~..{..xWʦrz,>g8×bՓ+['zh|":t|yik)u7D 9(e5aWO^n8;!Ҩ(icMveUi dѰ 28XFm~#qyJ봲duI㕏؟)aVs'b/٢a/Ԭ+ AUo7+xGEOeo8d_ovV p/#`X`Aހt`inu}oB ot;E~Ps -"1H V/li\N=hz{t!gArI7)mAR=A+3,hh̀|n; [+'&^ₛQL|YڰJ|Qq@m/J&%2MI ǗV) j϶hSS@Mdweѭ6=kЃm_=m@ 5['uEP=&Q F`e vZMЉK`WG=ՈM7/槟t.vAreoYIׇXtzYٟE&](麇Fs )N~n=Y$]E4DHᜎ.WVHZDa>H:Xe XKbmK-O+1΅NRXT1oSj J#:RFMuX!c' 80e ( /KdK 4\9}Wa%h3xza&RHi5Z('h3J]$|] FA&؁/qO08NҴ;HM˝L f%fʰ@n(AS3 ֺ\1D0 XoG I_$/Y{~S6zȄ*VMwPZBj=\F Hs#}D>^.:' !]\!3q. _ UaQ"Ms E0BuQ3u:+Ua')iI<[ < "{_{2϶`֐bҬsr:҈+?crl m|Dru%(%MA~"CتvIq!/"JlVST\-(jx#X>DŽc;'AŜk8HO ,HP^0 d` d*r- HãGȑ5FeBZ@s{)yN,(veK׊<&8V0Mdü.HE;  l1U 8o~:@ XrZ'ړD0pv!k! S$!#pinC36o٘ l_c'*-yVwki^h,%E,ڂ6c$Z. }A[vx;ו}!^6K> vPxx>7ao<se85M-lGHUJӁz*V@dfҴXvB Qs8I԰Z!t'p /MS5\.zri\ $cS$i)rIסfvx @P!kn=jN ͜3%.Y=;yg#tG⼥\ӆYR58E$ͫ59muxDkL8.&AawݴXtxuXY۳ȹ*&\[ɨG>sGS0NzX=*8BT3x {,ǾÀk?|_K!O&d4\9զ]l8iMazK.=y%ɛLg!;Od|⍧pFȊ8M.%nA=Vzo7% O\$1\wV?zl3vI ?VG +I9?~T*k y&ZcSjH&j1(1{n;|:9w;)nCX 7*:5ԄqwSbPb:ݎѠ̻'#z!,䅛hM.fngj1(1{n<#̻z!,䅛6E6k%WQ`Mtj!!5 rahHByZt_/>nɷ=/ot:8\"n;ϩGC ;1{EX'v+Iew0'q.tRTTyl1a@KPC4[%g߯My"-P19V1ft-N*5<|D[/4`x`>Vet אnj0/9~{>R>vS[5PF |hT_7Q}]6 - ~˙]jn!u-5s5Z2#~=M[- N 'n]Vwxxx`dh~k!?omz{g9Ͻ_hkp!qf*HfʛB姣Ѿd-* ;?-~o~g?8g54S4Wd7|icex{ѐg!p\.>&?Un[U? O75wY_dg}EY7^q'n e<dFilyH7QUsڛ<9&qD%P,uNr4CIZnBpvtȜ+Z;0{E:Y)hgR6DVGiXrtZ-spO -AʱDHH')G VY р29d kT1 @CG8F=MXLpw}[X[(=gRNˑIH@z0 X 0KhSj Sn6=ТN[ZO[M#N]},A%+TsͿ26b%q6Bfa 1XW;2"*7<6m_nYqsPku&nX k^5'f9V8"3H=6@1ؠ0h[0曢9lE$R&hN9 XCKS2rPV='\ʋ~q{MR[+DlӗK};XR(rr Uxϟ߽j`Xџ߯~vm c&UJ>ݟ9SDoήo g-9`s1Ǐ4V~˜VT˓okP}>ȴ8j4<_0sK=+ Yen #-zjU3~X[&9ryLY U$9&4&cv-X_?jm%x4`q)_zL/vBJ~rRN41 L-2)$ΥCLY&>j…A xKKOc"k1Ra^NqyM>KmJE|)d{ 4̇%oЫhVq!#b6å6L7XsĦpsޣ}1-s&_ cn6 {>Zx2/By.WɊq\1Z9^er8E`K#\YqG@.vWwl.eu٢wuc_?To)C|:=}.\2  UA=Tڰ=hXz/::[>dij}ʜ?\]o?Hy顚 -iϷ ${MaagL|T!GMyJeh$]]]}}ǍIa;Y|_X%7_.|g3ozG4Ԓbwe)0̨լ~X,LPG&x=5@`>er BA{W=#ݴZi0Vc1FB-cN;g jkR[w5b32_&h'B P5')f9iE 9Q5g'EzxYYY~ZT8.kA aD{YɡUC{KuwFyYSTfr\NV:Q<\M.qam]|}s]B:0lLT%S21 JNB'QIUUJ0h\P $aƁS "ydBJ`e EAJ%SE ;x}y>@rkDX){K;) KIj0 i94r& dDI)%oz5b@ \H[gws$*;oC?#ޔߩepD-N7X`ၛ4@͛Ǝ@` ҒMLݴ ])wBqEgQ%j"& ޸+J$LH(V<2@Qpn$ao@2U^O2,|WzHFZM\fSED1^œ &z|Eejw81?O7YAc)=3Xk ! 6Nx~S]U&Rƨ=ƂxQʹKL @&J?bZ' !R$<5I5[8yEIL{5xn?At3h}WFN*W bMKԵOd^P=\BS{PSZaXv9.Sspi%R] BNDHMWDv%2ڃ?ve8LRN2/HՅ_[?Hǩ=@6$ Āސ&@[zt$;)p{Xz62ʖ寞F(r[s'gm( sղcxJu ʭ{=qXyK777 Fc\o)ʁSt0?Eikxlkuhw%i*Mv [me6:7TvTXGX/.u.QiW8ÃsQٷqepCg$jXl k/]RA)4v*}kFRVI>,f<,I0Y m{ru:}oA#(F̎9X,((BG`"[p -`PL}_me{y!)nMcmP YP8lK#pAFRkr+85sa$59M+ѻNo?>ܗdƒxuURW|5K9C9_ֱTҜQ EPAyrg J{"^_zUM~VIRMv9γJpbMv3%ѻ0 ?}$g?"P_R!%|Uso_LLȎQ.i؋n&sHӛ^q؛fó06f0p7Sf=YU*T)wmԎk:?4O}9kH+}4#c&*a}"~`^"?}ˁri^f;_`':$kH ?׫ +!2=4%_\9L pQ@]vŦ ;B?mgu%.{"{P-hW|=A-QWv}[ZJ-5֪y~~Z*d˨^a.%iՋ]8l[ǨZ!liH۶{b!s/g[ #&KDQ5eRIS&F>_F/Kr=EØwF\Л}'A oQݭqDžΜmd~~IIIIբwq., AB*cuІbC{ʴO^X0j<Jߠ?ތڠgi@oF*:ZU4[T> [=f+ ~*):6EOsZm0~=>?}*?zŅ؃a蓞}z{uE?7o&%,^w%"5 U?)pZH Z 4 ('k cM \P&^p͕$-9!6%&ʝhI&0彰7VWDtd?1 9窿^yvWe" ٤wtț/]=Bt:v'w_ݻhXRDI_]ǚ M֛5# /V t}*ɕ8BCˍPoSp(oJWNQglb"1Daᤎ,qV( kkV)9 Czc(5l=5AJB5R.DB!3bMv 5R8` ;:4L)*o9uY"sbľ}_ !hwYʐȔ_T)=(#?Jwd)1zsqwk>I+}#[s~&rR[R@$~KR Mm#{Y{g#J1,qKp`LZ<UiȔqߤ⾩Ʋcn˾!"-O+W@†ըU)秫E>۫}Z/qww83|09R*89R/CeV2kyũJLTg{8k'qN⬝Tgm={CF&%4es<$h%EF2\E%r}p&=.w,6Z9FzjSN)EjV+^ dD$q\1X9 )A$Pa09RZϔ GZ q iW'^ veYf*\>\w^4qB2%#*|5 9u6xoDXVΰ;vGZa&P{[l8^kJ )Yĭ%M띖Xٵ}tx HZKi>~ JJ<ėYu*h/x^+K`pv=\֌Fz|0(@2}8fLљϜHA )tW`H\#I%bC:PM_S|Bohx- )Fj-wc@݉(v+C&X)" a%zfV KB"QP fGd@XKFT"-JWsۉ^|oI)'{rŻgW~{Ƹ "ƈ9B}w5?݅ *|Bo?iw!\.o Pj7]XwS)XbUzQNlH^I\'qupR]43\c*qVp)%,΂cJ ,N$B ;x> ;E.ۦjǫ+FJHbj$K͐0RNU<$ȳ`aYʡP\rb4O*"O\ˁO`ַ|墛7./³>}cX( s73!dr+>lpE޻777c\o)ʁYq~203Z>fOv Jn185 F濵iДwExP ą^ q#/WJx?R){rIf0YRHYUkp($gD`^:d=@=FڬOOf m=>WW3s+P:l UWSlRq``@ΡA1ALaY`eocɠ0IÄz;7DXDBJ u5H j ;),ǟq9R h 9*KWf\ KLi8tudLF2V*cPelnɠ\8pJ˘`itX$gL( /_"Q^}=*yVgn>5wl?Iv`I"3}f[ʹ%e<#>g+Z-]̂LrKSev,EIf$`FV6vŇ{-D_Peß'o ݪ&GXݪ.w T_h ڈd&`5$ %2912g2 RQ*(|,:ٰH4cV=g}%_VYad-gT$Bia/!uA>Y_*d'?lp :eKg3~?϶> ^܄^1Ǩ![ bj7ahEy1rL;ep1zq]kG8Giͼ!wo! q iA BX1߈[@n]P1|j_}Md@!On~S,(]n [ı݌_.&/tTq1y?^)Aeͤ?<ܶ`LSBhe2"vyɱ}R%띃<1a[ ̼ X36F-<9hKx>IF8+ٴs\?8aqeyQeDd{2Uäor׫}r~K$8FӹS$gwmp( GǎNijp0Ԓv;`4L40,4U$Ib%ń! NRw24oN*8PM_l~{vRK?sĥ3dn'nO:mt_MwstmWE{ ln&r%_F.~rA_QPNDooI 2F*MfoxPd*1M&~wnC*x  {<&L 'Xu9E݀A0{Z:vv;&7JA >fNk-ډ{ZqǑut![Zsؒ4oaJ(1CymXN9Pl)iSu^ p֥m Vy {bsCz"~7Swz︥ȸ+{wn[΃9쳕z{lz)& mC{g`9P+4zoѓG$FWzL+3誣THM-DB dwpQp^=]p B^NiAoւXtnTݳz jO =ڿMpLC Bqp|n@n D8H/bjZ(={3LDc^Q+_wZ|Q DϨTG4 KJu(<`T*n]yenHTB*%b<͌QPy j GciN5 mHut]z v"Kbzݛ7]&vPO^6[pݗ{0 7{5[lloFrM^! nZܬ; @+bbEeDK k)aÃ3LOdxRxO[`A.pl|'ĿKzp Q<zA3Y3`1Hi 6,v嬄c;!{C6z+b$~aV^Nnì5fS}QTF奓p<7{*ԯη\/]\8ߦnzjAtLC)J~.@X]!zީg* mc9Mg(zLqJt73ܜjg4|ym5-o(s4q팱&|)Ŝ3D(}z9!ísѹ8U?_ThgӰ Hn/';*už$/s+#hà2m;=O6~j  ݗwj/>ŧ*6FEs6+ q9ǸvNu9p bc|!xW^S2gCiNE˨`#Yp0 'm)=a{Uے{uq>L(j;*+ЖZPH* {J 't?%BdDڇ3m2ED,H axRdTh#Rʣ*ihq49bE]o^dųų FTpF[YܲJR) egF+P^AL6SyʕD[* e P)Np*.yDZ֞X~Ir&^\PTyhl^ sP Gh" D' l,Ƙ #ӌ D$C,*#x ZU"D?MIaAL E"2LX &B\JN,E\H%륂ӯlhx _G%dϚ{G`>uO[? xocĔ[C՝7 ?\1ǘ!Xp({[ %5k.7-MˇCdHG]Q+cY+kp*8԰JjJ n1zkLmOuHP\ =wklRo޸W˴Spݒؐ pݒ"ޝa$?+L Rrص6"cfR1IsX0e8lRJa,KRug.8y O0ȍI31N1EFH ڢu)]b\k<)|R<;h+$4=]׵6ǺCjp SBŸBhC& ]QвdуGzL;*Wh ,oJN%3t?aR3V G"\brINNxt.t">o}r6s3pVkC*A*p-&"ϑKuA8 ?#T(kd*T29E B5V[I<0O8(Yo9՚aURfq!)J:,!\J$T!'J%%R1ҩm@ =EXse㖄[SsUMO>_" &K]SMӓpPfǵ.? _HE4Ko[cS!V^Bp&n^M "b/7&z=s(ϸg ͫe +.5b:'o'Z Jp0uj \|X,` iXK.%I|~<#eRAքĬD?RE~(gfz{DGoX BQ?pvgu+$ y"#SL:nH:͑ōVL_AzLwnv?vnnAB^FɔLW@?}7ATp|$V"W9{á5+ 9çd\N> vL6#%Ի[s"o ܾHVMfx[`FZ~z(^0/z$pQ l"~?r(]~A9+ǎEE [ʠ/죁)_{ݴi#KjءT_']@ {pwvEh_#uXَԅG$n&W_bg)9R S Zso?1!UUIV=&9$xi~u'>} bbiVT)3%&3JRW.";TހqJ{ nSĞo|uuq.t7HU vn}{W?j7@_1Ǹ p"4/-LK{S Vr-& ;l-1Y@~ٖD#hfFJu&gb{ZF%sI;8m=^P5>k`im7hlҳ[WL +AZ#Fp*0B]ɘwg\peac77CE]- 'v1`+v.}vq߹b>0#K?ˇ˻e~7f:[\?8q 'Y1$+_̊!YT.%)w? 5.]A>fC]&1Y !Bp`]o#7W p-U$a,)4=VFc;"lmF[bUXdUZnSC *S*@II+\lqs:Ta \ 0W8Dvya͗ IF}?yLj|nk~pѴks=wW׿\:evs@HQP V=+-G"XpʵP^9JA+s(]= a x!QZpc+Դ "ni4`Z@A5P4ԫ(4[?"f;2^N"  ${h%>z)B7A AE=/O.ce[%?t9zgpw/~p],7הUc27Z)^G 0z3H}aѺ6jh< b~=}ߥ*΅vm_6?%i`mTx%MCj,E_|t_^UdxY\"u=W[k$vW1`#ILc,G9^oIގ"L@̳diNغG}t # B:o4;]hY@vsDOQOZ"DذDxT>n!Iӭ\K%*(zk^zYj2Gƨ')zsc)ܭtN7:-HZSD%Z JbqLx+,.,$3dVvܣXS=8H3X4q^rIN:$T2OT5e$`#r!TLxX*]'yl,FQݸTK-7P(iSj]һ)R1Qt+;T|JomCC*H?"6{vD nS(u1ݓj(>6hwА?Sj.{lv;K bg"؀ۂЂO3}>0T[S6.u'MIjyJ j>mXd}T'q)FN~Z$#Ӳ3gj,vSʍ[g+o~[Jlfsa3NgQ<͕cgclAlWKy{5A 1Ր[$P2\K-{f ^M<4;@ pCOb2[/8 Ş2/5 ^)|qU^e>bho/ˀ+}k[$nAs~rl7'$z0ȼo+8WouH- 9Cn KQ'J%<Ć[QIVG\Ċ2Z-;O'??_"cQ,Zj}!ևj`g ka(orj3 V)rM up^sD΋p0F7䏛a̟ ⎚t3Z"žۙ*Q=Xv }кƔƖUaM'ZWǔz!MfiO{|o1%$5 H/> h+0ˑKVLLJY=OÇ^'="-t2"9:@uҢZ@Sv J1>Vzz\hGoS*qR3fRGF\c},qL9-tdlE$^aDL#?7, RRx*u` v=so pTFzd@D DG6#􏩄unK۶ [c4=It4v#5Zl0 )u@,R% *O)\NpHJhc [i !Tx~ %TBhZ_ dai2_ Lh r|Z~A4AC1jroĩ($-hȟ\E)QTvy~>4)hj%ҐRd #zG*PeWz W\7,U\UxfZ\1#DobzC %*dztX;PizEE00l&5A*3D{g%/T (:/^/$1Tq6DAnx%{rЊ>5em|F?Y_ }N^f Ll::.TXu>&4ʼn řN1JeK()o*F1yZSNRҏ^4_A (\KhA{jB1E zHU' i\Xu-x+)RNz%Q*]Qh?Rl1&0`]EʷxX*HM2b~ՈG̃ Q{mkJ’Њ֋ 4k Qo`ZJuxVZjU̢ `ķ[[̰Bԉd1c +"OGLl|tqar׈P db"8rܬ%Msq7yw'2izwrcrg3mgqrɃ% WL WE?t ?΂Gzxx /}XYo'1׈P WXDl>a G-‰M<36&X)Mg\sRf{"bC׊]@y0bU(ѽRrE;e}!gJĪDbUDɻ!,ݺ) Gz>#9LwLl;-g]#2O3@i{>mR1:D%3iʜ$uS#)cuNHy@GeSk2 Ϋ֎^_fm6vE$Ud{u A})4WF(ь:mMɌ8xT1*x}aS;h-Xny8<ޏeu9H,@cB8{M,~۳hJgj{9d NgHR!ON̐33#ϘqRU0 I 9e%BFEO[8&imU7oqYaW)IJvJ.8JTK tpbvTJ98s-KkNm.i[:H˕Gݔ4_\RA䴥in} eV ZA0_ N݌a75frD(aP_y__L(z^v!E`EzY̜Z8.-n>|A=s|ʤ> ϼTfzȄ+03eB>GTL*r{ K2OgB٢P0aU.iP *Š"/ 0*&|۹=4bx~0B_g2!zL҄[Fza4qT*'\ $)%:/hA)xm$Z|ܘ)͖fk 5.`#y"]f,w^I4ȧN2%NvooE$1ޮVQS;PE7ts'yI:8~TSIٚ~oI1>NciV(՟Ԍ5Qv8~$t`)u.z6ZS=AGq#|x/' m\ܵV$Ƶb n\V}fԲH%v>8R#v "fUcW>DMbG|aI- ^:N䘢Bq>A ݇ƣnWpo|o|y]§~U8O-Sե:Z=A[`~UakSJXHyDb?V iGuo_D.UrZ,E$e K"894_CosL%3Jgք`(^zo J&eLI\ { ʯ6%Ib7*>tOti]9<ΒL灑,<I|_%zP|_K")s0I6oJhpsb_7wPBAESh<2 *EB׮,!"/~\SYIzikh$|0!HFJéx %(-x sgց;(UIh@n,:0'ֳϐJqe9U|f̋\:RsS0BHECM Z`|fc%w3Ьr__6FU+EJM麑al.ŊוghRӈ.M2#h_9?xC]LxyQG]I(d[?.nFO=SԗՈQu\aa^?g{Oy'_zzi,𹽃e^gѠ &ǤUFa7$C$uHl[*4$x:.U&29G}O5w\Ox{ap,!MzAy`߱S] B'6̰@UzRkeϯ˶X%H|aTMwWp)SΎŲc 9a;~ar!j(>!Xœ(ys#G*9ju!Hgru@O6Y1)LzǑ{dգQ|FɄRО Rml m62Wkb̼?w[׊.lgӹE T9v~p~y+ZQuQdw=o3b~F;>ɫA3~|xXܤ@O!}i0*Eg>Ǹ*uRV׬9 ] &+Ē_ŷW~ۜ׷Ss&5Γ7 c4*!K8TrOyB;=y@d5hhm[y}=.!Zl߸C۹ްD<|afکПd^/N` `YŶY9xQ"q8c" >}Y&*u`ߎp~,+(9%4ʕ1טa𵳕pf(3֨eo=*S8¥e0\6uǟލUBu5U Uq6ACX)xPn^Ý4jwHdf!^2`rCT8Gh苬w+8@|/1|M*NX֊4KfʍZw_R P3id0%,V}%:@|×|$O7I(у-zZԱ)%fҡxMaUb/C䇳rz|}LBD!rBsNեP=R=@wf (,Xш -RT$VnPFe,(JŔv0c}BDhI3F2 bcd`~}bXpz0 d4YKK뢊MR:A&[{:0\٭b6 bJXUNejC%Ӓ5 K]!>, -n`R!޸%ICb%4:EHB) X{Ձdz'JvD`V;`Ġ ;`!_ɖآF+'.l&V[upe\paaA܅*9cO"Z!-V>_ﲘt\Ğ6I'v4 .iq pD%Z[(OhP.;:-iY{b)Ft↋òJ87ι|ž+Z_\1pD*,v^ތU\BLʉ:DjQ6{ ZpIXt:l1,>U;|<(mgDxo7#7J#,|4I/4b#gTAC_*1`}hQ(]i b:n$dDS~femho4A}SbIaP-hV{rmO{,>%K6BbCĠU܂\b:qD1:VG'G[R|8;o*nw7(Y)7UoZ͜yUUJZI7rqk֫ieWԳ{ZC0E  Ud1S0s,1LݕsMZmhfKj^a5{G̀gX${. k]kKmFڈJ8ǘCfAsB*F2]=@v#"+@U'D%e BEV6"c7}5}5[zo\-i3rήbl ־2Ss $fw;3 }39-bf%< "f~[28O-aK\#=TfW+cB٢ n *+RrbgtƘ-y ں9 ['x*t޽S=-SO#[O>+wyo7proSt۔۫9֋$`#{, sć;/Vt ,+_Rݞa]PZtFwdI@>Ygl̴vEVho0uc=Bmk7h+Bױ̔p`1Q*|oHuT dF#TVHrf`us]dq2k-yGZx%M~6 .~(giIkvpSVo‡Ezr9ay/Nn?s*r'O>^7n3S3N  &5.lUK4n`~(Rş۩ʷ/޵7RVkTmS1hUJbkS52ɍ{N}}r|}edr 9%ߚbiY3)D߂,?^cֆpf$E$oD`Ye*Rt6rbx26r5Z/&lIsۑӋϟ6?h!;`m/+dЃ~>sU19Дؘv.|ꄦLtL::αCLI͚e5VډU[E!8XrMGW%*9޶W؊LJ}Yyݩ3dOiPcV g(¯ONf찖8G-~r꽻lvrP+Kȑ~S궥/ǭI8'X4n-Ij>qì1Y:rۣnpǎ;挐ڕQO:A+{{ȿ4f{^k|fèM'-6l  W٭xn/`@97o]SNCN<`Isg.l\ .:!A 5MTT5(`,j<'HE-Ր&wH՛(|N}@o,; Ͱ8ĄKo}sab6j ;uR=![za(QHhq1HhEPi,MJ(:P~O 8oG">D&Xṿ9kttz5D`TbqZz~y:fXzl8*@ @vvg='=f|ف۟L+ŗiw'~-e7z&.E4/=ȇϳ~ܲ|%M+0Gnw;YKoݝ>Šqseg| oMN\k7%ϽiM> ?u^֜ HILtm"\Ƙ\X) [З"ĺd>|8eNbR.F\j)D8wjbK1X4>ɱJ>=&%b`T҄*UNN0ԹT> ### 8$׶RjH+[v-O(~n`]x"^,JzF򻩄CN&̯umFf*V:m$vZ>L>E2>4(B8QCCCF +.;zD\0b^Xb8@"ql1Q""UH!k%TD(k. 3@ Hl$:6XGt"B:#kQ  'X'Ւh,Db~jgQ%\9\.)ZO va N?Hج *n@0kDM dYS{"pDa%{H|yGA%\Dq0C*rӄS)"$M*U4؋ˬ^+FFǕXk(a^1KM8IPLD"A1FYb} Hzʾŭr&9 ,,N5nH$l>.Y=F}e>ޥGrB:jpzc)@>apH M$iȭ'cHޣguF2 $ÌFqCAQG,|Ǚ!8j5G b 9p H`ss"ᄑ8;1 nÉ㬋KdeRJK&2 sN  q.J zIE!#Iӫ sAvτ~?~O?(ohGp]lxgvfS CULnmfP/ᇑ3c%v!8ݦ}/Pw鹽^|;2W H\\=~ [ntPOwYxqaҴyN/nRZ]kzg BacřH3xʼA٨P@.{ƷaDž:|uxIq~lf> zyp2{\;ftU9>?qK^~? =p3=OM{shr2o?la,=^ngk9wgk0Vݿ\NB (2Tvbnq0lwIǥbkʭw30G p͟E^f9 B6F~a#?lYJf]hvu~ach Aa#(lCCm9% EJ(|+ql4F$JD'R +IS`ŎnAk eIĕ*n砖*w(\qRZn"KM8!CFPp$ED%Q&/ÌLWiMebrDi_''.D8s˨BD CH 86O # Qb@ZƘ!.(9aVa.ʊS0Fd2D&\H*dŅTIbMb@J` 7=>ٝ fS47!P%qW>_$Jx<[ۭE~vl[+X VyF`p>aA6Eze[? nRs]hFqJ^ oHǽ&cEUԗ5:?XEkDž)昅SPS|fE <#8r1F _I/|%Khu 9W<h0׺Bql؄Q;O Y31k'3>mOxFj Bcq({^cSTݶQꧽ\v@1g7aàP&a=u 鲂"BYB!;P):u˦%pYT5T4a*kf9ZGH/oR9;Өwk c^/-9c5%MD&^)bٸ^+n n=k5m;׵&ƭH\Nz++H /*\>c5F'ukH!Y^9J<ƒ:]JZYҥ.zb_ }n]FΞC;:J;,} p \3WjՉ.+Bvh֤BU)LZ%zcKX>e|W~}V)Uxei_Xi nxRkr0y XY) 8U 1z: ,/ZLkϐ:gMi$0S.F}KgO 4jE15h胅w4o="|都X3k{sxhSB©|/C!gjf RM!E% V kgfPD34 UBU,YB9fDTp9WWsIգ2rD CL mԮ5(?0KލF(:qpU-{+${6䲇ē˩ъ I8 y}Aڎu@;D޶c>ݬ f(!ЌYJdաL.(TDK,Xcgy^sDJ*N 9΢X !Ab ұ49+Mܞo6X;  G@;x{ql(J"zO ?Vh4F*{aQkQ Z*nT*eCŚfxB;wpjiJ\{gYk~4ՌwWfh`䥍nG dSZM 뭋!G _gY~%}F)U  FRS C=?o;jp뻭NXNŸxzl?fG$ub>ONOR>0Uյ8 <PX 63Tu+k;mfNMWe"]z]vlug;ww@hl p'Oi4e0G[<6<=o샑:Oq/O%=UWco;{`o^g}spx6G9~B/ħijLfλ7ʾwI0P?O/<Ῠn~Ϳ|s˾:G/5J'=z.k!췐. zQJ_ Y4.[)[5ZfJ_P߭o*_ h3 % d?~`̧^g]; p:KO/ꛚx})},~7$dNr/vV)%ӖެVџ\N"^ %58*6~yW{űq._t?U_fp>4['7ZioTUj~PiۡC;kޮ].}oMAi.|=vPRPophaϦ/麰qjՒ5\c:.F]ظ~=mʘVq>i_j׷;}^^8Oϩ<o^O j_ MH>k8+᤼v ZWh}?˯+W#l60À&Kڥpd58 .р]}K~x'`0%)T45-Viv#zW[~n~6=xdz]9Qq%VQɊ %1% }q"3(G.ĜY)rʝ%Xi ۍld<~ˍ[RJm>@W)y xeeΛuURк *(?F`~Z7?3UHh_ %xP FA\A 91HL*En(2)Cw}Ӵ@(6b,DD K(L]|M&h-!N'&)T~QiYu^xΆؒ"%t;[*Is!BxBj:[BbK )3E Q%1d4ӔKoͱ%%BO,oYFbjGC'ZEm&ﭢ$1WԢעzZf=J><;23PjYvp5&+)#~̞e3S!TfBsOæ]jhxjK -v_mRO.g,xV㖊otKlN;b[ny~D UGYs9o# ˃?ɥDK庶ΫGnSg*ղY~X.h5Qge2PA([=LzClt2ɯ iR ?_0h(0fjz؇ev=֜raC2fRw][vKȻ6SƷr&c,Kv餕E?-1jGmdݕA1Qf2S;r&f)r3t>XPZIp7Xju YZ$5s[wJ"RyUEFj 3:]^.}M:)cvN߼ۍ,| !Qvyz1[kó~G*I/I !#L HA PsX̨,!GMp!y` ZR:G>'>>'>>'>>j~R%&>^#$>N'g&>6H'=9bωωS; &>^#/!ZU'>_ssscg#.ϫEH-Qc1\7ʕ*gap֙'YSI| )ſIw9!f>0  UkI% GT;SWI*G@[CIc{!7ƪ)* R"8H= T#U2tq,`k/zkX9 hs)hz`r1}(zaN Gp#rI5 +|rL: exIIjRXQ 6Gmc:tw/mEw2F[pUj>Z[66kYb  +YqknxGWn h嬔Jw//Tw%LDWXAv8Dt~L6M" ey99#Jmc!'xr0V9rugJ[ zGPDE(cԬ4)abB +IKT5\Id`9lp`$iXT RH[; 0RN $}õX/mOQwȼsgsgsgW7'qw{{{a~) #wqT;5֛ϝw[ڹ޹{+; )t뜾;s's's'q{N7D*bDf%H6`9E{v[g #XzztkW.}Q+$ ӽ>12i'>GhIc|ĝҨS1h90aQi8̨NIH[)ݹaiR }?r\^۶*g ;S~ߺD=xnZ%3ʹrAC# $<ZR @c!h̑'Z"B2%BXR/5QIf8rӚktԁJ?%2B ֦S,5J58'4lE-J)R-DZhs9 ub9&0n  {#Iai\f2ia>^n+5bFm-XY̋z}T} "Y ?|& D*D 6#:o?ۮv+rg2h9@gJj9lFc`ĵa/Jf߭ 㛅~O<0^s7g~ņgv V Nk zS0HPI&ʎ&u~-N"ɨ}=NbV\eNW$ h8kcN;P3oW/J|Ɂݔ&?'cG_̂:|c|3?,K_&viOY}&Lva ʰgXv=, ꛒooPXg炄%0ԟ4N~yv`Xa`| 8!`OA3 ;C>6EaD["hg?  KR;z~^ Wὡz.oz5%9=ǟr^X[vr{5jNcާa4x놸{6 Zŵ3\\iJRZzkу\=R/$SBw݋؛쁳׼@W^_,ܥUr|J a{ˌBjqt܊4ɢoByzXҿM=-EgkJ[̑($IVs{Ђx 4&c K$@<. }2h,găbUէ$IJ܊q7dFKQ4=N|Zi#K=2YYD"A(4a[&bL\.]J싃p737^&%k gSvALe]N.gWޖ1KCv!jkf^>m8楜= 'x դvۿ(JjmD/s~%2&OK"3kbpq3BݖTp8۫╝x~t'Cp6 zp"W5a409?E_ &pir8$$"y)NrK[JzPJ*n}iYPY[2>^ûkd9Eh/&96-N (οQ[/Mւ1GW-7'[Uw O`uB|Z58g6FR1|i@ Dj_{l2On3HsvPc4AKs6f4:w[Ug^d.PeP{dXiaډ ge@jq{Єx[;ώx' OWuvFs дy3B═C*Q$H\22J d04ؘtqЂxUk# %.@5Љ֢Y刮 Rv%>&"qJ\9mKO#--Wܴ>*I^(tP(Q3&"># ^#ܘlDՃVf (lj3BKrJ=!*n#~-+K>dl+]jb)5DYЀB| jXvĽ.qYj4|5 :k0ej5>Ag2$Pq4>D*ʴ ,X7{6.ߖ+f&-~syL4׳ڙ57{ThVT D[Cb@pJi˭%UI hKmesɤcFgW͝ 5$ yAE"A) Q3.VW4n(餂\a.nA,IF>^1ebA3D* 0H]w^r-nb%'rMkG_R$2b& 1IyM̵@W tw2nn4z]\`R8nⓎwgK4СRip6KZ_LiaBkAk8 9=gaGlLѩă KEa˳2fU ڱfcۧ"xsbi CyLy'5U!kXq"L!2BvY(Ap^y8dAl~"'qxpK&vfw6\|}AnKK3IU7Qh,EqUNje A@ &jR k_. w~xEDJvR",qVXVG|s ߧ1=e)xZ'%W4 62UDȃao=^[vB3" Rn-WG?jNwG(OGa9aykYnZ^lݼ֌F's+s/oGl,~0cq;Xr06q^ Ro㘵\ qǎ; P-F?bNK%žw;/gp>-tznAbScS!>gz=|pHD%=wr!,^"{<}\PO(Kk(5|?\Iq֙JnQ^ӧIgXe=t-_fzȱ_13KPTҞ}!咪:*1F P˴V$[۸c"[4M$20Y2rطg?q9 =p0/C3K.SSR uv$8B:c6!*Dg!@XH \#(΅\V̚I&{sO? NH5$8^E .Q1i 33sTp>#'v^4pԥ >Qԕ7*8gRۘ"~'s ^,$c&;X|ocu^|љ՚]\vlw2U{vw@IZ{U&%_kaƵM#!Q 4S4>֞h{ 3p\crEJ&k59o&RFTȣLD2Lԫ@<ik^. sp 7=yzhI1yiLONeAQfLA0Kt#l bu,3d-#孚cOo&9 XGu]~_~㫋%ެѻ{1TTOev7ٿu&A.> b{3"K [OJZPS [9'FԗAa{B'Yy)|n FEOx4ĝ$rxQ9,ujgMmdj6^0NZ֎kDx~b+.t}s[[Pb)LYѠ> M0{7<Qj\ Rb3 x !HB<\g&~wkRփ3G@Pj*ƦfϠDchCv5.zې-P'l4rBcO(lIdY2 ʀrҶ Y/vET5̲vӦ 9r|db{> 8OZ?_yn?_Oϗݵ 8QU>Xc?)6 JPw|Gv4߯̽?l?|yXVW:clԀ,u=bhE{UtQ1Գ䥅P`(x (KR[`!gfA& aĹӪ~ѮDr}mO_L& Q7m-?DrOriWU|UP_o WxsW/_R VFvEG=t鴍pq $h {4N  4bkG9P>u#@FWjDmy֡i١]5}4Ǟ~oVVY|鵽e'^2g%%+PSf%)HQYbkdwxKyTH/8)c3z~/y_44)c32djFcݮ(9NFT(@xJJZx6 jɜPyg B̼ )1cz+;Ry&$&nD$RFK`$厂#B/=0b>Ck;]ηǠA5~5JI$ ,:LiTnc2{sR*\ʃ3T|tLe6|>`CU˨5nrUOG~tm4԰+{9F&;mrK&iesGO,Ą "(0\Y|kQINKM=K 1xaf(Tej,СG EϘ(:xxC4iuoLN25W@k2xVig9"p4FTt¥4$ DkQgr)|N!m%;;nN *Ad !O[ Eb"B:l|%dJ@rT޴x'dcJjFRS"yIY]n%PΩ7 Ĵ#{%/Ke3tTF Ag 5&q/YHdVF'xIϋ s3%$*m-Q !<`p'j0.ռ4^IVQlVKhGNN#sP2 H>'14N6b(tnT!d߻IJ n?hTy 9jQ4*+DPnDǖH~*Ȉ߰D:(IՒxJ\$T7\Ia#l A5)cbkf?jL$g['6g<$--nȊY3*1'-CpgOH{I+S@kR<ͤ&a 2Jmay9PϋIi%J@b9E;R_凇۸~GbNUl;wf6]1Ҭe7㔋|ƛo;q/M{g'wh52ߢ{Jُ''ʘcY:(\"JFXO@T1 Hgᮿx b~%'KЊ.^/HiZW Yb ӅZiUlRt; 9nf5a oFo4eކhpifӇ+ᔊ38'1 V qy͜l"/l{GTtQP: FF͉2$*nm$\`ze#W§P3k&Z}fzkLhjcS6ͥ0zvV1UAŒW(d%2&;GuGO eg젡!1XDi#iм!Uv6|Z_pxǨ7&&m~.W8Y\/WY kC0W5T-9f֣ 5UڌbhtFFFy"M#S5R9Q9q#c7[53Bm*!҂P5.>ͩfHSf&3W\Exwc֐+=56+BlvHfuچJ  QhABF0Ʋxs$池7z`,Ob&"*Ż;cRu},w{рDBF'TQ!ՔSˏTl=Ǜ՝]4~ :X˗5u2qeP?wܢbEQ`*!Y8EķnnW |=DÙ dQZ(DOH.s:ZG%kC!ѓ_ǧ1}Ѳ6Ǯ6V⸻:9!a7)$yAR 'f͇/m'MOK=.>:凶 F&mrFu4+e<E@Gv0gBI` \ sP~3/'C9DfpdDŽ+I=ÜOq><@ǔGyH,(=F;zJ^M=عfM+Z|ͯp߿YPѺzf?-7ISaOrH;altpE(wh/k(jOxp!O>]ٕo I/~Kt$L&CL>@Y=/c#gDxK73.`qaoΌpʧfYbͰ_*g%WmtکiBU»48Xx!>`Ov#Q4W. uɔ\zS mlYɧyLa9Oq0m -GL7/u97.)tm[۶l4xJ#6>9_Of{.)tg /^ !d{=?A6zj%o4S jEvܢ]Fʃ'PTsN&/St3zr{|g1_;{d?͛!WӂUpo?n Jv?PυÐw7T{Ah.F<4|Izno,Zo\rDQ[ԏ{ocOquaw7q??S6n]j~r7m|,%02%48KapwDv~|0'Y(FyWۧ#p Z],؝T4K *j;-hxMF,ZUWsZ E[; &l47g=Edw(Ѭg IJpj`q sۀ_mBkiExa[w`X|w ·Ϟ5h:[ =Nf wr)teXTa}V>edpI·i U9݊^?T;=hr=zٰ\V>f̑>׀ 8z}oN_s7;܇զ82ׁ(nde/Xf9A"xr-ur},Qwc]Dw%cc)~ːNRšGB3ަQ!1rrpFoQ! -4ZOZnTpݥ77)}q*4 z"_d {V;$Π*-_p!J'gY\-j3zziL_-k8?Ǎ[IA/{y }$g//`G"i'߷غuV{ f}EȏU8l9fOf)dǥڎ{#(" ]V3˔'.O[hn R2K R *c*-$KeREBܚbS-f¹3w}d=SS)b0cR2mr zQ[&{QʊAkV0~)zvVw:;ַ]-sDm}o_&1Bv8EXX]8ɏIm^o{h۔ۮޫ%ዚ\{-QIgj-TupK4GUk, xiaDpW Wu`q SfX-|>L,ui5͝+ٻϣ!<ݍL0-GkZBWd8.!` N9b)R>=K|uD: 'ͥRayqD9b ny^Iu`͑"pT{<&>K_Xn6"JQ ABj~"E=Vq|fӇiL%yZdfo'o}›ջ[/bL 'E1019!tZJO~Q֖9%|N %Xe9%|N %K^tZ-^s(1\`VI]Z3Z(!j.#i`yW1XVY@ ;A/2KQIq4KV[(9-|N {..@N1Q1pxpT2HKSnE+4Һ NH"v!vA" oGT!"RT,J[NPʽvJIiG9`&b ]t/;Pi0]X|iΖ>Z4U©ѰJ #^H9-\!i2ݧ$/l11,&KMG\wS|/2xw~7M?|l:,a g{&Ȱ{cPs7 f|?Id)l8x Th(9g!ޚ3;"AR7\'Lx R]oN\}@JS_lŖ* t2̴7teJJn+Yab=X9 Q4HЊAO0^1 AWIq7 bz5\ZB+-`hG#KQ@&#:ʆ& ͩI%uc"׸yK;%s5x-KȨYUr.W]Ψi1Y 5GϩQ%|gU\Dygdvn촆Q2~`s ?&Bq.,Ŝ3Xw 03.$4 kiF+̐& `2,Y"ʌ<3%"v͐SLTau i\Ĝ@Xęl{%n@ H [1"`3rE.9A( sRlHV΂0y|'֛,ٌS͐K!k zj=@gMmL $trv`F֘U;p,bmU10+7 90VŖ-;~7nUmi/ oT^z7o$p"n& O!Cz&=~pj>c oONf< {e+Hle?Sñ$"w+D6QjVf2$!bgCHaYtJ"I~y# x0ӓ %^zqv|,D1m)D$X#M4 %^9h+x` c9˼%K@^L0UkQs;`|ٿ/oVs cq_B8'iL<I(FF"m%] "P:eb)DiѠ%ÒHH/(i-TM9.Ixh\X'n&O0{2a*4IJ"Y,*+(!2B3v bPa2@ 4Xda2zG5A81m,(vOZ ^đ` U|DG\EAvI*#"ݚsRF&<π g9;pub69Xd#V9l PF5i6r"Eƾ56 ]HތMhoG|gzBs.s}L("+:-qr㹢r~. Gw&JYf$Tn+*0  9!v')v̓"Hr9Y.!#rRL5n,SGl\(32W1|oΞS .]y+`|4ۻ9ى~x'-Hry:;5[5\ʹގS ,r aZ]Kd9/ِik!ײF!%. sSfUPÐE2j'Jb_ATW}zuEk:J~Tẖ"*:Ȱ;6]`.㢏Rr.VZf[e#zmj'luw:4Y@yǐ!DwfҜ$Y`j}Hsa:\*<Z:vT%mm;dh>L_uc4ʼhwJח69";Q( o R7a:_>ACqȕ)2CC)kE0b[ya-a,Gz'꯮UKєN|/S߬AS!k* 4SkɿH8`~+w#U~B]ݲ0qHݑmvQ>RA^jYW2 AԷjrֲjk !HB[M.j3˛}fyDfyfGZ7xHG*tTs=8܍SݠRV*egdG}}-h? m<Ϟ˧{n@SInB=vzM)MaINҼGqq9v)69;jl?#R.g[h~eFȟ|S\ߞ{uc֭-9C;.n=to$2Fqo֭֭Vc_黱16 /mR~L-DDPODNz,Fk0 wEƘTR2gz/o]Bm)IGvuwPƑT<{W^VCJؕ|ݜJ'9?oP^v1wu]k2%zOt2ffa{:Hyo'0w-jxԲ >uAmH+_O|lG1dk_w#.(98՝ q!ni?nqKSLΙ LzO)]Q{N8gmnAPA!Qq.93{ssWVu &$N`?F6lt <܍Id+DPO' lUνTscL|X'^^VN 'lOPZ Q6r}Ş3%6?Fj dqwT5srw}xir>ACa&z+4ȍv -j@,dJ-SdxLӢ4d ,,xXCTҡ=E!QB-^G#x$:K{8$aW[Sr3H5,zGOy3U`.UdO88'8_ 8\`'NQv{t-m;DS Qhr}97;@kOۅnL`}3/v*:G}u6 TSkpC>szV$r$,1{w ƯDcy\bD/w'fmo(tؚO.ZkVM+kF/;3%>w>e1Sjd,C}y"PE6U<<Mf N*ɔ+\|0ӕ!Miݲ9c$.ngT+"dn&"&iC:ŸoAn 9$E~V4W%V8 EDy{ X\g016F9)\"ܡi3cߖMOd~- 4&~x,N9/<~\r@?;U#ҌYc*}OY0Ƌ2>x]tq[J8z'1J62PihiIE}er* LSP^.epeQ@>},h ΰ$z1024ډ)AwCSp IB05gE9b(BE:w|Ց]Q>1PRFD޹P6뿶 v_}Y\YR (o>g>v\GejaAfn< @NWgGN滰9_m:\fuu9 mHX3lbֶqXփjrV n[ojo~?ruwٳv˯~g׻VMo_[9{FEqQ(aBR){sAo9t_Η†mIPml[ؿ6IxAEb)]W3A$pHKi[寃ř! x41F@)|\ eAx"\u&MBMY&Jvf#uun\QsgM\Eq 4'th#9RRYDCDK>2beLO_j" ~lmFx){X28L6b#05#i@ K=nKs \F`/C v\ua:;cWQ"v,C+шbXx%ʆSPʁ9j=`j-Z qP2rPX0,s F0a2:ǨgU(a誆AjhOd0ȳh5u"H`#6*WL4P3k(ˠv/EXF5@>A- ̃e0/aY DgarMF `Q$8H.99`]ASޚc)DL6|NgʃiƘLTzH[ oC&rŵ*;F(91ܧhmC3:iSRh<}C A]c`%*~ޜ͖k@/\׀-_#槌yx%)&VuW:WisPj%pq`'Ipj #?\!lӆbDmI\{8B2_ hL{@k@EG!#! fMQ \.G.$>LYe?M<8jif;151XvzI>ޱ$!PCo$>mJ4pwH+^әQwfT`IV1.dcDԠ4ӠFCMx<(IJ>(RTZg8vY0}86J10fZ.~U 7\btW^%F_wFz.@m=Elу݋;EBLXK[ BY~¥&+e|J=MУ7)ZiRp*o Wϐ&FS-QCҵS.$'CK(b!Qg9sF",M=ʐ`a9V2E`T\Ď2(%(L)E(oc$RWVUi$դ2j%or4:fRQ,4#%(mh9^\܊hЖFAp|$ SaG6a:j.I(=[ -tA1' `ȴ*d ,BF$-)W bQQsM*d71 1V)&JIYAFK޴r P1Z)BLZܥwo.2թ\GjɀV g%=FtĂ J(i(0Idž%HeD(ӝRx˽-S2OYQ<8KX/t!rۺ  j ]!A'2SϺ Rc;RS٢&ӹ|v ڈ› @zq1۠pFݷf ׯή\=ajṄNAb5~G_/w~ۯ^q ӾSȂjzBkj03?ξY^a_\W`6׭??>'[H(zJ9=w[^Wk2 P]›]{@Dݤ'A|,JPADdLI ӠEqj"z}3k=h1ɢ[pH\e&]=Uzt,?HIA|tP,ä^xs*0SJKPJz[K(U0%OpDJo>ؓ.sYfz&L{"Ghe)L(HᱝpX3t-t L:t<4—-jT1^4pR5B8G.v9{Pzvąnj(vD j)"<y z&@R3Alqg! bƬ%CO2%1^Ox 4?fZO{Lns֮wosljtF8Btk@p5f㔫hEB'q"Z8ThP8cX z=-!HPH`}@6u gO>|]:҆5X a"ҽ$fșINJ]ֈ@Ut 0W(Z%l:F~8?7PG zp$V{)x7_Y\7S呤4Y.\5CQ9 >)\c=iQ/*'Ri[⬼JWI:?,#8xMԸgDd8%ɑbh쁖1t <87;N{6Z,\ܜZކ퟿1WxӝZ}_BDs&;va ^or}fW5~PrMQVq5=ף?ӾANxVZy"Ayq19 f?_0x'YX^ی,bfh/n m{[,?evwFzR^OGw exÝd;2<I؎Lޯ{ܳIB!Ѭ>q*NGo/@7dD1߻~9<\6}e͕;R;/dj1a˻ۯ.ggo X"8Qhr oڒ*~e)$gטc@Z@m(rEZ%3L3US\n7Ƀ:eq {P uRKMlFNcR=((R0ݭ@zҠG~eV7ˑoFӇfcϳnջW79 epdq\_Ҵ {_3oF+n`X7|{,z\%w6`ܞbhaq83g,Z#vPb:#N4JZe/4U!9B%Ry%^S kPLVf;KEi5"i 8$#XfZhKE&<P_T=H(%Bʏ +LX+chڝD G sp.X(fZg5\K8/XdgYNs4}7~GjW 5b%YiwP P 7Z%EP뙘IؘPåۤc- lEO&g,WLܽ3p'DomvLDv=1)yi>G.3-ta_Dt.' Dkq5FSssƵ }jBSoϜEsxb]tLV!r1H|Gԓ-,x83g,3L N8@r5J -kQRF%e#,蘣癤jԂN}3$Ziz)K/p~OZ3oŠw6Ʃ|МtMrQ''B\B JPj^KfXYW/~I}h"N^b쑒 l64ppNWB.cS %&r408Y u{o#O~~8uls4*g|rjKMP],~|ŪҶǽ­N<4t28ʔ $6`FQA@<3%X@TKۀvɘ5A)8׋@99Rq^ ,b59M58 c|ST0Bˁrdxpvн䘈18޲J$n&MyzojrȤ+ mq-^af$7T' cw c9,0183gCyX/;1 bMݔDd]@U%Ajh_&2D&,,>m(H1~V 7.49cpmsQҦ0dy7+ۀc@ O `PD#cl Ѫ׼~^Ǿw zct5|\ZE 5rF1eb}iN-"FQ43#'?1f?K,1fȅ()9OIޮE*R/ҦMiTI5D;&Z+o<$g7:9; Sң|0,?zx>V1gsVc _/R6Pw{+6к9=3qnm4 DD G>l[=O ȡ|̧0b֋]t"!,u;Z}N:ipwgo QNJ0Kh 򓹻Mf˔ 2%g^ynFGd?{x.UXG0xc %Y4¿j,^"V{񠍤%Gec]zWP#~~9=TYĵ8%.yYiK 0@,Pa)+/^cRsA5* bљCc*Uga`R:rر`u4(SGh%xoHQF/q De$B8_|N58MAFci䴉Jj/>׸^gQD/ 1`1McQdR_|N/Ťkg$asHAԒ DI ET)9(i _Eg-0h0NEo"/'q<*moޯQ~H+Ӽ-|Ep&[]q [iR n"7.+l=j)eL$Oti/JW*XUPB 4N*x(@˓RT(bϿ~^ rx926%VK^@rY?c/픕Af\.!ƌ9aK0o 4C=ўBv/,2 f8uxxI5{oP7$Ajo0?fc5uw8ղ*s@`X VćL&2p '(Z% uȣPam)Q{$ςt&bNLP*8ĥKdr(FkqR1G ҥxkTsO3RW83Y+rOQ8jHtaF;nm@Q~Ak^bxҚ!*4|1F 9FtN=UfAQHx 1{p3ԳkO5T~yL><Ƙ!'H }~G5ܴg+ 2^N蜁?:eWH( 2)|z 8!{kWQF6qpb N_Cr-⩮H26\\W3;K[m۞UWm'1|$ 42ZÑڍޓ ۞Q*rw0pUNJk`ExnzʝD!`cirhITiXsCNe;Ne׭WS1|2=238ptAS%4cWZz{r|΁u[0WaHmӤ,.."@孱∍Jryg@ t8x%Vw))!2X.tCt=x^㳼ۛaJ1eMta6C]vkVݗQb& !ܳsX+ZmD8`JbSwq ,g/AdF4T9G:Hi<2{21dQKK- >%/"}B$[vcJ˕ ,U4`evm/UcќYFpGbǬ^ FL#1IikcXbRUϽ|HV6m'j G$ 1$^P' gJpbR"1-o*%Ek!ևb hbg?S@*v| rQpyu`V!8U(bT8"! AY1Z6}D ƬS~/L?`Yck>$XNl +SYkxC *!YIg%O`gf6 7U<ɑE:?-:AKG(0zA. gU#P%n}z+0/BVS~7dUҏ ~yQZUچmOo}%vӧke  w4N~|'AJJ(4^Y bbyH)DӌrGhnfmfX_$@A\/>K;_>7޾[̻~XZ\Dˎ_y5j$G^hZA]ڗHrI[<&qK4n rƯ/fMyIuF m .".$HH/r偪fyB.,kp .SX,ono&'s|=Y}~ҧa=bwwuFgqKc30%ι iX HmYpwWڮp'UF{etTሔjsfl [Z跌\k8%SpcØe„7JWR9Cdo0ߚv;۰-p1~ \ח+f6YޯW,0n!9BB_pV ү:#Ȏ8! @5F(C|%* F~Z`|_:b۷~;i&ܽo2ut2߭l-m_O {gi$ { |oIdFiAP(HkL{$<΢HKl ׳$gay >,I^_tuǔ-RlW?ނo׮pDpI,gS;ap ) X%>ׄ1fɏ`Jw@c${; U31\TYh߻TYM tm"@y¼;#N)_-Ada ) v>Hb*ja08B3^,^bbXK$fj EX(WDacId ) ctH,ĭ%hB0Gd:O_q/-vU/NH. *K+XZ|]&Ksժ|~ir+e{8h? {;eQ7MN^p>n$'uey~\G5?aWH6Is'ݎ[LQ.xk1Gc!vj4ǩhvM]b{XuU {LjWGpڑ"=jrUj jփ4jU>#9RÑ,t=fB~ξ0(I]4**jՆjAQ)k+A -E 3bœ0@r.^|w% .!}W^!޵q#2H[|?Y5p-mdm93C~U,ŪIm\;/(܋꯸aup/I,d$N5Q=V9Tq5A[ a܅"F  !F7Ⱁg IGl⟠uz ;uSG9>r&e>!;ٷ7rpw捷r?^5Q kAgKuSYN; !!H| |4cQQ"}Aԋ;, bgv%၀䃧΃bpI|xP]}''=[ JL{M!)BH~Y knan/?cypBeC6РȆ i +'og;Cc!0y[ia L7i*/">BAfc&>SQky' h om:n4਱)vxQ5OW,7 є6 y2\+Y&&> QfZ+k)nQ 9Ә#i!k^*F3/p3A#c}bx[Dlk!ƋhڵLx>3:{5KVZkޒ槉!Tvz2OSFwYĚ!ķ9ƱgqԯذvcA]䈛0GNzwz7TsN6j\٥?_.m]+B7{ۿOCj?,-.՝ ϷuoPV0d{-9eP[;?<ϯmս~zsϿ˯U]>A2kuvfOt3{޾Y[\x(o܌ bfAfg=:e/V >;ggp<:1-(.lU-ݎgOgW_|CNtu}}tq X]'h- p}Esh~)lwnz65s&X CuięRik)*Aƨ" *LNaQ fr~5{s W~z9ƒAfB R9UfzLfphK<~Q> EZYA+QA"槛b I8JRu#XlcNu6*;'@ZleY)dY fYeEgYOGͲ){+srxq@8Xsh$TH':-[Wgi옄DL00oN=7s ͩdzq&uħ\@*=c Pdx;Qxd$E5K3IQ%zQ\ 1 8NΕ!EpBA8\q-J& d/8VJ Ι5E MQĘlNN%j|=>y?~z1s+y˯=W/.~v/η޼|qً8HKh*h92Z qF/SBٶX2.zxhwMQz=_Y.[$?(=e Pv0IhHhswI,dSB(V?XH3؆(-X&8A*]A5'qhmDxzf-wy/^~gݝqBN`vg(Gf3_c̐zzN#|(#<wVw19|˦,v2 ECQ-Ay𠴂A5 pkuqP~ /XሆڡЊr[Y3^ݿ[ڻ:,9xx^nM!?0|dY0wJ_ \n'ciX˄$޶Zg' dxv]e=JHG.)2:)ذ#jT bD'u>m}VȎvK&4V吐\D[RtЗ3|΋9ycʚ Eӓ:XK" `#!:wuu>F=8JAR Vg9dɮԃRH c<<:L倊N i㮮o=-Bs{LR5^ooz@:>=E() Dg|(ETT* ̼RQLKOPn RŶzdE@D(؛>͎O6yOcy -h~Qos"|)Lvv')ӋMTc&4/W+s{`SFBP-I cGƘ4\H8`]`Q0w2l%;ah?y6-֎Θ~7ΨGSmUءD4ڶέ#p+׺_cF=cSLC0޵p'Zvպ:O [x kkYMd:%:X?Ը:΢6#DO9#*2q!xZ,T&jsVzw3-y 9|"F/F7ZBHJIkL~\󔃾GCT#xOG0 #cLr!'t).2kbCC:hz2 `!wyQY"!( 4EaVXs6^\BC9L!xc#f͑8tnTq/Lպ:/O_]֢䙪 LI;_) Mvʹ%VV`MTa9vs4Vj,s(-!0 DU^N9'3؛nF(>=J .(Zs:PKF*jkx SV-GL $&Lr¬}0.M) !HYi,SBs 8ƵB͸qSI&]1|完GA'"#=F  Ĭ`[ۖQ!%+t_)>>|"$S0^0?gJLb[Qǭ3x{sZbd=Q3QJ*zy`~>5Hu`riSV$D$ Qf{V@dk%֛#: z礚&\6X(ezc}>k:\QN@M)k1-k5V^y)0a fEh! F'|vxD)dBchLI9y[kOѮ:vK FtRƳOy=A2-|"$SONnK1y&bU,1q^V)"s >E|L9gǽ9E|LLj瑶,Ɛڲ!qF$6!_WsT1 ]ҙ~^x1H`/Vh i&Zj @H!V8 au)wMۻyow~Ͽ?;F]뫛ϐVz3Og˽k>ef{;AV햧PèW~|\YiS <kznwbdFbHBht8Zq#4!v٬b@,^Yp?Xs% y@ ux?{Ȏ+Ex`r& lc'۰͞ OQeVόf^l&_FH"}RA,|nvOA~*AaΑ+2\WHWZC`xd ]X q)ߚ-j遦z@l .~`c%Ŝ-dBp2qYJ>vy`EjO?ܥb}Ǵ<4p~N(*K7^e=z%),?\\5 qgRxЛx+.dUH5WHb2ѨٮNn52t ݾ猲f~[V!"m5kw5D1LCCa(0t=kxҼrs6ysF˱VQ),LXKKֺ _bSU㶵d TKGeݸ;+\|hr=}r]KvN "t"m'Zf^Z{prP053`nT}405 d(^l79^0xK3 C7YT]2l@a=1qTbpp{̹#SћtgcTp{c⡘9\ʊ݂Q1txRh*w˟x)Y!g Xy*@L*EmM @ p:pln!+?oo+ޫ[ U5 @⁃6y6KarfƐC& 2FYo<7#vIUU<TRɆx%A Cbr>d|9be 2%BfB`ק;Q]S-26` }"k,E9ȥO/mFf5{12Wd։cΑ+` v\qiYM5S)-$wvam^|=QK|DdN 貃IƆ)c1e96/IZ35k t `p+y As_҂k V#{o@66jƼ_4+eBr(߻l.[˻6X&4DkS&BА]w9k@xxWk=DWV/bU`Yv42,ZATP65BOV_ a3Ƭ!1L燴~Vlrs}so7ڹݛzs7rsY_qd=3A/'jC5Ξ֭'Wb1}*2&Pfy%Y &K1/,e.,Z**l&IS2-Cb2!b"pHLF5M0oσ=l(Gݺ% #ZNj$urE]$_HoK:nO|wskmklc{?lQ=&`gl}[^m#ߋ=c<&H̷kVAN:dqugZuΰ GzXWV9.dƐ֛/uiE{~ֵTqZ5=q{sC㫻mǝC/o?}#̐Od4S |*O0% ^符Jo`{L~r~t0^s07>*\r2cTE7*#cfy6g$]o{jn:]vj4 !>4n=9_'cCC9ʕj>-w,cN HC1 Xg+^YK9,+1e) FNWפZHCI.4V'lx<5u%;Vu%;kUJtP+A^06a6eW Ey!R`qH&PIDym&-duὨ~s-L~hw@*gM )|:=y )Zm hR?NJO*ɷa bC52-@zR2vw(=3DB9c#CB T#Jtrb6I 4J2/2j2:JE鼤YP URǑT̽!LjI^`9";y #*Q"1 tS=** 7|Ki{JB@L2Eo) IZ,`6VuSR SVf(fEUkTk|zIuApoRp_NJ{8ן vm?MZ?x~yͪץ6Y[>@6W~*? ]Z>Wds=pw9z;~y-=kQ*_Ҳ #!?Vɔ᳷$;ڍ\3> VA+ѩ}໋v;w@<南KGnm\DdqfvCljZ ڈN]!Dm&nm݆\DkdJi#9(df.Ec]h M%&g[}dFaT,k8TKs4[\O;TF倖6˴v.맸X0%h[K2"Z%S|:yG)[-mDv.vH&n[j!$2eZʔ9#MQ[ on|[?2ݥTH_AVo%xolzk5ߺw.-7 #ttOVlctr}X~<# /s((?|0ښ䮖=],+]pITޥp_ߊ7U 7= ؐqWߖ {5O*4Z>4 2.QnAT,VXa<5r=o9?\- 8s:kxa_?ӌ?y?m&jˍ$>\g bKksv_ea3̀>Z]x7y}sb M*gMdIp)3Rq"|0 gb4P!Kf\d]=S@ E~uU.m*T3ԯlK`Co4pQ/;rG􄆹|~)l^Z"!c=Q1%/`r3ΎNU㰽Yl5޳ |bATFmAPW JX˜:3ֻYZDdL1Jûd*A5=qKcK 't eBc\#.e)T A- եaNѣD :%sFYZ@X> Q2Fj-iZjRZyTtuAF"$ X1 v)U%b^WPmIz^.)LrQF*&)YNSQF`ޓ$)MSO5Grݮi*sRZ]:u\yjRm!sU%h 6KnrX xZ@Y2$6#43%Qىl2Y1C& ȬQGjK(Z*?f!d,"IYOi9pҫ)r˜&AYٟjǞTRtʒjJź%V\u0T$fBV2Xk W-r&2 TVk2 {iA mR5Id# ]'KHh^=XĄgYꫯkkrHE'{-~!b|Yx a/QAFK-0qjQ%UnI̯*̺A*&E`U0pdLq aBgZ0h hCZ;fQ сKr|dw9bb$iK ҀqH>ʀXc 1bB;!2A1BqRyD;N+scHw`8L0`+!5gJ {f"؃9^R~MUDmҘ bs5Fg0&qig!&$3!Cj ʼn h"ƫt6f"7DF$@+ %L!g{N:t_jt0=z0THAHF.Rٚ&I?Jx 3|(Ӑ [4:HD̐0htJ*lD :n19E&et eH |X4)--KzNEr"I&$t,˭'~Tnk&~ ?Y!*g ?b* t-̡*)&k%93MdO4w~wDQ:"-9lwkWf({5n۫n Of&y11}-'Z}wIX{  _`fqq蹧')w}:nSe:Z{,P5?mE;lg ǘp=Q)/vV|t7cg;NJ~3ŷzHgT!ݨM1bGXo>zF7 ,U0R;ȡgKsZjgST˲nD~}n -~jz9ҧ;tI`b AH*zueu\x%nGt'|@jS9@*}͐# Hഖu¦vt`i_wBX`hq BbIAC(@9tARλa1Tw UEs)DZ%'*1eLjF"L-]^_WO]E;IQ `J%{,%X=W66 B`: Kb)";hSm:`u(P\,\: eLq_<s.Z5֬BUF<^rz-:Es`s+,&<-*9gz}8T."' Vi:j,* q-fY) ݺnzOtD:涫4*-`Z Z>Z*Z㡬4ɜ!:̙nHMAQ?{3t+¬DA@G}e=U['{KUܲ#E27<`h2TjK*eaRҨFVJ\!.#G|wY8`p:3#u{:whW"? d7 )Uec#=tPoj!kND, 4;_MΤc*d^A>tR u%`)ċ^)lԵ=%EZkC2FJ)j(NF=bRf;ehmq4HC#R)Ye]fu)#O!xYqK웢>67 $GRt)zvr: o *ř[Mq{O W[,Wp; (4s}nDd* (ۊ FSFٺQ\p3( :t) `Ubʐ@ț]v9i^Qnɕ؅Eb9(XDH*;0 ecuIY<:VGh BE1VQhE,!jB[-f3۟R$wOuVv>zz7]Tb1wQvP}Ӱb,ۈQ:1W3:c([Aֺ)"[Nbԣs/X{N"niE|L\]Hgg20NJf+}oS.>~*VQQPV2 z2jruigʔG+HXI@!:5c <[=ʘѧ" >OqZpzc\,t㲠qS4z \@H=%P)Y4&JamcU7&`j80[VYnAaLb5SAKZ{#LxiN3Y2@f&L)k*4L*%}m ` )[ #j(Y޶1 &lda]޹,}6k8n_Ls4 oVX5lsX!ftm_QI}_WbSyz:k4Jk^VfQ\,#z'Cz[#3w|7&)P71Lu+x tyA0ܺ%,I ˆECVbq=zj^T]`q}S޺qn [\]-9b~gpkɯa9徯f믋*lP^o7/20R^ׯƣo^ΧCYIߐx!INQwzVi=k5{ UV_AIhGk+ .:grSuAHY G.6é,6Q< }/| Af'CzZ Oi{[ 2T28<ĉ,L  Oҕ۱k!$Wrږg?]Qx NGw<?k<xO>oszD;8c*0u$!p#D:r)("!".:gFDK.[5NdKS 7Fzt>]]ɒM7`k> 6CA&('5 $1#"DES; x9,|8C9&P#|FF N˄v7WȿG `tLC7~on Ai^OW}f={b=p$zV(nrJ'#<:8Qeƀ ii: ӌb[6 id+qq h~/i;(_lDR@r*d´4J 6%g5ܥBFY fE$-sn@޿&yэ;N3mZOzJkX.zQgd" 00y\x.V#bz@1ADTTmwF/_<SY-"ֳDIQJJ`S1̬l''& XʝIPAEGYX;S띩ݛ?sk·jL^<>I93_ߎGͿg7l fVy_$9HFQ:PV[-N-<#0ee29``k̟^zr AwYS⧋ϔ;=/(ī;.W(%;m\`ŽoFft Q%k㛥dp9]˛DVǏe换=VyS%bjS%jFn4`;"/W_@jJ &?adm-*mڞz43%/w AwǪ,͋B,t];_vfl-gu;8WOFf|G47<ᇩ QsUiOn1>U"*$Jls_N}N?#<bfhf݀b>/|<6N4D9 FPF89a9 {VF@io}c}M:S.fC!W` n]@&fkΓN,1nfM0iR !T߅e\яq[_.Xq}ܨ^mr9;5D$58mzjȏRm(rLry6^^bH Pϋ[9[gxUoQW6x5~|>ABwl$O =Nj^JRF>MVݰ.7XC@Ȼ:C$oρ6g3RT u M/cJ|n`Ax\3tr#L[n+n%FwemH03C gPfmN> ׺LD%)RY˦ö y|H$@7Vj\Uu_EV%i2m@6f/n] ʘWf,ų ^Z!pcZEL.a*JW]ݺ!FHȥ趢8pY oÛhӸ;c.?_O+O?N)o5\:$8P0A/澺 > gPT(ԉR̬2'9 ("põ#|nޱϹ;~FȂN[1 .@!fwwӻݜWa/p.yƎ_㋡;t}_EGǓ'Ĕ0!%]OS}:K5Ct@5#U8&H2\@ADG,: !T+Ԋȑd HJTDv,gz\"$ pA[MV\XjNx RTP>ň|ؤtLVф15>HҜThz$F(:7{EGS9]0AaNs E؂j8ՂJ@hb`yМ=Õ.X-p-r GcHtLs: A$̧㘕F)Y AHMWڒq(k:R|7P* WhՌ q@T?V]@TR͛(GB7 Z*ʅ%j^.-VQLʙϬ2$:9W4{au'B8NN4\q+FlT:Y5HF+ ^b^'KDDƼU!9N);a*5Պm)C)~ J)CiE5SP(ʼnUVɘ@STR͚%(G}x [էT3mtQ 'D4&jP4էTs*tQZ^+xwҲ ղ&}Di LvP8;Ҳ լY}DiYFܮK RPZQ@JҔT3H( >?~>TТ5~nmމHTR͌>Re(ejNAJk l~2#TJ*CiMtQ*iJ%!DJtj`KgbOS}:K5`K*N)Ogf\.D7J9-C)NvRH`AZ4HKjmȃ,V9<:v9OѦHASj tv5TǿzLQ v?>Mn48&"٥=|/x3Uͧ<-$oH#+Ag.e]N=$a o~y={{;|uY sTҸ-O!W}+譨duĘQɗmB6n`= bf(b{gL"QsL&33zf͹!`~!k5syu3萅PJ mnS\̦ I帏*28 )!Ҏ3ƈdRJT)"`bM "=W(l#@fhDz(l%C&H:0L4a~iQR ?_߿(N.f}#Q)zQ>yxC]޼8bA~k~g_IAr7wh=SWj(0勿RC`PUޅ01SOO dp}>=P)AѣD?]]2cއ c̀K-ZnwKwKKa_@,'pS6_ S=&˔c&Zum$Fƺhy4ҾgڂC8,1)S88ev nԛl,_>ۈnĸ.~ur_#nBv_IbbvjG{ GɹhVF &wƗ؁sV雨|>NgAZɉFg9g}T{FxZI/sk0 k88G4_tGgnip ֍7\m:޶M胝NMb'8F;i.,`7:CFuݨ0tэTu`8FK(\훯0d(]}0A 9Jn;%/ٙa x )*uiG&e䌈@St_k)iZ gs~Z%DkIlcYRPb9}+K 7C|,d!O0(Qz9w_"Wr׆9hEc^7D5wPHGk fUu[]|q 7S/A$Nh/7ʀC' L֙/ ɗ&90|($/wV:iǷk"T!jl&ͬ&\҇񋥭-l6K=\f/BRn+^O%- 'ǭ}\|fDn En1`[^ dnwwpYJ"7K9ds!gz)K|ޗ6"kURz]3o0wVl1U0$X6yObEtTLIemQDKJwBwagm|yg/7 .hZxzJjiVd7׫KjAтթjT7:ՆQ t >V_eb)lP]x2qGglSl_^D3xG7ٚLddgGb']gUǓgs:"l ~w5ĉ ,xᝄXY;ɮ@ SدVi,QھJ[E%z+*Y9*iwܶJ[A)um3X-v%hZje<+XD6l]9h7Y`҂E$_OeN\z]aqJ\"Jg,@:C<#%166Jylj&Z|EZ *2RtE"gp#Lzڒ]횰nvF߷i4lWj}c~oq ֓0W/R^Z6A&;l_/j@7J|,Hc]k%^f!9x|)ߗ$M±fZ`*A1SFBbk)eI {)RsUwTAZt8H5QR̶R"] |C:k`wE ȵ?N3  pʹAo21Y9X%44Qb,of)yJT]1Vg\i/`^qyqyqyqye3=ɘtJ":QōR1HYTJ'ϓht])?vt{[f "ߍX0n\'<#к D)6Y;Jb3it)Wul&0(Wz qs! ,RF8kǒ"0[PdE I*jJ?,*FwJjqw%AF*[z!2-Dm-#)&q|TeHP哕Zd&E-0L8>?G-iK64[mA5 #dW LJ hp*ǵpdIŔ:('& aB=FQka΢0i!(B L|3/6d*`B9ό#^[ A3Ņ\#'m F$o=E-N`DZ@ziħڣ8-"(k87zODZ޵q,ٿBM!Z 䮍'/yZ$uccqVIiHyQHq4:]]]]'EBBeӳX"@5Z$Q,9H6\{1:UCԆ:@\>f@T푐1>%Law%5@}!lrEjJe/6ׂn96& l+f4ijsMb d3 ƒ·$!r?"c>ײ)LkFSC4USo MAS OLI8y [CڨₒZ9zK_ cGۭ9P FoBC/=3LԼ @b0k x#fZ80]59oar~@)TݶHH$kqpa3&<񹬰?xXC|lBa7sv#0hϲgO=ĵښ;!j徃Q}TO׫^j伽~so:=ZukIjzuo~}tV,eP']_CfO%0Q{VƔӸg:z{Ou~jfEƉCg~*َDPqUw%z uzUlRKbO*>Rw{Kg*K0K-եtglWjף)2/dad!e5r>'/6h|_w_s,:]1[1Oͧ`|]bc~f_ޙE#xU k]1f\`˗aF,ruVB^9D0%tW} ;n:pNwx-ti~ѻՁW Lqy7i/[\>ޭ,–ٵ;z:!)yƧH_5|k </Wweh>5p3 G)f"A%Tj7'@}T+Ykudkʵb C5u ~.mZW)8=^M8>kʪ{皲ӭ[\j<,g/b=+6U1n?!f]WUB^JVàFuw5eU"1.2~'%8G.K`SZ; $@+h_[K'~w.un]0][ y <4)8vBunh`SZn [ytĔrB5:+'syÁr2Ք 7my۷|Y"ۥEnG엛( V՗Id U6iCqamoW0ܪ ݦKs3{X?,iu?L?LI~Yu L2m?$ǿ\s=,;&]rjxR34%8Nh S"(EXBz7,+/a-09,rIwd|c玶ȝ !5xEq^׏pi6o!7EE"LDrX(?F}J~ՙՓ`$((X!ɘ 1,qT ,B"b&ʲ$fNRJ :(zOM3-3)zU  \艗( [hG[)!zӢSkWP^NFxUg[;j^Fhq(љ1bI81':<%?$3dsR(PWjV:RG"cDPQ$2!ip,T",1B Wns2EdBgdv|̬Rj}u7c 2)?!3v NeX-!/}/:ekmݜDdSD׵:>>Ua8{Sd+F#neV>C#j}BCO[-A݀\9/0U}U=06WJ3PgъWf[^m_ %}^^$n5:G1le=.l#C]acGau'L (Btf::[pxwL("hH|<llmZqa'viT!QqYՌ%BQӝmZӫ8=Ց$\=NۙhtV,#/bNõĴ8*NxƔ0BJCa7!L,SHsEW'sy硓9?COd#%?J65aRɫHf/aˆ K6_MR+p*] %jB^`.5HٞԢ u=CuҐ<;|zw>M%ng~x 3raޯ{M~xP@';|[wʬnᵗޭra*nw\:pNwxyi-w!;|G!Z3ߑ1#71&lөWz+>{)YM>قPw6 !Q4!FxSb5aq[̶m[AwofZ)3Yϣŷ{ni_zD٧q>>>ͭv-ZQ]S O67h`e=/Es~,₃|wsO,ܾh$* 7ݣԳ] JOnYs޸PG%JI=ە2!=mRR*/ ǤJmy4JvC)ZǤJM$JO !!{:.lWjBPz( q)&@1g{R+!4Jڃl9JI=ۓHD?QV$"G֏߭n-5)Pz(e Lq)Ž@1gRQ*'ܔ04 zX: <՝>KW߿#TID'&$ F<&4ګo(犠 5Zrjl$PHaIIFj·mӞn:PY/PQKUR$TnPEzR$t#GNnI\sP%JݒеԔaQʄJY~Ь8JpCi.5 (=i@ʼT%J 7RkE0Jr*M9#ǥI-pw(u˞*^-{ZKMVFy4 “( Ԕ!?m2RGxǥJʹrSF>CNe9koҋԳ]%Ai 7zR,PZQFiR,zRPKMKOTvJvCi.5R'RraJy/P\jKO!sS-7>s~ʁ/+o(t+`ICqV `?j[Hk/Q,^pRXǬğoS{h:zYMf Y6$ZAFaK$wn] H܅I,gwEPJd|l݈QF7~@ ߎZRYHo/̯6]ïY|X݉:eJ"i'LWbb|r;6:,|I:&e 2ߢDql8]RB'PG)w3 o;L /K`&0>)WZ^*$>Gi-r.ɂ~\ޗYsAVևi.k*ZT]}IHwh3Gaٔk )A=дKhaM2nbKŰ ii"`1ʍCxa"h,>Г[6~^;-]\ `U3Dg?DuDJA-F0e }k"-FE 1FS)34HxH d{cmRѿ&(Utɢ]mן)ŴQ Zir*0Tօ=W(x1:SL)Kd,)#$)%RhHSȘ${UJUSA0*WHɓذK$ Qp$&c(I"cTTNu&i4#HFY9T@8#XҎ SA (VK,52iA DL!R% 90!:<}|AzlXm;7q؊Gq(}X= +o69 cEbXZٱ&zJW7|LD&]~|;O2_,xsuNR$TO=9p̧R e{b:Z,]{#S„4eTèݭ-}K!-V`ЎYZw/]kBB\΢l4BfN4/aʳ*E̦u¹ۇ&exsW%>=+ig!WvW#~8ؔ@  a0<'S^,?Ѫ 3Dh~ޣfMiMގBinʶz) epEBIvTUΧffcخ=A4ϔ`K.oozz=fڠ[<*j1S9ppBoixD?oˏj*hW$+#=͒f^t7wFo7|~277 pwq{pEH$^ j?~Z_\WO_󄨓9\K28 GlQaVwKeC_5+z t]+\~] qD-2fIt.M)+@)"{U2ZJ^jD)g?j5Zr]$5Yp|]GM2zfT8at8h48j]|cƕǘQh[^YYԢΘ*W;S:PA(R6׻?KҌ'yy 05;Tȓޝ<ЍF Sߝk{  pCyՂ mףm㇋0QMVU殮8/5:D3i+VQZZ@FliQG1~1= Ά|+QН YJ!< 42UZ D :4D8XP$,tIh" j."Bs!kǜy+"pƁeU# k--Q:Gj\L#|s1=+!"ob.V53g@^״$LpUS5BTV8sR)T6i)G!;1z1Kr `=|F$x1%]?EB0 "ZyZe}.e,XP_~r29$~q=Ro ^ Q0( # S)"W 2jlꪓDҋ.DakN 2hn.P,A_u|CB`&D`%"hA@<0U5)cODL9q+u=;TJ?ʁVq[iYBl~lyg3Ncp4OaPmIQ*E[HcR NaݳDSXm]LJUF%H_&AxZ(EJPSPWiDi 0{ 5  ;FʑζD"ZDi"o`B4u; ;ۄZ3)ӂoHO҅W At5SFmW "(qUQSr16F !ˆ:=DihB' 9sͲ}IM1=A~#ŻCKTREz>,M4˦4}|m Nr11ox 7=^|<=һa!gnmJF!3UܩKdV1o8JfE#[7C>UH\<|^ޠ|=CxyylVL?͕KBZQA*wr'PJGT¦)/'%O2OA4 ٬dy T/9^X)B_Р-|G 7DVݬkdUlq1 'KʣM*ME̚*J9G.\r4ہw'[JQ*j[PeI57j^BpY :(Gbz{WWbX:g?Dcg@B*t~wP/72H\SOLI'ob:4zgo_y!)oƇ{{q\>඿x_}mw?AϢz߯nl(?s0?34 mX;|QG9+-xY+őTZi*bKI]Ie]qicUƒ|'({yyq \'j`]\ٞyx힚4R`3eAV` xg(NE, /UYt  3S@J*6W䢖ZPMTQT"aT -jtz 94!kOS[g=h+_72$@CtB_=ϼ߹]Ym7%.0<_san?f {ד6ְ<ޟY`}?|Lgmh93u--M颬9ֆ!=̌|g;faZVŞ́\{/xֳЫKY$mrUN'LכceRՊ^P`@R1K-Օp甂sܖkAR4ͽH jo#2+WYij`OML\m9pŅJôTxĸ13` `qs-%9͉a ͹A͹ 1O C|*s`24攄.0mh$7V)c OrG{쥌0 q-<5S?Ch}9"Ggpq+Ֆ08:XBݽI4Gѵx|pt&iT OptOs~ ]T@:G gF%Ȼ >ζDE7=C}ߟIİDВQkM$ T7sGQ 6jx(MpAK]"NcQ{cV8{_`Ԛ9=bäևFjh( 9sͰ)WYubc:HnE_^HEz>,M4˦#C,b\ L'Im]6wă{Xփv))$;z~3 G:jS[S3I9g.r?uiC/aRǟ:ZAIz%N`@,M޶B#fWbbfUƽu+jAFAH5-gTB,teR͌*B9~wBDB ^6Nt >@F ~(·o=o0R4H^ujys6i8 rvi&[3:Usc%ep55R ZR*uM(VU R4ܭwvGvSnnB8xM]M$'CkK5IHg]h4h)JDmT$>t01QHarEoуhSKRiTp*f*JmiMe8hrAŎ4δ%VqxYy[F Xb1guLyMv|=\ 3+$yNyo臵{~,k_Nu/w ImFZ{R+d4͏"6O=SoIU jy҇LڝL=G|L& B 1W3Y`^v'KQ0lĚrV%ZF(s\јw&ߌZү Mry_$ڝ_~/RT7..hFjdh>'ogI# -/Iӷ8ӹ 9K3u<v'kuNTqtFx$׈o1rK>1fH@4+%4r-3誺Ԧ%74ZPxT̔P#Jв HZUPT.?s0\J rP΁pb-j-[ jUh\&PN:FtbȨ:*i94Jr'*%GCUG!ϨhO@xǜh"QṅRP7Κ͍}j+` Y>q~?,Lx l.Kp9L(!Y`R7[67nz7~ݦ Qc%·m.9 s-[soWHQv~* q% _6ɬa13.{ߡ>^O0^@M( ӅD ~g/|g 75C) n޻@s^ƒkn<qqΨgWjCuJ}9)r6PĻM={~tP}tCT7؟DI_!DF*` Xk,tCR9J+ AŌZs[ b]PRg+qpƇo|)zI7>0~Ƈ7+Rr Jp<ԡ. %qp*i"5*Ƈ汁KnKhх/~,-{,+j#T@IjZi1ʕP9iZ¹.]UU$0?iKd%C(cx׹Ez* 9oêZKʸeuI 1 (gdJ^)j)h]Py-Ė-<gHJ$)_5ӃRA pӄYBhm'urˈ{k{g}"m_[|`Oҟ 9A@קG<=&>} ksw]}Im'{ݨKSJdpt~+B!$QP~W'[==}$49&}3#K|>rZӲJ%lpB,]J ]BxuAQ)Xp:.zr'*E*|p&k1¯ hH T1o8JR¯Pw=ThOSe@ഋ \=xE#KTKtn}l BT 5+tБ %rh،JSbgN<0²]id>㷈I˶uDj h=t4s-s0 !G$|ʮ.AeJ=z/R>I|2E[I$e |t`Z<>)R9U=}~v )!@=}w)1*dKm!AVĜmG7':"gԂP{!勽F_0~HXG˦3>j)'Ge&Q< ?W=p/Fй<йH\>]!F\-\އfTc͛k|ݐYx\ bL')m )4n4I #[r&]o6WqH/]li$\pZ#%;e٦Dْ7۬,KÙCr(L:InףwAtbĻ8v6yx@B^9D+ <.IΑ9Q C-/#Jq*hTА'lIŁɼ1i9tZDtH4+!*H*u=X+%JьWˊ\ns#>wT1wsk6K=$NS5G߭_8zb>]o.짋/ 1F_KUQ=K&y6ߑ͟sRΌ]< j2}Կ(Y_[}7o!b֠υ`y19\4``-DɀDu|·#g7|C%x)99P1yFȡhR 2Vҵ]ʷ_ď? zluڜ.ne`Bj PgY_:c$X QfiJefnp2%i]YeoJ5r>Ys@߸dӌ7,N2āN KԞ<P-nm-bon.<+&d֯sXߌ~Nݿ.;.rg}ȻۧzOsk%O_$坵~[?uǎ=zZ |ٳfoܟBpO=bo\d߆$4}@.U[X{-m5H{-8DBCf8D1kSp4I g(D4HI-Pn$uåݐZtv')C)cDd(%mUj%3JO Pۛ0PK۪ԔVhn(=-@VdP[۪<6JC)ޘJwI}[J)3JO @P 4*5g4Ji//0 @_KMtgJe`7!ĥJ HF)0|۞PZHsGi74/Ex( ۆ]J lzF RV 4ЗR.ξQ})J@) Ԅ 8QqJ!RnH^z(%, +QJMbg"JAԘQJA41qFi4,Dy)5e,Fϫ'R²M9>zoRz=:Jy JZ\ˆ'kL ߾}ۄFafd^PtCarM#^ϽnkweЋfJVTE9Yћ7Qt}NWjRb( gkKe(VV9z#6"0 ,+nO=`M1IU_&Ɗ2 >LezQVq{U^7z@Q>ݚGG&KV%`iJ#~یS+>G{ ){}?֟};/h;.D2cL '(7uuU*q1 BA U~aLg;NP ՚?D[+qho ,k]&*ȁr,?%c~KBn9kIc78/tS ҹn,p>ݱϣq[kPJCh1q8bNÅpM4UR )!MfL)SEaE<'NJ%/5e`(& I*jU>L-Z[ R>hU@@_2Z%yjD+MSeF&)R,4KX2 %DZa.!ŊyIl?>;3N=~ -5srK&_&cɷt/?)Ŕc+0<3V`M2FHgE2"`IJ\,sY*Jn&mZPgP==ˍnvH'rPPѕ '[% 7Q c# Ȭ5ȔaJ@v*`yظ~$n6¾oKm ^VZu_ɕwVϹ!cPJq`>%oW_P/~zsAᗈ\c!g7Η̞Ks_䜶uGwG7X9p1$O/9cpcTGؙU1=.5aV<0q5Sĩoө Ru!*S8c1a0_/1F|pӨaڦh:wwF0ZPWo?7wwbV<ӵ%Ir\w!p y0O ^?m19U,2kr fNY% VĊA'|e/E#W/u0W.Ô [m# n]jRmM~irRLeEߓp?ѰM{1L=mRͥe<\`wC9Fk`77;r Ns3+d^RTװ2Mq*.zegtjQ"˵KϦMp+/2'.t)R %nJտN91:4])}GU_G:pg1 I͆*r3fPh_l@uNLCmRs&PK3 KwU˨ U{ak$n~r!:{U*ժqAOy]7녒Gvx?Z.?s} 4癏nF8]gv/3CW㮬y.BrߠkphK.zƟzug>kIR>g_jJ|f]rMf^2eSB^9D#08-U ѻ :hM=wOY n] C4 S >n|Gľws!#Gng[ym)F@ȍ㕟"Œ{8ģg1=Φ_JƑ h5M_n.~MfWj+|waPxa dv.sy4RPKM:8qfS1+G5\to~ϞA%XWrI:;Iнm,ꑛi^ӗIv63 ǪoBI(I'74 C}Fy`v] y|~}$fdE! %$(%Ud)vDM3dx/eنJ)!/%lڋF74N(L{>'h;NX1%aϖid 4ľaY@J?EkԼ@䶞y_"Rtc7/) gBb򓙐+b` " $9[n2&Ve -&3$ &C> U ᠇;6-~#Hu{*Ț 7X Sx9܈,i%JGeN`T1,Y  Z Lj@ [_t)DH NFj)Ù֮UʔLBJi%DE€tG_l,heVP0+* J"2X@ZSiBTbJM>jc3NS ZY*-%FVh+Uak=(Cŗ! sRќS4wBqD1krlhPNQx7lIA%|3FPNܐP,xcn)ZPoCìCM&$a$ݰ-\jtsI7,0c~U<(Z&PrLe'y6tlʽ #dgl63>A=S}| oQΎXY|l1KiY:nlt*R/`oZv lRij)У)"=D3BG3@5 ^Z?8'Z3Z6P-yЂn>ke#kq9clYY+nz1:alfQQ0_$'xNfS2p@~†ʑ}tƶcARky(P $P d\(zDvo5U`C*+PaٍUXhm9zMLTKzRKTK U!zP.HEI>xGr!5tH:!}GgS@9@+hJ:C7t:N;xb/gSS$[yP4sӻ #uDF="dkч1B;@+hS)3@cʰU]!Ot/*#:$RV&9I5 좤`Yd(%\j"K[g"JAL#J!p9AuzX̂p m.T<)(b %6t6LbK6&* 07xAMPɒ3xh39g6&9)@MRڳt%2dHZ.q˚-0g[#N,7/M-5􍛇N&>j946@Dhp7uĝd6ԚJtүaqԹF9_z8`Z (J`B'.BX0Y12L3 "(Զ;.MkT7Fl~gKe??orXN.9\T&(@ITTb3jdJ, !K|9(_ё?Ŀ~H9D[񕘋u MPHaݧm$ZmW001p6RUQbNϫAJajĜF6ŧ=I }Oy|$R٘ XU7`DK "AKea2UTKb:3,Y°ݶaX@)l}C)Hw{bg)F'ÍM9$P{df1SdZ:IcY2;cEQk;s胱34׼L_wd1}5-;Ub1r'C}9 :dˤ)S1p.5q84?i4RE_ӈf2PyDpk" Ja-l2BujTsDlSEFmJ%S%1j_X%mv(#`5mh.?uК6մiM{ .1Hqp?a5mr'HL1wOVӦ$nٻHn$W z `F#ZZޱKfCd2أTf<}^u#=mQwKA'^IFh%dV'}u^'kpfd!=hLI W}8e[q[4o I tApMI`-FXG [#_RK1bl|?I3+R;4- 5`ӂ8Å7 6 iH8e{o>h7 Y?=IӐ8rk1 ^DOF D"; aj:#  W.wިH;mTD@``H25=Nі:IR׀᮷#lRG &R!/cx?CzdMKOYGU cr' <QLާu¹sPn!)&?S7" [ B6bҏF.!)8w1j~#&XE8n] C58'UǝC–u y!"RLl@)3n,nem_~&W'$B)H/ltWW_.-ž7Udpl:y-J۱l Į[t%Xn9ƴ] :Sګ%&l[K4.T`wF7aT >EI{88tѽ!NG M2Rjk|TV,lh>C Cʹ <u]Bä\frnn+Bq4OEr v9ޟqt6M)!ltor 0Xlt,u`rpy5W}X5RsC1J') KP6+{!-Arj!2V&κtY ۽t=ٚPFկ w[V[adsH4@愀5$׀LFKD.2iaJcF4#Zä--* xf*Pa G9ܨ\0bHj70 cF 4#"BXA-YA-RW: ![iM]\{pEXJ,w,s28tF`n$q9cFhF H(=[agsu(ϖ_" e˫t#ֳ!&G\QΉv c˫JZ>ԖEalyO'V**5a{Ge޽/ը)Ѽ̄BC_{lyDB ʖ׽NR0[^Y- дKU) y)['/ ɖwtE"Ö<7fˋ:c8tpM۔8ڮ/%V#"a u&=r?s3E3 ñuŅpj@lyqޢQǖ>孟܋-/Σ\+A'6b/lyQ(`ˋObϢelyAGȖ6a4HҨo5Q 2FEzyh"imTtiC&_ zv, ɮOũ-لkBm0G Y  &!/#0Zp1XrސAWRmBgƕZ7JzȸB8DS0&?SF7rly6"KF[xٗ(cK>ʖB8D0%antM! :߈n#VH:[^B8D[cUw̖|-/ {d`'r#s|3[^ [^5ݽfま %֎Q΀|5ٜ?*7B;(A%B(ꦺQ ^._vp0_F EZ90Se,4EoT%/rq˵\Fˆd^@?<o+o~u#Ͽg7c*n ~Eq|ɲ#∻? ?uU!4%WqHM hAZx{pH%_:>XhAtž9e #aj3Zba%WTg(dfya򤡠t)ƲO+`+VPz\ 3JO P jJt-ftFIeJQz[cR7F(e: L;kN$PtJYhR[)MQz(" Tx1;F1ϛR3bFIiJV5֓@1ϛRN Pʪ5$P0 $(=im.jCRӘöa7R#z IXU4F-//li~5rsͬ:o-AsBeO[WO_|GE˞(4t6%s5cHWxp%UډGNNDw['P&aIPiZWΉzDH+ͭ^u:O ]ޝhS\{uc\~Ͳ):_Ug7hN|RYw]|I`JV8`u%RWͻR-Z2^Х-K${ECnWGW ޞWc@ZswU0 05^4psNxtT/}8ٙ8о/A+!ꁚ1CjQG=ו{Wv9u[ ϑ+wj% "?Jy~ Osm:ޓE5F*”O?.[<K#?yd RNMlYpU*%e ]hUji@96DzԶ@ie/-ROi}e8Cz(FX\ YtPgE(S:m33/h adX.{F2˞ |gA_lS x=i :jwl 7\=)yzX:W>m2u&v!k7B93E {~ԦUXF}W[?I:_EzNRYa{/E~zpO^tZsA7n?5?m';ZI.6\vV,/T6dԧ1h|f,@ %A*^Ana|[:E3⢺X%eTRc!u{zSv+^Ͽ/; JPwC%xE[r*@Bg,2.I1ӡct!oGW\SL1>ߑZji_ [=bfwKԚLQZVd<ϲN6ŌpYf)aD ,ؐ˚KH=D(5 ['=P+r$P" ԌԻ(=A2RV]햜NLN4yr%'R7F7(=iRR0h$PJeJRfSD)0"fDMHPZI DIR8WXMXϛR `rFI0rQ͝H&R.PZI r&?i{B]=r>Uit^WCͥy[?e[?4:.ySV!$钒(n+b|!4Od$]dwDTжKcA㤽 Za]ꆸ%r8r'W7.w|v(Tzɲ.Un%S@X{O"VqȫިC{\)2|d5;obQҘ~ ?%8}g1 ` 3Ѫ'3M4E#3O+`(kv ^ɽ/(H&gy kZ&SS%+>nA!?\LTcF{,5QY)ti)M! ,JZY**%5Ydlv{#F۾'d$"S4-TQf`VT!ZJ΋Byȋ,C ˭`YeZjdEU+u/ņ&t۾_oş3EvEi+n5=k-f`ii_)|R( D)] X09RgV\%j]pɊ{.H[PDR呉-ĖlfH@ LhJ4d˙e\-AflJ93jIV*=| $_?):?>ɫ~#úBkiWKYo4EcnJЧΰBp`wՋ-ŕW7΀XX+?W.Wnv߸ۇK:`ZK!{qϨ]qN ut2&Vy])F(w^+ A0*׾yoR¥*{jgIgko_6wq쯍5'Vu2`k)/xlj×_;+cTdEnZv6r߬v@VLs #WVmV% pX14lj}pU̿]+*yQ&iM*}TF[gmbyݻy緕ѵA ڀ%a8ֻVxÖoW{L.rp;ce?4_7A3Ʉ➟ąMT=Eܓ/yz8A;q^/F'\qu(k̩bSG${Aht^8HԫxOXz3seJ-ix" 1u$GcNhL J\1oXz7 {xpYԺV] *N;igoݲxrflY%+l>,_yQ~,4_s177Ӹ ˋkVG`L}er*\s3$9Hlrat~aGs( 5N-0 ];#=: X{N^@6J 3hXm9 zQplMը6 +BCe ! O@k"6/P[d-Ua3"r&FژhC"T@ʭ kloN-Ix([ u 8kۍF"dDXz f&,ڞdUu[I1*)Z)ُMs- IYͻOxѭ<0U`qqʰ8w09DR RS6*"Thѭ<[>17NΜ@+C5 򄃕!Btb@ ?t١2ͧG6Ԟ)p[WlsyQb6:=kJ&k=w?Dz⠖,o͍֙z?_mלϧnyŝ_Tw?-ٴɷ_@p7G_H5}a@v!״dI~ӉSh,ٯN /)feQjވni,L_yN!fpJ&컄@Φ`t )IMh&MGȴ\|< Ja&^d:̝DTC]"$mP{9~Vno^ԱHTP^s$[4J)=2EG51Ps{Q-CH|_ӛ%^ZR.8%HN/p>hFR|b3ZArEtZEGDԪF564ym hǂ)@5zbRUM% E< tA˶$aB4/д {^>V 023IԘZՖPElM]yt+x4!;E{"{aKP+Hc\CXi+<6.iZƱI)$-`n a Ḇo_;6xP˂UIiN;h TBHP22+ZCXP0!Vf`-oL86N[uc)hR!sXV#i颅E W(i_P}F%D dٞh!2,=a.YpH"0" p:*P DagyrQ̦&^#V Μzc5^QjAh#+Ja[F}޵Z9bfQ jԣ W`h HڙuFR RS6KijV`hZsqE7vSFB S6*"(q8[)풱ξP TO3Q[p8"-<^N}vuZ9@s^(;z#upjvS:% \ZMS.1f ݧ ĭ0"ØZWGǓmPa3i]%"^d2~W1aTva "w+$1ZR\*\Ԣǿ ~"QGJ%ZsOr!q3v 07Q;D6 e4یS~:Oԗ᳞"= J(qƋ+[I+Cn.͏W_x ܜ:_=mO<*q<=[@L 1x&^P/\}ǰz';LszY+eb]--ъibsRMzҐSZmf 3hRr GFqҗGV#w|ۣ{gyԐ`f zySa֪:Q{D]|ԱzXpzSLFc4lu籃<:{=C =߅{NK;/6}HxݭWȲĘ1jJ˃(^ӺNJtVWs׻V,K1H+}GtQ54%@}0SS8z/!Y(bBwD g_j<=TC8EK8%A~ƹL+nVAu ߑ&\EPѭ8;?57NΜZ|J< ubNި]sA"<r̻5YK L \՟%'J滤5K7a%-]rpS8)O#OV\vХ5#H҈u JwQ!K^W\rtWEAEwl}*qhu} fU?-oJ V`0P'].) KTiQN ҕ@c1<1+F// }{x\K:' ҋ 녶\.dXk֋>:іgc[EcZ $ۏW nI8tNHoR9%[O>?GJCP~uC~ He/'QE/q8ڲڗ ( \WNxinPԚBgTc*`N[K8ԐCJ yWdrA+'/[}$STW9QIu]Njt#mXXb+ޠ*Qm,DTXa{'RgނZ5Vښk m1k*B]F 靷 U _#ו!Z w$K^&P:;gخ@5Z5 i}r`6ӝ%Kvw6Gxukg*8iy6uo҆Mݛ/{SoB\Wq@]\dPQ$!WP\ky|،>pyT$Z}W;{S>{jw>QXC`(YFOgc1K.8)&>egY&hkZ~oY%y#a/῰uEkimMW,8[jS='#̱P"cfR `U1zd'!$tPHҨT+2 귔["Z]?R'%TJ5HiEa<G}T_SAJ{,$Px.s軶ue8)%eG+~~/ש&TR 4NJRYH)8)-&l{.I)-Ru=,S}N5Q0D[JY]Ę L$Ka{3u^L>IƄrVRFP3qwzM ,RHkJPh'bb8~ X S#[zUDw}]ɺۇ,!9 4*1ZdH"DQU}ۅV%>,fJFg2(p}`(x,":\AV{}fhc3CT^/KU- ?H\JX>|iwdy}\~ TĬ+?6ȷ~G2D~koLg%CL;a|{t+J{{lgoZjxxF-=79V|oM'e6uof粹rӛ7ɖbߗ|G/aTi|VJ1'^nc_ޤ@imOsG|0WkV]Wg9W5U ;+x<Q-wE=~˫Uz]5e%رzNj|t ӝsq;ЫQX3[k;:zkuu_BE:1J y]AdU;5Xj#c$Xlwhn;z`ܡ5}SBBlzteu4 [FGjA5RhJ>j" =C.HDpjȥ#O#Ɏ:D쫳WakTC^aijeۢ(~lXޱLl4+jz7۲mfn'][S[f|%SE;^j nVfٖOyDrZTxB}#ii 3\=N}Sd <KBürLb [*fSX}S:%SɝNTE/(yxI ݛߴS_L]^ 3Oc'<^T~Xޛ|“F-Qb,.3v$P>ꔔ%d<D0>Nۇ puĵ r%04RQKqLt7&׷w 6Fx hQMTf(BVHHiv񎧌}HV2c^^Ee2QCY1PBZ8fRقKim`3LBl~<r]cc{ r˚a׉D u:]H:c!MF%,eԈBvc¨WdTs3Kz{gYbim"{1S#?lP\ጟy+Ҫw rT{¯ q\rפ[0-|UPcD83x҉@XR>Ljn*!1W}i&в.ݿ/Ï'VfdzɍuϿf5IZ(Wa\rUu,GӛL:I]X#s)5O 0.V4vʑղ79C[";H}ѲEt->ZϯzMŘ5MO17ߖyJX_LRifԎ$8^Lf8 v'4ml4Rnԇ8,V*tB_o\ *KM&L+8ɛmJ à_\yV2(7y&9"YC"'x)󂭦pXNKOc}aq(F#PFOLy^O7-b5VFD;skKa;Ē.MO6C$yt]b2k -"2)7>Ll e}nfL7gz.rm}Gwn殸?Ā4*)Ҝz7SU=PM^ ;) b˜3RXed\Hr \IG OO]Ɉ)]NЯv7PwdLNatCrr .8WZQ77 ɩ"Ny0%*g-VXc%"6 5eTjT3aŔfF7B)ڎm\; Pd^2IC9tk/(=I8b<&a- VnCG-h[딽꠯.JH۪fYAC[eYǚ`Q6p߱Jp`D!<5JV3h@*J/.s|28y;z~DWv~,~~mBeөGGoI@@Qaa.+R%~|JFZwBԟ.V 3!eQBȼp繳NYCάEa(*(-f*,RT7'FOJV+D{3*g00fR4R7tTkVuVqQBZŴ :8+$=Mk.'6ugdHiQJʽ[#;jdx RS!; 1O/v2ނfRdS2ZXIY&a<'\puW\-1mzRR'OEq `Y2n1 K?X"67w?8C 47v&J{:RsNuIF+3?qVXŽls^SI@}H|STg(ŏ=6V=q+f` ^Z 2^gHj#!HY09 svYKHs:l`Qpt͌d!>qx4`6[&&WUI8Wh*I| A9ֿs KbOkX9\8D\W᮴1hpԁt09ZPrk a9BrUDCQ(b@Ђn@،dY)A~H ZP DUrkORje51hy6[n]z3q;?SJUH )QW!%ꪚғM[ƑĖ1"^Qap:-d(˴-ll/Jɖ%3)*ʦ!UlIU^UyTUPWUUY_ OY l-UXg- 3y{Fp-/Jԗ-j+>DQi̫DS* g0H+Be'*h7&v',Vcq?]LWEmؼeDRol፪,ծTU(P*PZuPP=CDUH ;$ޡb^$סY"^ɽhnmk1avnlUv0\UȂSRYK VMG^z$˧G±K8>|iw$P%Ò<~I.yk/yonyU m {gɈ'o[ow&z6v ;K~xV2Lon}Ec%o29(m>mPI _u#HW³y OJ0%8*]2#C^ 2 6v+%3L1B)q#eI>*죞աKvZVB&yI / o 43V5YGŵw6VAZu,N)dr`tmM {k;iVV5Όw+vWqvy||h?vWLH)z{ [+H]2dlR_mWP-RHPPo`2SIL!sV8b[^QFn+ mp=.mx$J % nY[B$t7؁?8{?VwM'AB6Y4v{uɛmNv͑@Mk6UeS*e!wm+tڵ&i{_v6ƜBX?1;+lk()36ؽ >31=)vK,jٞc[n7*zĕ2A$8 "!ؑr@I{,A^Ҳ\Ȱz9_i;.x2\hUd"IITH9H#yߵ)0P#DL\( GDQ =f~^d$Rl48SZ0Tl 7^VAz|S\Eگr˯L71i6(ŵ8\z'rL:5 A帒BDc*0OR;@!Ə5 @p Fz.I≈% (N2axo)BI},!,nZ t )mZbdB|qB4|ZBr`vc5՚Σ )e3JXx\bVX-G)vn.׳r=.׳r=k\ە\T*QkcApU6JsJFc ZpHSQ@qeځ*Zq|eT!u9}V:}HBFsٯ]A6ÕX~o/5 `|HZ _cdN3~.k)a$@-:jOM+JdQNZ7 Bl;bVm(l5Y2ŽԠѤ8X$đwf3-M6*aK4E]mȃ!E{/)h'N"Jњ*=huu 5]I:E<Xĝ-c-S)ˎc`"LBۮ+tɕHJi2Jʝv R d)v(Sƫ`,Nj4GA[sr)3)q3bVݭLKoy_3qJꊼ*;\i$8 NLTfapc7szU{U7R4Nk ̖} MjZuw^zЄ }5&BʉT*yU3M48%T'SU٠ 2^>#LObJ{MB`{^x*>FEF DZ!Pmp$2FQ\4moӏ︠轖OKŰɈyn\ Pr'DV DRxBw!wgD*[P&NH2&ibw k=dtJNDHim\ydN')9ϵR U753DBޭS2iKӀB10f3w{~YgYz;x pTrcQ 8X'($ڸh= U%^.߯ʇY"Vq^C*ܠ1^%ͫrL&v6M˕U"b `Qe 9E7o riiӤ 쥵vr]R2vQ=̵_>!FlZzt.-IszCXn<>,l%]_\]pYD l :}Q?ZsO<ˍMcd#nC?t6)l׭M5T@G U N2t1h3<@h= FC+rHjP͔:bW =0b|P IR(ORZW11BBHnU`+l\ZG'aK4Ru2LyKgO|)GbΫ/'6!gyi|z-B4'y}@s˅u7Y~?,Avq?cAyз:?J݄DV|7@ٙbo{Y]o|sg|QiՖBzCmOOEF:v*Pp@VAԎEjA(MUOT!!\DWe  yb -4s='8}wL/Oo>"OM ~Ct;cWL9-z$دRJFt~ݧ%8ǜ^^[Jւagt3W'ݰ&BwfD|ndR #-.L*kh$-HтJTO oDl\ ͗>F_|]wn~G-Wn\oGVUqDE~KU"67 xd|Ә KH/G1DG%in΍|H_͒걣s~T`bԆ^K{<ej\A#1n/ڷ V`H Ws<ƢW[x LJ&W}U/"(5Roә;Ԉ}+%ٌH.BrESBfӲJ`ΈrA=222,nW*W|yL/+`2\ۜI +TU vr~]>r2Ω`3N`=^Ǹpoe0H!LFРp9(тq)%3|{nGPq[QE,S 5irt=)P, Xxw3P)uO,PDXy0znl {v3x9y{'.e^PS^'JpAqBљ;p4Sl,2A\8` -r}6{7R;[YN[#omscnB6Oί pOj&G ^XZVșL'y1ס :ZXSL0Nz.ۭ.y`L>?7im.^[y ww4m9"7‰imLH%_?W > 툪@nE}\F&MlBR57倝j\t6}77j>om1wO&S1;BkY\9şW$GQMHǚ⅖?w8 5kHBF#s > ڻ];k FY=ZHC&"ֱsEl[{+`` лE]~m(1^8'H 0P؛Mz06)ŞxHoGa+u׎o==M`|~o/v F^s)7N"*v^$+&v+Y E,r$4JE4LzNopyڪ\px$VW/&;bS/[XJhc҃&L*(FfĦӄTx(e[Vj,ԤC$[- /M+1:IOuϨN(;-V2,m2*?6\Ԯ E:z0=I઼}y zWVuwQ{& 1k} ⛻XJp|}݁C}݁o|?|/$:r.t {k Gf2z ѝ|hŶqٷ:sl[^ɳJŨU];3 Jd /2 (Dfit4l@t(0A8=P^: &?7!ԝ$I≈y(N [-3ΐ.%iX{~w2eRuӐ%hRI*/P#8yy:~h@ZߨD~e:qp)_6Dn#y˽1X0m)*Ǖ.h~E51J4a¹'HEdpyP\hztNnӣRz4lzѠ奪g8j6'}-&PK6h]# 9]\nCnĈSuVrbũ]_>$wT\ʆ0JB͟?>BN9. 0@ sX%[tUMyNFnbF8%×ͨ$; 3շ HNvR ԩݸMqG'ΫːF0v]w "7 b|ꍸ4N Gb&%( Um )NlO7SKLb,M(6ieB(GD^C CTp,L)^8bRU8KG7 ;v5j)0*%,3E0Jj*>B/q{ m=x<^OC-OAŌi#IfFfƅ|fvxh\vhfU3D|_6@n/R |Yvjm,+8g (vЉ*Ui)th @BIFZuu&Δ-y&ZstqB{&:R8[mp"`tҲiwW8x)%3%T0)8clx@2BUʔ1{ݗ>yTLw?&ކ?h _G ?5{a4!XNqzͅ$4/eTa3:4Q:p `"7 >gGi%ȕ.V avk>/Ӱ"II^ꤼRI{|Vo=4|:S$Iiyi:['x,3xaOIO~ D~ +>;v0%b:_zCymaOgi!)Xwz0^>hUw !}# W`$-]{%1TCӻW3qC(1|_gWZ,)d x?k!PBz5Si"Jp},<&FX7}yf3ț.~k>ޏrC_JzMo_+?$z᏿_8.bd\*:/3镒W Un qdJG de͠}yd=P'\d7b2utB [[ RB7dpZ745刐7còS4i[z\$UB1:+ #N݌U>%Kj/8>LYI&8)9r=e{qN.Q99zi8S s98팎hq$ϡi{'98@;g%{9osP L1r6sR'+!x.#K20>$w!,\[kAbpog=.2'=0a6,z$w߃WR(,:Gt Ix^ mJȇ]/ CZHvVT)#hܷɰNv؈+ٕ&szdG_:hsK g kYd_[e/~7hW4*_vSuZ.[GJ]?.UZti+ƤJ'^vZRPGek /gYP&`gTH %N9ײ!<Уldǰaw½]vLTx鄇Œ\*fpf/hW)2}YB2E{Ugur(YǵIYNRd﷭} ,ǶEɷH~v_PresdfWS{bZ W[u%I>J)EPvSϝvEa*~Fifէ{knK`egO''7` AMOBdӽ{s Re\+ArҭZSTTY &Z^`TfDH_]kpx9'i$vBjiV%' $j߻lPHf׈\r&w:Sʟ).;_UcWq[*¦]&R>EX)8j`UVB'?PSo + gL,uLT.YA1|,2gmd gjAЂ[3w"&Vd˿m+c 9N;͜t|Sa.)'fҮJO-:f DuVlݯ˩@ et:A_$eX{+Ljz;hc+1+$[2Wo0ilWWX.2la* $|!S t b}H o8ml{ґ4VkЗ nbpo2\x8܀aƖ[X;SMIVnn58o˔+dw ɂn y ll -fHan?PH:Ўg\Ze"hZ/Q{Ϊr h_p'w9zș苌MPdTd=YPRfF0+ڵJC,Vk/le{%F?+&Z51>n.i0ݚ, 266(0=|.3Q+[S:︟7\ݦq_$s2Ƨ!G5<嫫eJxg\?) U@Rβv % R´Uu*S}c:Bʁˠ%@_ An:~@ZpNCY%ChWHsg<2%ǘ?Ay9ƹЃLrM-32P7Ƣ7t_|ӗu P%(*6 ];crHoA,/%~hpLo^Lt"JhTRApSljz}{+c"~K# !"1$z37,PLa`y፡2`!KibhQhbd RRI|u q[DH9[z**YjVշrRd>K3CRA/7oW0ׇܢ'Wc-p~@Oj'=QX+3/2R@b Vc@Mx7 JMlb:bɊE1TuYGȦvׇ޻מ?JVCnZs^Z}NB+ݚYRM:HD۳ywKu 3#mFZfD3Q)9CQ!YA KH0&`9 S(0C!B=!VӒB貭Γaw%/Nz/Nvk$SQ<ISzv& a5"pPx YBъ:%{LD?dAɛhkS[qk8)kId‘QSh-"2Y˜ڂb:d񁛢h""IMɝzT9)7vHI;[~w9ei=D鬏SQ"0"xW}b^+_0.̗H3wHrud|ڝ; KC|Qƻ{`]?Rݶeu&EIݶ;Y`3Y*>U*E{I䧿4hFP~k|<  '5keunSzяwWWHm)p}!P/u8 N!I=^ ֕(3z)ñWR t T&UвZS5*KN"kgr fYtC}"PDP+ѯ!5DɈ̊,5#< [@|)LɮP&94LIpd)K6f1cttjݛ,aF |_*VElvK:UR0jAV\AXi9~K~{ZZ]fH@x}lmi2;g8 cXpD^ ._ӯ5mp@lMD8E^Q#|ѧصRCS^3,FZo!/liDPAji {0f)Xk(h6G6V{:rufsFZpjHVcpTWC<ڃњhP/в  7V#H#!F()W ۿUr>e4A$ t_>هd|d{Dxa$=fwg=VL$"M`1.$mE;G6`kl,,4npQ3O[T=P}yky^}Ǹ&1'͉|uOo.jiO>(D'g{}wmQ\ϧg]??e6J<.,]GܮIfO+7 5;k׽k Yn S?D^~+#I;2:#vC7-.\h>cW &wX?O7IvX1JW,RF__=f)  L2xeRp;-'mW>WߓLFT heyu֨xtqtmlk,o?˻iٰpUɯoÖUT_+?4|Q' :zuNꨡ>-7w7nn?ه@咄LI>|sEuy_O>8WaC?G 16㍑]\ s~XB$],mt?>">syMLpZ,g%rݎ7o]֞giBAA|.FY+e*|YW3+]%*Kw 3E='as;' JEԾT8鱶C5UiTf4qfDMަȋʄV糣' |ZQCu_.q_pw^0h!v^Rm #c^_5q> V@0@Vu`-jN*%53(@fc'&wD]\^Ƌ )%=Iy1.=c _J(i;Z'ҫAq"S/V>c!q/=2 2E]j]ҥ[1\؂WCʠI1JbQ7 ,ٺVEH\M5QRy %ry}0.oj?$z*{ qAsNa v@5VV*7SՎKAD09I+$xo}Ŝ72u5:ϩ¯*kCq 疂 0D^[I-| Vp & -5J5V3i@Q[ƤE+w 2>HaSO[b4`(a\]2*4R{%AYbX)Y5v($&"\"⤝ pɏx#k6OK.~uqMé?5*uȄD2a)2`; gy"+_a[6UNAG(WP[ܦ0ҥ,нKĐ-G;FFǃ@kE0[YT5(*г*xnHVU\'8ΞKʃ]"@F$F$͍^f#{+۔CCJ-4KkD OTtsLN$TvO32Ƀx4|) HӉt")k=$K|8 !~gR']D 9f^TP2U9V7-m`q>u2ei#N]^ܮn~I,O'iDx=7,+ QɪkOxK` 'yC=c~{8@E{Djo'6} ,զA}} ?\/ [ăl)|rwӳsq7t(}cssK;).9]dE$-VfϓJ| dYKsgEETpI*Iɢp8Ք{`5^W?Kl&z0\3py;Ҋ>iŻGb[Z@xǐw.E2pzK1TJ1Hoh=z_ iTD6s-VP=-%' 1H2xD|"Tm y"ڑ\\`Kdܲ;-\7l٭Mn՚vl5p֨ӴFEֲy.&1 S4A޵B}R/OB}/K`#4VM+,6"قtlfҖ0]`>#S/aKN8)b}1 [c%{<0C^{2x ,XX'0ҫexC- V(%8L>ڣro DJ=esvSzlK|-kANe"$Sq>N#!DCp+ߣ@J;%DOQ\C3r҈tSqanS̥EDG<y%-؛澒e*J12 Z:ePKj15(Qغq5F|)Mv:;nq;=QmSu)EYR,ג#y (JYևײïPEsX9T3V9`L< W,/F[ :M*knRSU[Ө%b*lwzJ8DHb$&/1 Oe( d"B^[-| CZ*M`\]&sH*Vmzʝj%M :@;za}b#0,T mZ"U6.y6Aؚh9 BʕiߟN=Q]eb#[2q.e։!yeoYѵ*e,C@q4Qf< $l> j:JPH($B}#KAy2рuh:Qt8 :RHwG ` P %KT #!P >8pRk](xm9 -  8[Y:!5 Cu2>(  jhV@R @p [c(;TI:$ G9fYh4|B4&^jjb99Ql5pzN2QlhJ& #8D"ߦffЫeUzƉb~̙(V|6(L'AS9I\ezVٔ331KTMG7,*&qk`rF!Zw2rVxKEl_%=<,\nir.'E7\$Ϣ[?=;w1,Lbm΢S̢_846Թʘ4ϥ A>&;,Z<75|& c6a +Ŭ/Z_qF řé@;$AopL4nz%g'7P+EՒKTZ52=3) eXD7rmID3)癌!!\DKdJy:GTA~#Gyں)Z$j1$䝋hLqnk7l9Qb":n(91r{LgL&ZrgR~KV͂HOuaH&ݯ;(^}#.C ̐+TkڮYȢQjH2b1bAͦIPXЭbܼ<ΜéFP@e84TSN2J1H>.FNÌ<~\ΖP$f8!!\DdJ҃[ڍ[;Qb":n{Q9v+ﭙƐw.%2{MAdbTQʢ="X;8yeQ ʢ$䝋`BvXs d)oI ĴKlr[<mYk}m,5AHTϠ%I;@ۃm&5QGx]/:ҸnO)qjtUB QAHZA% 2+knHٙȺLOlVZ˨%)H@D}L̬,3CE  w`Wso]]#d RLmp'Vbi=omᘫSA2Fz68\\ n ;_$L0h.]5.J*1g(悎/4`~c?>fT0x8I񎖰%4ɒF:/k҄K:^Tev%2C zJRZz~=\."Xo4!K]i- v-ʜJK*W%\ydd=aͭf.-yɴv7 7WR)KN;=}cRrRgHɋK{w5.MDח[QjaխD1zo -Ҧ\`b餀e%Û+o޻Ma~ %(SXg|NB۠ RX1O xdf@.fKn?ཌྷ X 9E;i:.dp?~SEGt9B2_/2Z]T/e&myS7(ol=JJᅝZ>ѻx"gLx0雿_?rk] LJFƃoZDe|""4s/"P0`2[aRo2V`bU$#f4}^(j RRJ i)(\JG%xuՄ{2L9jB*G˽]%AX#21㉢oy}%PƒR+` U!,)doO5(FΧWSf&O9Q(ܬmaD/)y3l| x)R5~tz(TP1Fc(v(!եPFVET|MͦI#2) jZV[e&h5Md,K=_ϷM"%&8%~t\ڴaAn9 7&_{'P=|~οia|T C#WV<J۟^ h.iaz^S|nU)*942,0oBd[g0`a+C!۾cM˳=Wν[=ߞsfc;NJF#7⨡Ԭgq*&7%; 1W:@%zli9nW@)kuNja蝆 Lvu90vJ(R.x: K©+PUȁxZ0{LL7l J%e*8Q-gWgJ.w>{ :){X0op9q3=]3RgqdV6 Y0b]QՑ2yv43 0i2Oر?68ϥn덼{)Gjc胟"aZV3*ͻw&K6=x%=;OL.-{:8(՟=....|=/o 2炰dQ}`ʉRQ"J (3hJMiasZϛ gw &$piau_t6 Ň,QMZ@Qԃe˷\A)0ы!Q}jcC{lROBF!_>dP=f/C%Q.S1^RA)h[U(Jmղs,Ն:`q ֓C{@ R;ggόw:e[6Tqܦo@HMpHu % 3 g>a(ZN8wH˗l^r8N81s g4;r 甌N89ѥnSe3[SV3p:SD`'Yea@MO-OԲTd'HV̙S-?KJA%~ʨ]JsT?JtTdRVI .|iް,4"ZN/m˨fN$ۮxi+K;^ .vOfݫeޮiw F)#?PkgQ;5ʨ@Ekih_ MxkSc EFtl{K$v`u`` FZFFYΌ ?޾0  QBH@:iE}@s¬mK2JL*'C:~Zu]Px[8D*chq`l]Y|HFasн(]Y> Ogq`;!Zv;X9̑"e \(2ǽol_Y^TQ{Hiu۴@h,1s}xWxӔ58#t9e 4j)tASmTb#k]^/026uQWv@TPqh"d6`ican+l;֣LCd<CC= 1>ֈfp=,Hߣ_7[S_;,lWӥ2͹ A Inz ]GZk3 +_Uئ1Ilebz e:, *jk atўt@3Rˡ+j6S) TU1)ۙ0|L?E(h֪ՔIҭE+XT*Lu c uM dBptiKhJ KL2p08jvtmHAC(# l'D$kE`L+k֖ !;_- m,@r*ͯLr??;4> eXvxߖ"!%JX0#i6a IE])_b;V0zz 1ڕc9 ݞS, :fL5a'})\]9?Bpנmz,k-R9#h!uZs7}8R{ 7Z/ltz/PD =+CP e{,Eez{)WꡎHXsQsq6뙤 RNʂaTAхcQ#0'S}7e{ hz*w:>e:I ޥ]]xD}P=dI,OE^X}.h Vc_2wWoRBXL%ˣ*S_}?+X]*Y)Lecb85U8zUXL"ȼ"録%9X,~4k6S:羌9Z:΍}عܨ.+~,]\D=%Uh 3=[sDSue56G7'@Ƕ-9bazS r-!|]05xK[]9ؿ=gW-rP37^;CW{ &No\C,OsQwȱ.~ gZ6WXl=6bÉ^|@}q:y*{.I?Y ` eW})Km}̺-g8NzQ}S8ø Α=C8;M=!9xf?q(QaE^b_,LzP qhv[{_qOߠ(@] 8(}> E ԎChCf[e-)2JtAS?h_"C/]."(aZbxӟn?BpڲFk!Q jmI5GTӡ=8և/$k|Ʊ׭GPc}GD&G)#mS[Mr>tK.A.nez'*{DwWgz͗kod0 ]$wfzb*:(:(|gTm[Df~ӋMle3NrlREWziυA3Hj\sN)S𨖸g,16[:P"1 umN`f V-I@$O@%%3JDEh eh04Jڪ||\`bzm m`p|y`ĸC,b;;h$pbZᣴY&*]!bRb1^g9O bMKpiB*uqk(Mh0arhifq\!bǂqK􎌉-}tQpJM< m9){5K2GXY!Dγ^\E,M6$XGdpi6A=-11!F{;tZ,RIFEٵh$V*k{&DM^"ShVXQ!I8| W:ҚBMZ@ GINE)Zz8q+eBC*|"Ā 2GJ:=0^b~:8&G&5xӋ!:г>:O= w;v~_gӳ߹\n-O5o.NWc-` "KopRAT 5գ#08vY> /([1J7 %#)Uv,0 ks %$6/UnωGnDQż4mM/ BG@n̻;YJ"Vyj&z)hbwT0F:K6_VHcT.x4iI>m9ɧ-'~^%`g%hG %6D}Rv*9X5i(ȇj( uYoijYF~I9XE9MwP1eO ,,2'8C7ϸE"*Z!!&b$DLHTH&v^X^j:;WyRXcAtM rG d+J[\?JzLiig;ip3y҂ Bo>yɬUYjPExkiEqhPfv ݲ c ]j8>.d-hϯj}bsrM{&+(lI3wFZ%nHCH:qT䖠+8= UDkr;BAPTމX!QEn4DMC:MØytOt:}y_>YЙ l_>Y I7ЙrSmG& z%@Phn:߾|X5Lv LH>93rw][Ҙ>HFtm"#1Ws9tj:PJw.@Pjj7=:x~E]e"#A3|@A).F-uFIZ: E_+ځEDg̗9}:L_~;_:@Q,gEDi1zR ˔Q6% XvWW+h'"L*xG+O-2\M Czdd#i)`Dl0`@;t<@MY b?v?9ҧ9~Eʰ;U@##\lč#Lju ͵qkWZѵFM޽D77Xq_%/?zֈZ Y`?5obԈC[8zFIգ:?=/۷73$@}CqLuyєݾE4]<-@bKj/%^9ԂL$Ge)-C.f%4*Bj=2h/ ={H]mݧJ{[Ϛ=eL)!&›d o-I«O S0>iEr|`@f: ΉVqT@ES.ƢbV0FI0⎡('$Qa;VU@h2ƣ5sF(sa*=e!E9W<h&f>9ldf;;l>=hgk>?f@N|;/h>e`x?_N]oq5G^(sG{NKdV|ٲ d d d %-S >8MКkLnKQ*@ٲҁe.ƜZGJ jBJnݑ*Y"t[5j0<;ZI(7;%8DcFRed m垁YMb8]@:%%H{ox/]A:% .y4U\>k7?ħ۹O =0>tƮz380-ʴǽ3k90II1⥝82TF,/3Gbp:;@бm )~Vtgp(AEuIԵIFg%J3TJ$I V6 ΘO*$ SB@ԵRnBkF=u-[0Q[670ض|#m n_,;SmOT Ʉ1A\J|럟L_Ȟ,lEG]WыhsrvwVlVkȿBtxNr[QSrw㆗f%AXPx8_L䞺W\~yiB9QVnejՇIXv+G?9;V{~1{CdEnYf'b2CkoD2w TOMwR|9T߀Lx TI}a/{MD?CyɴmL fm] cW%58 L+ޜތ('q~-B!3d=j9&xAm{vܞ^wypy#lXHNJ~"GX`(H%tT% *?~9Ξld6Da=Ղ?/:kaW(`Ȋ;Elb}aVj4פ3ЋTO,wouggן]Tڷq$sWU'~K5gvb1*"]IV''gtJ-zv[p; 4rډD4!?yw~yE߯v14~0D)ٔ[i}_"l O]jNьy_Vz/ZIIII]E$K"kcDGotZp浳DYE"*ż/ӖMI%8)R0q`\p W('5@x:7gx''^QWl^s2+nJц3%E N B"E H׿gU=BaKt}rA+F@ @h&H*r 1*ņ,#e8.*R@ݹaf1͘(Wf6DU'8->q}j?K"]jg9;k5@ )a84H-3t%: i)(kx=pW1cABD e)-W \%{?ݔU{be?$jnv#r-GN~%75P#Ȣ9(ekګW䡱5䄐MW$Pj,ܨd<_ɣER&o$ ,XIbn?TS;dg5gvCP-R ö ::&XF=z˃6Jfwe=rz  ez0lc ?i鞑^ig&ȺGeM*U/`bԐG^>r/ _[1>Z'yL.潿}3` ~p!+X 6.O}Mr{6eG,F]:N ^Hu NizQ};#ۃ S6.#9VF:8o!Q(%K?qLͮԭ@ap}5+9f>q)j1ry/$nt9͒˴˔lb֤"1a):iS IYmTH!r% V#7Cg)GJ!ub=}Ԥx* LH>y|us.(k (Y;;+/H$bzWeon}bV2q]b~V/r&r57y +aOur,5ɧU6FŔ\b^dp҇= dϊ$8õarۿZXïd44ꑢH@h0 ;5Q(s#\4"뼟FN c)*in)'H&{5ShYRȍ$HeX;j^"@f)/Ζsֳ%^嵞NK.aR-dp41e?.47A)@eoYr1φu C. U+7wy`/ŭZfg Q3}wȤt||P~~H=bC~~O^3"_󯋴U1?˓㧔I~][xl[_q[i?~pWY{jypsBÜG6R+B 6z3?H⢮ht]ߺr NlgxyGbubsoNjxpb܇m9?پ!Ck&cW9{﷡w!^ Gy(m q)a8B$1jGCtbئ5yKkt6Dqu wvlE [7Y-jNrjtUG w^+(Bsݰ(J?Embejv}樥v_xJQ!)*U?$E^ju@BMz^kCHg f;3kZ v^_*W yfhGpF6;3CIYOgl0)l0S)`9˚ 5Jp“,VXb.V]8tLA?1y :pA cFFydK_]8oC 'm1-D)~˲z آ1Š\a"׈N -(hoIw*YSRGK腥g*i r2zkw 8sض]UZ:C6HENt6R%3@ 9:3jڵ91kB$*GFt:A)#dzTBA|jeMtPl9pw_MU후7XrSz//G5T{OE>uZFHşGQGxb>fₔ1F`.>]xMOkc?,C]ܧR~q[8]٢"5:^7k$(ez*jf2 +46[ C-[R21f~/iT8KlЕ@1~Qy}TRrnEF8=ke+J|wͷ GUHLCީ"F͎6`غèido-+geӲ׺J+=M-9hM2~lTrqyJAIШKކA9q6GoP|.%R쏻;HȬA ^YhpMj1BB7.ǵF3q}oe]vYe{s\A#&2:z-Q5RT)V,_)B:ǵoCr\6~6q" ZxTUHV'Qi0p!%L<&(9m-@6.bঁ*n}G[ c=X YRqWXwq??W]|&,iQ]0RFӉ~%L^Z9F =*˜:z[5֫547;SoAc mxVfT ] uw̚Ii]jT5Nԍ6y}PXΧK :Fn}[rhOyL4Gl2":!Fℹ.\ӏ.uncKõ.K1uQcڑ#i߄ة-N-2ǂ*CPI̼TA[w/Ai/O-w&1 T0Q. )<Ď(TroqsLҧx2m'4QzO s@F+}9{3曤_6"MqBFڰ wF}EAGlssD8\t^.`Y&C5poӜO.b3.V ^ttQt3/2H@;{(': W (=0QJ+ k]=M -.;=oPHIѶ|bb<1˚?SΔkp8βUc8Mt4J^z@s_|[Sm(կ^Q7Y:ͻ O1 <&0͎66jy=z8o2+3)tzNhBOzCpm?/ J٘}JLw}㍭>C3167aL}v>|nEߌhl]; dB_ƎPp62^$qdmiiAl3>$ELԱ 5#'pYe .\ rb"A4BH^ 'Dn ѧ R@#`IL,=ǂܳ:Rm1d3De W2[[> 5kzuf娸{t֑+hyJhw[:\rI#fZǯUz* ?@gN Mn6zHf Fu\gU\2x*`2G7oڊDHHxc6E֋"k"9=丛96LhfF/wi!6@gdP=G"a}3!}mHZj[[tr7r/JЊ  'D#[^Z}dKWR6,fI} }uO!Yhp1Z8X=D-!yD(LӮzIJƜvm"U17lz$S/+BqD )N=qKCG(q#e^B ψ.Ϗ Z Y}m7Fͷ׼׽ ZYTRsB"* R0`ÔrV7@}.h0TN(V†R%qb='l##/cnN0+hHO1GL5Jji'#U04S-[.4銂>,~,KP?%z(RtѨq+Jt ;~frյD'mI)䜴D`KSOO'Pމ9ihmJskl]5 $(4)I@KGRiÖG-Łk=x)I݊tGpsR@GEA(Lg´G8T1c++ U,PB`kEQ2wtNή^xfЋoӀ"f͈]"TOED-iH\]=tӺ|]ukj|=}șl8zю]zm*p'-gH%]ۨlHZ̡U'#qVd+ǯI:>)uSK 5!&N@/z8]#Սd A9guu[FmlO[Bj%j*RGK1oMH s/QSruZ?ۮR͎6NWAZJ5LQ3׀u*-ԒHXR][Z+ JNſ{^kc4"*z7B(o+3Bol [-KAQ`߆e'bz\o˛_𾝅Tbs&}k ,Gg?~3 '"j>6y =UMj}"Zc.E\ [5dF|t+BL@4+/J{tK2 灸L;nx8%m# ב\NEP*9&w7wWe>>.\wRWf>|vPKUǶG켾`';l8cƐVJoh ]˘NL_u *s!dIEnWm4J wﳺDlKJ7P{FR!w^J*Yq1hE.ZgP0ļLRڐZXo^G%t|Eo׷o q/_]2"_ U1F㧔s&wŬ*<6ǭ^̹4b\LǫЬ= oBh-JHV(#f~y}xEu}WBϳ3*E[zJ!a*xH/w7Nb*\.o9MzINX&%?vsܾ%/]nz#(FmQ&eC#{ 0)(X݈mzD4\yZļ+E6LJ䲹J+2N:%}6ڋk,pWvɉ>|xU>*ݓ|x 0$O[Mڿ܏>#ٻFncWX~IS..Vɫ\dNc 3J*Z_R4bnx;i|_@7#' ԊPZ;lHIҞ NiO2kq/UfQTU L3G4 ыYY/ߩ42 ȋ.A^d`U?ޅa}\ No1O%%Tlfс BU;W[=|=Zo%FKbu"5g̓lc$=]K8$X5?y GL"-!#6v|x^9I\5Hp$KD(ht^g>×(}r| )]8z80#- )Itׂ)TP*#݈RH `R a.ʢ#DҶ&@ͮ:j2,gKM(aQR#+'(10J焋F`#*Ж!E)|LȩfLć Lj9^}'h91S>1ڏ^`E(Q7o%疧`Omzov)kK5NMJ#e|b$λ.;FLuL58>)&Wss:aڠlǜLU1fH/گzwV?QkVcsg돺[EBٻ5U1l1]'"(t;-u Tg  [ܳN`hkt7SpVII5_K8ӈĄbBXa`2o]\$ 6XRt6]D DpۍTe)܈Gv$J@VBB)`5弖Y=ꡨ՚9A{o\MVeqT멜GF!kM; j7P#%;/BJzEɈ`hr cc8ـ14"&0 dؐۘz7Q, ?`p69@YIEebD*NفoT@mXڇ|x lPs V .Ǿ f@V%ǫ_B\Q\d*.Fճ a6DǍuzu>{=^Q]|ސcMvr@Nγ@ZB`SHXɨvrkjW8{:mh 4lDQ;&͎,k?'Gߢ97ʆR4&VF檥´!. aऑR~3-cySq7?$EY]7 2U?78J_>y"ALND'i $/Irnǔ!HPCؐuENM`7/9 eE8UP)!f$3 ,Պ76J gIo{ԚptH〦q4 Gx ((DEx]c"zo@-GҮ&fҘwtDǁ=œa$%ʼn1oRcί'~b@C6⨿(2$D4Jr)Y#`B"ԢBFjfYYR[ kK"Y9wI{$`]/`D9ڙXBFSM0WpZR\¢\RP3ͅPZ060.#;c/[bjxhx2# )my31 x9 ox]nGHH0 ~g ҃;̹;]7sk>ꭿJ9F4FV"-Fc88VV>x/S:[<^qkff7 [03 @20ќBv޻BIxusЃ(|.dv.=jv g弐7., ܇fGjU3t݇_>z0w3le4?[!ij w4#KyDnV=|?p?-|om"慲5 װk:wOf?0/Vl@8ZFBX!KՎN /NK>RWz/mqk~s7ҧޱt"9|)X~4vvEj W>k>_*q>Sm oU"Wxx3ΥZ'x5'coQM/U~@o̓]T弡r46qJ`;,٭Swldnߛկ _UZ8潲vYr;kIC^&:kAcnN7bpެ[zYukCC^&"8w `R1QoX!wAfs[US,PXUB1oRSZuyVq*b!*޳{՝==N?c܌L{ˏ_~泞o֟~ϽbFon dD*Z !( ,ZI 5x7EoC٣R4fm.[K i) G´jek)aZJ擔Qh)aZZIMHUK/QKAi)(Ƿt7M%W-d-evVQWs=gL>yvBqn>x|S;K)m-_IKZYCgo?OdLzorهOC a%HCQbޗI*VHJ Rk<4-R!Iicv 0 eRAt $Hǽi.!M&5m,r ˆ }1 f \ l\r "!kC>gX:o?C|\* fJ FJ(FЧG~-ecjczr %f^W=$,vXK4 ͝dK4eTb)y9f|nJ8=UH.ا٠QXD4|X,m@s!pׅ;X }u)0n] =Cw$8V S(6gw'8w'X=;h;c )bb apB Ǹ$z^tppo:D,sPΚoJAzM zw%BztP_ӽ_"Sg)DuW߬3p*eTM ԨguE%W})H9U'Z/Ԝ?8?H@(^J8j4C*UnSņ|--Agf60w{@ =dX}YmvRY nJpW[WzJMe{> ^wiM{W &=,! g.I+- =xXǟsNJXjپkQh"E}{-&@)ΤBʙN$$"!o[6jΫwPɇг}TCMS:j[(-dIgڍO\/ZSP'bg畘h({(y-P=&"ZyDl@*ml䱍V(=^Y9۪[~(?dIkBt*s:R')7Qa#6OR n~YΞ:[Uxy +V"9#`)2(L)4\پS]=7Nd$sN6b4MR"YQX W. !呃Ue}t:2APgL\5hr BOYq:x$WJ|BKߦ PLcU;Vl5&`#S!WV6 Kĸ),-0J/  ̨)"h\;$iܘ.0@HD,I`a2j,pIB+mHˎSRGzn{"O5%JCR}x$%(YDCZbeqedApd\ٜD}MPJ5=;WHm3 >gdFWT~oS.:J{#nDDHQNop<.Xr88U\~zsi&ӟ[ss@O X@pُ|擴`LD}Cq\P0)RzTe:a1IPYCk5㚤մM(FWݭw*.],|.h%Gak.!nB.4nx6[6wm2w[6rnݲ1'-U8*Uɪњ~-3W9+W8bzd-W9W1 0(_jkjuTgv BCۦ3iAcɪv^+.4]>WH'Xq?<,uFD5Z.]-6R]k;,(JhWkv'HqjոQ.+IHsK-Zjs>P4}2XWOHVӊ3^o"FUyJ~3.닭7{S&"z<`[(J] ȽO FhRj3KGR P5b*aH:Mkx1 Ôd*tUK3/OJL>eT$fJJ[1!n L*>ޏ&Wf~;=<ө]բQQ`i>RIN; sչ4nm4?)Zp$" @Qt=\If,8i1*,g,k3)F[1w{QRcvFOEaʚXkxs svAKf̘n/{uA{Ҫ%KvHoÞ*Ѿc~L?oczi՚YƪdS[IlzQn!opk<RH3l#b_,xˍC\Ht5WBKUB?Դ)׬'+AF-w0M7Rٴ<(qW+lDf#!V oǬjVmTw9(ݺM:D[".0`TtgkƸcDo9C[;e 13ϛ?u0C?xH"E'JL9^h%CaUjO- @U>W9akOVXuYE$j58FU]C]}*xkW#ssx 29Y3 `B{b^(|7biM1YpBp7Ǖ,B~)3E58EeZ\r\nW0Gk +ΑޟpQ#)_*E%U{e%+Cb٪&zEj=ٴvJ+*;A|j|j{Lh%vhV%O{V@W^#X+*|иӤfUS3%>xٝl~EMO[ICxSf:a[TG#e@D27^kFcJ)Y=଻3\2X0v+c7D=jO:}jmO;}>r*m Kd LCtOfx9g9L4{_}ZW: i42pga(R5]L A2!dBex d*<% ZA#<2a˂FY ${Hqx #^݁yCN-T @m:7J~N?c$fq4F> !>\컳dglyYA $6(›#@ H2^! =Tk|I>HNiv \tܟ@:p)hNǑKg{#%.pq DHJq\DKOݨO{/UD0W 0DŽ2Ja 0kGaW0nMy# >~o?io3S ~U5˸DI jFXlsne={wqExہ;pxLx*,QaLY''{>c\:@WD{Sr!/]΀Fb ɱ!`69u>r 6>B5/@4<h(3_址U^篽 Ogx{6cL޿2|P lL>%1O1u=~&qDb{V>z&+r?aKw\")a %8-j,MM;@%Yt$ԂJ80PMG( SgٚʖY=>X嬗Q;/L>-ЪD&?"s|;k"ZTÙZa0  >ˤ?L8R_}2_dDlpI rraq\pв{5r2drȐH1.&BC0z6Ҡp6RlQ0+^W_Y۩~"X,ȶ[gC1xVOj4[u˨>Rl雀^3Utr/nWT׭\]e!Yc!/4 ̀2'mǛ,Zg:sn_28}YJ67p2G39A1Dh-5Kuå71'{Ƽk9I`xfh3l&kc-\94[;锆Yu>j4ҾF~689— |{[;[t %A*i="l )jrG׉k1(hLZ܂ 3NLH;9/lBL[ؓjjRMhe:TI.^NA`0 gX4ܐ:gKF)B:T-V Sj2q_wEX+y"U(BH?gVz‰*|0 `RۛI,n ع3>ЧIڲ,bW/N8&GI&ϟPJqѱKȓR*N8G12wﶩQ -\* 7Պe%ƍnn- Rv7'z>yT`ew)M*\ͫ~RxI~᷏ܤdx1w'`D;gp4AbCԦR0@u?lt?fFziN=j4B 6k@S.S(/Ҍ"@\Sz{ӯ /1E`Uv`{x %%-eep,? oSgܦ,п'g{{q|}ջ)#Z]S0!@y3/5co zQVvfm\{s27L'i,<}!2EqBaAyU0.]a Hŋ1ٳM3F~gLCUY>^DWE V^>Z,Pě>|I%Z>6@Tr$9eztk-.}Gj̛nyd?$aOQsG60Q95h$+ADdMPI.m6^vUF< .0m$p͗~v!}fՕ,WP MCb(R` r.@.aT0ef|@cр¯O]yEYbaC3{.mQ1j;}?7ΟkG@oH.wj5L7-aÛ(^]&33?BȉG!>U4`HKQ=xq3]>3;WyY.*ϴG zhBGgqq5SJ& 7ٶ Xa,DhqTˬvk?T6uDFOҁ` o" tg确Vq{Y/^]kYK8(A>Z8%*{|f%[ 043KP5'z5X*E^8%WBwi]pLAŸpV<([dK.TiխTOnCRSߊdqpJ~ S`P*8J *L c4JV͗Gb7d+ %[|K4/1ΗYl;[Ϟjfœ7?$Ś8F)82l={A]4a~:5rZ1,Bs QDdo4sF((3{F"=Pemxa[P1.^A>]^}vAV ci9&g=;%ȓ3%8"ΰ"ɒoA& ڳ6H*>0ɟ_xF5Il*wTMu>ngelCT/>)r+ SZR4P TM9/<Va BlHd3 ߠ_LM[+6~$<& huϟҰW/9GPo]ǏaŨ5-MiE1$,#-QP%$.!*>]%SUx~ZJ6*W9y?]=M3j!x>0h]b%V{EW/nO#wysU㩗Y\acyx$JNQEAXᇵ]iPG@GRA v֦34K8F-!f[ BhoIDX Nk9uxc"i+f-"mh_p헁mYP%tV{9"#cN6:ovbǔ)0hMS{uâOէ7ƓEb+A(x"b4b3k,%&wi=u\_K{U414 2R]Kүð=bLz ;m ٟ١]1@. $5xR0hq,-#lTKFO`+N^;Ij 숷Wd^]9B!ky7:u+:}0rzzBKp9NGm]F}aw {.c~O;1^Kr1{ZURqJd'V( xDt1\A^P q^ᐬS8 "3::r8sx(⒌.bSR#Su%tm{*29l\Ҽ_2 .|wu*2x~|zzY=I}j{/\>}V-׫o7E 2S uM;]YFdR-H<ܫyvݖ~rD{ ĝݚ*H 1<xмq*Ta5i#+ސL%i5!w4:Ý~>M̱)oW\~]:jC(:ToQ478n.Ӏo "yPL7媻XjW0؇':ul)~>yxc;ÝV/t>?ǽzdN`!8U(T;:;U>VxesS6@1 U}}q}@%^L%lS9׿\nj9ژ!okO7p㧏O,0'dǟH9D*ʔ2Be!J6;-`ΡMƭVHd .{[Z2^6=r|#aw.m] ~ejH;/{SCˁ\+&9rѢԜSl[y^TH(ŬfBPvuEՀĂq~̤XN `%%9߬(ɼX4&&Q#HJǿAsT.o ezNFzJENō1 J5%ZY$IIpFjAk)(ĭ*R` +m*=+ J(1tWùFbFT1ЍFFTE' W +##jtm?aAyǜ76R⤦uػ?F:V6u*SzL2$HZ8 ^D>G] ;%3c_SEF #Ѕ!ppr м#ُr@e%+3j3,Umf;k3tTْjڣ, GXYf,q;uhQU˕靣[]>=zn'cx*B$׀j+sm @~P7Op,opGH] !9uj ,hJdMct~oB^[DD:[1 蒠`w;WWW8$t2yexgS}~)h{A:(Fny/O:)_!_B5{-Rjٗe2e"قc+G6ZvRY۷'[+_u ##/~JӟlPgbxa uXj޹Di\e5#JC%U7BS,a6W|fWj7_9?P҅F~kax!]2dϏ#bP ǵSUћ/?bǎocpy1]q] PO\ׇ?h? p> ? LzI܀[zh7{)π(to,e|9@=W_ls"#SL#=tMP'bLq\#o9Gr4⠬F bRChH}$u#[Y&jlJ ,Q3B+V<m2{- x#E HErkH&P.>?O'˫tr Dz{^n8Ƈ0/߿[ߑwzן/.ݰğo?G M]MlϋoOqoЛ߹ӳWeiߞm4iDaF9E|19g=7gmcXsl8@ŔruR,?索X@ƤRa*"Rev G8ql<󱽵Ytܸz4=:Jln2!WD7!b-*)ԋWKyҪr)) 2:\jQ z/-Ԉr<;8+YrpaL\ܴ xtpnah4g\,<'U.Fi*g8ص8cIejPa]sW͸۫:З?Ce7:0W d>݊WW$&O8Πy|7h<軅j[?YPRkAAUéQ^ﲬċUM]5] v^(PVè&{*ڑM+Zl3YZhTa|qӤn:۞eB?(h?^ڏ']HѥĊH4\]ݧ+wtA_) .ſ!QLG4Z3U]jw9ޛ<zsX3#O'7:֐{0I4 8y*xNWP)O_:̎o|HA}~ s\9DPf6l9kxD?{ȍe/LV6ߏa n/ *-;ܽ }ɒ,,] ms.//oJN+Z 5|B}ȕ:\IDpm]_iH(j@y.1AӏԽ3LV`N%|jo:_ z4"( jjU3)䕌C9"E=@1P!gvqri2SyS#@]epΩSffM#au&N:NRX~/ʈahO'JЙR{q`uo"$v # q&&Ԅ`H8aDx3D* -}8Al/DEО6L( {PiQOLqWJǍC+^!hh/;gDYAu<0mQXZWcBDá3¦w+Ǜ>JʑCS$PwOn|ɆD)u]'p`y(A EUex\vNMy ڋ(?G')NI'β UG]vO?|i𜻤 esٙMÍ(1zcH Aw0@|@g=195{NDR-+wFe)!r%)GsO?[`6),@/_5ǻSF_|Z*L:| *Po(`ݭq'.Cts|M=7E߳n&324=uƁbK(.R2 eftHv;kr>Ԇތ,lD<~N}^ގ\ޚX5?B_nF6 v /\vuo;D>.r6:dVf*%S &sH(JmkAIH0:7uJKi8DTcbEfp,3CX:e'2E<5܏_X:pX}r S p\@( ZSDֆ#4Ž@yTRϭwZ#=w;`H@Q6 (c" &YaOm==J(Rc @`@1#n$q ʏ -IBO* p!?/u܂E‡.|/Y#JǨ<@!M":/R,83#9JH(!OBO=5|\7Qj/f\yv5NڇS =`PA#Q)@΀m IB0XMx4r@Os"b`Mg k#~d(ͥ @i&c M7~FL k>ބJcB@e2)38hupB#}Ð$l ޽ /ww|F>)FFG4ee2?IX {~݃t2 Aoq3-zbmT}^dK%|p+_>F^>.xяB|~i4)F?vWOҽ|xj?k7wj6nj%V?$GB*wizz%A~L9KvPםmg8&2[+I KwP k4!w[_FkP^ӻ/;+<}ȹh3 ^^g%9qs1tvч_oyw{W&E![{%*qVǯYuSBe/_n8 j31Fh/ =B>NXziI!đ]bh{-,m_CRw״@kM%% ˰7c&nQ̵Et4}y{Ob;ָj4ѲQh5\+8H})Z:Go$9[+4zӨ>=8b"115Yۣ ՜(+߼=nir=|쟀"Ы}V*߄Z;F{]'J֖Hr$*8f,o3VURz˨d xўk!m/OIQa:^',gk(5'̀/swwψ3hyH`(s ,ӫ>̿ !gl]_'1yf|-"pt!Ejznn,8f)_4ٍ ganTEXN֓\*G+hMQ:?my71wK tR(ڋ[bFAAnnUXM4ɦ [{ff< ~m-Wp* JU fvg SzGO7 GaWeɳjj|Y=]eӗ!rv#dZ޽k,c 'Mv"`|7w~<߬3Ct\p/N߲~݅n޻鼝oagu!( jc:~?ȉ3ʑJOq:\1)p^x"QnXNJ\3'l`"丸Q(9JmbQ48a3nDCPVy/s\bĜƀfA iǔPHCX)bR*'Z&'!e!x*M7'r5!{5oW"UPu߿ "]{|\`)G?;cшMw ~|;ldsj?k7wj6nS}ாy&K"_ }ǘ/ 7P龋 Mdz-9تe6_N'Z{Z VuAf1.TkQ(&mi/ 5-H_cETa!7$¢sܶw-zT bL'2ޭŸsZnYHV Dl ѓ Qv|# zŤ= 9!}$G fK(-j@|(PyޔA(>\﷉`-SHEz Jm QnA9fKT Rh@ai KeMp ^*˜wvZ&C$Ң.LCAD\p*31xO =%uQikE1si@(r}baU)(Z\tY}(r'N8G]Xp tBhpI4uGQӪX΋2P i(wM+jˇ@LZ!.Ig0P &^,DPaVp,YWqp"I\S*5p%sB!j.ma}gHd[*JS Ph{9N5"#ac)"JnN!-W&Npbg0f [y$|H >!s$Z!i -3"pS鿜0 N:+3d@7|HI`bLkԯ\&3Q= Q5R? LB /IChGI-J.(Qv$gHmLz!*7@r?Ӎ;LUMfraas*|f)w兣PͼG5t5.n0EUh\=<@pvOϔ5c1WP7ů:v:W'{5,|˿O8~cUd(ϖVPlb~gt3(MelHᨂ DDp('iO![ۙ ޣf5V!Pg\xn~\J42 ci >К U6NFBHT+ Ut,(aQ+ 'C(VATO~A =||4cE^=^d/?.,.R)2f}iA%> 0fMŨeXh!t;^~f>3zO݅iԜbFJF\TɈ8Vfэ4j0*k$Tn\8"o@Qq[{Y{q^n' Ãl7Ȧ-iLByˏ@5OֲH"UJG988Jצ5qk Pph`Y+V 芰$80J wM.q*#(Tfq,oj#PVҹu)ՠKηsҹt{`Xa'XBgj{㸑_eqXLJ.d_l%hd;"}əΞ~k <=SdHV=%JiePTqEI% SN֥֚RW-fDQ ĴZ9Z7S%bJ8p Q8Mg#X80y tt7~oXTvˤ}~R٭fйٕVx$O8&8c1Ez ڡІđ mD F䄵(}(CMܛ{@R\~D9\I|m"9HNoEh]((т߸Y<ʗns. ~lP@=k'>8V:+񩖾՚;=aky͕l 0X' ;'I?JMljo}#'wDJRc8S;YOWddfՇ}dêVЪw2[DKͲ3֔ 3SJ KV_˙f-3͵L]zWmit m|J{ |o_0Q5U\efW1lCJ\uvv;$LdvRLq(FS]{v]ԘJc'Ӗdҵ7^ѓsuRӡ~xڬ5BN^XZ}k}ÔiRq=2=ĎЅD%:ĩ,#K֮4^dbF06OeߊF/&7ykcY=n$ ?͠8 3>oK]\WVÑsV+6 i(i,D9{6??M+x{vkO+/EmOr(p{W=P.sT]{ z C: m%UX;^:V 8UUCԮ*|4Zzat_)Ax6U5tH,1^+5Ͳ<VB6бf֥+ WBb]^) FWdri\t/ /Ï u}rq8%3%ƒV ʼY4` 8gϘWdA]EJZZ![ْح`iR)9HB^uo1pwvlnҥW4X&m@nvrlv1M5}úst '9±`@3({ 6lfyZ~pNS%=iYNHP = UU01U GP?NP9}z^N*j(e^@FgO>=׋1Fȩw=p05DꞠj]Xy6OHL~0 |+g<@jX|~UDPH|X{8;G;_ ~M޵*f=c`3u^wRpI1[Uj2c8 2Mz),v[ӤMm/ˁd\r47ך669K?h%RN4 X],ݤwBj)j4]QcCMmݫQY );t_ ]ZW̊/f#Bjmy F-cjZ *6b=z| -&Pvdr: ]>,ĖzS:q!`SwUgXȜd`]E̹Bֽ}m v[ܔ&v'Ota9t`Y<rDI* DVʑWK!@TLkӛ?͛8o⼹>o*EeE6P:%D ̅59*/²/$c- Lo޶Vm{,it II֙5 S[e+"WR -AEEBWpek*nW-Ljp[-"z)c zm z*=uAz8 wKd*_yL5PȚ|Pms ,W̻,}w[ی^qF $] ֖eb 9IUgWxC *§-c@JݧVy~bv,?Nf'=?}V|mGҡ M?<}L, ~~.z9;默{>&7[`6&sUMNޝOe6{&wn/ss9Z?~ Gưٹƚo}|gQYK8? Uj+dHjvIij7v#䦂53=s5B騝2We ``+l.T.#me+2O =]gH L_z3 Vs2>;ou.35וfXD`4_gj|#LPDVʒ`’ lk^ zN&Kl=j 7ӎwQXtQb+Ò%V8u9Fub!RX6O?m:lIob˟񛨇$=F,NC)UFfXCX khrAӃ+phQ%@Ӝ[]7rͣ+r OJ%aM{n(ePNzI7+NѰf+ Zt^di`  $dPU=PS{-Ѻw-:\{bvI Dtdf>˿|X :QB6,|Fؿ|̙j11Npuǒ"W&KY@ $Mdv$ Va;%_RWq+2etqiOQ~c2JkȂX1ӷ&aG>i#ce]OĖc҂#85RfI7; \{dIՠ o6qJ2Ѱ쾦$Boqfeco\" uZ4߫`]?Y yP>WUY(qpZWka {f':90 lh.8Qpes.$Xr1z"edZ$c R$ ށ- BUVB; 9_yQX)=P4Nv2vfˢ{-uQ T.2nV+,7k7g1"^簒 bAT9")m2tjmD#bXEs]2iҵdJӈ3$Jr9Y U ,6K^JRmd6X0r9׳Yi<3Ԝ\7(+ֽ'ewy۔юdU#Sܵ٨WSÇГ06=-..յZ[jʳRK3bݦ)ʻ=HHj&Yjr3 zۮ}Ή X ,ov rڀADCMT PNIWIS Y@-N1t#'FW@S F-Ur4\&Vqf9ynM/k,R5;r`uNzJ}NA(c*kP`R[`pV :J/;Ի\T/ct:mμ۔`~]}+Uzۻ|V-xw?w$wn O,5<|SADG6^Ɨwd__<ƜF7#M5 Eb>i^offz.|rOyC͘:HWUFV emu;ײXoֺ֭ʇҘJB2=^e&S;gptq|?{0w܋< Cwo>77ͼ7kM:y3o{|a͆vWAɱ]6ޭ1͉EsT'jGx{M}cwzvq]B7GljkfjI8̃C_z{Z3>C6T~7Z6@dHAV.Q׋Kt`F$h%aXhݯ| J{kstYñ UWɹ U3|96$*=o)|t,46{ 5fo{8^nptT\gnʚ_~ִF[SAnJYm/rU41BIOE/_X9ѳw#-oR zϿyd v:lL=ܲ*J S(1˰, G T$2Ƃ^,xJC-xZPNhtoԑ'gsm,|5 g3b$Gfd'U8=:$b3ɠ҂1%YiO>}\*rqU3gl lC++66Rھ"tOJ2vt[ȨcpIjiٱnrn|RdRWaᲶfIF+ m|Rݾ`w(UCmO&a>{46)Ti4[W(T-Ʉ s<1l}UNn0:IήIDht*%#@"M#bt톃fiڒY!H1]V+^ݼq!l/=65Qs򢖴c=UOkD{=)霼}4A\9x[\Y*&/%Υ qKUU2JdNҗeUڴ]sm4J3/i@i m|j4>- _Cl,kknFԞ=#7;[[Z'MFTVi4bf8$vd8F_?s)+1NE_l@`x'IlE:%8h 3 &pUqox VbHёӁ"XjA*xӂz@'᥻ݙqT}_?)Mr)ny #4 QZ^jIm@JAo'UfF:ӮZCJ{JSni7̈4h@ Xl^Wnn^a~<ϯ\JX ZE d8VAyj)#+ /sUIkpF,ӗ i)[q%[q%oIPrǜ3ACU)ЩE\AHK,uD#"a&}l;kֻ< >EEk|-rUZ4~ M}/}Y ci @U`3 !~,q,q|.ġTimU Zz+XJJ;%1C5v,Ĺ/ܗ/Ć)%K3n(퐾;F q̐ q̐3S*{$UA[kLIXҨ}Ҥ,1NS;s_;s_L+*%HMj$:81?1q=ƿ$ģ쥲\hm,*g%Gc,-!xb284L9CJ2]3_'ՙ/[`i"j!ܴ2G2reƉYS)PQ^Ÿh*L*ʽth>4'~ɦ,Knx$?g0n ϵOsx' c<rU҅2R,s\T?RwiaCθT`TKbf)L耙 23S) J*G ^T`y%йҾT*Wi*S!syPd_3BF+Yb4 hA(r(21PLUUEɱU˶ j5abFVRQJp΁1W㙋{ՔҢNIIkV\Jy,qeK%9#s$'jErx•I [@2 eZhU`SErE+[I[DI=¿],Qb--(P+5MRٕ)v5C%^̀Ó4$Z(3]2C%r?.pcG?se'o'gn > D[ą?=~gA yX}D)޲ĩZ"K).&Y"3"%To dZ8F%F3!DSYzENh,1Tj 3xX]T@~!³@B2t\ X8:T >wlNoWd!>LB9jm,;s9?vsJ G]=|{ w4^ګ|2B[*}Of2^B+"j.;/&95 Ǜ~ΛoZM;5Xd*}4ј]>˷nތn~.zYտr6䰳nԘhi]8[0:"m*D!9,7 PL|bk 6yFA2:sDFym(ۙu-yb/8DVZj=D B%ps1iy 6Ob!LK^@!  ;$ Of2 6APCtC.ӽkdQJ!dDOnl&DT;]NC{E#Kl=۫W)sv6cR^d պ}9kSP~ʹwD30u-n5نcmz{FC-?\}dmL|"p:}tysd%m3{uJAeL4$^gpm/M׆v!٫Xbi( W_Hr|e, Xx\kW1N:9t!٫{qok؋(Vo3gMWj>)ъ2:0k,t]q%8|ʼnR[ uQS!Fit򽙌o4vwCiCфi=5ʜ61Fkc~r&1-pm<1i%\F cxH{ӽ >-9dK3K8o#3p_Lt7dy',11@hDco5z&DĄ,P:t_$ǜŶ!u˻w mܗ׵qC&P]d;akRptAm-;S @KY"-@pb[wՇс"*D vH?~g԰jp)17 ،ó47gj`fS ̦`*a4^6ךifj(!5<}7$%"P]rbR4:'HCR1ϯC@MGPzLٛճmiQVs7#VΈ]ed&,Y4sxa?j)́-VsM.63_Vc /KWnro0dzQWpP@U6"VLx,OB#*}&2KEÉՋSquws6ε-ulEi Uhwg!v6U z#V+L0+`NA"hbC (3d1\Cw?Oz*Y:s! -Cp:(@&#Bkԫ\!u]a"Oo,ZQ3FO^}#^h&oK'Wexpa2 6~qqqq]&'mQ5NWx{Z'LIm xFR8 T\YJ ep~0C'K7]dā:}@Ga6⯦wB]N?.}˗B0S8PM6a^IOQ#k=՜!BT۠kM+/MSqX4e%1/)8JWScz<̕*D?_)H)Ķ)42o;e~nr.%7$/0qӇ *Qd\ک8hgy҆Zzn^:MmU8LTf6Ycw7p4iCNA}Sў LLn0Rz)s2Zj Q;g+z,'|"$%_uf+X 4\ =?sО_3ZI&ڟޫЬ9JCf:p Op:ԭ+pmè:OaE[-GZ7buK^E,/Q~:ny%b$àփb쐖.uY!9[L;T_#5MqqM#J ך!WL@$)Rp޲½ 5HZ4r!7(Cńr`{ܻH$!z]r/#Lk8-"w *CWy]pWd!v\r:Y_@PE4DDAkP}!\qsG8u8\lUU^+'bTʄ"q,\FI!m5r_Ut۪|PYty|l\E4(n.ő˛iG!x"AoѷC>kFXQoY ΢鲚+O>G33zfЇ }>Lo>MwoT$ ~)ث]M˻TO:?&1|N۪p&yP򪙪87ꄷg X4Qd% qЅ(0[h)Xfv@6Z b&W> !8e]ʅ_!RM|C.m&Z۳x-|9>k\^6N'HnvalLzz.2W&E$(B4!1kYe~fӦUDQ]Ċaj΅đʼ&+%5֘TcT䥶,5Wh8fu1*#X Є}lKT % xLݡU#5Z'~ ;s][6"6MލѦK֦oJ!^Ǩ}m3tҺDj++) ۓAt`}RM}]8I1V /7Z׵h`ʧxz  9_# #PrФBh̋r_9pcJUBiM"Z\!`NCNoKSVl Ee%#􀪣R.]uf+-˲pΊP ned 5ձlA֔F[^Vmc8{I:{:pHSi .hxAE'/yqN0ngmںi7k;? 錔Һ,ckapW_rP:y@2~8/Ñ czʮZI̔zND0]l )Pw LR1;9VA:l dg +[I Gpm%2wVOpud]QC-eh=[!N@%˧7{mΧ !1e;VR_k@c9#hϢj-FC^) mX}(u?8Gg$ X?A/Oz:9;)܂ZA=^Z$7`Rv/-ZH_,W ywwy ş B'şw.lkwnVpw} hyZBz9XiCSzTc_SKW3S*E-<\nf nr0\h7Sx*dc\ > Rηc6dհq2 M c.E/(vE_^q<~GLzZrGqEQd`{J:H0s0 $$#kOC3S@ݾ)?xkWGQfߞF_crj6YɄsb@ 8Rz32s )g˷7֝x62lnS {*@g$$=\}sT s,,6sLaUs H+}ьo&`GG:=P5gyjL2*U-*u %-؂̬Tg J&G1"{D-T40c%V)JPd` !Y ta}32IqD/&nbG-1R#,8ށ ˕ ^1|!"M@iΥn\]#@bdTFaKkf3*p~e(Ly尌nbS;'*GJ |3w2xj^:!sЧTgx"a3da0#$l2{,,:%ybƌ°e:3a@ $+OXi)0 py& V$1 a@;o"EˎJnsGtbpjyѓ MQtQ7]fE rs3i%DzQbvyc\+C܂ Jy4o@xg71[y9[gJvܳii䎯4ᯎ'^_?y=uiD1)]cIhC]F?=8xtQSlLkW˚Ggd /Jdh\,\l0m=ܒBJ{\֕;v\Oaƛ1gqω(N=^*~}^gD dwoH.0[%LCBmTDok>U/lP.K1̮×XGc"ɳzr!tӨzOMv T]\t<\U? ݇b1x}Kk`#f/Y *HHuvB4~Mk4Sc.hrr ~ L:AEqjWǗ@Ib^l=hY d<էoy643 cs-&rBkE-䘱{ " Jw^6peXRzQ*Vnq-/H`AhŲrO4Ld͔z aHrb2<&|"q?Mi)A[Q 71fȉ#JB[#07f8Bz\UAtX*[%*[%\rqTژXi~f E^. >$LSCŭr/]?, .48U΢eZRsSw7Wp^lG6z(؄l?L YO`(tyz.bJpmBzF't%"'RPsEKFRr $=n= 9jqZKgC'4:ǕLSxQ:.tNp1@N!=6uSة~Xg29\:ԒUJ U56kz(lUgM76k_=ע#.P)BҠd>"eiPJkpXqb g>툒a' 5Oq2YJO3G{7af/O"]2^tyZ͛ T39(Xɾ[HZ-e;VP˨Zq ΂QM:j9~ÿ GgR9c)%2G U+ok֭]:d>Cz!GYF,Ctc/M b\sqL1X{Imosʧg`=](g +"Wb8F+цQ:.n4$:NNj--\:f VYGG1iP;*P)@.?J@ц3 Uo qhHKQ癁FZ"3pQ(B=Fyz:aRdBT@TdufG'>cӣ :Et~67w>O׎|tJϷgnY^noܢCu$5ƪUv*ZܜӟOq3 KXrȫR-+,+Ut$!_&ɔ/h7%*wR11hվ[z:Ru!!_Ȕ$h݊#v`\ Sĉ}#q\?؄M`%` ɧCS- ME6p2>mH>HatRXE1պǿX*1Z-u6ݩSA3 E4I**wv.D;nĈNmLgDj.$ $bd߶@`vӕ+ odC16`:\ m DbplϳHOʮeWуo2o@2'E_;4޲_\g|P#v݇zb]/w8lb`C]ވJef0.׸vZi\8+_ÊJ~VHċJx1$5r<3`F*Ջ/@s~QxT]U@a>DU'[ht*^dTviݕbOoQT]=MՊҙU moBSr,ӀCto~s _粼~KKy2S42PS)tmnGE/{S{|^]^Mj3se\ěr2٭נdE HrĉR@Fhtcnay1\LioRB} ߒu M}5B_EYW lE$1>Qi'RnV݊ R(.' zqK'E1p џzriH hD5ӴcnwVJ ZDoоD!z4: Nh_߬3RߴiѯЎ1 ޕ\\Tj~'zx!«01[;MwY{6T;V nZ,oLXCZm50Bxݾ(}y:T? iE0kA!*4 \N,Ha M\ے`ŌVJ,4ъy=^Q' % IBN mRj)S)`J'*,v -[jG^rZQ` ._㌬JS<ܝ,o?y^z~u;__Zh9DK Kž4D)QG@nfU5g4즜 a f,|؟p ҢVu*Z\\홚lE&2ucT^[N.K'u"~m #P0$uUaJ%tZӛ5wggK;@k5+-o{V2oT`wjlhoh̞Տ;_%g"u׾w&wq|qk鶼qW/ߟ}.Ϯٲ(cI՗ϖ} nǃOSZM 5^jv!4PfY^eΞ5A;,N\/s™ 6{ HkOK3DRN ˨A΁tRqD&ŞEvَƥk[uKXisFC0J|zGpaJV'8ym)/y%.мL{e=|3 G؇$A?˱H/NK,wU$8Ib j4,TznA,`ِT Or lWF~ X_O^[S|o OtKf=,"^ԄO͟ ~x||z>O~zW E'{͗2ߟS\T&?s]YS?s) F[6\)Ax) X0s{czK5W3DJv^5Vͮh}Μ7Yʎxj)S׿dJv`ecT3T''$3䌩 lPM(AR>Z=C.TW) YZ E4Gw߾@8bDI$p.޳jzQ\ϿXxH!:-朶l+B_·&N$/:?gDସ^ͳx k: n:,<Utw9]1`RAҌVtA 4Ww,5zbK!H.~hN,{f-ҪrJB@ƬNӼlg  }Umi5LOX`n%RӜ=8y'KE%?$H9)]w*1AJ2E{d9*l4.c亢 vh@nV(%\6`Ec'c}c:;a4H;9EX3EEXgt4М xRiǚY_|͒OiْaVbz IDRq 8%"c}Y(,Q%K܎#Rb wcJt|ި7J&$* sF)8sTZX78Җ"St,bZ bD!Bm,Sm b 9LyF/YcA,Wf Zyn͏+{毽揲%A S,>;E!ZͿdbE&DфZZվgm6 P¨¼Q`_x5JJK -XwEg'4Q`iփGV+vdr.3RZKɖRZhJ$/O ^y'N1Zᜀ=F\[9ӣ6T"v*"ʵC̗kJ_~mQP+zj>ew 'Co}hB̨b^5dZǥ5pKPۯ} +Ӕ#l&D:Pxa̭Ta=Sk׷RtJ<ŧ)-R|&W~:?2KR^U0o}z oKou h]HXn#HxdžZD٘{G-ߏ߼!4#Ekr)4W(w{e6(Hn*Q:rZ\=p52#a"ApѠP=*j?jmm d.gp EȬN ZdZl=~ "~L DzUEm ('ivo fvCTqSL@_%-qD/+VR1LYm`+GƜN×Ҋb3d M2 <ɺ Lm,UKS`*f4.\͏vơ@)q7rŏ+n@]KK] LND&MhS/v*׻Q܊fPEl]|6s/E">c^RڨżrK(1 #Ė*qs(f7a߾o_ wu"}tYcWL$ƤuˡġS%RPN|\m!LԎfEm?r"DZ%TJ!XLjPR;f Am*[jG^rMJdbVm> 9܉"ky0^sNF=Wݺwگ\0F hCDpV30tf w\"<11^#&Y V=R4xbyyÎ>Z|B-]:zAI Bb*~oVW%?Iͮ.7OBVh%+:a؏!9[%h,4xHx4p{#XdhŢj#Q`V9{$aOHL1;D.t n%zDbSAͶ¾zVA£(_AJm>ڔ>%yZWs|Tb.$'8EL)\iPb":}nnڭ EtbZKqn*E8\D!ԴNbb!|5o:^ٮk].$ 1 [W][s7+,C*=9֩IŻM0FTi Iiċ!f#2LFw =u?UUgu6 LrX@ʵ5 Zn#܇ݿUEuG[ :ԜPHP1*G$oJ׌hJNOܬRY3Wa+;lAKx"@XoMK{3 9)ՋôpEaũys2M ѦjIhk=i 6V Y},,'4uY&bR/QVvE~}q%OEHL$WdM.S]yk`#@W1͵|555s9%(k (J- MRA8."x {`+*Ebk|")J% ( 4S-j= _(M<iIvTYˆL2w`em9 Ai}L2}DKhoaS67|se`ySSh+{# YnKJ [蒆, T-?a!=%H' {ҿ=vIov6vW4NDZS6I>n";q'hug rk;X,Ik)R{45R% 7 k-~J5>t¹d+>IBS^:.s'1 E4K:ؒH˕H1wb3mR4Wf*SA3٫CREh*WjABr͒ZWF*gՆrȖ@bFY1rO>RZ22oPK-سfdo`F4fے71Zg2Pcğw<ɥ_V(";bL}S802͋b[9U|rM@-nL2zgtxv?goBI.h@'ZN IV\=R'ť Q0cSx.ǦQڙ5Ëg<کpbJ_A$V̜4rDl+1d9ѭ}"f8NQpYhwafCLVѶ^G_:moFG9Lᶷj+aexa0޺`DHqIu\E prVx/g1(}3}h-XݍE2fN|WENcUL9[L)A(΢V@Ip,UpJ2DZi5^O+J6C_ b䯯ʥPCĕB-4-Z!W1^s`eOlۉpAh58WZ%%x?ZB[^;_W2(DQL#'YE6_I*(h_ꔁi$Г_]MqpS4㌯yl§DSsUP%,/ ȋQnbVZjT^p\)G%JM `@oDᘘo\-^E\-^E}XLyZX|Aqx*ъd{GW0 -}e_( u(a e g hQP 5xǡ Z4w@h}A”kD#&pV30+ʙNX@P% |HSNc'P57ܔ SvH~彜9l7|~:S|2R?fh}o Z>"H$4)Yq ԀmX Jýt}[Ѫ4O~Ol1GFS\;﫠MDU`ϸt'IkO'';Cp2)I#SJ%סT0ϩ*2v`{gyiRb:A-A+ Ø!lW <*8/{c(1٩omAi"2Z6S|%A('upL"hVh9]cw%Im`7*&~ 58U;q׿SU4z 4[xMr#ݏ*N6FBo$:  rw!o\Q(}"Vze'UpssS ۄZ4;,9UyrJB;.6{;$VrQ5׏ O 8?$5!l*Аb9 ?)Fwu[}r9ݝ{QuJOie׸lƣJ -vMi!x~ovR5C µ_ Pd{΅T/KCz:Ljg:(AwNۂڀkBJ e pS"eCB 9No57(ĢD."+Mb什Q?C3 2{~N<ܠpʻ{n4]lAaʀOgKT2A]i>FԄR(y1HlAWb>ON]՜;t1k5❫i2Wk^lʍ3{7k>qQW Pu =?$ FbtG0(g3< Z+/3 fXg0 bfE&U [[L>]l0d &S.`/|MEH˭6D~+6_;7GQi`wÚߊu޹J@[32dSBcy}{Z5DqW[)8@ (1W("-i{eC9.d[]#8 !M0HgGOCڈLSzf4o^/r,h* jKx΢RORXZ-He򝧹%>YC$+,|zM)Z-}Fv8+TL[hL1ѯk7b":hJ<ڛvbNnmHW.{˔Bbm۽5[bt$ӑbS3[~㇠54>ʽ%Z:(i,9h"zMI!:MΧ$D K?Pdd@ +NqᵆB5IN6$+* wpQ!(y\*3 wEdgQcN  \)fJ=X;N>\W?BŖ,z?u@m3e$:~{j|XsK+OL=9UKVdʍZ!%SB% {=\\oa}LR&a \961}=oy2 k)OܷaB6[dS?/r?3}mzezPA2C7}\ft9Cn^ F2鯮e<5BQ;W2G˞-;ьsTGu+a?\]5:ZpBRcَvT;hk:Vd91Ci?Zԗ\kq*1D $cꔁs?P^kLic`$byiE.yFFW!rąǗ"LgD>Ͷ2Kw"E]wp۱+8CeI)+pؒ VJc,蒔\[k.KZ~XAF>*yŷH(Z|.jg?|E>XrJZINQ,+q̔-Z73OACц%ģf^h(h,g(+9G'!+=:/1!ֳ#weFqE5uTt7,ܣ zwm%c{p?.Œojt{z)Ϯ\>A1YF-nƆXiw1(OhFϯT :?A f0:2r\9 3&4^ F^bSF'*c vA1>j ʚ,^^WHzܜA9C2l, [8|~O91\q=ϛښ7SzH!=JHtpE,*3e% շӞi2OPeIK+HbOxYM-u)F?*:QUxjDKl,-JYu1s)bZ>V Jf=Do.4_e!j@t[V^z!PZ 1 jCIQ*W2 `b?@et1㹱A@3!(*0(Bpr` ;A$j`D @ɢ p \_ߖFJ]"s' YACgS5 'ZF}wK)[I/g[ᝬ.$H9_&_$ۭ-G-vbER6H~HS\-~.r Vpz7\ThYtIVU >ḣQDll;ѴމmR=rm ) aRbas#qX5 A3Ri*c$.XZ_7޳6tZc7Yu EBr.Xn1ǁp*Ɗ&LP9P_v+0[T@B[ۆ>@vh Yq~yO >h) *pL1"QqMJ`-0jsP⪚[&t:{Qj2༐j\Lo/eDm:9CJJB6;)PѹF F7(ͣǷʕY[_ϤME {1jȁ}W [Y\cUaы-.nG./.Zgۦm;>Pc$pEEOפ%5I!ZQ6JkquzkJzhJ)i}&ثzWPFp0I{yzzzRLa:A6KdlA&(Q$Vϑv:[-JlE<kQ$tgCQ8"lĤ5& fXv_[tf~O)[cSN%t_}kN;uq;S:W?>=9?> 2_3uT~}Of(s-h<&Mz+;Z!e8FU'Lb=x:'=)/yѾkGd'Gҕ"WFS1M2!ba#zdžrY|C[)o}M[B3@F39!% )k Ɓ E@Im`v.Nt&X(mubH) CrEq/x<\FZH'A#" "bs&\$M6|9vC1ֲ+w ''5$=b5Q}y=|k!z\YJjj2mL:7"69ἿaGa-SWÒ{?lSnSb*Kn͏6`ʘ6!6{g$Qd0:X,éa=-;93G<#xwZ{]޴tmpE(GWJʁ'= xo</]6Ƙ {^gb+#1p=~X lYG唏>Qnb^ etN1Ov5XJXDvr:$r~%&#d b7>^lQK{-%HOǨF ̙T< ue[EqAX6KQz!I~ cZb, !xȐP)%|69R+՟lPyh<)Aws;ͫ: Xъ*1'yR1 l|;vwJ2;q*~!j"R25^CXbW맦9S[E4=9 P&#$_׏雳IS ?<]hl4"E/;mE1#U\BW0,+r^;]~o y|Aʁ^o-)漘߯%y~~Lu yxA ~?`Vc`JxiJreTr!I*A9x4q$diW]ՖvU[UҚL2Q?0 E5w;e,Xyp-3P{i$,&ܗoO4_>a@9 ]L5)Lm%wDĄ#iq3!bo0WqPJ J]LK&k\͟q͢N]OWu:ǭj~܌eYͲܒh{XNʉ8Ƈ0|o,;{/M|U'6ŏ~Do>g"U1Wq+|p}ͻ)?}i~ZbI no󵀻d @(nHyqȉ1Xr&gw5JyVy"G%Sy969$=ICh=:{'iz}& 9Bp%>FZ> @Z"Ő2%ֲ Gc$8cy OCH1b()M]`rl<"+H[ӯ GVX H`)h<%dE'˖[5B|tׂ7HS3UN Ǭ?dӽ+p](1q^ezwUq9ZQiq{`w<ڻwз GTxx:p $'ti`kP $& c2fcQTp>*ew{6^.N,dVQ.Dݻ 6.$6*[:#b>qꖉbgME{1 L/N5pc}|f_^O=&qtwTgG?jskP\Rؒ&&:;Pš#v/w;ѳ_!aJoy- »djװ$YęhG5tHُU'ׅTQcM<5zP}rql٥ IzW]-3[l0(fVR1Q*&u e/6I+ԉsG;@DJ phO]wLWg_>Ǵ g*^>߁64Ud9i!%¤x$IQt[Fei~hA/*#$AFh'%VPƎ$ehR&k;(i!!!HAh;*TL8e*Dpkrv1S@J/e:O;<i{i/=ppi E@O`2!G? }DC#I\^%qp|U{1zEda(d}n ǎ?'.]rJF}1 e΃svs@ęRiVcJ؊PQzKqY +ykA%6?[w+ySK|)&-M<+D9J4w9*AfkeꊖKIq mjhU?EZ|:Gj#5+)Qq$ˀp 2RSe(朂{\%EYZr_~)KäD)1)ZeHbq٦'xUū/^~S K0! 5WB!Ǥ[LMbpgM}Q}fK]JS b$KJɐV(,5fZaXR%"<2 qjxJ bi46& 5on?I0x@Bnk.I ӓSR _g\D1+xc9"8=WC#)oW9rZq7/.k؛r0{wjNy7 )F#GbU` , `R[0yɽE9=^"0r-R nH+.+]B\pxcPLJ˫b,"# v4jXHBwY8 h[yqG8Nf`##?NüZnWW/liUtazgND4MG8c dʹٛX||8QMTQdoQCXtQi˿/3DcNIoY1 2EWM Jj瓙k֤ef05 {~ןؒ4o^9q NԷWe1V6'PB$gء@Rg؍'Ξ/_=*uyt(+N;F]qH% w&N)v;i+7ET=/F_v)m)%Y͹T<(%萦7~ֲnVm/=]o&7sMɟ  G * WLPŨ"lcY [e-oiu_C{El !R0 AkgT]p 6k.zVNZQKð#$.eWkIJ+ϣHA×O̼a<i)A$Gp0@s` c7HGUsٺm%zȶ!3 Xoح% q6"v Rxŭ7}jF

M ⃻ ݻvD囇_O.5!O x=no7'ۗgvrk~5 @Kk4kMWÍg:Vo&.Cy}J[}ܻV@I Gպt>]MO/@Oq#4O8񄲮u~ Smբeؠ[,'k8[=Oyp"(0hZ&;,=ęE"aK$*{\f %w?S!gZ2❒X)N PJk*[!G)3AXҡG,u'KAZH)Qsտ.]c- N}*n ʁAaY q|gZ㸑"hˀbaKo.֭,iHb,_qf$Hvӊl,q*փ*r#b-gy7XioPoAat`ܣ'ThdZ7Cq4ʌӠpk6Z(J ד hW/x  j$A;Z\M0֡npP%!BpO=ADEYT %Ԣ/nLO2LSrPT |!V[I!yJ@S ECE Z_΁"~1-z8Kڽ"{oQ ̿o;GםkQd)F4|O_Uh)(8 xLjA_ *Xp51F< 葩+rcD܀]IlWϤUVNj:p-V f H "f<X0oo_w\ 1!(ɉy@d[KzJbm6 HxZ4Mqjʧ$#߿BbĨVI!=->TG!Ǎ}a^[)T #U.ɟq,ogҽ a%vD:wK(8mbBLoOB}^>ky1ur?sԌ)5LvJzMFe `ѶۣfgbzdU}] <#A#RLd`NW+N&(? m[6tWi/_כWN1V^ͻO^q5FZF[sl7nȷAZ9w{Egk9:9K]= g7~r%Wgwx zX݃ r ڷA+X bJlH ʥ{8%2Y8bJDN t mp1 U[`v>m NZ21Yj="/.*86p8.8ahumpMn̒'-i7Lii(C E.Clzq{Q@𒣀S = /[rۑgBON|"UAjf_=g >Y1iWgc CGCggf>%1RM;y5LynA&8Yz?b {ۦe0iMW}O}Iwmî/rSLKjl'|zp4bW}п对 ɦB3:BDZ ȍ xP.[HYˋx3!x \Ϙ8^dG;8{sg ΍xdԊxpTIP3$\u @!jYܪ?1'ٟg& ?@;O$.-0͓bRAQ U*v`YeU7Tͪ#WQtR~/K /.@1D`] Rh=12"BksN{eZ+PpITaw*23:RBT >=ajr/ϿmjRsK_GE;*!\^}uy7ʠ{{?^pԯSNo՛y_QoyZmr}ܛe*X1r{NUҫIŖl-=kQJSrTIB^)M|DgڮHEz[hVY4;ShcFT.롐mQݻwSCQZ]||J>N2som\yxhA:휒 /+';[᏾۫Og_?3v*0nWEs`RЬ. /]43'g'e<<H#5jVs\.I{关9ol[ QMg1R *Z.!QǷ[}ɇX3ʘz;Kӹpqs&/xmvnӑRsO@(}<\kN?Yv鲁Q]ݠS*bYs &GRQ+ i0L 1}׳ؙE$~M۫f_~w>% ⷫvtB,e hP3p1vuWzތǯrjA ۯgHkdb7t\@Ĉ i@jUpc!×ec%3w Q='#S l=tkR2ѷ81`ud=cZ .@N>d5 /t'Ǻ}o}5{y`2YXQgb[ / #޺1J|!\| ؽ`K`auz~9ajo~>e51y`譐ً!&3:0\Ŏ~89R@H ΒsgnW+瓝ř7qϐAJjfk69f' w,rk)VfyHKC-`NjkvCMg{796YQ!Tj2Ti37daܪgNMg %kS`v; h4 M!N`RWِz <%4312xfÜ'l8dJhVyyOM> 5&bU<5'O^ޭG.gd]RQ ۨ:z *-S)NCZ%#8zN>P t{;V퍲ިߋ@A#@dj"B <#! hGhOtAHcY!hmA~y]BfYn#MIK"$5U+lQ@ݷK*FR!uB=G%@|cVB/mF,ԅM3B&ITwTG'GC-h>L;b~C:JʔBE Z-DX8qzcPMefm LJ"D>+ S[S]vGHNjz^Pƙ¶L *(l'q+6UNWвM d< LG9%z9 ESEFPw1bDureiE*sΩ}Ω9Sv %tֵNj:pѠE\5a Z&N_yFy\V[orT?z.NJsʫ28^KIFTDDPqFf@0i 8FRk[ 簍Zέ69f-H1G Z> E72RGcB2CZ2T?Wq2)=#)c${0!Xos1 h BLK:"*SZ\Z1jG|$.2MDQT*+՚ho.>1gKnj?ljv3߬ ~[&߳n3`Y=| /8ODZ˫xszzr+-mVҿgN۫?oϜo 'wіXޡ? A13_^??]E)#1J'S j>Rv="RzT^ɿǑ|fQ 0XoQPE {1tz9Ϙz(FcYj͜ҀpV O0:[5$䅋hLX|P^)}v=N]ڭxڭp-)3R1wkK:yOY 8ɨU0T3k޴R֊,©pw6;.5-b3NC +oXI8Ǟ9 չx3*u-ׅ};"`alk,[Iϛ3 eĒ-bUb]ڸf:Aw|YqrN%]m:nS^Fl\I6M{w$>qb9N7iWFn(_/|e %&qУRZc<},Fwomy]Mj y"Z$SJ- pvSnN;hyV_G75V;כG_ rRͳOzyO+D)+4˦9rR8l6ȬZ9$_>e_^sθneNDZ䣰P?xiMlVBWR4yHqE.eWK0+rk=[L'jLtkN-܄ wD;=Fbdz c#Q^(cHs. NM Iմ6SoCW(O m#yB\!#(=RynE9 ,("0ōq>:$1PmJ ABUG^mդ bX+l删j=,BsPxcDh& f;x`pEM_s RK SBQP֠ (BFqԲu>@MhUUVfFuUKҘ@N҉Η`>n.%U@UX L2l3^[ry<6DATy%堓w(<њ95UApTFHGI:U'G! J&iMh(gZ K0--i;={8*> s)2L6(^7R҅SXAiq@ASRLp8 AJ,q>8]p R(XVCZ/+|(13aF<8uEj*^S='I#BQS0:0aEX=.>ʩDMIHÀcJeJ E)QrT z=$ ͍!TDx6Wev ҬJTbD"v2G@y]Iyq{QJ4qkI1bf jݗBUfk?g=m7mCXh,[4Dۦ`jZ3ʣ7@|qT){!̌$$sEzJVUt}\.chK'k]Fcj}ZAIPMTh_OLs M0^m 6J.C/Uң@A j$2S9#2Ϭ22C!w@DmJ))LQlRҳdiY^l>,^t[(V7W.T|5/YIxƠvo)a鲏)Tݧ2a4n^|Xlr*gEZA] P( E*-}4w^Am(zm*8hTb.XKVb&Jc)9Źk-a=Z#x-Gop.4/mERQCDFzF*];W11`ʕwcı)Q yv'ERx \<-f)8X8Tc'iJdy^2W_iNӬvV1G3ODj?g\os3f﫬 fE}\S0γĦVVFiU^Ƌd*O76h%́,+$A3 $7IqqLk& 1gҸz_Vos)Xd &5bdaHO͕67B犑؊"-zS1z 3!GQ3\$r ˨f9A 7[ f fBdV"$շPfxZ! >Q0der`n#uMQ #vdX2[ɇ[I O]=7@:49d؎Jql8됇dn-OZËL^' &PS6wjYE٥gkFAuLL{>X8T?gCkfaDú=f p6l(pg,.fjŁB6__ $1ejUupkI>[z(C$벐;<5K|aid֏_SeGΟfj!BpV({qoRo2djT-ќcl-^T]~ -eRL, 1DqEXozxXDK8 Ӻ2F@ZTcXk'梭c tAf XE{dT* 4oCQ3BH{-ƸSi*mxU;^J07 yan?3ƊJ0-t^/;ax^e{Z{t`݈tFDt޾o&Qu\MlM1U+ zSvo#('s4l,@Av nJ*g%SBnzg_?=BU,E;S b+ {Z Qe-`:49Z A. rv=QA!ϸ:8FnSݧnOkJIW[>I|[ܥ@qy]"}#'[i4mu8q~;IWnj{ >wKF}gX.z=wڑ[E$z =g=( B--t \ y`\M;{ 5B5lT.3 +"V|1 =+9؀ED[> OK ED9M[ ;D%E 2Dd Sˌ( xz]V M^p ".uߚ&_t` B~0~ЄtdmtƞTd_YK!=ZiyξRfiݏ0z1DhsP{džtLJ'她 䜝.gwѺD;о4[lCY}\`j w}SRX3Ҿ4nɨ-w=/0gZRyYDj@Y>|.\σl-r,7>*u(E!5 _/^S#G` H4ݯ~.$?_ u|htq?Cz}a M+^yWޗf4>eZӠ"I(D:<1-uF\ِKԚ:<!B&n%B77sV<{mJviѨON}.!? \r.y?^B:lNbD5BYz7 &X?_?fmGrS?|ui_OMޭC?Ƕ]7WPmNH^(P,@0: %[0Ԃ1s0qS})]O+F~2"@"M&;3@kA~U<{쯔?yw0JvDáh-^(X>>.f~6Ocgll9;ɖҺfpi[#xO%k'H:Wi-t<.hڥ5jèt(u8eWx$rq!3 >{I+&~{kEu?r~b x^d-Lyb8'pQn<װx_^EԧD[.o)US:UF+%_ mw8|D*?]& _'!̏^+/Hnǟ%~^d* iq{*YċT H0TJ%־z/#S6YSY=+e>s+]8]nb0[܅툭%=[/E `r"=i+cmnf#RjAz&l'n)ҰALkknݍ辠ob;Rgw<%嚾)Hd=e΅#[#҃F 4 Jsfty5l0C%(B2dJqcAk zZjAie ZFs՞7g  p9zNJ%:apv0'Y46P`(H*zS !BDQ +B8iH ?bBYk1#V.<kBK1!W8?- $J 8!*pMaHZy.uC=qц3! S N*üaV)[Ծ¿TY.{M[}}6HkazD,L볌Ljwpc/Jyƨ!l=ȱU7lgẕ3?BL9)YB!G OHsm]1 bM]tC(of]>="̗pܒE\ayRD8N$`۳5|3Y?نym˼J-7gTs?N=ZN|Ac\\"1 5TR6:7+G;&0 \3b ".ǵIP"(ǝ-ΐz.Z%p7 =*!Urm!U eCk fxzN*L)ݾr|Z ] <ٛ0,I>=K wWl F!̘߲o?%B py(2/_0)a՞#3I<,V'o/.ίv}'&wBR߄r)'f*xZp—6SSR.4?V~{ucԗB(eS (!x卩wكkPFi҇Qq7yI5וT/4htWϲ15ys4W'kw">L^M9[|nf~r5[zoq7E/}ƒ4ip? 7KhtSkO>MS4y~̍ ts]5e}y)=FYXTX)HBrm%ScHcnm1hݎK!OpڭDs[2Rv$PlWώ ]!|Pw fcmA1UҶp=`C&_k+y "}{Gzo<^%`qR cA,.%,"z]w!)g_5|_k>[`>P9guF{CqJ-ꄩWo>#}`zWf]q[~Q^ja#wR+ՂJGpLʕcw 1Xȟ=#u&z&n}#+'rJ"1@-!*a( g%eHk-ߪ&0a\HM AȬ8$e6jppH`B;)Uz.XυWT #ZKb𱐒9 (xTG74OnzP!>cBc^h@$"n@b5WA\ħFW,\PiE!(4Ā %/\,0؎@ 2pQZTm}s<3ApD5+\5Rb&VE7/,Zrta0_G=>[lC,rխah1tx?T y. #uZ06,)R&tSt3570CTG *מQ)Ϭ> Uite* }hCIb+2JDRr=wE EJHEar8 xRۣhzmtQԬQZjP̜  bRJ*3%Z<)@\8 1(QJxeAixЌD{-4\Qvh [ l4rBI] I5hwSrp+; f]DfQ#y (9hRq {B݂? ]]B_;;{| 3HN婲Z򔒂ʬcD֕pSY&4/_@UڄPH9A; 'y!%q:U3pYI!"1d9^qoB@oYȴ_ae@3iR$7WL:ӕڄW$Ԥʒv܆HY=ehxG&)7BJLn,qͦ0#ևeѭThM:AUԨ%)O S҉Ipu_&u3iMY@3[Cg yAkQT/ fй TK: 'E&ݞIlA/cen>PavuM<\.ܐJG2iizSט6;V=hC&Eƈ4!u/=?+ gr#Pʾ4[e_ƈa7t]m ʾ0B VU'Ƭubxcߏ 8ѕoLNͻ4tR 1peu99ԅT˲ݟxtQ͙RRm7 9|mqߣfV~o{p{D.o-ШA`i3@6ŇNQ?c@/# LG[r! !7`x,y0f!\2Z' pV+F jc 衢fеS0 4jx {0 Iz)dA ژRlKaq8얪ZBwb҆j蓋ӠX^ތ@ˮ.sB#fDIy0 )O7O+~%HhmP͘J'kc(gnbk8Es8 Jo9T!9kVr%%㘦ĉVlTKƔ4ǾqkHU?80qeo7)cգkw}鲸*Ζj^QU\镵/q`{2\.x7 {Tpӽ9?Dkxsf P8:Sm8(hOxJǾ#Irx`@9ϡR8srUsQikÎnlH%x4F1Qm/[롑urnd%/]~?%nٓ[vtBQYi;p1;+)Mä͒eIэ|d~/y:A䟳ŷIq^xGlWz~jͳN^nޣRbj,_h%iB/~L/o#A#+1O;R˃rk<ՙ*>ܝNYf(%5s $+V2SaLqnoƠz2(G>=4' EpbEy#Xd` hzjLaO8.O#x͍54YieoXN<21$/G2PtsBfyHMPyܼt@^4Kb@]q· 6SWņg<)!1T,n(?R8-|7ɇtXtY/biyih"ps{XfQn׷WOJoZk2,xPԄ@g_ETU9B$Ҳ6)QN_/v;\ByVW.qG3_J-GaJh@W`1hzBnwy$[2=E$p!>72ӗ1 54_z :ۢ u>Jq.{C|_RlT·j^Bft_Bt2x@ˏ[{g;u9Q7B{參5 d6M2WA;!s8{ϣS>#>qNrW9VF\eږ㸑023STf=}؝X:v \%)b7%yQn=1('D"qrh]ȗ#[Z|~dK+zx[˧_J{vnڒkHVCmʹCh(~r?Q'E`oB zQ\y7N hں68ETL=zt'^(;h|u`#LJZ\`jN{aEe@xԣ.#0Ւ~߹qd+ō ŠV17(X$Micad)q*ݭB1AP |"oj1(!FyդSݺ=4M}P}Ӈ|&ZeS;Rx7Z[-%S.mW)r4V=Rև|&ZcSmC8W oaepu[]#VBq])^hSxt 8dȕbT\)ulz,L*zxN#VJv76 $ UJR+%OnR\Nxe.;q`t<\PD2] Z Zek*%;nNgzӹRZaj6m ㊀VW oUjuL)ʀIɪhuh EDS煕VI8Ɖu"lZ5Rcޱfj|gFʅˆ⁐HBH %D%sd?{vwU&Ӑ{MvPTG!;pEDh=^S-3u1hA(B2ЊګX0 {$9x~Ejxot9^߲t:fid}Sxއ|&ZeSI٣wŠtjŻ1J:t@K[hMozĈsջZK%Ew*ZwZN_kh,7%*dx!9widYȒ7->` XB/6N/MhMrR|U~YLx.eR/En1mE(iH!B[%_9sN[3:n \~ ^˟}\L>|mjєZj['CXETI/U*ѵd!Q _%}hՍ>g QI?IKgJg=Vb_I ^֓I*GVk٠g5)R=[}i>jD@R_%1v *se:/ٵk䬱FQ4̠! ׷| VIyֶpxYڕ׈g7^Q:uvl}[86=Zw%'ΐo`Tk[#B2Bq秕~Jj?ZF:h&i誌׮8--;{ 컫ڷ®׼wk?ynJ@v9`hWm-cJ!>~XSH2Ô*߳{.I kK?2lfƳ,ʅ&I??Yel/wu` D::x+D8 PS0%1"?x~a}'U -~c b.++rUѕ}B3Isد _T-D=7$P)S+ӟ˄6z*l}с]D=?n*-= ![&sT;Yk9]k.?^s}}5].Cg%`ZR!%pty6%iܥ/QUծg]0) -^BA[Ĉ]Z׾ԗmMdjU8\#1T-{-+rHk:[R MXT~摥,"zd*zxKU1T$&_̦7zt%6n$?bl&[}ttEaI H5 WtrQ[ΘsjPpd=8lOEAaf`(qz':H MKd5'>8H?CQU6.`LH\s!{P#G`sTpS K 7P!Hb*:BiL#I.$)14(KR3(`E()Q\Z&@>I![̸|J*ϥ4-O$:Tmą'ߞU9d}&S̟pnwwɑ~N޾yC -h7~|\Ms?"ićI>s{yW2@տg{iΟXC1TrUMKIhPEP%yCG,rUm0yΙMfP͢.d.E+z2b2<P޲ gۍ|wq6-LCs-ZuQ3T R%ysO>y(p M2c ty)BE Uԥ'&WQ74!k $L8[zSa-S+`}OyfA70x.Ūwdr3Jl~Xٰ_:535iaU48v#;MZ4Tr,[N7Z 4El̺ e1#6o}*YgԯVD2")*lr4m x;69,S›N J,(,Rdžobqg3cc_1>74a vJ<`_)FkD ST9KAkƖJ^\z s?>+!lւ3%"M]o[淭)׸S4*2sTc5wF hv69]lWvGl6sy_ 8ZuE)zAT_Ds=#}Rf nbGsA9,/ٜYH1S.@#)`$Lsx\|6.txmj1KJ^Ձ D\*HԂ\PƵVǥ$ZS=oGhm.υ'A|Ou [,q> t:⇐b"٣ (B/ʒwmm [ZxOݲI6/R@R"J Iylùp(OI,q8h|RәakƈUA,0%z l1B:J`#*oT*$Ô5J" ڹ:X[X4#&`HE`)NIxQH{rO_Fu(2~_*FؙqÈX`YLȃFq SaCy"z1 MD2ULu:;]S[~xF_r"@V; 8ɹB @/R  DB++e=A@dm] |Fe@T8W7ԈIpF9s{/˩ zpiaRɜy&lp#2Nrn6.*2$"k PϦv6m"FAOD$B|Êcra;O0J:иORZs h,3J9@{ jnw^?4CY*݁.B`AAa''?nnMC6a&;jc,qN(=pww"l`8xpusx  /ߛ!ML7ƺ:,n^Z`.VuTסvO>o?R \+. ]1:uY\9^qAL<"5قfLʽ[\bCFcJZm 粧a)ZiOp(vU@ ';# Мrўp~qpC:=Kxl)/nbe)`x͗Gp+X3cDLT97F;hv\-y⵻ %݄rp~׿ QRZsLr0rѨ;Rw:'PݑĜ,f;iYBsӛi^ ~7x{%66!-Qa!b43}3 g}\cɍJG"8.dIȍ  taL7Ww5:ҡMpaKJF.tF=uU"b#3])NO>YLҼO-*+bfCt{ i+XӚ&tGѨVњeo&]"Tg, j?ڜ3=vSZP)UM@ ܮ$)/6`.55:[wg*GCNYMN9ݳ^Udui%*S/\k𷧫Hޢ2x?ob0h0*P؇ƺ3> O}}_F}/f}SP阔B2H{Scy 0bc~qNuWsVEղAISsւl`vm?~4%,!3Vة4#e{h=⇐\zc$KEr<NQ@qDQf@+G$(w^Z KB`lXU,TBP3Dp#UFUuR_VAĈRPZpMhJ\FS2ESĐ=̽A6JsK Z*2 j1`'#%jC=;X̌!"",.)v"qm}008ChPci ֘\,S=uJB~xUߡ2_()?W7JTr u.qhaڥΦ?9.+F(thJjQt.%=2J฽K+K['su]Lg4ЂsĽVʁs{G/%!IQh \Tlu-7! !b 2/ШE^w:WEY0  l|A&>1'$ H9kHQ2LfRS8QW6)TIQ1 LձfeZ ] ܪ+`=(Sq`T58MVX<1RllTEdjj +b¨FnrC.gf*p%Q1Li"҈\l[aX:Kigz HR42Aՙ hq9ͷ=?x%9q );sȓ(zH-^bpW&Kjvp?'RHbS-d>Bћe !2zu79@ᡸXSZ~UN@^!Bkr$`$Psj vNT!d=.:tXY2T ,QL6iV1WcRZʝZ22αC3/8HR9^z.\6‰|6"qx:}B'%H ωEűgɜx.7^XOP$U4,L}R]:l%dQA^0 ǣw7q"Lx}Rz1'婧AH -yN7=Cy}?&5~#4 >|F=!zq'z}gS%@acAf{Ș:ht 8i I];ȡ YɽY?/X|0~uDN7gǻءo7K t 163(&>+sʥvjf뿦c}dvw%|^0Lx'.滁4q_(!O!GɭjYQs]bLBů>4|RaqFЉQȠH1Bj+eWRȹեv=^? ? GqCtzm H!bj{F!ˊ3G ,L# 8@rcsBo i%cgm%⨦,Y3*j;nQX܈&J䠑:Y'AiɆd溏H!6`Hʐ&'AO%>6 SpN,ggx4XkQbXLIڰc&0K8Ȗ?IGX͐O.o VX CBe4Y?!]﫚CUJs`墖-85M]9]NYNUFv.N ic4"`3hTExḺ%t{U3'Q@ss 2Jq0e"8P]@4g6 ʙG;ٖG[Wj[d~fY"9#{'<1Y c@[Iw%U-zJE@O mW284o)h|?SǕ7Yum)ˊ8iC]eGKU6VmIVkUU1 jK3x,׋l$3.b<>Oӱ^a[?kV]CJIT< .Xv.'Zl4 5$=TM)&ڧRUK"1gO~5d`)֡b{_՞_;7P戜Wf2Di붽쵒stf\ FwnGo0 Ĺd;d]9/ـ)j>۹+=ouɜAo~" miC.0r)=={e -[ƏlDYvͤ<3Av6{ח}V2͌:-nt~MӐڻ_QR>GJpvu.*/ϝ&;2 :M^τiTfKhӊ =Sz [ Ϝha5jaG>OH+ \0 ;GµO"Kӯ{NOy{l<)K#;n)@hvh.A Kqf (7Z ef2f+/R~邭Cb{F1x<&{ꩁRgYj! =r*h >>K8?ۢRԇ]қoXk:b_>]ktw><-D~d.]_ҿ3Y/ؑEu7`uxnrIB>v)&o6lbOݺb#:]QNOimڭ "ڒ$sDyI^c4'F"3I|W{ɣp_6lT ZQ0[?|}W-ZM<| i.!8Q][WpOO7·/d {ٯS؝mo7ݸa#7Ơ{|"s*[:4#%la0'WWx ԕn`mG_tjt.UIU/=K@0fwɔѦjy-r5Gx#΢90\_̌I Co*rlꘁşEbE9K0>^5&<,4)L*fQzP.9eti0t^Ʒ>߿} g%XR0hv*:,M 1.dG IL`|qHY| 7OϏ?f [zgq:@Kڥ ݂jy/$]ځ(k-v/7F Su/՝>jgVҳ>Q۫{l_-CsɹWT娡Ckݓ[mߑJڽe(VU1tyPpMə>= rpGΡu?vnϡB;g)P4dbҝ9߅j;:865y}<'lܹZxԮql54fOU|٠ ۻb ;J8O$䃋h'2lvzC!8dq, GѹB2wEEL!LAnFLwAt}Gv;ǢmU'ݺn}HіLLRsda;yQ}vru-f.Υ5ʈ,@WSf~}܈L=n犁n MahQkM<ѐzH;ίy ,OΒvaFˈ,9~ ,dv[L)I/PzXl$6+rt߼$Ո[ە%w!-o=wfS|XՋGuګ^ܧ2VX֨Nl\|\n"^sz2*w[*fVx"y/!-;dz C46 8N.'cr yW 1<}]4I%ΫO}J=߸!ϿܧZع_W;Y/7šꥮ̝7 hV~O?V(E܈ќzExF "$rsc FAko:M2r12#{.Zx U@,HJ7iEB2J=e^"Sc[ۓ Դ+rXEJl 3 G>SQ@9T?d?/V3<ֲdGۘ (ӤSlÄ́ITgh& 0-5N7w5'/y}J1GA rY,:m̻P +'PO38 hD.I-9g.@%\gQj/QAeksXJI挍*Jrg$5ϟ!Kj>:Kc` *YS[4ꏗTj1 j9p+}F$>`oޕ5q$鿂ˮw`#+3VȁIZ EV4:4/8Q<.ߑwoxů~UzΆ!|z3䗓l!:ﯮ.oφ-Ί?N?߹2wt߹]@ < @[2볿p)`*Ӝ ]|G7p"|wё P~Y!O~kB:dAKoB]_T6hPf$z4tGRYAζ[<`(QGFi,nv f@0K Ȝbo#Jki0%S`^F+vo]j7;` !'7ڴXt].;3xe1䘒N1='ƒ 7 Pn5X|n)vԊ 5ح*)LZQKqTL8^8(qj`%CNw֪VA1Hq؜p9M SppH[( sǜ:xn`\ئ$Kl%,7&FT;5?)id^h)uVlaBY}I3 CpjRmV$&4h)v~ 3s[%%0N3Ie\ 0L(sU^ 5Pm[LԲ)IV|WD~қ,ȥ |Mw19BB\@00d}r$%I5Q`"1Sk!*FBS+4nx)EHs]j8\BQnjТAZ'-FJQwK !#}$5#DYT]ɤ8S/ҕ}j~J5jb,1L5ӺHm@>}[ Wr Vk]>@7JgZ-5(4ۇlv`8aXF2JõU:9q4S@t&Ϋ CHj`a-D.oZ}a!H2BLH{yi2bv#JaðgA#" " D˜`FCps7's[ {orL뱧du.Cy ΉG#IHSq1M+tiR1l  Hq"Lki3 !jBZlRq0SA^^ĖwX[qP8%1?!~|ޕ@ L/&'q`Cbiƴ3;UY ѽ3ْ0Qѵ]3%|>ܝt.LYEfla`)%,簥62\E4Syvi6(C M[ l ΘC2`Ğ:If%) oثS=$ѓ|AjX܁ZEy) S8jҫ]}6;F"(&)t\MN W*8 \,K4T>u@k_C|LO{c1򚓎bL/pCxCwۑc0aF^_iՈ6-juĥ/#sb4&@7| 'isdS*IZXmL4R%+=֟_@­JIT6kԿՐqukz1kIP{IjXA~Yn9hU\8r@;-VcF4H,QK. )3R»Jj]wZUڊH)\9:T:MޙlF(EMGu 9*}(32P6cJ;_RX>Hv)/~vwМ(G&7&1h0R|$.l \wODJ`4CLI2a2H`XWgfxʾ$tZ,SD^^$}`M[bБB+qJ]#!T!!h &aB$DxrB.H1{(ȓ~8t2,2,r>K\/[-\"ZR>93\1MùF2<\@L,#BlR2qE.{P8QqgyqE2xe6*K,a 8!SOp{, )u[Sȥl)]k%djlci/ϋ<Lq6g0zAa OBჿYtgX #*1iQ^[p|LfawUo/*_J]P%`G?3y{fsBCF?7gϓ@6135oFܬ]z4xu~y3|ˌo'kqK1rʨyTpF͞ys= mC81ʡSo,{43lo;gf2(k'm_{viXgr&%bc=== 5bj_OޭRIo  (1,^m0=el8s|xZl𿷳.rўT> f|1dv;vqSy.=c&yX9_3r"ZK4q׺v+מޣvAvfi[ڭ~5Dք.S k(P']jT}F8m:X? ? ^ڳoݩSoթk]{8i5]1kL༔mN' +bŔUEjIk(H*Q:;uw:Tms\G'88B6TV1"߭pنns27Bs,]t|JRڥx胇u~?38?N/olԯ3v̈:(gE}EP~;Rl?Q̸kB.1u"uy"c-]D-0pҠB:x݉)Pswe2M-#8JcG#4FhͿ^xZ+rj+z.ŮzQsEnK9⭨S" EUdSmd8ҸSP1koc2Xrs8Z x[W/f ڔSׄx*%,iuYں6^#;؀ *AO@;`vUHh㚆e z&SAy*Xil.]NԄ]@5EKwt>N>ɴ80̷˻;izgð8SȮk.'=D?mfb1Cy!6Mc)<Ͽq?.$2e+OW?%:2X EI9'W''FF|ŏW <MԪO%*Sn^R=VJZ9DE!Hw9G@5Vo3[a?1z Bmذ%RVhKѠ>i_*'T !3lLAXyJz:أ{v(/Z[J1U * (*hwlj*h5 +;lvvǞ&Vaf\ҞgO\tt LFS#D1z5GtU/&׻hs N:ݻ flu 5iw1”>9J?iVr_5ZpVsGw)֏-Bjv&b$ȞHT3$`[cIXuD ~OYV 8DԺB&'+EZj19[[<Z(D8@,cL9Jd@L5۪)e> nj%Ni-9df Z+%MAXCB&`*gι̌#4r A3 BQYO`O l0g֢@ASF f24\a⩰"^E짪s/"*_^]-aĸ0a.+ j-ƪXˉqEӓڸEuP&2[CXsʚ1"W]-X%{m@Y,(;wq@89',c%y% >23]Kh"'D@RHAT;M7t EMI`kp\@A$Rp{un S& :yN@M0S?{Wƭoݮ3(M4ӽ8$'qk^YJ.HHh^4Z0ZɦQMbDxu7R` ODNT,DF?QC87ƈZT NF?uZ,*ڨ#zŅB1iJqW8UL ]s|m\pѲb`Va 'q@+NդpI j U@4pV6j8@4P8?;+f-cJ$Qͺ)p8ȋuh?63}gQ8.f0CfnzTzJʉW8fD0^LDשvV$ED%jڈΎ[iS7<\m1y,EeK:) JIhV@ q`.b&wNJ&:IO lArwJ5"|yvj^R$/WIBaWCJ'#-@m{>9$ ? Y;:è$|Dtxvs.g3rI~SRx^k)ə IjlmxEx^LǭlkTǕ5NLfCr_ID--&o87+»~D~ j{yt4V ؖWV5Vho,Z#D5',U!|#)>_±q*n=iF*UxD0'5eIM^V0jrH RI]Y ?s B Mu補ӦhFAœgB0Hiو>G71gmN 8N<#N,8/eJ@ 1ǛG5gry4]yRZVǐN#hSUFXh]V.eF0.~G.~Tx'1+Jv^c,/p̲^*Xը0u'=pT㘣!ϔб[P.)q0cZ! oE|FyjЧOZXW 1·$v64E;fp.X(.F'^7Jm5sԉb5YJ{C9 .P]-mhHUr^;^Iݨ/QG/%UO*ʑ%j)Z LR*e5_5+6pD|BِYvt쳱&d5쏧۝ J*bCxxuvq?Y~_M0+` K1f>UcDE-wVLv!dGIwݻ:)-T|yXNDc|s%8õbj@)-l{ySq.][8]KO{̥\]\V=#}Q󞼏 v4n#:)gA. b~~ WL( w+uvcg]̿|9}SoK5[j4wF^Ghs6,䋻$TӾzm)OĉI[ʌ'FhYYkNN\֠:ȁ\Ӡ'Y a/`>_/Z E56R bC%$pƴnqf/8f`vD0F[?|.(fC&(:*`bsec"al2%-:$ '(--}׸ j_O{JvUf&en23e hP0Xk*CEvNps'EF pԊ/yB]c-4h{.ǔV@zDyqX)sD7@JNJa֧C:O@uͩY؈h\As$h(u4"v*U^vv֦ineZiJmP!A%1)i*S!0>N[E򺶷Ra3F] 5UKހ`/߮$C\4-Ӎ$]֬3k$;9nT;"譗+קH(Nqf_My`gooM4 Ixqn[ъD3|f?LFNzdjrh+hy/jW6#Z,m?ziqoKg8s*S"9:}uEz7C~O9 KO@}F#xgY&sSC;*\eoyͭR@G6 jѦJܹ 56'zUDﷳ_C`imFC˭NC1g%ʮpX "܀%6)P Rʅnp,q`a1A]zYw~<iCi{4,注N-9>g-gR81/ǟٷL6Al3IjRei >Rg6_1CU:NhsO@s8#, B'L^^Pzq8R  h2NdzIiaėMC l]/QcbӜMm^4B+$G-8cиFs'_މASߊ_6ftՉm_b8718 h89I;;v-6cAɸ!8{rLia( 7AJ_'=_Qn;]7S(¥] Wi.i8}+U1R[1d턜Dt d NI>uB  9$'5+>G٣^;ίds=|:Jv]^ݻVhyD ɣ@#ّGH0!1ϯRǯ?u[Kf>0OV-בtslHG44ƈЧI:dLd E~(9>pȝ+.cqRb茻<hCiGt)Pc5k"5=!!570v h!'@Jy{s)ZwugKr4;NM2Aār[8 :b)]|.x_.<_܍hqbyp(|mf.,ؿGu398 9,.L =>yIKO4䙫h҄2 Gݚr:MQgt;<F[&[U`T޶\.y%fP_Dv5u4[}uOoߦ.ZeU*Vu;tѺ:X]q } *gpjWHQ OpD{orQ-,j`j3qq6 -9e?4J @ wZE7?2C+ V~]~' dؑDgW4EIK5{m,SS8wC%OT9I$F'ѣHƮ[Т\!$ ip/ShLѭ8K1kňcA`GWAF {uvl! yۥVp#h(X.lck8Hr!o6`Tt]z]ꀇ wM5Ӎ$lyƉ HɚDsnhb˩%?O`pN@7<ʜi50 R'˜;S˜Q*7yi96r9d\*GMn铲sZ ?(:Sܦ )Oa=e1l2zˡxaAŇ8h ]Ћ=^G ._vyoBqy#ޜ|%3o+{7y(].S>eAA%w( ',Ɖ(u6f˻FW4U5 j1 o2ה) dL]ō+ g&'foyƓdT]R}vЍ3ÚIR޵q#EЗ#>E>\|aݻ/{$V"KqE+Ό<$13 FlI⯊"㢽smlvms ]^!ɁG̟l%j98-k$؃JjׇOSwʂ+\9ʵ'gf'cдj[uR"ɿD"emZf\.3o\b2ѳˋ|Y>5̪?ϲxeڙR݇O3a&еacA; 7BBH:2f<>٤qA\?28Nn5! - dI*#P&hJ>Q S)7^+@VZH5q.AhC!k RMcT~ {M γ$af/1nk`t00OڃyOSnj-86;yRvy.;b:»F+҉,H#M6Fx.3{n޶Ht9ŭۦQ08,dB5eK7H1Hח⑮OO.ߛdk'A)t?0nvNg__g\Zꨢ vJY6:5fp+!p:AY|zͿR:"ŝ  Z|Y0lmD{,I;' aVڽWgqK{,;LXdFޫ8-Zg;KQF3n;e|q|$U}++OQn2& 9͕Erߥ%}}ch)XzHux]5}?Fɨ=[zA}ZLT/'X++ :W}Gڐam5GA#X7xw[@a3 *XFuj;V J"r! nnAkV Y\^Ez/2tc¦6{>hRzͣ.F֤x -㈆o};4Gki'5+s}_,gR;m*x=*,cYγg;oXԸ`R4d>D٤Z֍qЇMcVVK)<e}_ԫ_,gx-}|WcڐClON yZDܙ݀zo:d| ƫ;g[ޞtL@'zhM7@Wj~}1م$jשnMP'/#Ɔ `Vh/jj<: EB&׃j%ƗQT9R'LʻF& %X/-F)cyD2FoR-LcPd Bzs:g)F|6&8xAԩa*(BVjmrӲT^ 9S2JG^22ۨEk0pVY!6]^)97d'dX^5ay_)m;O!XTW:5?`JHmi 9?1I͈6{h<'iA UW^b/ycT -.zAV<~UiR`=/UU3!bLSyܰ$Ww| T|:dܕ4l#Cub{N;Nm^DȾߝvX> DR!Q%roȖől! [ȘgCH]}E\grup][i+n-p`ag5Y7pnx`@ ܻ\+5 laG \}w)g0.eSPx:yWbFY6iJ޳K" E\/ڕ%1o4n`f )&bM{qӝ!3vBu|쾜(Y}ʣo(̄D)]c@:nk纰(Ge^m?ڟ<}4L6%`oF>s`]U|;TzonN>uR)5r8st{EtD ;e^2Dp^>_=Wd|E}E=T"4(Tt8ȦT%-jԐxK/chnq[%bY=_ 2r2ii7L!fFXM FKa-,BA&b 2>lXL ')P68'DmAʴ.]BmdEK ( yZ&p7D+@EFҀ=ب =}P9~ %ACRBr`lc EĨvsG|l-%lMoT <7J?;*!gtg~lMfBCm*fg'k'm 6 .qP+/M8LN[& _:7MNxR11~g\kgZK ‰Q/ڨb!dZ"]Sp]C% $0B Z:e)F!OG,B"ID4#q ]q nKҍ42f|{yo6fwtm6f0$;1[~$ms {tbLtHמgsQpN9]ly usfLx!,v+{%_.p&`VYCD&sl.] ߠ ffXu:#xm6 XBgXLM[A>Ch x7>@kFiy[ /xm6/Dd^p5G+gOkN$j_}g63Ey1?y>k.{kdiGNg Ϻ:DY+ގz+Eu&dbFŚ톄U?aGLb#*r&̝ܙtzE!aЂT2%0˜( jNydWN?QtO# DjSKRv@FB Zip"3Ϥ>GO}(ƴ>Q_?m(v焠plrj7|LGͺmn7؛0@ݪEp=;ť/6Xی狻#׷7>'56Q4PWNYxR x{z=ϖ>˻ gwXZ]#]crdgY%lG֮~-,00B^H|>hG;-'Wuuo5TSJ9 t*_)jv a$We&lU=UOJʊu$zQ$k^$M4Qh!6Lcfn%\+BcڟL:4IQDƛZK U6JJBc_h M [ɇ$Z])vhpWV-R0vV+ĉo'/}@ ~ro&jJ1CpcQHM()_5unu&%AUa+bLE1 5TB-[NRgJCs{,gӯ= K4PC߲siihH߅ۋg/|7ey>ê VStE.׀(vՍx^Mz4xWw?PT,z;rRϞ8^#gwן6gkvޤ\,, ^h}r˷N|u-K'{GM46{[9ᣓm OqqaY_i56 PS[ӓr3O *sj'˼ѭ<ɹOZB3>GoU3y&|^Ӌiqr\뫓8woEH\opvx?&F@``~ԥ+рpR8(8U[)R5MrAcmj)'ۍHW Cdzy%*-2)Zod*6Y]G7lIdqe9eu-蓧Ng)| |cP}}:qXgXF Θs >c@$D;ػZg$Z3էn$0TzӼLr=}W. C1О04)uF*gUگB;F#Qo~vW"W8.z TXy>u[;s$G}FG/ouԵٿۏoe^m%nՑIR +$Kzq'ga;TH:rTW.ܵ \~89N-wlDK3ZF}dfT7_*LjF#-I ̪3UV>mp3,DShCay]d/=9nw *Ԃ7Ά/~ O%_A./9ЉܮVBУQjZK -Fu8"aQ @8KY+vJ m5Gk߭BN~psw>ܼn̳ǂūGӔzڃF y:xSf %g.`LpHt,=?OBk&wAp?%nt#yһwП*;؃ 5ߓۀDW6!Z%»? !?rl15.֌Sj BJ6ꄃNuX<4Rmtv$.%x,W︓p5jt'$d˾ f ilOdfq328&S5I L\mF)hݟtI=ݻSZY_\cs14\X;UZ+ 6}3ږ9:/Sy59adVC4VAu jwJ~\ɶ]iXi 6O xDf?Fc0Bu6_>AcMs;=:VH5ъl no=ٔ`3N[0 _%מ%i?*ѿ+m'VN蹭 IM`UI2VwC%l%;odyM;8bSi =6, @zs4]tv,:K۲PYeƽ *7 ~`-|QWj{~Sxyx8HcF!w<*넔ØFe#q^!#fWsv߾(d IFqIZq'¥I"3:EtPj![SkH+#|~BBNӨ{2Lh .VB 5O,*tYp'h'"k_S$r"œoW>9ύDaAu-h'`(du--SEhhI_x6rኯ]0ȳPfBQ,Iu"G%4j/4_jR&E@TQ A!r9cVb!"y,x-!Dd'O)[;zzu 2S H[ydb9f҃ %d8a6S.82J|<ȍL)/6tw,ןOcϋi0݆g?}oo$њKQ~G/J[̭->~'t1%vtV4ΗKsTsV39LyCo,u_o|q{_.>N'qG?"y" 0Ύ|l!GKpNG Z+#ֶ^Z U,d=wΏo*?/zG޶EEN2i{&2#+ټ|kd3k;3™kpH©(M85y( $"ȦE89WHpP@b&G$c"qY8۶(drn/m ǴO -¬ZEDEa~T􉦠~dVH}tAɸ2j\ZޓeQH!z4ǧV( H(Y7AV(fж"pdʵ]d8imTdYA 5Vcx5vm[T)9!SbYծ5 ZI%.tr]+}Ց$ AJt1̊kHb]$Q U)2/Y۶Wzٰ2I4xe9ߋzP[r1\aRCQ= YP%T.xCX%F%£Q&2P;2RtIA>a 2Es%D-j g(M56(ĈZrF#7ioT> үn~oe<+۳n9򬶾<;qgRynMn HV[ك2ڪamY l@uVyy* ݯfa^^ +D;l{  K*YNfg3n 5{2fwMcZ}?LJٛ`՗êR|5oa^ɠ!kx)PvwǣVG0Ar8#:cv9gDqظS<4-a:t b,XPUQh&Oqǎ6elZE+Ou=G`Fj12+[C z߅u@sO|g U|d?PգY ;ܗCuS:g)mD+%Jl)ao1ʌ~pv=-ꅁ05f ^[`& ȣW̫X  ښ*xjz<.J.H896 GC#M\JVqvUƨM <ڷ/bTsW%' twz1i.jr|=YYEOi~2e!P+ޜmhsfΪм6z89[C(~8&bp$"/!,cM?0.QF&COe?A9 ̡($`0YKfH3$!LvD6Cb`w6xꄾR{pyVй@wM6ޠU!p9з\2)Qls҆}^h=Wg zvcF9箟u| ^㣗SP7;,~\,>W7R|r ~!/x/X ܹpg.QFOpI [Abc^6_iYPRhIڠIiuDYds^@h2L gVj>WN5ȣdtMz:@fx/[Fpu`٨؄ll yJv{ܓf!!d/=.TLP!sy]%yJAE"VI胊<hgKjA_|/ |(t?国p)ItzKe l/ÖKev6q`ެ%I14˝(#`fkmF/bW Cr .(}. >*+I{PmY9GimG< !+2)G=d@*yN^4zHy%B, :n8/GZw̌)Ï5o^p_ /W=BlKҧAI):,8CWϾ&/Ţhb`bp@2Ay3y!B,L9D6J4F"-c3~9;QLʧpAQܚOvxZ֣Sm>YMkF>V=\'tJTRnd]8|\8-ri mE.z'^]LߧN%L![ ` ƻ @i 3*̽xd<;~q8W$4@8w@#9/a1ݞGlT #!G8VBr\Rsr&mr:K24%kPpA&ZTh _1}r_'`W>`5]kSzsL"hƀL\2a\gG(J.\ doٶ]~}讍o|iǗU=NHޯA`0}#0_Up{q};.R˥s[d!<1性 Wi2) +}9u1}3ұAY 7^ӓ 8j޽Ҕ Vf=xt>Q oah WO~JKMwKˮt~ʄD4a#U_"2oƓ{b9qc-.q476dgs˄^8MgD7Vy'o);W_5_uN0~_]RsFޟo0ɵ9'\zTwjӹ#!,S#rӵjS6R7?!!xM{\K3Y.]ė>'gdT}+W6qZw7dv2El4G-8۟z|g>/g ;;qcN1:vN)G"+.x#HGpAq, A. Blr`Nk˽Psk˨)P#TR4\Lg Y h\>k6Yk h"fܳDy̌I2j9JM,/{ %~шJJ\*I^%DT1[ɏ6"F1BJyXR/m0k__KJG+WΜ)V628/<ӽx=+5 =4o| SLZknFy}3Q BLg#|IG ._ &twFz?ECﮇCX6}FdH>j3BJi mI mHD gG#j`4Y.G#?gu8;(BdhO*[= \kV%!TrT7aR?z-3g{]FqzQAan0ߕ[ln`4Vf\oD؀x*ot8@2PAS%IeUgvFKb%iց> )9Ab9`&4J9'oPOyV!]@ jfNQ:(xKHU[I7]khy>ȿRŽ يT)XO#>F=#ss_xڶ5JV"'3#aRŐ"E%arqy5mF%\$N6\B{3mI$*H,+Rd&}&L[2`uI {YhǸOɷPɝm:l_Q}Z[,4}Gb:I3Y> \+m 1%UWAN ԂCWP_fhWK 6vY信[Q*ԮVQ}+Yp*Ȱo8.̢h))217({)[0#ԓi;,c@y4Ykd#=/`%ǧ-+'SX̦ RcFEiEIH1qRբC$-^hI:JeCW6CY-t吶 6QvR Z->0֢d -ce!G^#>1:gIPh8٭%*\VYo5,/I&cg ^IB:6.ʄIVe/J?+9\JŔ-;ų. cG]NX%sFh >4j`Mb JE6Wpdh~{=Y=w rJ)j3ô"\YM.SdZO !cJۅ_ gxzP.0JBP4&I`U*+s0(G~ ^nl4f>sP$!5Z!QimX$ %!2/SlBVf)Ӄii8dKnԴ,n9l#U7o~Ύ}u,_ӯ&DZ)4vl{S,&mp"dImx1*pA!}4 $2%f9ykny:1{.] }g-G10 ÓW붳}grK]M}$Oo+(.C \?pC&eY"ҳ'3b ;!skF6ʔ&މ2-~KމSb[ӏQr6-UT#oh'K [.jYn@5_n##dw2}J9هu^ʶu8'}497"f-i\`-tt~u8U܋6+y[i>d [A}KkLC N?FmPoQJ#=ֈXR9sZ.ΘHƢr˶ZKlR!G<x 7t־&!xϚ %,_ſx͆&%\5oPc|_YuT'?/v;.LWu'waXnN .Xx a+( O#fص(v{Y*w )x]jT-˱mWØ Y'}v81)sıx E>'/2?/ٟ/{oJYҘ]I-ɓ/EiQEle2mPy󧲪^FX|L\n^d<6׭Ez^tn]^nҩ8&ɦh([NNDό䘱'2qe$})[ܚrkْ4KJ7M l8$B晩i

Ĉ` pL.̐jhƖlx3ّo'z8ȕݘ`i&m΀~)9ȝO!#|M^?6YU_e s^-Ε>庇VI@h-07^ӓXhw^j],叮ƣވܰr$^P(&p2ϝKuaho~򽙌VtQIFu;o7< ~'PK#D\ꨊ &"׍mzYT;Dݵm:(jef2h5Qcgn4֓{a~۴y,x(r~6>_ͳgQߦwJfLn*yYN|~.-]Ӆd7]-";* Wo zѼ }>?[f-RC4A -BJ@375% 0lAy0I+ P5.ՐAJLGoi$ SHBdMœIweqH~YCIF0HSobۘb^v!𴅖,i,o$[:ʣR`֑ ~q0HZUIdRY!C[e/(?߿f^Nnn>}`wSfiFaQЯ-bn05!YS1иc [Cܹ 0e hW86eUXz|<>xR `4.oTEӴ LA-[Ӌ|{gn/o%tY+km?'j@9~>a&qKz4nG%&hm!Kg ۍ[L,~įݾM+Ø1SPml߅{闍wQ\} ;o/}7&o0|^KB//t0Sw71#JY̺--"mӚ05UM뫖iPV{+Vr)onNVݚcv֍l&v)Փo/aPbs8ME!t,Q%TV={ZbDc KMpN)9@f#a[rԑLs1 *1tC=z2 #VcjJF#i~H7k}}-a>T\6c$+N~ ɣpQ' 'hRՈ]R%.:O"FC05vʾPVj@thwwX&Vj =övm^4"VҮ4=X\Fls5Ik+]MR$*R =AwK0Bm4Ŭj%/ )I6.r. E˷P6OÚWhrְ;dk]kL:!N-mlSb{!͖_W'9K&|aJ5_9 $nCl&ޓ1Rl*7GЂ (Z0˺*xCӔg]")G+5itdkmS3+q{H֦ m=zm07 6L<` ÞtS/8j~F(!0xvC&G;z笭cJ6 1tM֎M> heaɣǥԈGVC40sd?#&Q+z䐝^m mYJdS/ ଑1(d ecd#ȁ5.cEP!A[t[G3/e /1\'H-P @W:ZD+XZ+Z`b^V!5zzSG.{ B[l5n:rf+2J" 5=;JY1yڐåYO:淀YI"Ul\7u.Ai)'&F %h]'$tb cZɂ툓]IF/i?N&[$[A:U@Axr^ FITX8-DWcFP;/4 ];ļY S1f#{v// :d3 ϟ|10g;Xǻ$L4l_6B#Eؾh*|\Z<容=`>zR&b`uރj ]pndX: !X(GЉO-/CFfrLT ‹A|J{eKj9;rq2jֻ2ΐ#WBDr|qN! B%/g,} B,IiCJm هd3P&ybpj3m/dƺ.R?)\ww>nnf7wU{e7>J#V]M!ϻ'?_6]D'[^3N̟ƫ_6,&ñHڎɾ'| ph _SAAO8R2x;<5sU矃e_\bàrP$ۮ i0yD1  W1Ch0G\RX!0ƌ^e*QV@PzEEwS5YZ`Txm)[Ѩ:V hŻμ d[Ax9g0hKKmb'(La;cM˓|qugήwoi>NVZv/h0t.@la{~imw>aHᵜ51wwzs5_zdw{K\=*f#z1T+Z 4}QCW9O|MM }C/#یOJ-¸m˭'fGa;ܙMc!-X`sggn2F*rՇҹ~zr~="af-TΙUBPb6qI"1=A 1)"϶_n#I4wq޹#V:|[?%&ZrW2$2z8?nr$c;vVRB"`-WHaɉ#R1ځ) *2+fiDNvbTc]l"Xã=<}-"Oƍ'@D'#}&G0"B1ZB]$J@(X!a"ø;8pIYT"~ȬZ!vo?Zkjl`] cɮx=7+-+VZXk)Q{lW,Zsثzf}p*5O`@ZnvcF4E[QKkMEis<|o´ =KĜ:4#sƚvŃ8EȜ:t.;dS_scdҭ0c ^X~9V[ sCdqf1cS"p9Vh*&t/JQ(91kF7^25x\8dUSmc/*d"; Q+ز wI r3V=j߉_.Za݋EZkYh`{/p zݞ:#l[t+s[>SNc0Eׅ|냩Fh\pu8UY7_)NHn 1j `Z.Z`&B*͙{D]K-TFn9fV8`{Q}cl@ w @4z{ kţ7zmBh5q377+ wNYR͙<{Γ?dҤkM !$Z=^ĮO)WueBy)BvR"F"'c@,)t\[LrR(T@ER4R)\*j% )2fe3QMX&u->ѷt $N6XBl+a IF]V4CLX\g_xJ2=褚7NzXD AA'fG_A҉6`CκsGpKze}QhEocD^l)YIuh@6^1KQVjx=BcH4kȆʰ/P e(`$+~Puy>Cձ'ڕr쟾5 UT6~ "`RvRD ਄5Zv4'h[CC{Sz9i`2iV7 f3w};SLV<#5jI@P9H S^hNٕaZX)0nGJc4/V~9]/Ujح}nЭ,mHYٹo zQ[X!7wWWon}%j44EI/H6hZf +2@$iD]6H&8_[.1ռ7yR蚠;Y}ņ/Cn_tNb $vrL3V/7^\u?-Ĺ=cN#G7Y2`Z]}Qҧ=Ćx/yQ1 faAA'NYlf~6Q@7<~WYz kj'yi/CiHL1o@Ac18c8q(ƛ ֈs~7ʼn9ʽ8aٳ#?h0j7ꅤ|gZ@S#wCu-SZ՘U?LJr{ϫKV[s^'k-\_ynbmm.̮iiNouZ}Ӷo60XLR HZ1*eX. 9!8Y\q29[ϯk6|p|zR#d@N~]' ;%h,rsn 瓳4xOC5>!LnAu8i]N:p^LB%rR9o@"d Qhق)XJ,C$C?XwD4bUW4ˎKHaR,J扴kgCr:fS$3eHJGw34fPqjPiiR7tN;){.9H"T \,U TF[9oMD._(nC4ESƙuVM4* CZAsφoe{"ćgv=ndkmrTPFZڄr5misOl>zĦ8#iĐN?{Wqb 龪d)`G%B_e q%WO5] 3]UuuUw=jr)6iR_ 6/}yDJXa\r/jg-[Lt/I 70PiaZh+U<#HY 6|$  )7ۍM rŤ6e'V!]ҏ]^qC2ӫO(ThlioNԀ\Sj@~>UA>CD!DPkQm޹pݓ:ϕXκPO 5~toj,tGM,3ۊ\gr`fakA>c}@iM-l|z'{2 *nyKP*(jV HamH[,~ftn^|k߻=-a%АV%z8N~fNƋW,K߿wB(E^%m@f;UuGBl.~e/7qۥ*j1"/"i͹{6$ 6 Ǹ$tޮALݓ{b|n 1m{ t E $$p]b&970 qiuF9#ad^tD0)hqVyo*@*iFHC$'D.hz#HyA;X 5j}P)QՒ\0@~}Mn(`@@;~*% h2A1E%a[*EB^qJG4v<HLU) Kx f{La㓔li#IYSo3,;d]0RQg( YVq` LaBeȞÉs`;ǘr=ҶcRS=(j@~ >"ZjT ǫ <i!uqn܊F2*rT2 ms2zum1oYΡ%Ĺc:[ۋ\z1 edәz UR+iYSy md}paշ L[CAUVV^!+;j#{95}:MSs4P%q^oS&MfU7O*]hw7` K%$j秝eTy 0|x<\v(*/VTU Bp*ϛX` m#- k{W?i=e:IDנkۜNeBQ"dژ@Vk7 Jtn,dc<`Ar8f4K9C md}5)E9. cqFynF/A0,L^̂7/¿c:@>nGc :vώ_9'ݞgܳh 0 S:axM5kN6CQCuiVY(˯"VyH4}:tr4̉A!H3يYmBսp%=fJ`+7f{F[8mg^ sMϷzNeiB˂uÜX 9DA5f' 1;C^ ?jwގO:6as{ɻ G^lXKPa/:e"C{L; 8/%^PJS~҇력-RC g$צNMӏSMw#Kn>˿lCrR 3ޔvbAq= z# I_Wr(^$?)H(Qvcm [#)cXn}1P3(3AP8 !oޝ4{⇏/0}5daVh_U`>d`if*_^$|!3O=gdƞgl&Sfe)O9-֡hNMdSSȜ.Z/ijy>Q>*3LwPՋ̧s`Wz" rړ4d*L)Sʘc7?i֪NMV^dES]*ij;yʕR:N(NVhF6աpEVA!ǃj}mMr*`VW 衆u0&OF'<2MY4(x)':5$y VsH-qVc. -陳AƸS{sk%U c*jAKg44|\S@_}^SS_HPqIz"HqCH *(A^pe#C~>]|wKR _M7.ɀ>@S 9 x0ǵ9L17q&@<G_3 Če(sCSSUEaf68RڠB:QQg) CP钚e(Q>F%xP(1# *|D/qKM U62-!e,UMz"P 0iFGc@a`|05~(C/T{6BoF/nE7˫J"hT|ݘoS%vq[EpJ գpf"xɒQ}/hpkW|I,rկ+F603?~ b-"o޿'m(nD:c|{ѹQ:]+GSiSH"ph#hD\n|7P \7 jg<3 @kjA7Ԛ1kHӁ>cK*43B_JU5yR[t!/d aܠ"dUyi4Ȍɔ3%C1FW,t 4 hAލ>J&b9BQ!uSahy_W;$6hc9Alf>oVR3Il3 CYtbӥF؎wU3=lS-a88FW E!F"* y_:Hq/U4NDt|om50k2I % S+dǐU@$ ykK>,%%ZiM3DJԇQ%;4#:;rةh(<zxj-$nx z~'")!>UT>G,˹u7Z ҉X oGl~BThgȅM\ArBYn#@\ ; ՝j4E&<'5Akgɯ޾<߂VET~[ELswLF?ft"c2zt=\ei}WDGZ +.뜇[?Ƽe&TȞ1ouAY瞁IFqC{PˣTRJT;e*KSIIVUeZR,ʫPRPSjEs\Z~ 2?(L1r#Y5opTPZ3;҆pB5CmJǠP9\. QcI;nH(ET… u[Es;CN)S͓fN\U; g7i1K3(jo G F= N .IdTj!tCᅶ)ƿQ( )(֌ ;toT/(jtnQ[S"ȉd\\`n%$_׋pYpyC@ Ϭʁ97wm_!awP}!`v7F~I i3Es,!%xpRۢ3տꮪCDž!RȁE5dYziJ+A$8T)dq;Lo?YqGӺ;ݫ/fz9Uӗiɲ/Wm_yEA;VBpjIXvԁ`)NAssi"R8X< :4?HKuj,_kPqTOy`5=w閚(t1h҂3FZzNiE?ii ںL0eZ䞤cN:bN ^3JQFߨ}l, 4Ax<~Yxh!ʼnO7K7=}̕5uxh+=ghy^,T7ߟ%dBh9V| %: X}3U$FȺyW$Cb1{|8Te&-;.$cqўwn't%3Aj̭[GƝ z` Gvk3 5{<SDi=ŖNťenAЍAe;#ZӴWYM?궼xߓ"mzm3 )޺L$j7:DI9y:2"R2N&je]Z,B{d DTI/α m PJjol؄q\SixՍCZc]-iCaf㩦c^J[`]="7o:Mnו*at޳܌?sʣ(K6ʃ`m7 pP>Fn>'  +*-D&|a˩$V~*&ݥ4 aI}ҷʓͨ`&[jR#t1p[X!9[21lfhC]Lrrc +!8q#*{rRQT'\wnaȊwr洫 ݭQ\ :U ͔P? ^54~Hǎ,W;ߗ˖8|Ll\,|$)TNŪX8= U|"P)%DU.~x֧$О0[g[d |5_ַ!YirS*Ld*D(ٜQ f# ߒp+#2%}wN⯱Oz+Ž6ΟLjzQuYod/J|yO9W9?~K2[g/JG5n\C]PZr(ZP ֕hX':B׃`0IZjڝ>؊y¢ ;|?n.ydY ?c3MSupg! YS֕7ǴD 70ˊ5ʉd\$ȅ_ f^'Nr{*7u7)2 Z* | &,F19(FOS5Taԟejv:fMl'p& ek 3/?z#߸4(+Ǿ| a*.,Gꮫ_a "#3ԇhidzf!)5R KFdߒ~PIl(iI,-6TU7ZZF!G5RR'S c, S JMx&Z|R ёn "6ґ0O=@$ c#{lq$WjU%!0Ҙ($b0AF/$CI$OFt@[iPGo rX()0`d9%BJ@N8kQXaH&#E ,FEV{A=4jJ0MAd6*Lk"A x)dԢ:zI).Gт  brno `k@H-@pi<1Z!mxt!A3z@9=%l"ZdR-c}PYOKS$8Lڂ1ry\31)A1"K%3P {@AѲZ@ofkȹLAJSz>5Yz2)yK~u/DOQ+Y= %LsK޾d"|k(}||,5" Dj72Ngp[m{`Db|~7$" W$Bb06׫VOD0U dwxWi#iI6ާ2bmKR~tCTA?P|٥_BS'KRJO$.9@4`Iן0"51 `H^8bdz5cZW*鹃#x1 b~&?^uKW ˏv`,M,P/&۫& 3M+%N$S]ٽX. )WwTy}yX\c*`JuDN d* NF7.cĊ3}1/Xf,#h5: jbo]KÜXG{H *Z98D HAcTgGr.j&aۿ:5j}% ( UL '="N3,ā]7<<0.F ,>ߐOn]09ִ8NvHіEc! A` RQ`'GQ9o%:ㆧUE(L{}1:1:kЋٕщα7iҚKUJK+>}Kc:G㰌7 ]Gٵ{4#.\q*ݑHbNrfZCRKU^ZꙁDWD a-_~*R9mlF΁5c}q-O@šj{3GUcsY,Ɠg~T֨<->yVIN3ϺgO~WURBTeΊDLp32?ღ=A?v8/^iq5S^ߏ8m#/ZO~q@!IoPX7X{$7&T3گ!\c2dC*]Jȕ Ӕ}Cnv\ɗgdz"f@-W,0.DLe C\@D85T"VDQ|1'>BM Sa ;xkSh|Zmdє^ # DzJh4%XDR'0Z[=4#jfpA60-yp#cy$pz\L 4lQL/h(Ũ؞U!3Ok9ӂoqQ tU>53V+F ZFa"\e%h~=wW8nT }5 Lc(COu }!$Jt*±ȹձP=*%{8}}o$JJ\Uc`43:)#((d:S4*b. {π[;{z:*&WZ0g+S:Q\@ P8K9D1)p0ʮ+{ &HW ko IP~:SN5Ǫ=|)҈󏿌Z.)֩Y,z1`pb (kd ǝA5`?>6cf0u1dY ٷDA1;]_;ˎ68z y,gS)F3bŠ2yf15.,YX ZI|1RA.&z5~y"E??C >Hd9r.bJqt動 0!B{6wgZR^6tAچK,8;ރjpu>'`sy"y7p ֺ&d&9*ծh^Q^@u A݁n!_UVH;8_)cBB#3}3fʷy.ܒaٺB<3Row~sN.2wŇ=[PmߌxwQ>ٔGM;YwǤ$1:SJEIEk'lh7ȅwf.ذ 5Izg7EؠɮzZ`R= ,N̼I)ؚY2bAQGii/;?!GEN*Ϥ[xL:0ET壕(c#yd)Y&="WIugc7䖢5y+ 3q*,-Zʃ'$XMJ ÝgF*NPM;X#cg6}CVaS8ts8:FC'V3j r |XO>kJr/f3AV΃4@A 髋KA.&#gw7Y=<@bαY˱\ү MXR:t^'aV^Kz6?SxZĖjGSvY& i:d>(,3,FRz a^aAKEL銣w߆v9hN6hD]w4W!!\Dd*:aCb'OFRVb"7W4)Faa`^(E9x&c /GA=덵 Pe!v_S h<ۨ G܁׊+< ; Ge~ c<[+xr- ݜ7l;H^5=eQB¼#a4y#4P*\B 1 u҄d4[5x:=x hLJKnHʃ$ +ʃ+yK>`UEbJ%#}ʃ+.qR0h8#2qnwX[C x< %"ZqJ|ьE-U졟P ݧGR!gj{Ƒ_ioŗrb& s;`v"$*1_Qvej$;ӉZ*S$ȪrE^w̧3*k,#<+Laʠ^''=Lw94nRۂk#3cM)elN]FARUɘCmQ \֘SˣySj(u\Juv$ \:HOOƪ<Qnb˵oz{XzX),tY)Wr J^]jQZ%)ދ?ozsxڜwl> ~5\)1#=U mqJB:ca&; 'RkUd=DA+V~Q'-Js•qI]r ۅE[\ϔ_^_RAeޚƲ4a B؆>'Žy$[I697_Cӭ5 3fZ{| umP*i5RM(XQ2ɘt^کjǛ]WItiliT~<n7pWpK2].?O~u&E1sW+1=[^zt)a6PZ6s$ ^BL @i==9|!& vO"P&H9^׀ill5a{N .{څa?]}ˀ^fׅĠ"B}FԷD5$`J8'>a)8:I%"1F[\ZZOr2@>z c@I2u%#TҬ!sgT(w) Di28=CIIl^5 9pc:LI{cj({Ɋ=BBIW] n:/W(EM{8#`\c[i1.%n"l ݨ:4uk"4[&kܖVɩ;6ٮցiȔQеn*t-UYDi,.@ wlÁ|;/#QJ*ƆvnZ5*SgZE@` grjjk5a[eMm-k~kwAKS&x|p;ӣxX!F gƜkl]ύbSg֞6(~\(%zޛ区?-dt{0,:z/,\ٮjW`%I7%U.=pr.yxG>$PM9VO$>#B} SCϓ@씠2Mj'^/HײtW^y7{^Shqr{~1E1.S`T(-I7Nfv򑣤aVd9p:yA_&8T+Ǫa~>lׂڧ[otu25Vw4>Д2V F'bn?ghG @ _}rs1rߥ,/%FZ*to*%P΍-J ^T 5pʹCUj,$:\S_%g8L p ̬QY)F$,6NJ\?jʈ䌨P`)8=r h20(>5jC`Q{nn`ևdR3% 9q~~|&N/ơe9:Kqc7'p^SUJ*q f9e E V4Ӓ:8msEI2t9'A0 ߵ)sSH\ i,!F61.79~ErƨF>i{׺ X_IÛ:բ'*;k_<|go|_]}~]gӷ~ J|b3=ԓy3X*jT4XehWrQYR=<3k|t׀[?7ERj!OxBt y6ȧ\r_<,̙f૘-9{YVLi,# ZpO):ԊNwnӦ!hMQSj.ƒh 0EޑI %n;aAt5f2Cާ(@{Pq>QI;\lcރ/5vydlADp)"-+04aPLL-ۭnM|SV8͇'_:l-Pu _>.Ηݿ+Ov3j"ܹ|u?9͛@V ]F|?@pyaD~nzJR tOWsgF{aZ35Sڠw1͏JMF(?M16ρ,u.`:sDFp*sTц' *j0t0+Dk|+6a~B> ^*4d;7oJ"/?MEq5Id hO!Djܤp}䕠*ư>K3PFAe/a_s^qx`k)⨅|QAt4^Æb%@`]cH31dF{KO34םoMV%$DSq فhLPs; 0Mt{)"9@H%4+,Ti9a]iUfKe! F@` `hf4?F 8*VL\jE 9C-pm5qP@F$ eQ,' 3dj zWg3dy>rp%lnkDFM_1GCP~fЊ2Qk,3rO9RZ^zO1)j~apݭZӔ.zm @`On$]j``IQ#T aw!C\g/^l{JNOy:χb]9||uŻ7{)+՟+n-s9kʠ֗Gm?\ogJolɎnY@weqH~d9x}1nL&Ҵl%oJG)f2 [v:31`͂m8K\VT%tNml4x\C) Z;VOgp0*SB^9D0h:5z˻&ބ-W)F6ݾ+ڴqqxhߩ0ѻWLqqn{7|-W)F6e!ϻ7Nn} CtS"V iLmΚNyζeQ.,'.6=| Wj||bSX-xĚ¯Qx~-1+sA0K"n*lfy:Qa$8O1 2r1T$S+\)0ߦ C#Rv-"` ZUY s4r ^5JSytHD˛J/bJ2Ԟo|-C]vk,h\Qr۝L[ۘvi~ʡlpZ;؃-7젹'{(V W7KgA+æ܈g7wS=pE\_b*Ζ^&(wW$ELS^QЏ|.B]&4^o>[%XЊ=QE^Br 0J9?Rdzh_G~/o)`{ZXc2Y>04#P_Z6ss9[-=ql9P(0?e{6L"|s c0? \Dڍ XHH揋>U[E6Ӝ)Ԧ &Y&Q^*#,Xt!^ɵ:2ғ>e1qMp v@8C冶K>Q¨9u-P3ҹVvGkg-+4quw\rGn1zkeQ-,7lf/)l<+~` Ԇ 8sPX3\׽`KDEqDEk($*z?aoC oîtP٦Ϛ= ٴD ksݥVs9j yBw$=˭Ȑf@lxCh Ғ/ؒZ5''A C7ГL>M$HOB^9Ds0eW-%9x\@'mthSnFnhޭrfaJwލ6 ZL$qf~-dQǣ.{O!1\>riGχRްZ7 ɵσ5a{Wm[ڑ${KrLWʑXԿGG3ړGg}%BUcCNofԏo.o߾e\/΍8brglO")oC.%~OuQ~~6G;4Tsǝ*;dghj݃GSXUc!rQ( pU$2%0U/4!4붐"R,@2D/۟r努,|`m͓6F){9HBR=TqYu=fxL)/?"cǣ>T_Bd#ȉ$ĘgYh+aD!+f<LgV{b)o,k`h>{e ]H$=ql9YHi1Ɨqa`a%X>4ߴW"DA,^bp7ڿ9UJɚ0Rްb{]\u)PRmމ?_=\\q (__ml'0fp|:Q>M_:vP]!-N4Vѫ2 ;8j@ޱm(9eU2RsV)ڊQ =Jҥ q2b]%ZKoS .J X,Wg.}j0c:fIx}2zVڦst&i7X\sj>]_?hdgҳ:/jWZ䩲,#Q{0Yl.]_g~H-,"aFu@n+ EbRi7/`j#C8Ac#ݵDD|ɭ0NH4" C-Ym `K 1b5wI{.[j{*n' !3YcrJMz5"giQ%Z\UOw~:PRP<ߎ1J}OœH#WVvS 2A% cK.y Rm!UtjB\#oCv!Yg ؐZd8:H6-J[M7JBlQIlQB^9D0emljz\@'mt{/)A4_ @+hqǹoz\@'mۄQd$z>!Uddv|Yv7V*3,e"Jj+N\t}.> ج#|Xq}U[yt}qs&k"GͰpC#4W^.3,"Qލi/ڴfLކ24% ι$snֻsE-1u^EqͷCf[=٫Dk]xi"ޗ[j`jO&pbT{/h^|uv= :b0=Ӵ)$VE Ã#auDbr0 ôRjԧ1`JC)W1ؔi'7Ju"JjzhRn, 0qRgJZBN#! #?4ֳ ڱಟ 3asKشZus w(ɗ3@!/6žVcF: >=K}%u3:(QjQjx3spOy{}QRCq×xxٻFr$W vfSE< SwbӍY <ޒ-$Ww)N˒T۪ڇ8>FZMgHg>8#/ B"y/ ;RBCP5W\2"4xHsIyF FL<S2혖KX;Ix2AeBՑXgd5-cZ"{T3(q/{*f==38_F_gjgcw?-w *LH~J1+'k#3nKCmQs>7N0P#4Fz0RhKF >X$BY5RKf$5R F!#eT3RivJ dIi[9GE(YzECmw0 K#TV20;E6[/a%󺰆B~.vP_َ㫫ٷ%Դ7m 2T7ʂey"?]Ou>vms,q=<>镚Hn&p1҅&–wrtUD ]qoY 3K z]rVv55fBOE{VNcWfw/~)/Y5x Jnѝ=YmaG8RFIn=٫э*t<͋B}7=zH'qՓ]j\s@8"2z9i/ K`CVr/9PY(#9E éQA r(! Ju [q]=]R$M:K.2DM_ n1@ϻ `045Q6PbGI#1IF_,&O!NBPΫO&)].O}v4~>w5-I ɤduh[#,hQ0%%FJ<)FUsPj'h ޮD ^FP|[SJf#WDOJ0 Y,FO4ňܤo-wh+n8_ ;򃼷¡EAE)ɍߌ'9hIƳ?g;K?ȧjcFRu?ΗaS:σqֿJ!tv4V#|=e}e%̓gu|vԊ|ӿ^CM:_Hx!ZZj) jf}Le\Fa5ip0{Wm% ldx=V%]\_b"yQ\N.\ xG,M LL٣'icУ;tzQ}ߖKR^9QpGnoH+ypEK+6%ǵUDuL*~}:cpjߺw P"G KI8'JFN4V ui$;af zq%цRތ!wop=F]wis̈́fܧŗ2i. UT\VER4oQ[".Y-FSQLߨJR="_r4>-Rˊ%Tjg3Icǭ޶IOX+c;Lx`TV#ب~[~A0(zk Dpx mQQzȅvZ% b)$F[f!I*/ZA6%׎]fC7ɠt]<0.w?VvbGaU3"?fD~̈Db7W,2JUL{@#S!F/F=}妈&*c`#ss-sC_W\.XT]ꏗDu]r- UT̮8Ȑ8Cbp#@\.ǭB)P 5k4p$!H8b8tfI\PNyet׸>y5gyBFxKFCRm2wy_l(wu17BccwW&ʇhnG*\D/ێޟ4 ҤN1Z @,0bQsMd/&˝b"2OlC>F ;ǻ?Y(/`wm#_aKrGjUM~ky$IOI@ $ӻez3P$23J'0˗dF%K9ra{YWZbINT{81i7%{=/MN:D3Og|JN["@xKp?[ڠ0ZlZ?Y.tg` gg^Bcq8{A!j:j ̀qڅLGh(tVxA(BĦ{l~deBqx)Kqq2qyTH*Q$QB$i Ub#Jc)1Y <{aY(1?1Hd%F<4VD&N,45dYr58&ꑌ/T9h?]$2$Qj)˹R{}AG[˦ѵ] R^+Ӄae*Tz3K,$|oͫ<a$Vܯi'jH#DFbx?!k~=;_,׊C 3Qp^$c0#p05/kP=? q-cx,)EL"E5>Xvbea[kqpd RKDc[6}49݊ +\Վb4%8L%X!{#>X Y41CVP&)29M*!rX& ZE4PR,ҩ-Ȟ.%XSb#ʓ*b%^@m$Id4?E >@(ad43ESc>< o5kT|̰`rXܰvޖjY Ѐ(ղ+ٰ#1%jnX ;Nƣc gY -TAGylra W^7YLƕr3>J2^ kW~X\ >fBQ~s`Blc͢k)Gch`VcUFG`x&*E)U8vq01LX ($פhz#\%AjTHYrB`Q4>>JcdAxcg'BaϭyvR5¼(R xz {&.lE?'B[6(#e M)ϼS3V,ί*N:4aNz&vua('b3N WϖW7޴Tя^ov=VAOrqxm >*NI¶ufyxSaZ!,S!SKeavol{bɫشwyՏ5}^hILZG ]\ӝXNcn,Rh{,C%u!MHD xpSNL_tO&P̄z23s_5I;~FPdAUgUũN?Bgz<= -byUzQn; _[7SQjSM!'?(f>LQ^7\KޯC9.h@rGqs[n4/ܵlO7|~5UR봀t5/VOSa_Qsm>0p_VjtcEХޗv:j&$ *$C>n?v 偋脾Gp۞:j&$2ʼnD(h}9az+=|9xؑg,?_ /t]_}a7NwN1גHUnpA>UW u":w.C vPuBAj\zh77`"TOC#Dq".[O/`!l  nCK}c{ꡛZ;9=YZj]a}mz?9d4zH9P2I Y[]Z $%Oh}>ly4t`rjw뼥T0R.ݚ@B- o:R=Y }+x2P1~yxNf^֧x]3\>2/xcTb*&]4\6՜bɾ#9Ea(\WN&bp1^݄""S#&R}nbJI4TEґ(:&4E, e!@zxGd*vH> |X)_C,)= |O^JYtKm8/O5wFHf?㰻} eojHQfv*kChjHf4;i!2J`E!&KzW\Mwd=7اL6hmLffDK!w,n;6v?W;l Pk;d6[)0̗ۇr|XU\71z S}sH5 YK)fnR䪘V8ݗn/ ϶3lhp3%m'kφ#&q{MUv3"1fSW[iM]7fVf6̞Ѐhċ4ׇT@aj69yof*ms3n ËװQfaw_Ӌ ԡ]~>B슱}%٧Q4OHY$'fifSsCElNDVi~fsup)XPnu`7~~lˇ?^ &H'w{8G~^J ^ !p-<"(KĈ!4f$UP`<:}\a=|#w6ݹah P(&X~'~6P;?X IKrqCpQ6Ɇ3iζ Y(L\嬡8x@qd>x "`pCx\i7n@~$E"`ʔL$(ɣ4g,F8Rh$FRMP4Fw")YN8k #)4J$)0*XD`qu?9 ~`fHBoul}6)pѶ{O]3~t}U~3puwd:0iCKP1x)>r|>rR5BYU2DRuP *V)| @Xe,#KN"^E4#*ًD篒y[p1=YC.=:!쎤IF"ȿOקO%Hىn;=FXN!môsl0ttϻn)<wfsލn[ |\'v6b0Zt/n)<wѮŐ=jUH΁w}%/*["sRN- Yx1RbTH?/y)v(%cRK,l/EȝYx1R\)-^z^J3>3>vҔ^J3~+5 ^kGkTρw}K/K:fa8%Ij$ew KۖBϘS|b[؄D$h^J~-;(Ř\i=Еo6ׂXmhn7eś5%C;p&ѦeMncLkN2AXZx]Z~zt0 Qm?65OzCϏ!AFF5Ù oOcDv]ZG mZtToɋGq{i}XiՆSst:ͺIoyX7y;бE) =WI(Zߞ~Fo'fV>~9\!gwE^lpzrn\8D_^0YӮlsb#,pBؾUH|̊˯1"+!9"9~٦kJVk_ۅ?/ #34~ғahk[HP. e|Uɲ7ec| 6og~CT[i|?Z]>ъd chx'qCH-o@2EBAiH'`pRIҗ/1$K8D/ pKKaӀ= 8 }phܯ/>b5t/`v卅˾:>-{UM@"LVLrʹ똪! eQ2#k՘Bĝ14)\ILN?<T,`rD u X!a(B̋&l,1)n]5|gLi=E oO6D^%L?0GXV ܾO1.ZyGWAD^j yБ bUZ{Z*gHR:S:/{l^@?/\ُ^u)w}?H~^-SxJ/EVj6YY< bبĸcRHmZT^3{ek9/5t9/D/KJ\)RaZzIa:{y1V@hşanW=Ԟʺ%):C]|i%mp=ՐzDҰsI¯"*Ādl3✣Tj8C-r1h*c͒ +J3q:Ǐůg/^su)tZ^xnDA[LGDK32GO 1KάeOv/,G'cZ V=+*;͸#~h40 ӲHY)< YZ/+K=* <Ձk6k#':Hdtm9nI(隷gJ;ݴ Z=?ܗYa7{bc?c7o(fvR(#Ģa5L9bYdIUX8A UK_>xg 3j+&v7 2q囝*hҎy چ*zx#DX$`α}HHl@bi&DG_*kaRu%( /d-*/U4NUEeеX4Cc,j|Hpb34S EvYց[b7 <ܓE}ڱ)>m_X6rB`l}d7$")y-UɃBa'O9i+̃L`w z-OYOlnL[\.(eCC/?4 ƮT8gOњJ;9[ &8}t- ^~kMaÃW##pۻ#Q?y_1haj`K[lN8%!yBUBzCXmq ۮA-0f}yiPIL\_:y3 \QVS@sv2(/.F99RKD#)$ۑ:0$+<_#CIYV>gt+|Ɗjӹ &&;_Efr 9ThɌ,Vj6q9yҠ򸆙Bk4_JU$"J UCYQ)u)*!My߰tf>t [1DtBmMU"ERϒ "2#./ bLJ@ׅp%P'oZ)^ 3ΪxT.|ZeALOG8bJ{Y?'@J* ҹ]xҚhBKS}>xQ*)el((5QXRP܆hNT|YY @~jhPu/ߖ M\,F(j!FK^ESUU )5)$ YbEheʃEN kk$F8gLk`_އp)ƌ@ߚF:I`,OD`ȄoMc&q}hdgjftPMEZ}aVq֧j rEkGocRkPE{|= gǥK JK/KRb^J/K[[ƂK/KyߧcRHmzݫ/8/^*Wv;+9$',0$c`vETPTE Yad]uEf! bjQmhwH׋K!")gyݞKBxeVIr?G E!j5b:ԤӢ\0n%alې WfV'\6P Ra87@"84˽UENܿCAAA }; `Q,05OiDaʲRaMnD!Gr({gaFRVatHۙ">̇Q4$r sDO%֥ VHԅ,kkD^i,7'YSqqU֌bp f} -bdV 8bhqjp&ї=!$hل 5S=Vȸҏ? \y_"Eĉ1$$f>@C˫8{rİ7W*aaTݷV /%(i22Fՙ)#S1XiދEZ[j1Jd \ßG?z~ɛ>ݗԦ x`+Gi-\>t0S+l`8UHKKl .(p.YbpO%ɿOפWr\u:l;CS y^M7EØd9ҺPj]jUs,MU5&M )MSX{>=UmD֮rz ۉD %rk\D- $*W%eQJ%T ktєHrر-xVuH[3oocLZukp@~3408}aSn=lkElნ[X{Zrȸ>o^pX9̖m͌7pD׳g]oӣø_#%tuul3:hPKA<s21\A0#t[^9z{rEhJhr^7pPIW<%5%BTKSWRhSܰ=Qփ ;aI,E)GNrS=\ւy(x/wKM_eS휒(aƃt~M*gC-a0sV"V!:(GDc Z2`G]WGzGBz {]rZ|_!2)QovBb:٧/]11vԝ=0BC؆ZlhC %XTeęϦAmCt~S⑰ ߸T ޮ`&bj'n*2Ia{,b}2K$6᪩')r&lFGa^BFfk!U 4n~fw.'ٹnm/`em2yk =x^l__86^J Et_+:Cc{8A,dqiA)܄)Ę1&BVI‰"D=C-vBV JO4 {cx!ljラ7 Y,p_dҧӝCAp[a ^p(pp PJ.1KD[y1lT3ay>D\Byx J,=7H*UgKtfĕ X4 0NN4)hzL4]?h(YIKmdi>HSMɒlǩu@S3B$;Zq.VGrf.@R017ݞT \FYnm55K8tj$`ݠ)VO.| naaһ,XZkC𤷒1鉑5MA84bAQ3.-;D7.A@83,niYQ]dܾ>+x:o\;Wυ(V-5SjմV@DCtub&LwH{RLJq>k-_O/5.Cњ UXژ`*`qãqHg.o N$$.?32(X>3_= ҩk흏WX.AYIANS5;;/mكAl˻olmn?y;G=[\;ieK:@߿+.7oӡfoM~qckUh;wZz\o2N&nr|B! 񄬿ZoۊږMN_Tp8|LC5N{x v|Er v~p\HA~ Bmᤄr;/;x ^A_>l@P$t F:8OyM z^,qBb'SaW18 IY%# L뷦~h;T#S&wq{\?|usf} o;dd3A1;EB851uu_5 e`6E]Z0bkY3+m(iL͓((Q1 [ j]%F;952Zs]K!jRi*f$5JpAd[4]5\0#()w c1*U8aihZH*]d32Y %iD3,4 #z0IVߞ!"nfw2Ai;"Bn\S͆}gsvK?wھ/WRZLl4q>+.1 )qfRhN%_~(ꏙ7f~JrHLyCPn9Sݷ3{E1CZ5sL) )UM/A@Bӫrn#{˅E"gڴd% Ty3/#%xKˮcdlSpję&{An4^;pJ!=!Ƿmgl| rƪ !RڶumhM6@וqG!*8v:`qfH il%RW@+Pn+Z$FqٶtA[l/y[@<ϙM,ieт{nRƝMkUTJbS3¤%bʪi[R*'Ơ8z5Ɖ@=ߋ9N̊}JQ"L>F$L̀i)d'ŚDSwxŤ5<5<3,,)>i\! Mk8N_2W\cH\@c"Grܓ5R91̸ޤ z8qX`| SO SOgGo7FI匳jrm̼q8 yQ83rF!*;/cJșx1VM:nF֚ժL*PٳƓI¾N͛L9h D,TwHV Kdp,mAMv ^+y\F g\F%:Oĭ%x5ln2ϫ3"4 %֬#5[_P4ܩ&X n([vާ;~IkHJ4PO Ԟ!ĚラJ%B>`ww)v[wC`כ$:IVcٍzl=L%S1n#D$︕6`,HR _`|9c;+:w.Vʯ܍RT h`5 cn{:S5Q9}Lk3fU2j*ՐuSeʥ;~ڈiE ӣM}uB]@/&WeMI(&g4^3fZ{]/6爺]!?M,Kv^*W_wĘj *~.Ճ<zxe̙Kg{b< P=~t~o5  7_Jx X*]dE i/% ZMgv Cp U6bHUbofRTb\NX]H4V?BRrԙ^rGM6/qe@\s2k)J+^l.BL˞:N52/4)D]ZL/\>xJ7HrגBƏgrJR1vEu$lċQhsɈD@^@3 蒓_l$NhxtrP*/?Znw:#&??>;2YQ?z^ۇ_],m\e~6iOګݼvV2 BdRI)F)mk%TJC#mYKiXCYE#ŏȿM>ݮd4[&Cz7iN){5]6'OoɼZx9v}p0ڃ@q̵%j?WqjV5+!n2-ʩJkʁÔ71V֖P+-{Z*FZa'jtxx_Z<{q{!@Qo%׾ q0Ӡ uxpP'c3348-*iŅ_2.-^G^)P& /I_`NpZn쿽~b2nW}JUvxfQ")߬Ǯ|z 4e (#PBJ>Z%l>4:=!Q}QXTd{߹3!bibܵ^53?=/uؾ'8_xJdu&۷4jaG%ۨ1(R>}S=.(ta՘79D@y+6N ׷^7~r HC W^Z?)mʿ=Zyiݍ*/E (#ɋqzQ||";rMBa㢥 O_l/P\vv;?\j6tL>zLA9R=9@Fg5 Gq";%sξ:q"|넏^k)NB⯞۶ 8M6@וqa7G!*  QO-j7M+>Tȸ iH%/ePJvQFW.?j\@*H7QZSWҭTZ&-QPVMܒ VR֨犓m .e v8QR H]!@񺭠Fhx6Ze3)J@բ@0hk)k!TYn%D8qe@&JbUc㑟x߶3&*协xKF3OvU .j{}{ݬ+ЊG2ͻˀ?}Vp`6kҽ/onb3b)x.CXuo;f{L.12@>6}Kf+攂w'͇Q݆ d m#FϚqr&i nc̐W>EͩYTvN3j2:Cۈn؛`EH{y<;= 'GJQ<"Dv΍$\Scב̈"U1J˒)'NTItt]I#')f*pI›T6_ M*5=\j M(M6J `M*/JGO54p N)AiY7J)!2RFPI(H"˩WΙfPUΝj$@X4Ԭ` 'z#b^5Q,\@!5$(m9 ZEʼnV3(#Z}T%mX ޵1zV\lS8oov*qg?m>ŸĻN_$NW\Mɺv۪o"Hdq$,'w!R'Ns[$˩m*/Nǃte{9z;%Qyow+\tWnL)SA>M̤Iry*-²¨wv2Ń:x8nۗ (cD@d%o!mC(*:6LCS-V߷Muf'WGB@3Ot?||,f! `*Uv_O QC@9-@]aHNN,7-pZ;J2e4kbr8^ g̴#f V8Iy=ga 0Z&0(P\SЉHSd$)hn(mh' KSHߔ1ȳX +8juBK)I6M$*tn!Mu7SypQɾꯖ/ߗ OOof^:?/'KFƧ> D1}_>⛅-'^}2Vo&|q\cwfF B R={yWnClr/nyfR:k#p6],X۲ChI).&4}4׊d>@:G LNmjњHm9eͥMkcgW$tO?//,2ZM}s(`sj׽`jէ{G^)cENھL `KDcBPkz)'&t,i#득'jЩޜ 4Z:-tWGZ _g\3fھވ47֮r΅ %w9[nupZN sZTtKLZm;4NDWb7jZXJ?۞w5،ekDoΎ57 {- AHLx I Hjr@hFY|-$t&ݡtPhXvL]zZcZCS'xY2^}^afk^SBm|O#9"u*Qn#9@!#n&&FWxOl%:] 97KpPI$rqc'ʹ7̟$?5q!]r(^͋t듫vq1n['޳ikz[Le#N>wc׫d٣%sƫF ijTodJ Ŕ%FPʸx}Z؟õCbYlb!TV JYqxCk1٬Mi7 daD6mX6ظ=JɄ`7Ra; bKҀs^2F#7|41qT>FkC OhqMr4A޵Ю˅kbM#1ќ%#Nv8t|P8tx֋ P<8 LX-{#"VҗgLJpuraZcmGA[lP8nFE!LE[GlS,SM+b.mvl,dS 7Ag!U^.mAR5֮r)XmFwcܲ~3yAl_е՛e?m*9-E-^Ħ *1Րnd xƥdw=F֣'=YNT`+\7P[tAT> cA$y$Zp$r3ר׾R#Qw4q1˧c xz0Am搾y v->ؗNk怰jz">;-F@vZ k錫&I>J*Ӂ]|f4ҧӹ}8< bwLRy{ -Cs 'Qq/B<p$} dXWn [=DBT˖Q0puܺH5ǍPE,J !4IP2؜i*H!eWZ`2$Q$Hd2/vR Т0#)H7gQ&L!K@z;e\gy9AHXRHTiEZXT32fOo:\ B}^CqLHvܵK"Lf%R7*"M}'FI a+LYbɖ)gEG> ogj 嶗dĒv(#=:vƎRɯݸxovywtfy1ӛ,h'ɤ )#8ZńT8+Cz̮Ηs{h/f?-N✧)4uaT OB6ML+aUaVkF+^!GE]o 7=v\{}ѯO.x^|_#rtv??HMJT$\b:O'lۅCOuv<[e_}rΊgRB6Ő[*<?4}s ޗBhT,U[f |patrS{\K Kga5zVCt·h1 V&r;wkj1mcZ\ػ5OE nmw>D)Fqz7%qޭ NwDywZ׹wk\=Z!|n)8 ujuu~_976nr*NQdj]<;,ίH!`|mUjƄ[ed5 _%ʱGα t^RahQkw$!4p#aSޕvVa#P*E=r 5T=SoA\<՝𷋯UTTA)A,J޳:wU+c{gH TcjP׼&ÌA7:PX=@nme2CM42Gy M"4Lg9RY"U "qV'jheCqX6T~ ,1I)eй,Q Cip$Ͳ4+@hcYc@vNn Kr֤"[M 8h|}h3yif\n!9KXiӌ; ܹh1!1e M~KhY*|dž2ІrI(ӎ;LP?}17d y3]JƓ)%3F$V-e.s.,/,jaA Yo A,mUҴDߥ V!L3LrB 0;0mt#9l1Cff 續Vim*28#T'!e(_@h wf#0i$EGHiݺ _v1 ӌ X+Nrt@~'h|w?nrĢ.fn}1yN*\0{\=@7/7~l`e?tj/Ⱥ/ bzF{J2cY+JAy4pTiַ*r C=Lߛ"IH!/ "WyJ'<"CBN}]oٿZOu|'nj7п;|}k7\M0!G%j" HŶ7́| gəAWaA \*ECocdCP*o; T8hmrFE5_6xq|sJ*S\ ztHZIcjfDQ 7[)!h4ЍnE2T ,K 6N1 ,#DHqA'w K봠-4T*paB(P)/pAB8 !nk}9@[;n/V9i;@|j6F+Z|"Rip&vt bb+g;w>$OzrTjx ji Sn vayH=(t ļ55I`2֧Bx 퍋$N{IBryc- $\rF;o!ӸJ^=z_Ѭw b|߶\H~˘lBJOL2.բnՀ6g;ͩ>L5:qK]fm;-l>9a!wjC(a At^emVyNfVbm(&Ar]~7:.B#/,ZuXͯΆbҠ\ rUa6S A `  2"O|%r\ 94a Dfj1Cx/gK?.3!0KQіaGj>ӡVJ?-6ݕfʍ3D1=hlA|k pFa(O hp RYfjtچ:T.qr ru"7n,Q4s:Ѵ3Rgn "ƫ!3R5fR:0׳МA`uyI9-Gr#!Uo%@V2r(.rΉAZqV-!iQHYV,(ifޑnfY)!Li#"UD~WSӊ˝@tNy/E͒^;LhZɬЅTڿ@DJT3r*qd 䃢^k>2_ ~\l)o[=v]f2WWߗޫ;ܿ~%&u144# hPԆ ou&EaWQ( <9) I)F(Lwf( c~)=Æ] EB>tX`A*9H+B(H^hQ&YY0)N sB%I pݟ!@ DP.gR|%dC0C`K!!l+XJop(78Ō!DFp4i>iv) 2^wK13߷7 kZf g! d\+|g`o>g ݻfQ6֬rpY nef^yR & U2ز #@B )~Y48] f3_w=W+^8vӒwsx?T(~%nb@E"{_FIew=wPnj0ކ_?mAaF SKQQߣ-2Ai> m'ntC%(V)T"Å6.IR&R,O f(LQ.rm3$dA |CFR{}X"{}@7-%pH!wBKI-5bw+[Kc~ezKj1&^KXKi'/O; Zz\mҫRܴkK a[jLjq^b䦥|nh1&K\K vRKָZzLm1iz-F-u=H -u=\'Q׬-tKsVK$R?lK)+R쨥x(Y:iSWM=r *t jcС(!fR--ePX(&n~ϭWfahRC3VGNڻ֝.]D:q[veAKhIO ́9Xoۅo<Ç.] SRK8w5||5z::r5!pԥj×aU~5Tc %bD:|t QN; 'ttDB=I*D&@N͸ hD}'wgO-"!g Dmb) Q6 Mp$P=#i AF6l ʕ g2Ρ)BӂrXh!y7f6)A6"բޣ$cmv2Vb&%~ؖXB^s"5iR l\7a;z,_gy0 a>à?>u-/~6ڲX>MbJ).C 5HCإUaL5ˏ$=Iegۺ\M[A#F|3)̚r2oV{4 _&M}0EɋYV3f|ge #vh}.kP̧OijZ7wOO FKz'V=%RZ@ɧ4UU&k%PZ'|؞"Tjv.!ާJt{+X+"f+> iiG'^eNT>U:?_XP־;M(4 !9,4{~Q͞ļc+?,3cYGbOB@ aޒфHD ye$M)"9#wk9H.n ﹘6bMMplb.r'5k4KMgLσ7(=v&$M#N]qLqy=|Mb4NmI~8pY`,֧?ټ&~O:/ "x !#.[KO1'|ID'/b1(iE7&_b'/oDNO޷);H%`E>2 eh;L> }a > eKd0(c} wZ>(u˸ W@ZuXv,CPǘ8YSKwmN:gމ '>tS7&޹vwڡL@>2BSۡ ͮ>X2Li>mD\ Lp[ LG}lXv0@Ǩvvr"Qh> a}c7˺ʺl J  "L\eд+](͈aNPd*9ݩ/K a B A$'iN?5^g*DtN2샥BBVo9ǟgmݠI- wmJ~YUŀf`AS}a:+˂.,o%˭;[}G`H-v"Yr>DY$˧{|aK?K{,J ~cw_b\sn4kf_6xrw aOX?z~{wKlIp8= ̀hM{O?(Mt@/0~\V=О VHmv4&_69  ngP'Dsa(5ZG)j v|LLFO/DxDw%Po'q^m}Nbot 7+ jþ ;zstͣckg$r`$iዯDӆLY:?|󙜺"~컐 1a0z}f*W[ =2ʏ,h^/'+{!n1]UTqd]<^M 8٭8Oڣw[C@`@˻4|yץgc/o}4(dӹ`vF.I'UN:y/сG>ڙ׼EP};b]XTvim~ eyJ/OI)#Tcԍ X^$}ǎG=z$ #~GI+SG”K9R kǔm׉&P@j52Vh(5[+J@KnmT0e[oMcZQRQj5ִVpˍi,*V|xY'RQ Qol& 7R묮FN/6J|du+d7czzqttzW dgo?ku|%p;??aQDP@q? ڣ%T_Rd!(x[QeX$>N|~O,.{Gѻ-*t:x=z<#\ vJ_D׻QO-VggH-/ 7h9a|Wgs}/wߝ8QTL٨(XD]V'Ո6UFRfxjGũ:ƤP^(DI]1y2<I!"${Jؿztbe wq&?pY"bJI"a<̈́ y0q~3zDYFc4q td9%"$0)bPG4 ,="J(.)Hqe1δ"%!7 Q24 E$@NJ$"[uo334vQ$tCwa Nrlt//f2/3l- (Ø"f{1͚x7W \ܧ1>Cϛ}U/L%+!]#p׽Q<WKrُ2Q@Rn'&j\:-ױ )5rG{Jm qѨ&ǶzcʡOb}XMy>ҥ^Ջ7N2YTwCRmJ EKZKRU%ӷJe,Qϵ|17wP#E0u /S =E{VM^e0\*K[4ytjPTDńd]{uJ]/<<02_a/6s Plf\L<oKOы~yNF]jL=qqp䨛yVʵB~[yw@RSN,Um07xAhqG.|*p{;>Rkz)bί>Tyc e\uZ7[U~qwq*2*G̰gqSFY[dTUUQHpR\onPzqX ( WT^ٹ{0i׌JPJjjހW{^ C2w ONlF %(vpy%')_q/~o]^A%U۔Z|b,*.w{noݡ8vBh:̄JJfEq;,\$! $Tڼ`g9*hiwet*hLe4hb:eQĺ}9N9V>"ӺU!߹ҩ\$nW=mr%9۴n%1R(dZ[unݺDk*А\E7t ӞrN~ΕN/-D [NRR*2gv2=kLf'+[홀)S`~LJ ;ӠaZAr+ը pȹݢ¨=kJ/繨(k$\ £V챢Y-\z{jO*~0BSbr'O'8E˕C4=}F7|=5緇W|˯ .=;u[rûMa6Nj/Uު~󛩖)n(\1$N ;h˞nbT^dMNEQG6qFkgSnX $ ,u{9NuXjShʼnҭ_w;( TT^CTp8/ >/`ȉQK+RKޢ, B+yYBj#'NVF9r8֜Ix)5ߋm1EIY |?}G%YUǩn.Sy3EIh9;ՙoۺiI<~ ˽X̶iT|3ˆy',!߹)sj6=-&X7gGuʾuks!n=[NO20C*O/|(O ; D~ݲ0Y^6hf'v]?#C? CNv?(K֞|($Cr-$nzHpZUsCrֳ5( J !()YlsIh@Bg]9:>Y5AHac3[x"Q 宋pR c +Xa-k5c:k 7 P)]{ZA~t?ޥ}&gU _Pi#g#`+BLDá1n$!ihsOQ==ڛc/`cػfZ$?B*6Qt<>2N| DK$Kp߈4=8NRP&%:)&%lE8E ll2:9ƣr 7=?8MQ*$}#lV_VYjU&1l *7ɩlݾU9TmF?U_ EսjǛ,'}Ϻπ鋠0}4Ka Ì3i yN(DL۲HQKr3V݋-4Du -Eh预=1m4j$B1t!Y;(*8%7,/)*/ 4Vi>ǹÔt-U z*ܰ=݃wjG2fSS^Lb )0aAt/EiMsL2V RPsU"`*1ϋY6rv0 +:}ioI9`&cԁnWG)V}}|zfZwY<)b=ɳ2E|>w{|u~{IMlkU;K;.^mE,JY"`q}f>W=6/YT̾t8]ZPY?Z[Ef#+)I l?[.cz &'v5 `#=k \D`,:G.0dr:박jU޽5dH(OO XP\(w5o]p=? U7wPɋoK-N۾VQb2z!b ̥`"pRF( ˲\265\PFwYł榺sAIQP^Le ,CxY 'bA89(u'(8W5~//+dxq PcaOVyW9-1JKXbp(,* 16/X294`-te9Hoj=jysW߿tvO_}IUK3O_tCf'G??' &cZK*ՊR7~ץR @ ᆑ(nB"ϥ[ns;[_﹕BzлU\@SI՛?j.n"A5`XEXi7DO134PtBkZՉ5[,8kr.1J8E0TjJm Q'刪{@O}~3fG_EӺX8!*P-KϮ*Vݷ+֯|O&)CAow_/eO_?;Ө??v΅gdP_?\;4_y:uwT2˷>ĂLWS=_|~s}+S9ROϳ{aFB^GҕrH_'0!KAu):iJ~)yB NX&¹,H*,7H() qs\%YCT)+aX v:CU:8!P&yr&LR"M q%=ѽsNyQJ :Rm Jvv)< } 2v3 [_VB*-+8VC:$LKWaR_XKJO!>=\ppŬ-w~oUoثw>_W)+{p>,zJdG;'y4+݆y 1˜X`5F+07b z-w1~OSBKnM,HT\Q)Yr*AF;vQ$tޟrlz'\kynyہm6ULԶ>_c5 S]} ,8n>k@kW`jƖ Tq.3 އ aZKй׀8to@n0F''t5`pzmeL "9?;ˁ kd{͐۝^cDknWӃs+ D>=~z4˼]9K/+^΍<Dc}&A dĶƺ/ւ[|kuK1B!3Ĺg0!t -V!C'6팫)SY%-Ƈh1EZ;Iq?>Lo=G-t3oVM}U;³hV^rЃs)תd\rS砷Ă.٩4BR!@H@j|If;|jvҟy2j2(-*8P^-%N@3n( Xq̖L1_}a3-o΋M[e^uf;+q07zplr:cbH&9so*Cә{KLyamidhof)\J(-]XV.-Q,E2X&L)I8ػb,W ɉ%9WN.&cy +"Juݧ [A r)K{.i7P4Va6/j6[9+EU<#N&e>r<.P/ڔb"7{QD-/[8K-Y\߉>Q`|43e c' D`tE=G@0D&Vn{'Ĺ@eƶ(h)6r XHrI܊Y:q\0J 6P]/KýU8d0q*xrxRÐ84pT?ŻwQ*WQam2FZ&Lj~1!oLSԊ(,Z=HR1OQDSP}#y7¶m7 哉1s# \>=8 |O49685!9)qf񌑃F)sK{|F9}-W}Ϋ-nN|t1¡Z>XCx0A e_m!t^ `B #$ $+LK3DrkNu/"Ed0 HIEOb6 n[ZۭTLW]u#~e/D$[I 8!-tElA[o_}U;4Rmz 5|1@CcV9'| ^;V':˝9ƌ xRZ?~?>kJ^:fֹÓO뮋O&ƫs+>й@o>\mVA7VSlF)/ jfI=܍D'=mej䣛nSg z*sAK׈!4=8N`<&Ԉǰ##ʱDBğmw@Nn egd.{^szS2yʠ}*'E/A;AZ0JQ0D\BKkpl qHI8X,KU(נ񇼪d[=}U?|f8HjClzwZ`b7 ?ƜaʹF,Wnt`!#THO`NY6XC_abӒ!ѣHQX}SJ,f|O\dݘ\YޙO~5' n[erwOezűQQs8{ !EysZFs4:tsJt1Ic&H{EFZd$| ""}T<*(1]2q,2naÜ=|&؜V 8R L'xaC!!?rD]c(P܅#!)h8@WQI -wdTcY 4CÙtZ-a zauhon4q6bՒ SY@,It=v)U*x7F&5R1+?RH ysbZ Ucا$OX}w,$LԷ_ہ| 1ӳ,c<.W/L [j2)|MCS=zRZ0{Df ߻1δXYrMhkwhj!c^Q%f[/q.Vصۨ68= ?WB-WU:_hGW[wNhU5bA WyvSpXZtU)P%ѩ*8>8腭 be!_O4mX!CpEϭSF-~)bJŅG%n}yŌ}\dy$T%k;=jE$7fT rbEc6hBԀ_, /A_z?oR/-n3»;_1oos.7>cW**t2{ 7C)Uϗ4bή¯֠;s6]_i{rV [g*2;kCrS }+\tA }Fv9e&д*mV,R5!_9)p5c& wnu1aϨ".x`Lԙt)ݚEy8bxdžO%}9̛=a1^Ѧ \UtpYLfdy$Xͳ'D=*ՁW`Jb! lZl^5(91 tE+g 4!P{-Vèq%g+֤5\Z4(J%#-if#[v|4}\}HmCuC%r0|q;+l>)_ĤaRJi:'sto8܃޾S,t Z=}*Q{M-4,#'NBiv*kXFZ|Ĕ0#%8!4T%>'{ _g (34EZ+U,&Yg ?qk t5hXf9E6`945#)Y4Io2XdW%IGTF8$&҄(X3Q) Ca6:Ҋb}DD'|?Eg. C #"Fcʁ%OlW-I1v MD! JΧN3[Bs ΃ 4QtX%) $Ijb*li'u$q$lB 8%)82M *h#D܁wZԗ/e8 [o1in)OWRBa4H*8cHc\KH0qpJ2rV&(vUB`ۑ"SH{GKIil5Ħijm 뙂%00 Ha5!^1oRޤ4hA(q sH P_.ď/JpNRYuz uHJabLy!@"U)#^1Pݘ b(T Kl`YqP?t˶1&wwm`ӽÕՎ%s ᕃ {W?]_/ I2?/|hM>~VC)! 篤,)8ic[sX^0/> OBQ ZZ8C>q{:[C^nEv8EgUU; 71@<אk";#OBiFͮnyG? я,qo>FxQNDuwނ((wW#!(T۝ \h7B-I-2FHȂLDfN; T@lJ 41`=JSꀦ|,RV8*w)ĉ N;!RkYjWJTuFu0t0ōHftÌ R-ZhNϭXXBͰZ,S*8c -qsQ-1=-uA- #߉uU$i]ScRtoi+*kj7G!uf] b}FԙԾTt:22uf<%ђաGws4:%wbן̽_}3[YyYZ*Ԭ*aOwnْK7xr9|U$ɢn,Sn.jJΣU?fvYQp6.:+EkQ?ޡ< 'HhoꭵH%w m0ŗ"U(g'#J6 5N 'wi9kTT >>79!ŋPҙ/`3[0ytx13$ Gq'+c{#_c\;_.`ŒN&?_a0e[-D :! AvĜV#C!!g=2'=1OUpO:sOk wG؋կě٣HRIb0yKL3J)|94!!<`u&o[kQ%_REJ(J 8衫}[_ TP iW1_L(Z2L]Hۉ(jf! D ڊ|頰mUBrܣhHO>b+dwўS\ :A<ļ& *.& *ql b>b)*%ΧJ[w`Jr/j51%ډ@B.ew;*r5V1yF9bEh$ t- ݭ¢(TDjXFօZ A3 +-(lj&X%B!dIHMbnj7xUM ɕBjӺ֨ 97T:WF!.1 uQ%炋hZHvN!2 Y|UTLjFj%i!bĉl<NrҤxF/ G)@ů2ޜEg<wWW$k|¿3: >OQB2QĥxXoa7l5pYLfdy$G'gך>N]sqOM>&"9h68:?H*d+0 8At ~ 57 +4UaljQ+uV埌GI)HOA| SQlzaҐU0 :I=J\qKЈbż F89Ӄ|PB] nMRvO_/;Wxfnjug`-BX 91w<\&O)K)O$NsE=H.d)!0O3{8{KxBe0Ǚ*.9m/iZLԺ`Pa^/3X֖ 'kƪ=A|=kMw՗ͻ30W{oB\zl8a!J(vp"4.pCWgCո,.{SQs)Fœ'xI]#:8o9O CEVTUb#]޵6r#"e7شMi "A=r$^[$g~,efOgjbU8ƅV wĮM F 4+BޜL 4~{ޱqjQuj=!5X6Ju8Q!C =8 RMGuW~Q@齝G N0lX5O^,2O: u::q9J PSpYw\EwkGTVOwDEqŌk~x7 ߣ+4X@aAA,( '@0$+:3$ ܴk9B6t:!g7D8&JvN!?$|Oҥ"t)D)&*9" U>:rp% Re,>SȮ * x C7YTvg[Mz`[ ٗ ecG4#UqGEqLH0 jQja=ҔS U_drêe|8Ւ(IQd:ɅNցFĒ83CA("Dَ"D=pa3rjyUhZu|qA"G4y-R!py)J,eQV3 %pL+-XW^,F?=WWsG$)ྤ$2r +ídR) 0!Y&s(k"dͨo<{^ʹ7rP֒W_txkԺpI{kv؈o Jq\T=`#SITBfvD;UH9|AkݺX3HfJ) 3j~+%h&"j*A5CZNGۮMLs2St"U1[cI!$fx d-mѪ(X6"`H:0:CAhF0ܣ̂EV+31<îSqJ6B8᭼rTic[2Éqb:+ { t: T3F`TH3w} /Q9RDi^.JrQ{˔^| 1\[hP}{1n fIx(Y yM岯ȸm4Nٚ-BTsf] D׬[u;_5*S$%f] fvgB8 etNqfFGᢖGR<,Di^MrV4G Z“7&G۴pAeoOIeo Jj 0-Wنժfh B” Ʃdܔ%-@<ܚKRǵSU7(ڙV[բzXaآtNSed[fjUJhzgTT}ksyP)h;.1eV[ۆ|"%S6D'MJLm@˃:&mtRk=Ѻڭ E4K*uiéʘ|cRn: [sγr \:#kdJPkyv5Fha򚡫_!jDbgTSԌx'ubhr3\P;a;ɯSx E4GhǧTN^\9{B3JN!U -m>K$WPD2u9co=3@%_pSyo=Qa|-~yǘYDXT8>!oMaU N@JQ(x4;xEC~=?҂p3ON_}c+MX早mB\tt)etB %L IڛLG}6 Rx:\ j %1h֏fk%cdqWwwp]p??rŒ J N~QÕs`{1h2βSxGjzg\Ш^&{ϸ?bat44Z/^EF0KW`<g5eLfK,:9݀hiXY}E@AE_\i &k\>:[kې/\D)͙,7wg5T/DW>&QSJ{d~[]L)Ǯ 7F4V<N-u"4Vj w%y区V'QIiJ1s!JdJe׍źJg([|Gf8Izi͏ Qh?)RTCz4WH,,U4(eƇ 8xD3s9*pӒ=&ӹ4yr_nv29[.H>Zy6s%eXE4"Kn]Y XH2+j3=V*ZQEk"p_n a4`Þw@EYjG FjS i<徒z|޺kGԇm\#F V&OqJu?ZoI:&%1"U4ʩUDIG`*K p90*=A(͎ܟVNa."z\JQ3A&bJ\qYꨉ@ƒZ wH30}Eq$΄sBʖ"+t.!kA+⌲4'sy+UdahU`ė3K^!F _ED!' #L,^Iito9u,Q9g@RI%B:ɕB(%{T0YRU"Ws8&Ef;˰d\'˭W%+KuY"PCآ%JK dDJZR0X4KGKsyw!\X xžw))mk2gn=OaIʢϏ*.x< PT3BRyh75_~<=OiFȻ\_ous̏w|"pmI#Po>E3hqN9=膏DPR[X9>V^/w`EQ=2"8 rJ1 e+pԽVh_DRiR@iPD/DIОa3JAoK@k;Cdԯab d)dڻ* A*|Q*9V;U8MP>TI PBPEhRFdZg/M(W7'qY?8%~mG'ڛ;Y$myk58Y{{5VR_:Gc[\K -ץ(9:vBSǐ_ ޺m5q9W1^h: Ά |ig*hU0͝GBG#% S{ GOyG>ѵ85R\4Sa\.Cqc \`rn҄ZbƖTBV݄y,dPɟ0=rEwXn4纉tr-$aztrJ<@:9&~[II=X1Us!8<27W.|u4[}z.nEWIG/zdr'b=&XvJ3 5t\Mi卣tRyuRJSYM1n(~H&ϑ\z*2y!]{8P]+ T1FڠښǶT+F=)GNH?on-?M^O>C^n;k=$1ӣ@bu+&[szmX|~wrv:Ǹ 1 -fua~[',*|>Yg;_^EB(|*\W &C*|47 70V0p)B;c3?$EGRLފ%;Mӭ!'ʤEn /"Pt0.,)Bygb7UZhCnVE@ IkMK m/`E5P>b#;ɬ}AA%:A?<8.)v&&.5RpɖHÝ겈KTO~jv-r(\嶧z +3cR@\OV[-'e(̑ʛnZX*~dIZ3s,{Q C])tadZ V 3L;BnBẃlXX%1 |]od*foA^Wk40D4  PƔ3Pe>TK^7C WdQWqYQqe41+ᵄNT8O|̀RkRB ~1Nъo]!kc[MKY]/.E Wrfrkvن܄m[_H9W'LMMw'II$k"[*INOV M-rԕD|vp=$W۹3ҁ}d/_Vb#Zk*#W_M]2;(l,R5.S9eiz`e2۟.}HY"nc?Mj7ծf_DZ ,s_~";FPd^||ǞXPZKڂ}:@W^nw^HZ}©h戰PPeQFHb" $3F#L02(RWvP ly ʇ݈`pXzXg\V,=Ebc)CvzCR&Tmi?X3%d!RI~f) ?Ram,=$x[j2I,=EdrpdivOk]̙'R9 zR&ٙ'R?wY%e/XzHRYz,%$YrzR"XH9KOT*o3]*?&R+@= %P&z?"x[jB2ժ,=E"R Kci"5Af)&~,dd2x?yTG]Z C9zfY6[jM)P1 LZKSͼX< bjo!wE} sP_~_ AY F;k#= ix~5}^dNX~9<z :}V E~/3dZr|J؝Z>ğ6gy˛mkqD>}\5=F]goP;뛞c{h{9 Zo9}^Tжlw9IsOlN35NJ6bI;v{1>ZV8؀z38.0c5➇)~_ۖ#OT.2HIۖU4*Mk]iZu琚Sձ{7'S*[Xֶ0{\jE*zzu-zbzb#jZO[7_BrH լ`_t'9n~pS/"$k%}.@*S/s|XЍ^IҺR E Q*tT)40*aQpC6[څڅ>iX&͹Cl^ֵ'Hńf2|S 7УM-teJJq+ڼeMݑ\Y\.Bku7zuc RJD*#խPpQ.>#q֕3N() Jm4c)F)Ӛ+J44(D4h?th%ZNc59(JׇR;$ds0iL;VF$ Q,!A@I3 (Ĉ8ԈːġF1҄HV[ ]Y)rH?XViT`) {]w:)x:1H^ "@b9=)#i(K)ci"5UYz,evtÖ2T;i:],=eK5vnƴ,=,x[jJę'RR9TL+l8x~ͮ \0+= zefKWrnJT?.ҟ#;% /RC+ˆ@O0#ohd㟯7sAiZ_2mV)ȔT^AL^ZugzA{h8 #‰}ZamL@FHa cpXPF kXlz4K@Evsh P1ʂ`9_B' ˫6Fc5мt^ywշG`C*nŽ21 ӭӟMtE k&{ixAEcLjiPH`p *n+VYL˗/#BLKtaMտ1FKcBa !"P@uHccD0Eڛ7Ig)j?xV8C4XhR {0;J"8@ (4vTH8 c,٦NrwO.r4,`i_C֭}, V_?< $bv{|_D~"@,'ZwvYnC`%j78J ӧon,JF. TT/W[OFS8"tyFv|*B,3[cݝ}؞9E"&6|yhDMF.w o^{?tg"F\n{xکbЍ!ltyq1HpcHZZKE-[B+J 1O=]|kaAkx8`>".љBI:zzʎ[kRȤ%[2ޗX뜅,%nsW=)hTYrZV8r^V_$kj1$[ll[~1jM<}v# ȩgFhkz̯ȼծu-n PY*?aT{bVWUApx꭯# b2$( w6~|e JYqWec^KE&܅=*ɜtc**PI.쪄#ޛQDL'tBe=*u/a&U_^jEYf,&u R34k"yr6&`QKam9^V+; wfUN3ɍW^98Pg]&y?RT:QȺu\[e?4_VCsVT&___W;MmH֭*>ԩ"֭] $&<[ ))2\3ʊ~GχB+|Z<=ZN v_^Q9a+*qԱQ/.I6D(KK-\heAR'p9Ibe̖+g dil .ڐ!RC) *#0"}ȇW!;C @Z9w}?yGo!x3h<>@i,jQP"ke$6k$רlUVL >;ŜA1M ,')| 3)}t|>v:s9xlw?}߿OcJ}KcKt2oѨJG7L (%_%cf[4P|s Ui%d OnQM'7GcY Exӷg~N߼L keYxI]Ѵ-?Qbn^}T$dj=S܅svi\ux/%qn|SToh%*mLu uϕ^7\q=fW7}c'ۛݑXhQH%t¢hoA룯 kl1mEFp+ 반PW 'bQAgJɼmأ2,83Hç&D 0ܞ&.qs+"qWƠC7g'NU,~Q;g)هw9>O2ʎN{҆ @yv'S!Tm]=ɭRk(ʙbZ#-WS=]W4 镉gPQ!+Z&PE䔑E &kk$ :PCp=/pXMBh;^.Hh $[A* & Iȁx]{!"겎Z5@#j}{uUdbR321x] { R(r,$N_ٜQ=KxOůE|~"|A _4AP=\V:0mw$)+GZLXexT}PoQ@`T-5+ ݼGZu~yPntb%\.5Rt  <2%7P YV"4{&(k Ěw2QVي)An$ie/ovn*ߺP+&2:m9@H SԚ ʹ*K*ZaiȔ#dH7!J? !.c.4^#]71јɂ=wxwZVsxap!ބ4)%)~Bb?2$<6b1Q7Ő%kTWs>PHU@額Pa\(BA ȩp,͓3ʚW(O/E"|Ah'*Pk)Mi]%1!I\IU7 $0,/Ա!=JSy<<v"cih3'gCC/=лOC$ PagHuHA,A%%@0^FTy"P"Ce! Rms1'$2BM۽b將w7. J::QyzwQD2JbvpbQ9w#L Z <˹;!t[TPfJLrL)ݹ~ic^|-= }@LV լ݂(O5FFJBr ҿcn/첽$ f[\G ^o4B1:;} pG+vD'򳐨Waa~&LP;AY)EumJOH`808|Y)k+YgPSݬM4HsMm;G)3HCw:k&ʤouou$>yr  hIXJ#{]&1ѽƓcB8ZI*So۠Q@Wf`Tq#/E0 rnb[B#Xpr7ƽb nLoik,MĊF@i`Ef`hV Z~IFJϮOg#/דLl#8uZEե>YhyrAu !w9É)6(PޠqmlX8o/[0&=cmXd1Av֣:W/Ξ|K,D!';/fNw(>$:O\h-&|mtTkKs+&ǐjY']'aGF5\HPƕu&w11\ԭ3=zIZ&:k wdg1&߳h 1 x#ufY)սaFl~^;٤t \4%=t̜WOtz Julr\u9*+B΅?LG{4eޣysP N]&3"X(AQQY6wEܝ2Xvn׹%x( Nc;^&@26;_+P Ngdl)z1({Wf̨3`E"bLrp Iyd!0V*^}]U$) DʋeB++!ńD3!) chI<%%2xx'-+ErJ =!8KDZJEψwZ}w|'%w _K860 ?T&zx&A50تqxuIZOˎgsnO8s{z2Q~rLf\ri~Fz(|0:3,?՜pfߙU>tZ9nɬL;#ŃsL:tTR,Uj̪iYfͳS$*ЙUY$3$uA(3\f;9js2rmC2kkps2f#/!9i`?S4t.q^ּc@ږ2¦$楡C3_vM˨H=0e:vH]RKjdNvNa:&+je-WrjF2amZ( p yQ(M2V嚿c=Xˮq L=; 試Y%)į7y] hK'|4Yeu*+C'o=6Ͼ]#RIY4+i,cx/+x騣ULPJh,ӡ@Xy@}ϞΟwa^56eN/zMźIJ.:+aRHK'ΠCPKY"(AAYYr/~e?;m#ڮp߾Ũ gHC'%Z-ΓcɲdDa0c^!%9IW_Ҟ3amwYY"ĝI`QOwϡec 8-g4HJGGUϴזK-z`ΘRHsR/TrȘxEqh\kS|EdPjm UHAu`A unoba2Ųod/'ҹ0 `J RǴ9Zpun+^wƜJ%ls?T7)JX}!o7ĜxG1]=gGmtK/2K_HzxK/y(q镟P㞲*R IN!}Lwȍf_ҙDrXx4;wnTO5nqޤq_M>qܳU.o'_==NF_2 p==l'cxDdRo%z=vQI ><} jN8g'f8"6W o()8@=c)oBs&NT+ިJ<.39t %}t rԩ⥦Rsւ?'2S,tJVj5^)TN3Ne)uC,N g7|br:f=5H;nk2†9sBNvd>̀|+\)ƨ&}uX?R._0JW^5V6|^oX#i}3'J#I#EH2/('#Cҿgc:O'P:%e)xQURa5Uō.iךłsm$~,BXJC`DV[D8X![`MI IݗFKn!_u.E4tZs[I)ubE+^iA#]dpRXkm1+G 6%c!mlkLOk;$f|VxLDJ$;nmXx}zPZ.Į[cu"ֽ+ KZ9eiƥ]xQBq'=c " (׺:`1AH35t.'5B g\Q Kt1em.>b &oʻ F1]p߁HF$Tqm@5,6Kad{"5W'L]&S 6 njUyu8&p0pګf2:)BSB7CQ)&(fPo1]nkD22bR5l>e`}θpy,ݞDP GPvoM5{YQT' 2C{. )uTk*:wI5? Bvz{=ZG. Ъ?T5>^^n￯լU9 X>Z5y6m2pRxfʽfr%Eo7V7zMGO)lK]uT{P:\Dd+Ouv$GʃI#FҒ!1vKO> n]HW.I2W8Ps[)/z-!:Fvۧ!5Eiڭ ELqi׵¢O՚ yCc9&κS+VұL^S+n[IzJZPuXd"^.Ix\0rdLēU!͍!BJf 5&{U1"u>}t~n$8A %PDW.۬՚+@6M f/jh,k7BX\cd=lv;)G3P}|:t?WR^nhٯNYRr'?vįl)w`+ξnd'O=yVX"|<ۃmמK,,>JX7E@дc|G;b9t\EDes} nTU΄eJMŹ-Zf%C19DRPIev8TkO (6l<)sc)pon3YQ]]̳*cp)Pg$ J/6)e2юObnmAtO)IcxՀt9 LnLp.zlL$L@J߉WU8B2z.̀W h~~U1 nsw){̣_>f3pSv e]Ҩ7zV]zR׋g= p: ӝ@:4@l'DuM-5uYQ|}X)sQi#f~"qAvX [W䭰;_W^>?]0t$D)h2`~Rﬖs.ˊZFR1IY#RfTaJle絮Wx$dؽ gPQ!6-&U6C%˴A2c.sV(̞@e2Ө`*k6xӎ1^h{(^%YalK]BDq} _`AEm߿7ݮ,/ B!M=nNT29)xD(1XZfW*(l>/>cAf^O8sh$̬X(= TD8üPM,8mYU 滋&"U';B@m%O^v;( K #nT{յwjW.+SH)˔ADl0F*1'2|E0Rc!5do ∃_sb0|XU +n j=ZeQ>fy5,0~7|^MܱMܱ?V[^f*d^ k\ªPV"(U0%3RUe ?|^ ^bSz5Zc`٬wmsE_zeMPRV f %> a^DNIĝDNI?"Db^tP &9 yi1{JUu$+s,̩% s୅# k=pbJɃ~MW0ȯ_ԛ"9qȉ EN\(r"'$.6`.\xÕF`[J9Hbs#!(pK(iZpg`x-K3z3'Όf/Z!^ji1Rօ[QHpM3Qe*XSJ[Jt(IF%VCQ^%ќPН s0庢e.][+lS@2W^%x/R[:{uPh!I{!<ӑ"$'XPc+\` ]F=1#5>l '"^S $2%;\V:/CNCxi$+ʪlƠ* uV&2gNGk/}q X(kF1>^ ۈtvEyN*\R[xJ"K!05C1Sk=U)N7S]wyݔCӋGS^U3-zN3X oߟy[VW?]^Fdw\a ̏?9O<L?S͟3z_m];k~zpx WE%vYu}Za]O?ވI)vNbΏxaN|-APVdތn[tߴAVFW1u,9Z2C3c8۱7N/%VQ^jX4(ƲUlR:&/-`|،.o/iomޒg1989>R Y'Okpq=K _8%劊:@_cN򉅧v.*FFmMӃLFWn81lSx/k]#ҜcP8ޘ%۟f' \U]5zN~Qy9+ ) "30 b5+&4P73=CycT0Rk;OcއZ|<!3T?3-- ; !D[[!Wھգ:אddO%R^yuw8h@OPsW.`?p+O߃d9 t<20UמguLatto3of|gsgvk_x-`+n|u-؍drL:]c^piayƔ0fȔjr339ΟJLȠxx{)OQ}E"$-c"PQRʑ .UeKİXe0TkxnҸڠ^I)U ST_lRMuk0)ejӃҧآlǓ18+rtwo>hġ[ >߬&{ʋM*ԏTw%h@خW "n\(^jm>p ljz醷7?|x_9]~7{/oC@Y~>zQixPshEv>"'ߎNk!C{ͥ#FnJ{gn-#37xG0 Ȫ+<}n`!(/qL2 Q4~w~FlML;&˦Bh95SRGGd[HtS-$DxZB82ށ8 nNJltƽ ɎZL۝Q%8K;xmc2YB"E_౐JHP9VQ]ݢ<, ϚL',M ]4eiKV͟L{}y0lzjYE{f2+u.Ly<艌AU3k& y[pUN t@/&ND01!5zKzՅ[4Ř_85s[`Mk*2x54V X&oz7/Vڝg6V}>y|{V3=pxj7}:܎Dh-sg=7`@p¢B eധ 1_!̳2'ݛ{S|y+ jfc<:&i4$e{a?/k њy`FK Іs^41<S9T SO)@W{,+|>q9i9ir}J(g@ QN,[HAV9DRD(Ӊh@/dpH}dz"CL43f 흵 E?Z& A.V)$Thh:P#h:|@=3xl&bR1fzb&jO1[V3{19(%G&I%uN\_B :s<H2ȹ_掱7ƫ+G½+ B>zRjno~WrY tH1|wʂ'=\_U (J{ ^~Rz`Q4 Q.oF7i9'6 byěۋ /OS>-ak߹H?Ϯf߹o@`m4#m@ۓ˗N`'+cKXRҜOվY`a4i,,rw,<d)t )w@ %G&yG#N5)"<\0LI(JriHr(u}!YN1jRmյ]UQ*m٤FɐU$F#'YgV`l|V 4٭ Z`2F%A+eRk+灏/U r]|0R|D~sg4cs󐁶[r@ZCyNqf,ȧE>l F(^{3؄xThP< 9 fY7M' ե%UkE#LU_ 32d'F|"멯2^ 8` ԗ.gׂ >D66_h)Ixie0iYmycK5/_)>o^|OAT~XAwwl, >‘ǝbGl b׳N8<<>^^M'z,?ݞSq9u*mj"?QܵNMSȏQ֎)XzZ`fB}kڧiޱLr9i;y"YN&rbY%}>>)pn䮦#V6ywu9ܦqgNQi鲡jӇ=>Dnq{rи OoWbZYȏK?_g+*##t򗫫u26Pq:ysq~vz'NVoa\c Nlc5lѬ&c\iQP!-ͧ{#d =/JlWb;+/g4ki7)%!e!Ny" ̡"-XN(x8o\'amV>[/a6mf~zay{W@>i*a2aeU-VX5\( 0#O^P}xhBur oծۮZ̩Y}vr$oSߍZ _BiUPZ=}+!M(h+T Q}z꜍ 奄H]W 5RJ\A4T/aTY5&P,1p.H5"v0{xgS>rrRQ{gA(^lظr\spⶨ.cZjn#!r ((䀒"oiͼ3$3^RVR9 8dy8@]-t6_=}ӊ;xxӊz86"? )k^nw E=wDq9zx@xDӽYMF:Vp4,ן "Vh) lXhPCWBa SYB4':<(}>YN`T^_]20:szTu) * v<(_U=;A';FH5l_gRyӃ~g`r@zwq;Tڼ8w_Rutk2.xٵ~g=Y5ÿ4Sn ޓ3JL)cXn5>݇1M)v>yEiEOw]έ-ntCz{| 2QmJbMC!yd8%meeoͻמMwՕRHNzw4hkOH DCX{2C?TtchPnJYY"hL㋴ { ":e:~<5}θD1wtDh 3`Y 3 ibx(#dc "}tz=4Sr9F%S$NY,Z4)R%PS]Q<t ọRKW>RJQ}N5HG) 'T tgTcHG)J'$0t6֩FQJG,@b4<8n V󣔎ZJ9I)aH6֩QJG-qD!qBv tRj#Ԟ| oyBjNpHL2zeZ0%)faGyܪLYրjMRjΧCKeR2*{fX3iyh ]S-T§d|L2D+0J:gJe^hj1.y-ߎV׎)XAT4%LY\H2ڴOͣf3؟j G5Ӊ@C Tx3W1+~˥1$%)C 8sH֨FG5"w%p6cgT;ڌq %"!22J,F8fF-EsAJt;gT#پZJI)B 3)'X(cR8)◛RJ⤴;WH/ o0,hı6n)6NJ%-lR5q8cRGY\ХLƶ5u- 䩶KRh8&_>,oy /8(5$^'PbpQ^]VG~h~ɿnzJQ8ȠKd\mo.VS9"U9u73xz7QƇk%fC e1Ի70j^l+de?,}G .:K/on6Nk{/hu%JMEf԰ BZ0=F֕ŌP48Y, z(r-SzW vS-+ 0pRpS pqS vm{K]ap\MAasa^ CX_)nwT ޗz۰&G;nP? ۨ>{BrRj,CJґ+m/t,+$_-(-.eDr"=* WFjYvgʚH_a(vYGNf=3k?Q' 8Hj$ABՃefe~YW1il=DO )CD{1{{i{dJ&e:*#"t! z+Ґ,B^dJx5U trkSb~=I9 n=巹ꇓ NOooMycm5ks Ɵim'u6g.x%JiO hhȠ%;tdTmQS_K ߔ݌DOSR n v)Xx%E"%J/}LP 4/(q(jRATk0 ub !˛7{ӭ Z}s5;~:QYz9$F (*G@7^xw{wY@nE;k|}x}- 3 +3y~4؏-dZ|f/< 4c/ηRג<@V y&=?^?C揪ڌXvM?3DRN4yb;7_pEVyU3{cvwŦ|Y?>)_ʗ=ep~"w~|]q}k~uZo}o'2<^Mp |pG/jh p/M*5=璀rcx GGNOnk%{";{Fk/:p{P=[#5d?V}DCz[LtoKw}|[[_D;""͵d3otRg`m7X?uz{_*)/#WӴ.^;m*yտtr;G}1wewy^v_%ZƫVndbԎ‹"֊7jׅaw,TB&:Ħ?HwrkJP1ǻ"H6|dO6MtM9i!Pnw*>3xWzHVwwB&cS=W8P\]ujN?Nsuy2_\w“:wK:f: MM{ 7gתx*<VWռu$je=fkŲʙHmٺWupWOb9]bْAÃIXHHK+>\W =zu#E}ʋÞUiZ}=Cs.vC!7Ȋ#Q91\ft]ؗ3[W+I"Ri"Xɨf@1>U NE:ikjDz>< 8/ _Og>T (%yhn3N<Ήx묗Ǵc9 BBhbuG3B剚6{ԧG>).&:ie=A0ʰBMa@%hvMJe}.'zZ@u֘1?cr>3,= !˃E$U b%pw<&ai1IOW%T='B7? LF))iCF b9)<9c*ʎ=7սؙد&OPUX\D"9h%9ș@N["T`khvT r>ϗOd4O2/yBIAʑX@sk7Z* bGE-/wF? mD/6Y(%U,=̻ʡPE#0( V949?l)N}~,A,ZeQ@wr|R zTd(ʨ:mñBAY;fWv=k, -ޛOEeM  ٓb|[cssPhV`=4v|h*:N )eN-B&#-#f5"] _Z=F'VҶ}7Շ"D\YM5m^޷iQ8hb2x(W ʰh{"#'l|L/eU̖0J! N;fdΤfsP`Bp56ԟ);JVggmHڒ#5FOYhMYԱ`iX2dqj~<<\*0,g` 8Y붅$ y~@%k-Ga]ce5L[m"F ( +;TquMQGMxʥdh]|׭.$XQd +gң'^^ C!:r ֒u6+n I`5#D58H^"rȜ;ʛKK6[2W{i~z;zs\ GI,OԶ'IO0 ђhE <\jFس;~6 %"ܲf#e*bhRuǣ!?&j9LPV xz9=ҺMҰiz:V-eG6¬u ؼD#% 4ux=ɼ(ȒQ٣=n^t\@ Ŧ|qE^_/ <4]q"zPhFZn@t+72YZUbߚ=\6eAB8dxBVƆ=DOB#'F'<<-K5ίϡ-E<69Jؙ{ MBX:xYɯDJZ/Uq& q@)32W14[*P2Y.&bV!"%1PZh]6т5$4=* z{TdLBqa֒)a@6lnG {aͳ<㱪9j=.JWq5(I9k@rLq(o5[f$'e0:+yCPzKly.MZ<1S{TB> %Mn#31!TgU'N4+DfkR%J;|x Ony\ɚ2imJ5gEQgS^7 wm%r%#~|)>șiwm>%ab"b]iBCDZ㾻A=RcOOP+ v q'nGwuSu4/Zm  PRl,l5$ QH Xy@wQߎd`CJ`t~rGxg."ϙi!mX-e} S,95Qh<.\?}B8'챫n-Ho®/ekU"@2Q(:$ً߯jDI#[F-ml EqzWU]ꪢQ}PvσeH{Dig69M OOˤ!DNdH" 4Yx%ǒ{Q^]_TIf%F#)BѽƑߦ6Fuz;;[LegpZ"/ֈpt J*8s6>NX{rns3:xy_- B05TvHΖU{'v#(Yރe!GKHl)bd*$,hJXjL .jpa|=b8NSXK'.X\QH܌|2J N'=NδY`wV]ׇ  Kah?N"C' ?( Y!9Mj}g KE l^8qlqn}OV&xD[ [1@-D)͹{KZ`fD̡kX`^~m[T %HdD`RUޑ˻$%PAzqg]cLv]mg7  OgKTB1ąa:9amZltX_X0F!3SUa7PyF13U0>K:52eĊc$ D_()JQסsd8wZ`X).^Lj+0v,' ;&3'3 (ʹX.$I d99}bnW*Ev=Nsa,W@B]q-gTn?0֮9ΧhUpdRE&g"JV\dW܍K $a ӥ* Vc4wQ^veڲ<#q?O/aE )1FÚxSE QzmfL=r]nM)Wr2Cׅ%3 l|~ԻiҲ$s#iT='u <I1"`Cyr2ػ~N4eG~ueC.Ǩʘ27KQ&b-qu^c_|pЀ2 /)+eoOv%w|"xcKatAmHrO<鸜K A,^4j) `k_Hy9>+s٭͝sl(*BB=?]V*Z #I<2fE Ҍ&#Eorfzc5" YA&{Z%X›y2o+%fGڛuۍl+El$ˣ Sŵ\kI öP4%dʥv!ֽ6#KJVU>'ViA wߢܛjiJ|4GJ ﹘d$xdd*~q?/fq}“heMɎG3 w!RNnG2K*Yz#!+}4_UdLċX"3rO ~G|K7?h<- ] F*rRE bmrDd6%Fs k [%_m@Ӄf3Sg+4sċKq|K F/0lz xx' {rTѐ8pJ(^C3ɇ8H[˔B >"8RL<` (}P-Ke#u![H jn豶VߺP)|Q)|Wڒȃf,vcR_o>ӏ;~s[r:Ru .2?s ABi1IޖҌ]hPb}Ya:`ݝDpQ6B(z)oϚSp 9 i5huDd[x! xmw&p7LQ.ȪuX9]Z{GWF5joK<)rY՛uúEciI"ҷm/':d0׮$\,znQ8ҙ UΗyeFsJIp<2MJ-ϼ>I4*(p屮t#GlUިB" kG!?Ϋԃ_Kbt3}L黉F|]mhd')* :^ I%gi@h5vC Fl2bxOoW܅F4b̺A*f F y|˻y5K4"+7LA?5b7kH`).e B4Q"Z|>M/<Θ<uճMO& F˛.25d66:,Ϧo.r\Gˋq~6O%ȩˏ~F7;߇?Oh^La-G^>>4^qSr[XXd 6WgWûV6p_T9}A\[3Q¢wԽ1k43_cc=xٌҚF음?0%o:;NFo'NOkV&}u@hG #?5"ok TouCo1C1i'r>T>@钲#[&ۓQs:4vg`kl ӝA>?OHKh9Ä~$aI~ %BP"Dn񌘠)B Q)QD.ZE6se3/3IdmM}ֶ`pskH*YaѧA"_Qn|:/([X#(+?㛿~|ʲ_^[iQۯ(/߼SQ'Qp4 8PU硖jJ,sLaB7-Bf K]3@9ui5YC8%YB227:ړ[LV, #FCْrkKpgGFs;!Ւj`wJ"TOfs ڎ(m0XS{(W*]h(mY@xj!$*<=xϗemS5m=HPITGjqHjn(+~7-NZi]CZWO2LdkB9igWC%1lz2Ogf &n>=893X6 ̒ \]=7^Mv!dg;H36u=X?CsgF&a-bxqz+9K?=R0<@I~kBwAp#s\<??f)@P!*B3-gTQ&!yP}3ga÷Ws}w~du0ęz5_Ҟ0WV jܦ)Ls\ } tx.xH:ٔ]S,769\fYto̜Mz]lD❂$,[sw_>4*aӽH 1v1m!&Z% xMHږT7"H@r/>0:`'򻆺~K͋%Prz:^}hò1:6 A:m?;}1mL1g4f[QD!QASYGOyͽdlӴyσ4==CuZ莣-^@d+{XғҜƙ@/ߴcyc~>a9*q?9s7[(0cdZ3 c0VgZ{_bPfEfn O >NJmɑx[dYeO[0[nSb,Vi}Iz jB PT1[ztҬ ')t YX#$f,?WLHAYJ5!Z~}- ,em7?ܽ8Wk[o%ϒ{#n-)V5EҨN'&R)"O^3˾(jgQ2DA}Z;Lr,GhK5GH3j72٨5QEըb }g!H+UJV#Z<ōJ8@k8zd5$o]b B!vw4)wG#~#LhZ|h(k[9,U`Qr{+G#I]{CD>Q$浢걤wC%EĂ'kז;&=pgX_a`1:}M!nCAF#|j,:FӫqYgA8a6Z'']3z|=1`;0ɻqɔsV{h8:oV%,Wm:̹Z0JĝLٝ?/AZjhBf)?PcYhc#V2_*~aM?Lǻw["S=nPt3J ďZe9u3$EoȽ׼_MeXf+#rׄ0 w\G*߭Sc~:yK',/.4}Dzл8I%tLgL܉4Yl0~β:Z[)bύ;@7d4s97D\oIC^6ҩҧ8ݳnl3[SNwԱnsLܛukukCC^6ҩNf߷nJcnM9:MQǺn2 ں5O3[Ѻ!\ES]GM`U僓5ګh-VC#ĝ\]ǿEc+v,eZ{O8YT'paj,ԱRr=l3t1ߞ'Weo2<]|.룫{effdV^FxWy`mS,}NlχК{(`_!OO08lXmMyuA3[7춏ͶMoStO'ެQ5(.d-TeIT _M3߸ϧ&p:Fj\nib#Wt籊zxm5o P潅֑) xѮб{R݌:jHrԑbG5䧹6ڲWfhK=r=inP$Xvh:,5 [G>g@:W $*moF{JvE{J@N-VyC RXXڭe2(kiI) 5 rPQ+0[˼qˈ;C8(0i9j8rp-|Ԕ4C#E"9Ki1%.eT ~*˝[i>Q@!,frKtYFMJzL;f[Oi 0 .ET*Ye7dP{$@4 =K$*ШDԃ4ldwݛJ^(O'LLDGS ?zNb$%Iq:Qt8POlCQu?RmnQ%/:GDf2$'5oJ k] &zrV|_#'֖*]:xzPэ2d;"fIE=6RLqcrÐV*Η,0%ֆogC8k2GCS"YK%LzEڮYYZNJ8 I"I%izt܇3?vZ Q!9yަ!3`?2S Vxfg払fHɄ +x9}!<3ot(EWʣ I"hᒔ_W̚2G:z|4d=^{0\t>qqh`=kOF+u~y1L8#6pKd4xY.KMt.v?)v!5tb`n\/m̳o/ؐM4֐,L, e.3%!JC=~[ց/C+IҔв&<%$4K7hW|23xvᯯHR,f"r$) `H/hq@[d7$Y. FvJ bh%Hg:|̉g(GWY9R$1LZhqNj~}6CS̡, MP&%O҂yD5%pBs'άD\^AT~)I)"OW=s"җR%nޘ f/wT' X#(gl5-,2:bBQ\*clBgCs!%b%D" %^AцFoїT4cmF흈Ŵ}e$L /a~9-A hc| Mzr1x~!d,NSH=>xE)P"OS+DXʬ2Ϝ6%F ui,"x6ּXAgNģD .Jiz8(R8w0SDR,3Dpk|J17=g*,6F[fxE].t KКϞ.[Y^xwZχRp' 3  ✰*^N+$Zt}-hǠ ;F8.*/rp7Mlq .\8G3.]L$Em$}Wo.첤P$Ν7}[*ɗo,Wԛ\/OKO'9{4iJG7w,RS?Ӕf%"d'MHfyne$wh~mI䆀>vRhnsBe G ЎVPm1OLxW>ͮ(}C }=[Qq`:\K9NlBr& *t&%uZ1P9H)8?>ٸd#̖;YRH* =Rr~q`"*.O99T'QxO7F Ty;Z`F$ c䣫չlao֢z2A$;Uc9fH4g#E= O;,M|J+Y,1kVS8%Qb!PxIHZt[-gmv_|VT3ʲd0æ1+$}eZk15c?di=ZR%܇e:.?O]LM%%7?>ڽEΈ<[REs>yR)L]z,+>fߩr5afz<ŻNeb↡Z(Ԓo AjlFM@vA9r~|5IN͛WbgA$% pZ|rJOK",5x>(;o]K]+?^ZTٺ-<դT)e#29fY 8Ϥ, ޭBhc)Uaz俭(+7*V6S˾;c]KIb fɌq)use Kዋx=\zGb%.r,Z`)ݒ( [EQ( !oWmO`9h$(02qJܱ+ojzG)Y؎JL~scvèTk=ӒV PFqa/VnMɑ ^H7.hk!T6+t6.ҰXھG迥u÷YDyRBfxΩTl=ǃFe륬j)fv*;5C; tvn_|ʴUpyCHF<^{Z_Hv7ac˚bPā~=V7>/UX G#Mb;PC!jO8-Df JUD9nE"LC(zpl^5q Ot~)n]h7v2OYH8qKgg=2yJY:Χs#l> HwըJ;4\D1\LR|AmG_}P*4b؇c ͍fo~OST_k=͇ TkwY o"YWI9# ;*ˇ@.#_M!$~~b~ |*gorw#[7 bi~9I?Eȸ׎δL9q,,;\*-rwOn+ӥ{p w'PлSPq^Pd;*ݟq4fp05yƷ,t{V^IRT/0LNImh՚xiO6 cۢTNgĻj]?S 5' `67<"TA4'Dhk!jFN9]>Z39.{?r(;X@It"pnN^%='/ K\u/Q$\*.mv1&\?P=!,5W?hk`p0kηr5 [@&ᗇ"@C&N|o6+lfNT%a~zxX~n`z(?_3Vyͧd0+#U▴jkZ4:GQ28 78p(*$[D2Wu">\WkAo}6]l nָW \:cNARc,)3ە ;;TYbh=yzź)zR>[,!ԉ}FG bwf{[ ywZz֍+AubQǺ3P m Z&)ĩ6p0̪ ;'.6q>ߋ]hva˕j6E%ӳ٣sBp6{u g[+Zh7HdX1&(ӎ LgŒ!<2K$B)Th0$'w3HD{]4m}:t@vԹ~2q$B$q4G 0K-RKS3G@-vE!>+z vV+`u'xj'BZ&n5|25-Z 2dNH SRpqe0DICJ4aYBJh};kMt+ Kxo2ցN۱oۗT41&2 f %D5xj ךvbkfmȩ>9cM{Jz mrswVWz¦YO^šMtVԨ盞Z$.ߙwDmY/<Bav^@L(ద1ozvwά PމnB?B= dk3g$NY= z&))'U&AtbDgԲnsB@¯[ Buk!osJ28b @k3<}Q!C )Ryt_D¢JL5?Z/U2>| Dy32=gvym:gcwC7 |-5cD-\$։ \Kx`smlX_ACBU9yGbD`Q?ül= zZ?ۼŝ9ժ6S`8|0p0g ˽1ct&DXhr,4 tuT kV|%ʞ0Po9z PjìZ!Q*B8u }fP-q e ngD¹A+fId 9c/2}2A4ik [GipGz.8sSm*࿅a%d%R\;f#x^mC_.n{Ë(u$n+\޲IybQ4p.׃`KRVVzK<Х`)> jKoubֳHQ"Dsu,}Y޲0K0FKRr,}I޲oM>(YhK!ZvM8Xo]4`=Re\H, yBÉ6cZRrʦ:FH-1JMtJK@&& 12524ms (21:28:28.303) Feb 19 21:28:28 crc kubenswrapper[4795]: Trace[622355876]: [12.524563965s] [12.524563965s] END Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.303685 4795 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.304742 4795 trace.go:236] Trace[1224879230]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Feb-2026 21:28:13.437) (total time: 14867ms): Feb 19 21:28:28 crc kubenswrapper[4795]: Trace[1224879230]: ---"Objects listed" error: 14867ms (21:28:28.304) Feb 19 21:28:28 crc kubenswrapper[4795]: Trace[1224879230]: [14.867472332s] [14.867472332s] END Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.304761 4795 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.309653 4795 trace.go:236] Trace[1378771009]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Feb-2026 21:28:17.137) (total time: 11172ms): Feb 19 21:28:28 crc kubenswrapper[4795]: Trace[1378771009]: ---"Objects listed" error: 11172ms (21:28:28.309) Feb 19 21:28:28 crc kubenswrapper[4795]: Trace[1378771009]: [11.172581909s] [11.172581909s] END Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.309686 4795 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.309705 4795 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.309962 4795 trace.go:236] Trace[1492480695]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Feb-2026 21:28:13.684) (total time: 14625ms): Feb 19 21:28:28 crc kubenswrapper[4795]: Trace[1492480695]: ---"Objects listed" error: 14625ms (21:28:28.309) Feb 19 21:28:28 crc kubenswrapper[4795]: Trace[1492480695]: [14.625326642s] [14.625326642s] END Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.309989 4795 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.311112 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.316403 4795 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.435747 4795 apiserver.go:52] "Watching apiserver" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.444608 4795 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.444782 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.445076 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.445273 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.445078 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.445331 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.445357 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.445396 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.445403 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.445504 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.445539 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.447940 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.447987 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.448079 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.448231 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.448593 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.450302 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.450486 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.450670 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.450794 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.451802 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 03:36:06.082769997 +0000 UTC Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.473021 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.485310 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.498038 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.509966 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.530197 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.542444 4795 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.543426 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.546701 4795 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:58612->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.546750 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:58612->192.168.126.11:17697: read: connection reset by peer" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.556146 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.566492 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.595784 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.597457 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650" exitCode=255 Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.597489 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650"} Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.607146 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.608570 4795 scope.go:117] "RemoveContainer" containerID="d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.609978 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610397 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610429 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610450 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610465 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610481 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610517 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610534 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610552 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610567 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610581 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610606 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610620 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610644 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610660 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610674 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610690 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610717 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610732 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610748 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610762 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610775 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610801 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610815 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610829 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610846 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610861 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610877 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610893 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610906 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611016 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611035 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611052 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611106 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611127 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611194 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611217 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611214 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611238 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611221 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611263 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611272 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611286 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611307 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611324 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611358 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611381 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611401 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611422 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611445 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611459 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611468 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611491 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611499 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611512 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611582 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611608 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611619 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611665 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611706 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611737 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611762 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611784 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611818 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611846 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611871 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611898 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611920 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611942 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611968 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611988 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612004 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612052 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612072 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612094 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612117 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612137 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612154 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612199 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612218 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612234 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612257 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612272 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612286 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612301 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612317 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612332 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612351 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612368 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612384 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612399 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612453 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612470 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612485 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612504 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612520 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612536 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612551 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612566 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612583 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612598 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612614 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612629 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612645 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612663 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612679 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612696 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612712 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612728 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612746 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612763 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612780 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612795 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612810 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612825 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612844 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612886 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612902 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612918 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612938 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612954 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612969 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612984 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613002 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613018 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613033 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613051 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613068 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613083 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613099 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613115 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613130 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613145 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613176 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611791 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611812 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611819 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611849 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612132 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612177 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612309 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612356 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612375 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612410 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612542 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612548 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612620 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612620 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612802 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612811 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613286 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612818 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613300 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612904 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613018 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613327 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613034 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613195 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613015 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613424 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613442 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613472 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613510 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613558 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613582 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613600 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613628 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613711 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613714 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613818 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613907 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613926 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613943 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614033 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614065 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614126 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614225 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614245 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613198 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614314 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614339 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614360 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614380 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614403 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614425 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614447 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614468 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614475 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614491 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614516 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614539 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614542 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614563 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614586 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614613 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614633 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.614657 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:28:29.114640875 +0000 UTC m=+20.307158739 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614677 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614698 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614720 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614745 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614772 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614794 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614780 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614817 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614867 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614905 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614930 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614951 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614971 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614992 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615037 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615073 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615106 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615129 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615251 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615278 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615312 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615339 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615378 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615400 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615421 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615448 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615466 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615482 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615514 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615531 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615547 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615574 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615590 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615611 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615628 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615726 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615749 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615773 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615810 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615832 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.616181 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.616213 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.616252 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.616652 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617265 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617482 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617507 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617573 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617591 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617606 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617641 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617670 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617691 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617709 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617726 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617765 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617789 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617806 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617822 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617841 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617858 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617875 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617893 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617918 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617935 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617974 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617993 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618011 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618029 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618067 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618078 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618087 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618098 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618107 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618118 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618128 4795 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618142 4795 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618153 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618212 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618222 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618233 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619050 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619101 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619202 4795 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619230 4795 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619268 4795 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619285 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619299 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619319 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619354 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619368 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619435 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619454 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619469 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619688 4795 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619711 4795 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619724 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619737 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619782 4795 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619800 4795 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619880 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619899 4795 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619916 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621504 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621524 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621542 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621563 4795 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621577 4795 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621591 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621606 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621625 4795 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621639 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621652 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621665 4795 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621682 4795 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621696 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621713 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621734 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621746 4795 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621761 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621773 4795 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621790 4795 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621804 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614868 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614864 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615256 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615287 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615590 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615744 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615851 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615867 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.616031 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.616130 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.616200 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.616343 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.616490 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.616831 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.616839 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.616858 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617037 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617056 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617230 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617308 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617576 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617648 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617696 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618033 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618220 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618778 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618859 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619029 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619859 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619911 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.620494 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.620861 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.620920 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621327 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621416 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621865 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621968 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.622234 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.622929 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.623054 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.623102 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.623209 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.623403 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.623420 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.623604 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.623749 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.623758 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.623767 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.624066 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.624971 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.625274 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.629069 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.629333 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.630083 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.630550 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.630573 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.630580 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.630772 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.630813 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.630938 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.630939 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.631142 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.632459 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.632818 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.633054 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.633335 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.633385 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.633562 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.633661 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.633714 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.633911 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.631707 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.634342 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.634378 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.634428 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.634643 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.634765 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.634755 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.634774 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.635034 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.635128 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.630777 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.632205 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.635231 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.635288 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.635303 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.635335 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.636200 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.636212 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.636315 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.636351 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.636355 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.636351 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.623075 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.635234 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.636698 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.636841 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.636853 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.637039 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.637155 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.637188 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.637379 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.636109 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.637766 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.637870 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.638007 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.638072 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.637842 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.638151 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.638136 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.638410 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.638432 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.638538 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.638581 4795 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.638813 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.639216 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.639305 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.639430 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.639587 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.639677 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.639736 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:29.139719316 +0000 UTC m=+20.332237180 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.639912 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.639970 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.640191 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:29.140150029 +0000 UTC m=+20.332667903 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.640205 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.640410 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.640612 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.645255 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.645405 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.650663 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.654107 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.654129 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.654142 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.654217 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:29.154200549 +0000 UTC m=+20.346718413 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.658007 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.658037 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.658049 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.658099 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:29.158082692 +0000 UTC m=+20.350600556 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.658778 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.665684 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.666379 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.666503 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.680265 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.680265 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.681748 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.681779 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.682921 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.683123 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.683841 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.684251 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.684982 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.685220 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.686323 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.686351 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.686337 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.686350 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.686628 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.687996 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.693200 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.693485 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.698690 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.701426 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.701904 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.710177 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.711741 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722497 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722542 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722622 4795 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722646 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722658 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722672 4795 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722684 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722696 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722708 4795 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722702 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722719 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722762 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722777 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722793 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722807 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722820 4795 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722840 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722853 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722867 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722879 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722893 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722906 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722918 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722931 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722943 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722955 4795 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722967 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722979 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722991 4795 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723003 4795 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723015 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723028 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723040 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723051 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723062 4795 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723074 4795 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723085 4795 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723099 4795 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723112 4795 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723123 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723136 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723148 4795 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723184 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723196 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723207 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723219 4795 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723230 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723243 4795 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723254 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723266 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723277 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723289 4795 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723300 4795 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723312 4795 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723323 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723334 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723347 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723359 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723370 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723382 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723395 4795 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723407 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723418 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723432 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723444 4795 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723461 4795 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723473 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723484 4795 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723527 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723540 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723552 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723564 4795 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723576 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723587 4795 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723599 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723611 4795 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723624 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723637 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723648 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723660 4795 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723671 4795 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723682 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723693 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723705 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723716 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723730 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723743 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723755 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723768 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723782 4795 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723794 4795 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723806 4795 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723817 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723829 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723840 4795 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723851 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723863 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723875 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723886 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723898 4795 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723909 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723920 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723932 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723944 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723957 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723971 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723982 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723993 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724005 4795 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724018 4795 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724030 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724042 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724055 4795 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724068 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724079 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724090 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724102 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724113 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724125 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724136 4795 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724150 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724220 4795 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724232 4795 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724244 4795 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724256 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724268 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724279 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724291 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724424 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724438 4795 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724449 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724461 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724472 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724484 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724496 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724508 4795 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724519 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724531 4795 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724542 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724556 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724567 4795 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724578 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724589 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724601 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.761295 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.767993 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.773907 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 21:28:28 crc kubenswrapper[4795]: W0219 21:28:28.777884 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-1a2194b11ba3a035a9863493f71d4d63053a4ff1210ab5a016ee0a984f8f0136 WatchSource:0}: Error finding container 1a2194b11ba3a035a9863493f71d4d63053a4ff1210ab5a016ee0a984f8f0136: Status 404 returned error can't find the container with id 1a2194b11ba3a035a9863493f71d4d63053a4ff1210ab5a016ee0a984f8f0136 Feb 19 21:28:28 crc kubenswrapper[4795]: W0219 21:28:28.778641 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-4db24bb73e8db7b7311192e90d95536edc7b4e668b036dc5ed8311816045ea99 WatchSource:0}: Error finding container 4db24bb73e8db7b7311192e90d95536edc7b4e668b036dc5ed8311816045ea99: Status 404 returned error can't find the container with id 4db24bb73e8db7b7311192e90d95536edc7b4e668b036dc5ed8311816045ea99 Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.926930 4795 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.926986 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.127969 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:29 crc kubenswrapper[4795]: E0219 21:28:29.128174 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:28:30.128132996 +0000 UTC m=+21.320650860 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.229259 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.229313 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.229340 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.229365 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:29 crc kubenswrapper[4795]: E0219 21:28:29.229506 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:29 crc kubenswrapper[4795]: E0219 21:28:29.229503 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:29 crc kubenswrapper[4795]: E0219 21:28:29.229550 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:29 crc kubenswrapper[4795]: E0219 21:28:29.229555 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:29 crc kubenswrapper[4795]: E0219 21:28:29.229581 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:29 crc kubenswrapper[4795]: E0219 21:28:29.229617 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:29 crc kubenswrapper[4795]: E0219 21:28:29.229632 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:29 crc kubenswrapper[4795]: E0219 21:28:29.229564 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:30.229550983 +0000 UTC m=+21.422068847 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:29 crc kubenswrapper[4795]: E0219 21:28:29.229704 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:30.229680607 +0000 UTC m=+21.422198531 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:29 crc kubenswrapper[4795]: E0219 21:28:29.229719 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:30.229711398 +0000 UTC m=+21.422229272 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:29 crc kubenswrapper[4795]: E0219 21:28:29.229564 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:29 crc kubenswrapper[4795]: E0219 21:28:29.229865 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:30.229831221 +0000 UTC m=+21.422349165 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.452435 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 14:34:53.534242081 +0000 UTC Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.514579 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.515378 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.516179 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.516760 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.517376 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.517878 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.518742 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.519411 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.520097 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.522453 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.523187 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.523862 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.524494 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.525002 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.525526 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.526011 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.526587 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.526994 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.527557 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.528082 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.528666 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.529219 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.529639 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.530261 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.530678 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.531362 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.533495 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.533587 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.534457 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.535380 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.536048 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.536716 4795 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.536855 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.540329 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.541105 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.541809 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.545082 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.546010 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.547276 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.548209 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.548738 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.549646 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.550332 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.551141 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.552732 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.554087 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.554788 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.556028 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.556877 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.558416 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.559115 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.561537 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.562758 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.565022 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.565402 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.566338 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.567634 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.582652 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.598252 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.600812 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3"} Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.600855 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"1a2194b11ba3a035a9863493f71d4d63053a4ff1210ab5a016ee0a984f8f0136"} Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.603083 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.604628 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30"} Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.605196 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.606319 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5"} Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.606353 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697"} Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.606368 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c85a50be9fb5e82c64ae1cb4bd5344baef3eff3b4599c6a5e0402ed9421e65c6"} Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.607280 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"4db24bb73e8db7b7311192e90d95536edc7b4e668b036dc5ed8311816045ea99"} Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.610534 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.621634 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.638696 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.667683 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.682421 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.694785 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.705483 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.715604 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.725082 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:30 crc kubenswrapper[4795]: I0219 21:28:30.141356 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:30 crc kubenswrapper[4795]: E0219 21:28:30.141564 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:28:32.141534652 +0000 UTC m=+23.334052516 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:28:30 crc kubenswrapper[4795]: I0219 21:28:30.242618 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:30 crc kubenswrapper[4795]: I0219 21:28:30.242664 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:30 crc kubenswrapper[4795]: I0219 21:28:30.242698 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:30 crc kubenswrapper[4795]: I0219 21:28:30.242718 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:30 crc kubenswrapper[4795]: E0219 21:28:30.242784 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:30 crc kubenswrapper[4795]: E0219 21:28:30.242863 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:32.242845686 +0000 UTC m=+23.435363550 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:30 crc kubenswrapper[4795]: E0219 21:28:30.242918 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:30 crc kubenswrapper[4795]: E0219 21:28:30.242795 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:30 crc kubenswrapper[4795]: E0219 21:28:30.243042 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:32.243011901 +0000 UTC m=+23.435529795 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:30 crc kubenswrapper[4795]: E0219 21:28:30.242925 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:30 crc kubenswrapper[4795]: E0219 21:28:30.243103 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:30 crc kubenswrapper[4795]: E0219 21:28:30.243130 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:30 crc kubenswrapper[4795]: E0219 21:28:30.242949 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:30 crc kubenswrapper[4795]: E0219 21:28:30.243208 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:30 crc kubenswrapper[4795]: E0219 21:28:30.243228 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:32.243207397 +0000 UTC m=+23.435725301 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:30 crc kubenswrapper[4795]: E0219 21:28:30.243297 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:32.243244648 +0000 UTC m=+23.435762512 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:30 crc kubenswrapper[4795]: I0219 21:28:30.452780 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 11:51:18.934310671 +0000 UTC Feb 19 21:28:30 crc kubenswrapper[4795]: I0219 21:28:30.511153 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:30 crc kubenswrapper[4795]: I0219 21:28:30.511197 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:30 crc kubenswrapper[4795]: I0219 21:28:30.511200 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:30 crc kubenswrapper[4795]: E0219 21:28:30.511302 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:30 crc kubenswrapper[4795]: E0219 21:28:30.511460 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:30 crc kubenswrapper[4795]: E0219 21:28:30.511674 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.453713 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 20:12:52.653068241 +0000 UTC Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.603212 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.614443 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9"} Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.619316 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.619990 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.620436 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.635911 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.651533 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.663898 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.676278 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.690281 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.702559 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.716388 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.727554 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.739817 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.759292 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.769968 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.782286 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.795567 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.808587 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:32 crc kubenswrapper[4795]: I0219 21:28:32.158110 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:32 crc kubenswrapper[4795]: E0219 21:28:32.158305 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:28:36.15828061 +0000 UTC m=+27.350798484 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:28:32 crc kubenswrapper[4795]: I0219 21:28:32.259238 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:32 crc kubenswrapper[4795]: I0219 21:28:32.259296 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:32 crc kubenswrapper[4795]: I0219 21:28:32.259322 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:32 crc kubenswrapper[4795]: I0219 21:28:32.259349 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:32 crc kubenswrapper[4795]: E0219 21:28:32.259469 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:32 crc kubenswrapper[4795]: E0219 21:28:32.259437 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:32 crc kubenswrapper[4795]: E0219 21:28:32.259490 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:32 crc kubenswrapper[4795]: E0219 21:28:32.259529 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:36.259516242 +0000 UTC m=+27.452034106 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:32 crc kubenswrapper[4795]: E0219 21:28:32.259538 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:32 crc kubenswrapper[4795]: E0219 21:28:32.259564 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:32 crc kubenswrapper[4795]: E0219 21:28:32.259576 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:36.259546573 +0000 UTC m=+27.452064467 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:32 crc kubenswrapper[4795]: E0219 21:28:32.259494 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:32 crc kubenswrapper[4795]: E0219 21:28:32.259606 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:32 crc kubenswrapper[4795]: E0219 21:28:32.259619 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:32 crc kubenswrapper[4795]: E0219 21:28:32.259654 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:36.259616705 +0000 UTC m=+27.452134609 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:32 crc kubenswrapper[4795]: E0219 21:28:32.259681 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:36.259670497 +0000 UTC m=+27.452188451 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:32 crc kubenswrapper[4795]: I0219 21:28:32.454575 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 18:06:54.616184067 +0000 UTC Feb 19 21:28:32 crc kubenswrapper[4795]: I0219 21:28:32.511429 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:32 crc kubenswrapper[4795]: I0219 21:28:32.511491 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:32 crc kubenswrapper[4795]: I0219 21:28:32.511572 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:32 crc kubenswrapper[4795]: E0219 21:28:32.512151 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:32 crc kubenswrapper[4795]: E0219 21:28:32.512327 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:32 crc kubenswrapper[4795]: E0219 21:28:32.512532 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:32 crc kubenswrapper[4795]: I0219 21:28:32.980527 4795 csr.go:261] certificate signing request csr-62htn is approved, waiting to be issued Feb 19 21:28:32 crc kubenswrapper[4795]: I0219 21:28:32.999227 4795 csr.go:257] certificate signing request csr-62htn is issued Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.377238 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-blzsk"] Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.377427 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-fxj5d"] Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.377572 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-blzsk" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.377646 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.379984 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.380239 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.380346 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.380458 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.380512 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.380975 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.380578 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.380989 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.392787 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.424344 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.438539 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.451878 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.454700 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 06:24:44.746485258 +0000 UTC Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.466384 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.484016 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.495531 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.512795 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.524721 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.539445 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.561811 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.570125 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7591bc58-96f5-486a-8653-0ad93938b019-mcd-auth-proxy-config\") pod \"machine-config-daemon-fxj5d\" (UID: \"7591bc58-96f5-486a-8653-0ad93938b019\") " pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.570551 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e9643227-37ca-4e4a-b9bc-371b18d67edc-hosts-file\") pod \"node-resolver-blzsk\" (UID: \"e9643227-37ca-4e4a-b9bc-371b18d67edc\") " pod="openshift-dns/node-resolver-blzsk" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.570677 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7591bc58-96f5-486a-8653-0ad93938b019-proxy-tls\") pod \"machine-config-daemon-fxj5d\" (UID: \"7591bc58-96f5-486a-8653-0ad93938b019\") " pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.570754 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmtf8\" (UniqueName: \"kubernetes.io/projected/e9643227-37ca-4e4a-b9bc-371b18d67edc-kube-api-access-lmtf8\") pod \"node-resolver-blzsk\" (UID: \"e9643227-37ca-4e4a-b9bc-371b18d67edc\") " pod="openshift-dns/node-resolver-blzsk" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.570840 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7591bc58-96f5-486a-8653-0ad93938b019-rootfs\") pod \"machine-config-daemon-fxj5d\" (UID: \"7591bc58-96f5-486a-8653-0ad93938b019\") " pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.570912 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwrxf\" (UniqueName: \"kubernetes.io/projected/7591bc58-96f5-486a-8653-0ad93938b019-kube-api-access-wwrxf\") pod \"machine-config-daemon-fxj5d\" (UID: \"7591bc58-96f5-486a-8653-0ad93938b019\") " pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.576388 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.587832 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.608797 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.641035 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.672139 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e9643227-37ca-4e4a-b9bc-371b18d67edc-hosts-file\") pod \"node-resolver-blzsk\" (UID: \"e9643227-37ca-4e4a-b9bc-371b18d67edc\") " pod="openshift-dns/node-resolver-blzsk" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.672449 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7591bc58-96f5-486a-8653-0ad93938b019-proxy-tls\") pod \"machine-config-daemon-fxj5d\" (UID: \"7591bc58-96f5-486a-8653-0ad93938b019\") " pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.672555 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmtf8\" (UniqueName: \"kubernetes.io/projected/e9643227-37ca-4e4a-b9bc-371b18d67edc-kube-api-access-lmtf8\") pod \"node-resolver-blzsk\" (UID: \"e9643227-37ca-4e4a-b9bc-371b18d67edc\") " pod="openshift-dns/node-resolver-blzsk" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.672655 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7591bc58-96f5-486a-8653-0ad93938b019-rootfs\") pod \"machine-config-daemon-fxj5d\" (UID: \"7591bc58-96f5-486a-8653-0ad93938b019\") " pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.672749 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwrxf\" (UniqueName: \"kubernetes.io/projected/7591bc58-96f5-486a-8653-0ad93938b019-kube-api-access-wwrxf\") pod \"machine-config-daemon-fxj5d\" (UID: \"7591bc58-96f5-486a-8653-0ad93938b019\") " pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.672851 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7591bc58-96f5-486a-8653-0ad93938b019-mcd-auth-proxy-config\") pod \"machine-config-daemon-fxj5d\" (UID: \"7591bc58-96f5-486a-8653-0ad93938b019\") " pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.672297 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e9643227-37ca-4e4a-b9bc-371b18d67edc-hosts-file\") pod \"node-resolver-blzsk\" (UID: \"e9643227-37ca-4e4a-b9bc-371b18d67edc\") " pod="openshift-dns/node-resolver-blzsk" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.672761 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7591bc58-96f5-486a-8653-0ad93938b019-rootfs\") pod \"machine-config-daemon-fxj5d\" (UID: \"7591bc58-96f5-486a-8653-0ad93938b019\") " pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.673597 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7591bc58-96f5-486a-8653-0ad93938b019-mcd-auth-proxy-config\") pod \"machine-config-daemon-fxj5d\" (UID: \"7591bc58-96f5-486a-8653-0ad93938b019\") " pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.676492 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7591bc58-96f5-486a-8653-0ad93938b019-proxy-tls\") pod \"machine-config-daemon-fxj5d\" (UID: \"7591bc58-96f5-486a-8653-0ad93938b019\") " pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.700151 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.705010 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmtf8\" (UniqueName: \"kubernetes.io/projected/e9643227-37ca-4e4a-b9bc-371b18d67edc-kube-api-access-lmtf8\") pod \"node-resolver-blzsk\" (UID: \"e9643227-37ca-4e4a-b9bc-371b18d67edc\") " pod="openshift-dns/node-resolver-blzsk" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.710122 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-blzsk" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.731094 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwrxf\" (UniqueName: \"kubernetes.io/projected/7591bc58-96f5-486a-8653-0ad93938b019-kube-api-access-wwrxf\") pod \"machine-config-daemon-fxj5d\" (UID: \"7591bc58-96f5-486a-8653-0ad93938b019\") " pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.774351 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.783980 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-l29c7"] Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.784613 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-5p6d9"] Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.784794 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.784847 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.787319 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.787553 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.787785 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.787920 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.788134 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.790487 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.792787 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.792805 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.804636 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.817335 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.833214 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.845205 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.857530 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.873144 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.885658 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.901923 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.913414 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.926458 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.941883 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.970330 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.974749 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-hostroot\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.974794 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx6hj\" (UniqueName: \"kubernetes.io/projected/e967392b-9bd8-4111-b1b9-96d503a19668-kube-api-access-qx6hj\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.974823 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-cni-binary-copy\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.974847 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-multus-socket-dir-parent\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.974917 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-var-lib-kubelet\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.974973 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-run-multus-certs\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975069 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-os-release\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975095 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-system-cni-dir\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975119 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975141 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-system-cni-dir\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975195 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e967392b-9bd8-4111-b1b9-96d503a19668-cni-binary-copy\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975224 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-run-k8s-cni-cncf-io\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975246 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-multus-conf-dir\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975271 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-run-netns\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975302 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e967392b-9bd8-4111-b1b9-96d503a19668-multus-daemon-config\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975352 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975377 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-etc-kubernetes\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975412 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-cnibin\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975445 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-multus-cni-dir\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975468 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-cnibin\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975496 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-os-release\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975531 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4pzd\" (UniqueName: \"kubernetes.io/projected/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-kube-api-access-f4pzd\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975563 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-var-lib-cni-multus\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975595 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-var-lib-cni-bin\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.983432 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.000099 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-19 21:23:32 +0000 UTC, rotation deadline is 2026-12-20 18:37:00.677622872 +0000 UTC Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.000407 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7293h8m26.677221874s for next certificate rotation Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.000266 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.005346 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:28:34 crc kubenswrapper[4795]: W0219 21:28:34.014917 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7591bc58_96f5_486a_8653_0ad93938b019.slice/crio-d574fc53e044370889753eeb0c62a38953bf809ef1ccb2d1baf5e98adf4e9947 WatchSource:0}: Error finding container d574fc53e044370889753eeb0c62a38953bf809ef1ccb2d1baf5e98adf4e9947: Status 404 returned error can't find the container with id d574fc53e044370889753eeb0c62a38953bf809ef1ccb2d1baf5e98adf4e9947 Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.076914 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-cnibin\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.076978 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-os-release\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077015 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-var-lib-cni-multus\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077049 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4pzd\" (UniqueName: \"kubernetes.io/projected/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-kube-api-access-f4pzd\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077102 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-var-lib-cni-bin\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077146 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-hostroot\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077192 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx6hj\" (UniqueName: \"kubernetes.io/projected/e967392b-9bd8-4111-b1b9-96d503a19668-kube-api-access-qx6hj\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077216 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-cni-binary-copy\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077238 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-multus-socket-dir-parent\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077260 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-var-lib-kubelet\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077283 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-run-multus-certs\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077330 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-os-release\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077357 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-system-cni-dir\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077382 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077405 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-system-cni-dir\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077427 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e967392b-9bd8-4111-b1b9-96d503a19668-cni-binary-copy\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077451 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-run-k8s-cni-cncf-io\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077475 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-multus-conf-dir\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077527 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-run-netns\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077552 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e967392b-9bd8-4111-b1b9-96d503a19668-multus-daemon-config\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077584 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077610 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-etc-kubernetes\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077638 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-cnibin\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077662 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-multus-cni-dir\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077957 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-hostroot\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078072 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-multus-socket-dir-parent\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078086 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-var-lib-cni-multus\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078142 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-system-cni-dir\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078046 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-multus-conf-dir\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078373 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-system-cni-dir\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078416 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-cnibin\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078448 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-os-release\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078470 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-var-lib-kubelet\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078489 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-var-lib-cni-bin\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078506 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-run-multus-certs\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078549 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-etc-kubernetes\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078600 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-run-netns\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078728 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-os-release\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078811 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-cni-binary-copy\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078853 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-cnibin\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078934 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-run-k8s-cni-cncf-io\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078956 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.079009 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.079097 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e967392b-9bd8-4111-b1b9-96d503a19668-cni-binary-copy\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.079307 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-multus-cni-dir\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.079469 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e967392b-9bd8-4111-b1b9-96d503a19668-multus-daemon-config\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.094104 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx6hj\" (UniqueName: \"kubernetes.io/projected/e967392b-9bd8-4111-b1b9-96d503a19668-kube-api-access-qx6hj\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.094315 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4pzd\" (UniqueName: \"kubernetes.io/projected/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-kube-api-access-f4pzd\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.095800 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.105210 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: W0219 21:28:34.119303 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode967392b_9bd8_4111_b1b9_96d503a19668.slice/crio-4d4b0b683458da919192e168b950b3bebf6b1cfe448325e37e3777ebb0f2e9ff WatchSource:0}: Error finding container 4d4b0b683458da919192e168b950b3bebf6b1cfe448325e37e3777ebb0f2e9ff: Status 404 returned error can't find the container with id 4d4b0b683458da919192e168b950b3bebf6b1cfe448325e37e3777ebb0f2e9ff Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.144013 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4qphl"] Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.146118 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.147572 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.148235 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.148575 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.148672 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.148734 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.149070 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.150011 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.165102 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.175502 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.186682 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.199358 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.210726 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.220142 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.231276 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.251940 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.266644 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.277936 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281472 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-kubelet\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281514 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-run-netns\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281536 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/adf5bd36-b46b-4a06-8291-cae9f3988330-ovn-node-metrics-cert\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281563 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-ovnkube-config\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281611 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-ovnkube-script-lib\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281635 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-env-overrides\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281656 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nrjb\" (UniqueName: \"kubernetes.io/projected/adf5bd36-b46b-4a06-8291-cae9f3988330-kube-api-access-6nrjb\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281694 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-log-socket\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281728 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-etc-openvswitch\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281749 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-ovn\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281774 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-openvswitch\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281797 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-systemd\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281817 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281849 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-slash\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281878 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-cni-netd\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281932 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-node-log\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281959 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-systemd-units\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281980 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-run-ovn-kubernetes\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.282001 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-var-lib-openvswitch\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.282032 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-cni-bin\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.294463 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.306063 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.318239 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.382645 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-etc-openvswitch\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.382734 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-ovn\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.382734 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-etc-openvswitch\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.382814 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-ovn\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.382754 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-openvswitch\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.382952 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-systemd\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.382982 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383006 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-openvswitch\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383024 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-systemd\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383020 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-slash\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383069 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-slash\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383081 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383098 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-cni-netd\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383123 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-cni-netd\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383133 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-node-log\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383154 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-systemd-units\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383187 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-run-ovn-kubernetes\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383205 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-var-lib-openvswitch\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383220 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-cni-bin\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383238 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-kubelet\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383254 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-run-netns\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383273 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/adf5bd36-b46b-4a06-8291-cae9f3988330-ovn-node-metrics-cert\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383275 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-run-ovn-kubernetes\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383239 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-node-log\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383290 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-ovnkube-config\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383306 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-ovnkube-script-lib\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383310 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-cni-bin\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383222 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-systemd-units\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383322 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-env-overrides\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383337 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nrjb\" (UniqueName: \"kubernetes.io/projected/adf5bd36-b46b-4a06-8291-cae9f3988330-kube-api-access-6nrjb\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383363 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-log-socket\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383400 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-log-socket\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383258 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-var-lib-openvswitch\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383339 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-kubelet\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.384106 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-run-netns\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.384272 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-ovnkube-script-lib\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.384278 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-env-overrides\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.384462 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-ovnkube-config\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.387660 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/adf5bd36-b46b-4a06-8291-cae9f3988330-ovn-node-metrics-cert\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.397919 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nrjb\" (UniqueName: \"kubernetes.io/projected/adf5bd36-b46b-4a06-8291-cae9f3988330-kube-api-access-6nrjb\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.454990 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 00:33:46.495095331 +0000 UTC Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.511684 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.511795 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.511697 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:34 crc kubenswrapper[4795]: E0219 21:28:34.511874 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:34 crc kubenswrapper[4795]: E0219 21:28:34.512013 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:34 crc kubenswrapper[4795]: E0219 21:28:34.512132 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.537735 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: W0219 21:28:34.572591 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadf5bd36_b46b_4a06_8291_cae9f3988330.slice/crio-740cf96667070bca195c02d1eaa75a11754b5d3aaf7c73fedd2e4e883f6a4193 WatchSource:0}: Error finding container 740cf96667070bca195c02d1eaa75a11754b5d3aaf7c73fedd2e4e883f6a4193: Status 404 returned error can't find the container with id 740cf96667070bca195c02d1eaa75a11754b5d3aaf7c73fedd2e4e883f6a4193 Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.623022 4795 generic.go:334] "Generic (PLEG): container finished" podID="c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3" containerID="40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9" exitCode=0 Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.623105 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" event={"ID":"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3","Type":"ContainerDied","Data":"40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.623193 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" event={"ID":"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3","Type":"ContainerStarted","Data":"06cc03bf1c3b39c6d3842b732a42963fbbed69cfc78269fbb0494e80f0536205"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.624225 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-blzsk" event={"ID":"e9643227-37ca-4e4a-b9bc-371b18d67edc","Type":"ContainerStarted","Data":"f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.624265 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-blzsk" event={"ID":"e9643227-37ca-4e4a-b9bc-371b18d67edc","Type":"ContainerStarted","Data":"8fd79f423625c1967ec4fdeb8bb0c4df88f46a9fc1c2c400d9a475409b01da2f"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.625575 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerStarted","Data":"740cf96667070bca195c02d1eaa75a11754b5d3aaf7c73fedd2e4e883f6a4193"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.627527 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.627562 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.627576 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"d574fc53e044370889753eeb0c62a38953bf809ef1ccb2d1baf5e98adf4e9947"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.628619 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5p6d9" event={"ID":"e967392b-9bd8-4111-b1b9-96d503a19668","Type":"ContainerStarted","Data":"ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.628660 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5p6d9" event={"ID":"e967392b-9bd8-4111-b1b9-96d503a19668","Type":"ContainerStarted","Data":"4d4b0b683458da919192e168b950b3bebf6b1cfe448325e37e3777ebb0f2e9ff"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.647294 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.662497 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.673492 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.688546 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.699806 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.710375 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.711566 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.713540 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.713577 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.713586 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.713719 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.720321 4795 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.720615 4795 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.721591 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.721619 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.721627 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.721641 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.721652 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:34Z","lastTransitionTime":"2026-02-19T21:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.727122 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: E0219 21:28:34.741518 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.744550 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.744579 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.744589 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.744606 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.744617 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:34Z","lastTransitionTime":"2026-02-19T21:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.746590 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: E0219 21:28:34.757046 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.757501 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.760331 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.760418 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.760435 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.760450 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.760542 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:34Z","lastTransitionTime":"2026-02-19T21:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.768241 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: E0219 21:28:34.772997 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.776747 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.776783 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.776792 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.776805 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.776815 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:34Z","lastTransitionTime":"2026-02-19T21:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.787600 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: E0219 21:28:34.789273 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.792471 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.792506 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.792519 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.792536 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.792550 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:34Z","lastTransitionTime":"2026-02-19T21:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.800286 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: E0219 21:28:34.803103 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: E0219 21:28:34.803530 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.805041 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.805070 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.805080 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.805098 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.805111 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:34Z","lastTransitionTime":"2026-02-19T21:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.811828 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.822758 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.833394 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.843824 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.854996 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.872538 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.883049 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.892874 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.905432 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.906674 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.906720 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.906733 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.906752 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.906765 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:34Z","lastTransitionTime":"2026-02-19T21:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.917457 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.929139 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.944178 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.956312 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.968822 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.008993 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.009030 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.009041 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.009056 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.009068 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:35Z","lastTransitionTime":"2026-02-19T21:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.111366 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.111419 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.111428 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.111444 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.111453 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:35Z","lastTransitionTime":"2026-02-19T21:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.213759 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.213796 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.213807 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.213827 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.213839 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:35Z","lastTransitionTime":"2026-02-19T21:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.315849 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.316126 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.316136 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.316150 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.316158 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:35Z","lastTransitionTime":"2026-02-19T21:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.419110 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.419187 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.419197 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.419215 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.419225 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:35Z","lastTransitionTime":"2026-02-19T21:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.455558 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 14:14:55.983096506 +0000 UTC Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.522009 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.522053 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.522065 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.522083 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.522095 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:35Z","lastTransitionTime":"2026-02-19T21:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.624499 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.624554 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.624574 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.624606 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.624623 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:35Z","lastTransitionTime":"2026-02-19T21:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.633802 4795 generic.go:334] "Generic (PLEG): container finished" podID="c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3" containerID="30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e" exitCode=0 Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.633862 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" event={"ID":"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3","Type":"ContainerDied","Data":"30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e"} Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.636328 4795 generic.go:334] "Generic (PLEG): container finished" podID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerID="cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c" exitCode=0 Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.636494 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerDied","Data":"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c"} Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.670210 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.685386 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.694315 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.708489 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.721394 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.726593 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.726633 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.726650 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.726666 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.726682 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:35Z","lastTransitionTime":"2026-02-19T21:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.736259 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.746195 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.757743 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.770755 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.790280 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.802690 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.813756 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.823607 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.828861 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.828896 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.828904 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.828919 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.828928 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:35Z","lastTransitionTime":"2026-02-19T21:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.832763 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.848646 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.860638 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.879502 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.891859 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.904205 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.913808 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.924733 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.929315 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.931079 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.931110 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.931122 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.931137 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.931154 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:35Z","lastTransitionTime":"2026-02-19T21:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.933462 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.935069 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.935817 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.953395 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.963915 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.973326 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.983485 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.984752 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-jvnv5"] Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.985149 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jvnv5" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.986233 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.986488 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.986692 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.986910 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.995554 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.997440 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93ec0ce2-79c0-42a4-88e2-71065ec8ff9f-host\") pod \"node-ca-jvnv5\" (UID: \"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\") " pod="openshift-image-registry/node-ca-jvnv5" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.997489 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89q2s\" (UniqueName: \"kubernetes.io/projected/93ec0ce2-79c0-42a4-88e2-71065ec8ff9f-kube-api-access-89q2s\") pod \"node-ca-jvnv5\" (UID: \"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\") " pod="openshift-image-registry/node-ca-jvnv5" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.997518 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/93ec0ce2-79c0-42a4-88e2-71065ec8ff9f-serviceca\") pod \"node-ca-jvnv5\" (UID: \"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\") " pod="openshift-image-registry/node-ca-jvnv5" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.032199 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.033359 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.033388 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.033400 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.033416 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.033429 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:36Z","lastTransitionTime":"2026-02-19T21:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.067371 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.098861 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93ec0ce2-79c0-42a4-88e2-71065ec8ff9f-host\") pod \"node-ca-jvnv5\" (UID: \"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\") " pod="openshift-image-registry/node-ca-jvnv5" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.098931 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89q2s\" (UniqueName: \"kubernetes.io/projected/93ec0ce2-79c0-42a4-88e2-71065ec8ff9f-kube-api-access-89q2s\") pod \"node-ca-jvnv5\" (UID: \"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\") " pod="openshift-image-registry/node-ca-jvnv5" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.098934 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93ec0ce2-79c0-42a4-88e2-71065ec8ff9f-host\") pod \"node-ca-jvnv5\" (UID: \"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\") " pod="openshift-image-registry/node-ca-jvnv5" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.098962 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/93ec0ce2-79c0-42a4-88e2-71065ec8ff9f-serviceca\") pod \"node-ca-jvnv5\" (UID: \"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\") " pod="openshift-image-registry/node-ca-jvnv5" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.099956 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/93ec0ce2-79c0-42a4-88e2-71065ec8ff9f-serviceca\") pod \"node-ca-jvnv5\" (UID: \"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\") " pod="openshift-image-registry/node-ca-jvnv5" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.107673 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.137081 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.137118 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.137128 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.137146 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.137147 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89q2s\" (UniqueName: \"kubernetes.io/projected/93ec0ce2-79c0-42a4-88e2-71065ec8ff9f-kube-api-access-89q2s\") pod \"node-ca-jvnv5\" (UID: \"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\") " pod="openshift-image-registry/node-ca-jvnv5" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.137158 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:36Z","lastTransitionTime":"2026-02-19T21:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.171874 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.199736 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:36 crc kubenswrapper[4795]: E0219 21:28:36.199947 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:28:44.199927394 +0000 UTC m=+35.392445258 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.207847 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.240394 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.240427 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.240436 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.240451 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.240460 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:36Z","lastTransitionTime":"2026-02-19T21:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.250413 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.289754 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.297000 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jvnv5" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.300672 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.300715 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.300757 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.300784 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:36 crc kubenswrapper[4795]: E0219 21:28:36.300868 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:36 crc kubenswrapper[4795]: E0219 21:28:36.300882 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:36 crc kubenswrapper[4795]: E0219 21:28:36.300899 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:36 crc kubenswrapper[4795]: E0219 21:28:36.300929 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:44.300913068 +0000 UTC m=+35.493430932 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:36 crc kubenswrapper[4795]: E0219 21:28:36.300905 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:36 crc kubenswrapper[4795]: E0219 21:28:36.300950 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:44.300941119 +0000 UTC m=+35.493458983 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:36 crc kubenswrapper[4795]: E0219 21:28:36.300953 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:36 crc kubenswrapper[4795]: E0219 21:28:36.300959 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:36 crc kubenswrapper[4795]: E0219 21:28:36.301008 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:36 crc kubenswrapper[4795]: E0219 21:28:36.301023 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:36 crc kubenswrapper[4795]: E0219 21:28:36.300987 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:44.30097168 +0000 UTC m=+35.493489534 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:36 crc kubenswrapper[4795]: E0219 21:28:36.301093 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:44.301081393 +0000 UTC m=+35.493599257 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:36 crc kubenswrapper[4795]: W0219 21:28:36.308543 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93ec0ce2_79c0_42a4_88e2_71065ec8ff9f.slice/crio-d2a3fea10dcf077753d86894e22ca3d47666d60b886c420c35b0cd17318add40 WatchSource:0}: Error finding container d2a3fea10dcf077753d86894e22ca3d47666d60b886c420c35b0cd17318add40: Status 404 returned error can't find the container with id d2a3fea10dcf077753d86894e22ca3d47666d60b886c420c35b0cd17318add40 Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.329858 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.342719 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.342771 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.342785 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.342804 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.342816 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:36Z","lastTransitionTime":"2026-02-19T21:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.376016 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.408086 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.445812 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.445846 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.445855 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.445869 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.445877 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:36Z","lastTransitionTime":"2026-02-19T21:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.448409 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.455818 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 07:59:09.9220675 +0000 UTC Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.488722 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.510950 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:36 crc kubenswrapper[4795]: E0219 21:28:36.511049 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.511107 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.511101 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:36 crc kubenswrapper[4795]: E0219 21:28:36.511157 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:36 crc kubenswrapper[4795]: E0219 21:28:36.511308 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.529186 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.547767 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.547799 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.547807 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.547821 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.547831 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:36Z","lastTransitionTime":"2026-02-19T21:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.572789 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.608866 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.640990 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jvnv5" event={"ID":"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f","Type":"ContainerStarted","Data":"558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.641029 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jvnv5" event={"ID":"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f","Type":"ContainerStarted","Data":"d2a3fea10dcf077753d86894e22ca3d47666d60b886c420c35b0cd17318add40"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.643958 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerStarted","Data":"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.643991 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerStarted","Data":"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.644005 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerStarted","Data":"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.644017 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerStarted","Data":"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.644027 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerStarted","Data":"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.644036 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerStarted","Data":"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.646118 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.646309 4795 generic.go:334] "Generic (PLEG): container finished" podID="c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3" containerID="c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561" exitCode=0 Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.646339 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" event={"ID":"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3","Type":"ContainerDied","Data":"c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.649239 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.649269 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.649282 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.649295 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.649306 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:36Z","lastTransitionTime":"2026-02-19T21:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.687733 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.731338 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.751129 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.751191 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.751206 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.751223 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.751234 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:36Z","lastTransitionTime":"2026-02-19T21:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.766724 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.811330 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.848341 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.852923 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.852946 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.852954 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.852969 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.852980 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:36Z","lastTransitionTime":"2026-02-19T21:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.890430 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.926906 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.955397 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.955432 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.955449 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.955464 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.955473 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:36Z","lastTransitionTime":"2026-02-19T21:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.972208 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.009621 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.050135 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.058536 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.058596 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.058618 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.058646 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.058670 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:37Z","lastTransitionTime":"2026-02-19T21:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.092602 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.133005 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.161216 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.161269 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.161281 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.161299 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.161310 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:37Z","lastTransitionTime":"2026-02-19T21:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.168800 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.210136 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.249660 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.263874 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.263922 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.263936 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.263955 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.263968 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:37Z","lastTransitionTime":"2026-02-19T21:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.290944 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.330365 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.366095 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.366142 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.366155 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.366187 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.366197 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:37Z","lastTransitionTime":"2026-02-19T21:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.369956 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.414728 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.450136 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.456323 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 05:54:06.083236843 +0000 UTC Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.469075 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.469118 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.469129 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.469146 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.469157 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:37Z","lastTransitionTime":"2026-02-19T21:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.489798 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.528490 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.571979 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.572036 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.572060 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.572089 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.572110 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:37Z","lastTransitionTime":"2026-02-19T21:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.575106 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.611942 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.652498 4795 generic.go:334] "Generic (PLEG): container finished" podID="c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3" containerID="f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf" exitCode=0 Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.652551 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" event={"ID":"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3","Type":"ContainerDied","Data":"f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf"} Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.665879 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.675215 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.675240 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.675250 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.675264 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.675274 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:37Z","lastTransitionTime":"2026-02-19T21:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.688657 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.732185 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.769855 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.777850 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.777875 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.777882 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.777897 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.777906 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:37Z","lastTransitionTime":"2026-02-19T21:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.809618 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.855742 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.880131 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.880187 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.880198 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.880216 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.880227 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:37Z","lastTransitionTime":"2026-02-19T21:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.889464 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.928971 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.967517 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.982636 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.982671 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.982684 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.982700 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.982712 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:37Z","lastTransitionTime":"2026-02-19T21:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.011435 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.050108 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.084967 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.084998 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.085008 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.085024 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.085037 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:38Z","lastTransitionTime":"2026-02-19T21:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.127444 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.144513 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.173244 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.187527 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.187568 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.187579 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.187597 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.187609 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:38Z","lastTransitionTime":"2026-02-19T21:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.210410 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.262490 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.288399 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.289520 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.289564 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.289579 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.289597 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.289609 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:38Z","lastTransitionTime":"2026-02-19T21:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.330952 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.392041 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.392073 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.392084 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.392098 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.392107 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:38Z","lastTransitionTime":"2026-02-19T21:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.456603 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 06:53:23.93784557 +0000 UTC Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.494705 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.494755 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.494769 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.494788 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.494801 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:38Z","lastTransitionTime":"2026-02-19T21:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.510953 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.510983 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.510953 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:38 crc kubenswrapper[4795]: E0219 21:28:38.511096 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:38 crc kubenswrapper[4795]: E0219 21:28:38.511153 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:38 crc kubenswrapper[4795]: E0219 21:28:38.511268 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.597145 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.597198 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.597207 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.597220 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.597232 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:38Z","lastTransitionTime":"2026-02-19T21:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.658431 4795 generic.go:334] "Generic (PLEG): container finished" podID="c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3" containerID="649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094" exitCode=0 Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.658499 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" event={"ID":"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3","Type":"ContainerDied","Data":"649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094"} Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.662900 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerStarted","Data":"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00"} Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.674872 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.691694 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.699032 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.699102 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.699119 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.699145 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.699210 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:38Z","lastTransitionTime":"2026-02-19T21:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.702532 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.728658 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.749659 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.764538 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.779189 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.789360 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.801782 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.801822 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.801831 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.801845 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.801879 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.801856 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:38Z","lastTransitionTime":"2026-02-19T21:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.811990 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.821229 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.833380 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.848392 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.886467 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.904157 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.904214 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.904222 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.904239 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.904248 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:38Z","lastTransitionTime":"2026-02-19T21:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.935923 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.006392 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.006431 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.006442 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.006455 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.006463 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:39Z","lastTransitionTime":"2026-02-19T21:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.108357 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.108386 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.108396 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.108409 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.108419 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:39Z","lastTransitionTime":"2026-02-19T21:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.210916 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.210940 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.210948 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.210960 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.210968 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:39Z","lastTransitionTime":"2026-02-19T21:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.290399 4795 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.314015 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.314052 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.314061 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.314075 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.314083 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:39Z","lastTransitionTime":"2026-02-19T21:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.421828 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.421861 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.421874 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.421890 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.421901 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:39Z","lastTransitionTime":"2026-02-19T21:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.457569 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 15:40:52.051265018 +0000 UTC Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.524885 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.524926 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.524935 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.524950 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.524961 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:39Z","lastTransitionTime":"2026-02-19T21:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.528786 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.540100 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.553306 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.564727 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.590779 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.605727 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.622243 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.627791 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.627825 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.627835 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.627852 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.627863 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:39Z","lastTransitionTime":"2026-02-19T21:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.634912 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.646357 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.668018 4795 generic.go:334] "Generic (PLEG): container finished" podID="c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3" containerID="01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72" exitCode=0 Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.668155 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" event={"ID":"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3","Type":"ContainerDied","Data":"01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72"} Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.680836 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.698002 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.710318 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.725269 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.730678 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.730718 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.730728 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.730744 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.730756 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:39Z","lastTransitionTime":"2026-02-19T21:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.743365 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.756143 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.768117 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.779807 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.792509 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.804063 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.815227 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.824462 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.833030 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.833067 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.833077 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.833092 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.833106 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:39Z","lastTransitionTime":"2026-02-19T21:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.839864 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.853019 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.886420 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.935027 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.935079 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.935092 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.935106 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.935116 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:39Z","lastTransitionTime":"2026-02-19T21:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.939871 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.965645 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.007827 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.038521 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.038568 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.038581 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.038599 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.038612 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:40Z","lastTransitionTime":"2026-02-19T21:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.046391 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.088117 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.133991 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.140676 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.140728 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.140743 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.140764 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.140779 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:40Z","lastTransitionTime":"2026-02-19T21:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.242986 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.243020 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.243028 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.243042 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.243052 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:40Z","lastTransitionTime":"2026-02-19T21:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.346628 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.346672 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.346682 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.346702 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.346713 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:40Z","lastTransitionTime":"2026-02-19T21:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.449304 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.449333 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.449343 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.449355 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.449364 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:40Z","lastTransitionTime":"2026-02-19T21:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.458655 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 23:37:41.334687827 +0000 UTC Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.510652 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.510686 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.510714 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:40 crc kubenswrapper[4795]: E0219 21:28:40.510892 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:40 crc kubenswrapper[4795]: E0219 21:28:40.511018 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:40 crc kubenswrapper[4795]: E0219 21:28:40.511274 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.552656 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.552730 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.552754 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.552786 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.552809 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:40Z","lastTransitionTime":"2026-02-19T21:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.655484 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.656020 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.656037 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.656056 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.656068 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:40Z","lastTransitionTime":"2026-02-19T21:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.677372 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerStarted","Data":"2a0af6987edcb00218a308872d80fd4728d48000c0454e675a8f68bf30dc2359"} Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.682460 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" event={"ID":"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3","Type":"ContainerStarted","Data":"47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7"} Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.717617 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.739869 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.758128 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.758177 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.758188 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.758202 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.758211 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:40Z","lastTransitionTime":"2026-02-19T21:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.761809 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.774545 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.793455 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.805367 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.814150 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.826273 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.836658 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.846219 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.859894 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.859922 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.859933 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.859945 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.859954 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:40Z","lastTransitionTime":"2026-02-19T21:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.864992 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.876651 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.885410 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.903599 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.915215 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.919076 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.931871 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.952473 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.962673 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.962716 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.962725 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.962739 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.962748 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:40Z","lastTransitionTime":"2026-02-19T21:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.965685 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.980035 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.994229 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.008099 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.024927 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.051293 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.064694 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.064759 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.064777 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.064800 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.064819 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:41Z","lastTransitionTime":"2026-02-19T21:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.092698 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.132473 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.167383 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.167430 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.167444 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.167462 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.167475 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:41Z","lastTransitionTime":"2026-02-19T21:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.177206 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.207805 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.249476 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.270446 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.270486 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.270497 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.270513 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.270524 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:41Z","lastTransitionTime":"2026-02-19T21:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.292448 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.328012 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.372501 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.372533 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.372541 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.372556 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.372567 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:41Z","lastTransitionTime":"2026-02-19T21:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.458874 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 18:58:14.291355369 +0000 UTC Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.476498 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.476563 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.476588 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.476617 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.476639 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:41Z","lastTransitionTime":"2026-02-19T21:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.579281 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.579348 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.579367 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.579392 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.579410 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:41Z","lastTransitionTime":"2026-02-19T21:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.682154 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.682241 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.682259 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.682283 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.682305 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:41Z","lastTransitionTime":"2026-02-19T21:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.685375 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.685451 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.702449 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.713999 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.715057 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.727716 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.745532 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.767744 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.785815 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.785921 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.785952 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.785981 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.786007 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:41Z","lastTransitionTime":"2026-02-19T21:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.800314 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a0af6987edcb00218a308872d80fd4728d48000c0454e675a8f68bf30dc2359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.819743 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.839348 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.856033 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.879961 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.889042 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.889081 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.889090 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.889141 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.889153 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:41Z","lastTransitionTime":"2026-02-19T21:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.897903 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.913663 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.930381 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.942431 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.952302 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.972585 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.986009 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.991772 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.991815 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.991827 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.991844 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.991857 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:41Z","lastTransitionTime":"2026-02-19T21:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.005833 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:42Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.058369 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:42Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.089685 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:42Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.094270 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.094317 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.094333 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.094355 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.094372 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:42Z","lastTransitionTime":"2026-02-19T21:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.129723 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:42Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.171380 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:42Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.197148 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.197233 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.197253 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.197274 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.197289 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:42Z","lastTransitionTime":"2026-02-19T21:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.211414 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:42Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.247939 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:42Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.295052 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a0af6987edcb00218a308872d80fd4728d48000c0454e675a8f68bf30dc2359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:42Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.300069 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.300121 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.300133 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.300150 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.300176 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:42Z","lastTransitionTime":"2026-02-19T21:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.331898 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:42Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.369325 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:42Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.402823 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.402863 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.402874 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.402892 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.402907 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:42Z","lastTransitionTime":"2026-02-19T21:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.410904 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:42Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.447791 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:42Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.459113 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 06:55:26.423708245 +0000 UTC Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.497609 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:42Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.505087 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.505123 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.505132 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.505147 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.505158 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:42Z","lastTransitionTime":"2026-02-19T21:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.511460 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.511476 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.511594 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:42 crc kubenswrapper[4795]: E0219 21:28:42.511707 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:42 crc kubenswrapper[4795]: E0219 21:28:42.511857 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:42 crc kubenswrapper[4795]: E0219 21:28:42.511987 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.530940 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:42Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.607855 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.607892 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.607903 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.607919 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.607931 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:42Z","lastTransitionTime":"2026-02-19T21:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.687422 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.711034 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.711073 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.711085 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.711110 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.711121 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:42Z","lastTransitionTime":"2026-02-19T21:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.813365 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.813416 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.813431 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.813455 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.813472 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:42Z","lastTransitionTime":"2026-02-19T21:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.916060 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.916116 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.916125 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.916139 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.916150 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:42Z","lastTransitionTime":"2026-02-19T21:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.018627 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.018665 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.018675 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.018691 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.018730 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:43Z","lastTransitionTime":"2026-02-19T21:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.121066 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.121099 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.121108 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.121121 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.121130 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:43Z","lastTransitionTime":"2026-02-19T21:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.223031 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.223077 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.223085 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.223100 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.223110 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:43Z","lastTransitionTime":"2026-02-19T21:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.326393 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.326444 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.326456 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.326473 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.326488 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:43Z","lastTransitionTime":"2026-02-19T21:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.428796 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.429000 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.429060 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.429132 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.429230 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:43Z","lastTransitionTime":"2026-02-19T21:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.460187 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 13:50:48.413691279 +0000 UTC Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.531826 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.532014 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.532073 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.532146 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.532233 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:43Z","lastTransitionTime":"2026-02-19T21:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.634313 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.634361 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.634377 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.634398 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.634413 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:43Z","lastTransitionTime":"2026-02-19T21:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.691749 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovnkube-controller/0.log" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.694369 4795 generic.go:334] "Generic (PLEG): container finished" podID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerID="2a0af6987edcb00218a308872d80fd4728d48000c0454e675a8f68bf30dc2359" exitCode=1 Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.694412 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerDied","Data":"2a0af6987edcb00218a308872d80fd4728d48000c0454e675a8f68bf30dc2359"} Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.695123 4795 scope.go:117] "RemoveContainer" containerID="2a0af6987edcb00218a308872d80fd4728d48000c0454e675a8f68bf30dc2359" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.716620 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:43Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.728104 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:43Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.737114 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.737270 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.737362 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.737428 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.737508 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:43Z","lastTransitionTime":"2026-02-19T21:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.740959 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:43Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.755938 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:43Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.774142 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a0af6987edcb00218a308872d80fd4728d48000c0454e675a8f68bf30dc2359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a0af6987edcb00218a308872d80fd4728d48000c0454e675a8f68bf30dc2359\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\" 6105 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 21:28:43.391192 6105 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0219 21:28:43.391243 6105 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 21:28:43.391249 6105 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 21:28:43.391260 6105 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0219 21:28:43.391273 6105 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 21:28:43.391288 6105 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 21:28:43.391303 6105 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0219 21:28:43.391308 6105 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0219 21:28:43.391308 6105 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 21:28:43.391320 6105 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0219 21:28:43.391330 6105 factory.go:656] Stopping watch factory\\\\nI0219 21:28:43.391334 6105 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 21:28:43.391323 6105 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 21:28:43.391350 6105 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0219 21:28:43.391356 6105 ovnkube.go:599] Stopped ovnkube\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:43Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.787948 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:43Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.799376 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:43Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.812717 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:43Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.824398 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:43Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.840306 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.840344 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.840356 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.840373 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.840385 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:43Z","lastTransitionTime":"2026-02-19T21:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.851562 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:43Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.865002 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:43Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.876485 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:43Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.894618 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:43Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.909025 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:43Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.921117 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:43Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.943141 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.943219 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.943237 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.943263 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.943280 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:43Z","lastTransitionTime":"2026-02-19T21:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.045087 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.045118 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.045126 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.045139 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.045148 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.147649 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.147685 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.147696 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.147712 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.147720 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.249527 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.249558 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.249567 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.249580 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.249588 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.281419 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.281614 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:00.281597885 +0000 UTC m=+51.474115749 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.351241 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.351281 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.351291 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.351303 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.351312 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.382276 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.382328 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.382359 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.382381 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.382417 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.382441 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.382467 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.382476 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:29:00.382459995 +0000 UTC m=+51.574977859 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.382485 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.382491 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:29:00.382485476 +0000 UTC m=+51.575003340 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.382495 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.382494 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.382543 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 21:29:00.382534908 +0000 UTC m=+51.575052772 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.382546 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.382565 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.382628 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 21:29:00.38261461 +0000 UTC m=+51.575132474 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.453187 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.453226 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.453236 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.453253 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.453266 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.460439 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 21:11:09.56090668 +0000 UTC Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.511051 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.511137 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.511051 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.511214 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.511263 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.511341 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.555640 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.555684 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.555699 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.555715 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.555726 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.658019 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.658056 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.658067 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.658084 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.658094 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.697525 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovnkube-controller/1.log" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.698058 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovnkube-controller/0.log" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.700267 4795 generic.go:334] "Generic (PLEG): container finished" podID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerID="175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517" exitCode=1 Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.700299 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerDied","Data":"175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517"} Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.700338 4795 scope.go:117] "RemoveContainer" containerID="2a0af6987edcb00218a308872d80fd4728d48000c0454e675a8f68bf30dc2359" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.700878 4795 scope.go:117] "RemoveContainer" containerID="175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517" Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.701045 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.714263 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.723901 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.736300 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.747397 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.756336 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.760013 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.760044 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.760055 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.760072 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.760084 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.770062 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.780494 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.787736 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.805762 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.816264 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.826683 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.836886 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.847235 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.855790 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.855837 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.855846 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.855860 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.855869 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.857331 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.866434 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.869093 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.869116 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.869125 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.869138 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.869148 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.875857 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a0af6987edcb00218a308872d80fd4728d48000c0454e675a8f68bf30dc2359\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\" 6105 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 21:28:43.391192 6105 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0219 21:28:43.391243 6105 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 21:28:43.391249 6105 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 21:28:43.391260 6105 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0219 21:28:43.391273 6105 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 21:28:43.391288 6105 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 21:28:43.391303 6105 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0219 21:28:43.391308 6105 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0219 21:28:43.391308 6105 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 21:28:43.391320 6105 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0219 21:28:43.391330 6105 factory.go:656] Stopping watch factory\\\\nI0219 21:28:43.391334 6105 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 21:28:43.391323 6105 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 21:28:43.391350 6105 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0219 21:28:43.391356 6105 ovnkube.go:599] Stopped ovnkube\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"cal for Pod openshift-multus/multus-additional-cni-plugins-l29c7 in node crc\\\\nI0219 21:28:44.491789 6229 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-l29c7 after 0 failed attempt(s)\\\\nI0219 21:28:44.491795 6229 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-l29c7\\\\nI0219 21:28:44.491778 6229 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.194:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d7d7b270-1480-47f8-bdf9-690dbab310cb}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0219 21:28:44.491057 6229 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} wa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.878839 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.881968 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.881992 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.882003 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.882017 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.882026 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.892231 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.894954 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.894980 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.894988 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.894998 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.895005 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.904789 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.908085 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.908115 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.908124 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.908138 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.908147 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.917640 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.917745 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.918967 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.918995 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.919004 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.919018 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.919026 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.021449 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.021499 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.021509 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.021522 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.021530 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:45Z","lastTransitionTime":"2026-02-19T21:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.124320 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.124362 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.124373 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.124391 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.124405 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:45Z","lastTransitionTime":"2026-02-19T21:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.226224 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.226444 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.226548 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.226636 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.226729 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:45Z","lastTransitionTime":"2026-02-19T21:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.328993 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.329033 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.329044 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.329058 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.329068 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:45Z","lastTransitionTime":"2026-02-19T21:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.431052 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.431082 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.431089 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.431102 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.431110 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:45Z","lastTransitionTime":"2026-02-19T21:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.461525 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 21:57:45.846761219 +0000 UTC Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.534682 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.534756 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.534777 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.534805 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.534827 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:45Z","lastTransitionTime":"2026-02-19T21:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.637725 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.637791 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.637806 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.637831 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.637849 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:45Z","lastTransitionTime":"2026-02-19T21:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.708430 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovnkube-controller/1.log" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.713523 4795 scope.go:117] "RemoveContainer" containerID="175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517" Feb 19 21:28:45 crc kubenswrapper[4795]: E0219 21:28:45.713744 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.735698 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.740390 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.740429 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.740443 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.740464 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.740478 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:45Z","lastTransitionTime":"2026-02-19T21:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.754641 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.769469 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.782475 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.801130 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.812392 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.835007 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.843084 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.843126 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.843139 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.843156 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.843196 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:45Z","lastTransitionTime":"2026-02-19T21:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.855043 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.868255 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.887001 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.901952 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.914479 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz"] Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.914931 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.916891 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.916996 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.917491 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.942125 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"cal for Pod openshift-multus/multus-additional-cni-plugins-l29c7 in node crc\\\\nI0219 21:28:44.491789 6229 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-l29c7 after 0 failed attempt(s)\\\\nI0219 21:28:44.491795 6229 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-l29c7\\\\nI0219 21:28:44.491778 6229 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.194:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d7d7b270-1480-47f8-bdf9-690dbab310cb}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0219 21:28:44.491057 6229 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} wa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.945492 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.945552 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.945566 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.945581 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.945590 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:45Z","lastTransitionTime":"2026-02-19T21:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.961353 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.972941 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.981460 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.999595 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/67ec5415-424e-40b7-9beb-171cd1f3dbe9-env-overrides\") pod \"ovnkube-control-plane-749d76644c-nfpbz\" (UID: \"67ec5415-424e-40b7-9beb-171cd1f3dbe9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.999706 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/67ec5415-424e-40b7-9beb-171cd1f3dbe9-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-nfpbz\" (UID: \"67ec5415-424e-40b7-9beb-171cd1f3dbe9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.999866 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/67ec5415-424e-40b7-9beb-171cd1f3dbe9-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-nfpbz\" (UID: \"67ec5415-424e-40b7-9beb-171cd1f3dbe9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.999930 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmblv\" (UniqueName: \"kubernetes.io/projected/67ec5415-424e-40b7-9beb-171cd1f3dbe9-kube-api-access-kmblv\") pod \"ovnkube-control-plane-749d76644c-nfpbz\" (UID: \"67ec5415-424e-40b7-9beb-171cd1f3dbe9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.000695 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.015305 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.026501 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.047440 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.047480 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.047492 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.047510 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.047523 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:46Z","lastTransitionTime":"2026-02-19T21:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.053240 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.069991 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.101530 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/67ec5415-424e-40b7-9beb-171cd1f3dbe9-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-nfpbz\" (UID: \"67ec5415-424e-40b7-9beb-171cd1f3dbe9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.101679 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/67ec5415-424e-40b7-9beb-171cd1f3dbe9-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-nfpbz\" (UID: \"67ec5415-424e-40b7-9beb-171cd1f3dbe9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.101717 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmblv\" (UniqueName: \"kubernetes.io/projected/67ec5415-424e-40b7-9beb-171cd1f3dbe9-kube-api-access-kmblv\") pod \"ovnkube-control-plane-749d76644c-nfpbz\" (UID: \"67ec5415-424e-40b7-9beb-171cd1f3dbe9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.101757 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/67ec5415-424e-40b7-9beb-171cd1f3dbe9-env-overrides\") pod \"ovnkube-control-plane-749d76644c-nfpbz\" (UID: \"67ec5415-424e-40b7-9beb-171cd1f3dbe9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.102676 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/67ec5415-424e-40b7-9beb-171cd1f3dbe9-env-overrides\") pod \"ovnkube-control-plane-749d76644c-nfpbz\" (UID: \"67ec5415-424e-40b7-9beb-171cd1f3dbe9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.102691 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/67ec5415-424e-40b7-9beb-171cd1f3dbe9-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-nfpbz\" (UID: \"67ec5415-424e-40b7-9beb-171cd1f3dbe9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.106681 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/67ec5415-424e-40b7-9beb-171cd1f3dbe9-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-nfpbz\" (UID: \"67ec5415-424e-40b7-9beb-171cd1f3dbe9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.119941 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.136652 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.141695 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmblv\" (UniqueName: \"kubernetes.io/projected/67ec5415-424e-40b7-9beb-171cd1f3dbe9-kube-api-access-kmblv\") pod \"ovnkube-control-plane-749d76644c-nfpbz\" (UID: \"67ec5415-424e-40b7-9beb-171cd1f3dbe9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.150613 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.150670 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.150683 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.150700 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.150712 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:46Z","lastTransitionTime":"2026-02-19T21:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.151536 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.163680 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.179343 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"cal for Pod openshift-multus/multus-additional-cni-plugins-l29c7 in node crc\\\\nI0219 21:28:44.491789 6229 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-l29c7 after 0 failed attempt(s)\\\\nI0219 21:28:44.491795 6229 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-l29c7\\\\nI0219 21:28:44.491778 6229 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.194:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d7d7b270-1480-47f8-bdf9-690dbab310cb}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0219 21:28:44.491057 6229 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} wa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.190029 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.203310 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.215789 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.228812 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.232477 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: W0219 21:28:46.239576 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67ec5415_424e_40b7_9beb_171cd1f3dbe9.slice/crio-46647e0b4c0675fefd39ff37753ffb51485edbf8b70456292a38fb7676ee7efd WatchSource:0}: Error finding container 46647e0b4c0675fefd39ff37753ffb51485edbf8b70456292a38fb7676ee7efd: Status 404 returned error can't find the container with id 46647e0b4c0675fefd39ff37753ffb51485edbf8b70456292a38fb7676ee7efd Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.245981 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.253038 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.253076 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.253086 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.253100 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.253110 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:46Z","lastTransitionTime":"2026-02-19T21:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.355663 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.355700 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.355709 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.355726 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.355735 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:46Z","lastTransitionTime":"2026-02-19T21:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.492962 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 09:16:55.111097551 +0000 UTC Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.495716 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.495784 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.495800 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.495823 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.495836 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:46Z","lastTransitionTime":"2026-02-19T21:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.511631 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:46 crc kubenswrapper[4795]: E0219 21:28:46.511751 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.511810 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:46 crc kubenswrapper[4795]: E0219 21:28:46.511937 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.512929 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:46 crc kubenswrapper[4795]: E0219 21:28:46.513049 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.597849 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.597896 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.597912 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.597935 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.597954 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:46Z","lastTransitionTime":"2026-02-19T21:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.638333 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.700044 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.700085 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.700096 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.700115 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.700126 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:46Z","lastTransitionTime":"2026-02-19T21:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.716539 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" event={"ID":"67ec5415-424e-40b7-9beb-171cd1f3dbe9","Type":"ContainerStarted","Data":"ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4"} Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.716588 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" event={"ID":"67ec5415-424e-40b7-9beb-171cd1f3dbe9","Type":"ContainerStarted","Data":"230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6"} Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.716601 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" event={"ID":"67ec5415-424e-40b7-9beb-171cd1f3dbe9","Type":"ContainerStarted","Data":"46647e0b4c0675fefd39ff37753ffb51485edbf8b70456292a38fb7676ee7efd"} Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.717105 4795 scope.go:117] "RemoveContainer" containerID="175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517" Feb 19 21:28:46 crc kubenswrapper[4795]: E0219 21:28:46.717260 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.736318 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.746701 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.756256 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.768480 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.778906 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.787809 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.797576 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.802431 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.802461 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.802470 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.802483 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.802491 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:46Z","lastTransitionTime":"2026-02-19T21:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.808991 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.819417 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.829996 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.853904 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"cal for Pod openshift-multus/multus-additional-cni-plugins-l29c7 in node crc\\\\nI0219 21:28:44.491789 6229 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-l29c7 after 0 failed attempt(s)\\\\nI0219 21:28:44.491795 6229 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-l29c7\\\\nI0219 21:28:44.491778 6229 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.194:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d7d7b270-1480-47f8-bdf9-690dbab310cb}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0219 21:28:44.491057 6229 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} wa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.864634 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.882418 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.895475 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.904906 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.904936 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.904946 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.904963 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.904974 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:46Z","lastTransitionTime":"2026-02-19T21:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.908813 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.918293 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.008055 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.008116 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.008134 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.008160 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.008250 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:47Z","lastTransitionTime":"2026-02-19T21:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.111085 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.111123 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.111132 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.111146 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.111155 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:47Z","lastTransitionTime":"2026-02-19T21:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.214329 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.214373 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.214381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.214399 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.214409 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:47Z","lastTransitionTime":"2026-02-19T21:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.316600 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.316656 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.316665 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.316678 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.316689 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:47Z","lastTransitionTime":"2026-02-19T21:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.369223 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-ff4bs"] Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.369647 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:28:47 crc kubenswrapper[4795]: E0219 21:28:47.369700 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.383932 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b1b4346-e02e-4614-b2ff-e4628046a92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ff4bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.414583 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.416111 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs\") pod \"network-metrics-daemon-ff4bs\" (UID: \"1b1b4346-e02e-4614-b2ff-e4628046a92f\") " pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.416233 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw77f\" (UniqueName: \"kubernetes.io/projected/1b1b4346-e02e-4614-b2ff-e4628046a92f-kube-api-access-rw77f\") pod \"network-metrics-daemon-ff4bs\" (UID: \"1b1b4346-e02e-4614-b2ff-e4628046a92f\") " pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.418671 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.418725 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.418739 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.418757 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.418770 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:47Z","lastTransitionTime":"2026-02-19T21:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.431596 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.444829 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.461349 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.479656 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.490500 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.493974 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 10:13:56.151111468 +0000 UTC Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.503610 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.517051 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.517534 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs\") pod \"network-metrics-daemon-ff4bs\" (UID: \"1b1b4346-e02e-4614-b2ff-e4628046a92f\") " pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.517586 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw77f\" (UniqueName: \"kubernetes.io/projected/1b1b4346-e02e-4614-b2ff-e4628046a92f-kube-api-access-rw77f\") pod \"network-metrics-daemon-ff4bs\" (UID: \"1b1b4346-e02e-4614-b2ff-e4628046a92f\") " pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:28:47 crc kubenswrapper[4795]: E0219 21:28:47.517684 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:28:47 crc kubenswrapper[4795]: E0219 21:28:47.517772 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs podName:1b1b4346-e02e-4614-b2ff-e4628046a92f nodeName:}" failed. No retries permitted until 2026-02-19 21:28:48.017749655 +0000 UTC m=+39.210267589 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs") pod "network-metrics-daemon-ff4bs" (UID: "1b1b4346-e02e-4614-b2ff-e4628046a92f") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.521212 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.521256 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.521268 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.521283 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.521293 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:47Z","lastTransitionTime":"2026-02-19T21:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.534089 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.541636 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw77f\" (UniqueName: \"kubernetes.io/projected/1b1b4346-e02e-4614-b2ff-e4628046a92f-kube-api-access-rw77f\") pod \"network-metrics-daemon-ff4bs\" (UID: \"1b1b4346-e02e-4614-b2ff-e4628046a92f\") " pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.550880 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.569565 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"cal for Pod openshift-multus/multus-additional-cni-plugins-l29c7 in node crc\\\\nI0219 21:28:44.491789 6229 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-l29c7 after 0 failed attempt(s)\\\\nI0219 21:28:44.491795 6229 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-l29c7\\\\nI0219 21:28:44.491778 6229 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.194:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d7d7b270-1480-47f8-bdf9-690dbab310cb}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0219 21:28:44.491057 6229 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} wa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.585353 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.599759 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.613019 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.623250 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.623312 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.623332 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.623358 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.623377 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:47Z","lastTransitionTime":"2026-02-19T21:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.628907 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.641626 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.725069 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.725102 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.725112 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.725124 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.725138 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:47Z","lastTransitionTime":"2026-02-19T21:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.826934 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.826981 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.826989 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.827007 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.827016 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:47Z","lastTransitionTime":"2026-02-19T21:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.929439 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.929470 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.929479 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.929492 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.929501 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:47Z","lastTransitionTime":"2026-02-19T21:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.022841 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs\") pod \"network-metrics-daemon-ff4bs\" (UID: \"1b1b4346-e02e-4614-b2ff-e4628046a92f\") " pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:28:48 crc kubenswrapper[4795]: E0219 21:28:48.022961 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:28:48 crc kubenswrapper[4795]: E0219 21:28:48.023013 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs podName:1b1b4346-e02e-4614-b2ff-e4628046a92f nodeName:}" failed. No retries permitted until 2026-02-19 21:28:49.022999035 +0000 UTC m=+40.215516889 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs") pod "network-metrics-daemon-ff4bs" (UID: "1b1b4346-e02e-4614-b2ff-e4628046a92f") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.031842 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.031878 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.031887 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.031899 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.031908 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:48Z","lastTransitionTime":"2026-02-19T21:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.134122 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.134176 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.134187 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.134202 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.134211 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:48Z","lastTransitionTime":"2026-02-19T21:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.236244 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.236284 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.236292 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.236305 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.236315 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:48Z","lastTransitionTime":"2026-02-19T21:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.339018 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.339046 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.339056 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.339070 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.339079 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:48Z","lastTransitionTime":"2026-02-19T21:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.440985 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.441042 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.441063 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.441092 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.441114 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:48Z","lastTransitionTime":"2026-02-19T21:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.494264 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 16:28:20.792601536 +0000 UTC Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.510853 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.510907 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.510867 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:48 crc kubenswrapper[4795]: E0219 21:28:48.511120 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:48 crc kubenswrapper[4795]: E0219 21:28:48.511229 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:48 crc kubenswrapper[4795]: E0219 21:28:48.511302 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.543528 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.543568 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.543578 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.543594 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.543605 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:48Z","lastTransitionTime":"2026-02-19T21:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.645823 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.645895 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.645920 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.645949 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.645972 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:48Z","lastTransitionTime":"2026-02-19T21:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.748981 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.749031 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.749043 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.749062 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.749074 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:48Z","lastTransitionTime":"2026-02-19T21:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.851645 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.851694 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.851704 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.851718 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.851726 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:48Z","lastTransitionTime":"2026-02-19T21:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.953795 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.953825 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.953833 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.953847 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.953857 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:48Z","lastTransitionTime":"2026-02-19T21:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.031613 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs\") pod \"network-metrics-daemon-ff4bs\" (UID: \"1b1b4346-e02e-4614-b2ff-e4628046a92f\") " pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:28:49 crc kubenswrapper[4795]: E0219 21:28:49.031794 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:28:49 crc kubenswrapper[4795]: E0219 21:28:49.031854 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs podName:1b1b4346-e02e-4614-b2ff-e4628046a92f nodeName:}" failed. No retries permitted until 2026-02-19 21:28:51.031836298 +0000 UTC m=+42.224354162 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs") pod "network-metrics-daemon-ff4bs" (UID: "1b1b4346-e02e-4614-b2ff-e4628046a92f") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.055631 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.055664 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.055674 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.055687 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.055695 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:49Z","lastTransitionTime":"2026-02-19T21:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.158041 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.158081 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.158091 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.158108 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.158120 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:49Z","lastTransitionTime":"2026-02-19T21:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.260504 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.260535 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.260543 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.260556 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.260566 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:49Z","lastTransitionTime":"2026-02-19T21:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.363106 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.363137 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.363146 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.363158 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.363183 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:49Z","lastTransitionTime":"2026-02-19T21:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.464887 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.464919 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.464928 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.464941 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.464949 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:49Z","lastTransitionTime":"2026-02-19T21:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.494913 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 13:42:06.836952082 +0000 UTC Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.511297 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:28:49 crc kubenswrapper[4795]: E0219 21:28:49.511412 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.531178 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.548937 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.567667 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.567983 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.568104 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.568259 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.568368 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:49Z","lastTransitionTime":"2026-02-19T21:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.574878 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"cal for Pod openshift-multus/multus-additional-cni-plugins-l29c7 in node crc\\\\nI0219 21:28:44.491789 6229 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-l29c7 after 0 failed attempt(s)\\\\nI0219 21:28:44.491795 6229 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-l29c7\\\\nI0219 21:28:44.491778 6229 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.194:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d7d7b270-1480-47f8-bdf9-690dbab310cb}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0219 21:28:44.491057 6229 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} wa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.593561 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.617063 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.630300 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.643032 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.656487 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.669642 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.672035 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.672092 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.672118 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.672148 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.672202 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:49Z","lastTransitionTime":"2026-02-19T21:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.686279 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.703976 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.716505 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.734641 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b1b4346-e02e-4614-b2ff-e4628046a92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ff4bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.755537 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.771371 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.774820 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.774842 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.774850 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.774863 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.774872 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:49Z","lastTransitionTime":"2026-02-19T21:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.786511 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.796263 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.877359 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.877390 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.877400 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.877415 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.877426 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:49Z","lastTransitionTime":"2026-02-19T21:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.979824 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.979866 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.979877 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.979894 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.979905 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:49Z","lastTransitionTime":"2026-02-19T21:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.082731 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.082794 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.082812 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.082834 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.082849 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:50Z","lastTransitionTime":"2026-02-19T21:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.185400 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.185727 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.185823 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.185964 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.186062 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:50Z","lastTransitionTime":"2026-02-19T21:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.289223 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.289265 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.289281 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.289304 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.289321 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:50Z","lastTransitionTime":"2026-02-19T21:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.392281 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.392334 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.392359 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.392385 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.392406 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:50Z","lastTransitionTime":"2026-02-19T21:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.495000 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.495025 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.495032 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.495045 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.494997 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 00:42:39.723753432 +0000 UTC Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.495053 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:50Z","lastTransitionTime":"2026-02-19T21:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.511527 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.511544 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:50 crc kubenswrapper[4795]: E0219 21:28:50.511605 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.511612 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:50 crc kubenswrapper[4795]: E0219 21:28:50.511709 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:50 crc kubenswrapper[4795]: E0219 21:28:50.511791 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.597822 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.597880 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.597903 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.597931 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.597952 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:50Z","lastTransitionTime":"2026-02-19T21:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.700248 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.700301 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.700321 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.700344 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.700362 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:50Z","lastTransitionTime":"2026-02-19T21:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.803239 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.803275 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.803288 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.803307 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.803319 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:50Z","lastTransitionTime":"2026-02-19T21:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.906445 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.906487 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.906499 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.906522 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.906534 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:50Z","lastTransitionTime":"2026-02-19T21:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.009131 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.009180 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.009193 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.009210 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.009223 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:51Z","lastTransitionTime":"2026-02-19T21:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.050504 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs\") pod \"network-metrics-daemon-ff4bs\" (UID: \"1b1b4346-e02e-4614-b2ff-e4628046a92f\") " pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:28:51 crc kubenswrapper[4795]: E0219 21:28:51.050676 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:28:51 crc kubenswrapper[4795]: E0219 21:28:51.050730 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs podName:1b1b4346-e02e-4614-b2ff-e4628046a92f nodeName:}" failed. No retries permitted until 2026-02-19 21:28:55.050713337 +0000 UTC m=+46.243231211 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs") pod "network-metrics-daemon-ff4bs" (UID: "1b1b4346-e02e-4614-b2ff-e4628046a92f") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.111741 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.111792 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.111804 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.111820 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.111831 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:51Z","lastTransitionTime":"2026-02-19T21:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.214958 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.215010 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.215029 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.215047 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.215059 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:51Z","lastTransitionTime":"2026-02-19T21:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.317829 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.318063 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.318127 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.318225 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.318295 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:51Z","lastTransitionTime":"2026-02-19T21:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.420375 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.420412 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.420421 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.420436 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.420446 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:51Z","lastTransitionTime":"2026-02-19T21:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.495321 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 06:21:17.996560819 +0000 UTC Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.510874 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:28:51 crc kubenswrapper[4795]: E0219 21:28:51.511052 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.521989 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.522028 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.522040 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.522056 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.522068 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:51Z","lastTransitionTime":"2026-02-19T21:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.624077 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.624104 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.624120 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.624139 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.624149 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:51Z","lastTransitionTime":"2026-02-19T21:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.726771 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.726820 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.726833 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.726852 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.726864 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:51Z","lastTransitionTime":"2026-02-19T21:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.828919 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.828949 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.828957 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.828970 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.828979 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:51Z","lastTransitionTime":"2026-02-19T21:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.931233 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.931282 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.931294 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.931312 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.931327 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:51Z","lastTransitionTime":"2026-02-19T21:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.033133 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.033525 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.033627 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.033716 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.033810 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:52Z","lastTransitionTime":"2026-02-19T21:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.136657 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.136702 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.136709 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.136721 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.136730 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:52Z","lastTransitionTime":"2026-02-19T21:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.238707 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.238958 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.239049 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.239135 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.239289 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:52Z","lastTransitionTime":"2026-02-19T21:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.340952 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.340994 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.341003 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.341017 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.341026 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:52Z","lastTransitionTime":"2026-02-19T21:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.443335 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.443474 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.443502 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.443529 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.443550 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:52Z","lastTransitionTime":"2026-02-19T21:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.495977 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 14:15:34.803411244 +0000 UTC Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.511368 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.511372 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:52 crc kubenswrapper[4795]: E0219 21:28:52.511838 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.511390 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:52 crc kubenswrapper[4795]: E0219 21:28:52.511691 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:52 crc kubenswrapper[4795]: E0219 21:28:52.512020 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.545921 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.545954 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.545962 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.545977 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.545987 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:52Z","lastTransitionTime":"2026-02-19T21:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.648422 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.648450 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.648458 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.648470 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.648479 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:52Z","lastTransitionTime":"2026-02-19T21:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.750753 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.750780 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.750788 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.750800 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.750809 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:52Z","lastTransitionTime":"2026-02-19T21:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.853514 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.853815 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.853915 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.854022 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.854147 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:52Z","lastTransitionTime":"2026-02-19T21:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.957190 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.957226 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.957235 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.957258 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.957269 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:52Z","lastTransitionTime":"2026-02-19T21:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.064478 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.064710 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.064785 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.064864 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.064929 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:53Z","lastTransitionTime":"2026-02-19T21:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.166687 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.166740 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.166752 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.166764 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.166772 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:53Z","lastTransitionTime":"2026-02-19T21:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.268684 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.268925 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.269003 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.269075 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.269145 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:53Z","lastTransitionTime":"2026-02-19T21:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.371453 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.371698 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.371801 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.371885 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.371963 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:53Z","lastTransitionTime":"2026-02-19T21:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.474432 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.474689 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.474758 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.474834 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.474902 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:53Z","lastTransitionTime":"2026-02-19T21:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.496546 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 03:19:07.853274193 +0000 UTC Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.511054 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:28:53 crc kubenswrapper[4795]: E0219 21:28:53.511346 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.577084 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.577153 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.577205 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.577230 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.577248 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:53Z","lastTransitionTime":"2026-02-19T21:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.680838 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.680894 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.680906 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.680921 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.680932 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:53Z","lastTransitionTime":"2026-02-19T21:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.783290 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.783350 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.783368 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.783391 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.783411 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:53Z","lastTransitionTime":"2026-02-19T21:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.886857 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.886897 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.886911 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.886928 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.886940 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:53Z","lastTransitionTime":"2026-02-19T21:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.989871 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.989931 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.989950 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.989975 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.989996 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:53Z","lastTransitionTime":"2026-02-19T21:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.092702 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.092782 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.092804 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.092832 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.092853 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:54Z","lastTransitionTime":"2026-02-19T21:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.196303 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.196352 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.196364 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.196382 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.196394 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:54Z","lastTransitionTime":"2026-02-19T21:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.298729 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.298798 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.298815 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.298837 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.298853 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:54Z","lastTransitionTime":"2026-02-19T21:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.401775 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.402044 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.402109 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.402205 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.402284 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:54Z","lastTransitionTime":"2026-02-19T21:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.497160 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 01:27:12.140175061 +0000 UTC Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.505283 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.505320 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.505336 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.505358 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.505374 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:54Z","lastTransitionTime":"2026-02-19T21:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.510794 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.510917 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:54 crc kubenswrapper[4795]: E0219 21:28:54.511084 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.510794 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:54 crc kubenswrapper[4795]: E0219 21:28:54.511299 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:54 crc kubenswrapper[4795]: E0219 21:28:54.510944 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.607572 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.607605 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.607617 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.607632 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.607643 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:54Z","lastTransitionTime":"2026-02-19T21:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.710564 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.710630 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.710656 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.710684 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.710707 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:54Z","lastTransitionTime":"2026-02-19T21:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.814048 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.814094 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.814109 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.814128 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.814144 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:54Z","lastTransitionTime":"2026-02-19T21:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.917686 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.917748 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.917765 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.917795 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.917813 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:54Z","lastTransitionTime":"2026-02-19T21:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.021129 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.021211 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.021230 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.021252 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.021275 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.056274 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.056315 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.056330 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.056349 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.056363 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4795]: E0219 21:28:55.078441 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:55Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.082699 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.082737 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.082749 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.082766 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.082778 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.086900 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs\") pod \"network-metrics-daemon-ff4bs\" (UID: \"1b1b4346-e02e-4614-b2ff-e4628046a92f\") " pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:28:55 crc kubenswrapper[4795]: E0219 21:28:55.087065 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:28:55 crc kubenswrapper[4795]: E0219 21:28:55.087217 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs podName:1b1b4346-e02e-4614-b2ff-e4628046a92f nodeName:}" failed. No retries permitted until 2026-02-19 21:29:03.087139168 +0000 UTC m=+54.279657072 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs") pod "network-metrics-daemon-ff4bs" (UID: "1b1b4346-e02e-4614-b2ff-e4628046a92f") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:28:55 crc kubenswrapper[4795]: E0219 21:28:55.103257 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:55Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.107607 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.107687 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.107699 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.107715 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.107726 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4795]: E0219 21:28:55.123605 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:55Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.127126 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.127208 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.127234 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.127257 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.127275 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4795]: E0219 21:28:55.141400 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:55Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.145032 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.145072 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.145088 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.145106 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.145120 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4795]: E0219 21:28:55.158939 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:55Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:55 crc kubenswrapper[4795]: E0219 21:28:55.159107 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.161214 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.161390 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.161483 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.161587 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.161670 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.264113 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.264154 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.264175 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.264192 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.264203 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.367744 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.367793 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.367804 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.367825 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.367837 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.470484 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.470524 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.470535 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.470552 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.470565 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.498308 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 07:42:28.506358376 +0000 UTC Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.511690 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:28:55 crc kubenswrapper[4795]: E0219 21:28:55.511842 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.573620 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.573681 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.573700 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.573731 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.573749 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.676757 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.676792 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.676800 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.676813 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.676822 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.780076 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.780114 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.780126 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.780143 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.780156 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.883247 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.883307 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.883331 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.883359 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.883381 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.985580 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.985650 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.985671 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.985695 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.985713 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.088724 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.088796 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.088825 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.088855 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.088876 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:56Z","lastTransitionTime":"2026-02-19T21:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.191622 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.191695 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.191732 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.191767 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.191788 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:56Z","lastTransitionTime":"2026-02-19T21:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.294319 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.294397 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.294420 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.294450 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.294472 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:56Z","lastTransitionTime":"2026-02-19T21:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.397870 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.397950 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.397973 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.398053 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.398461 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:56Z","lastTransitionTime":"2026-02-19T21:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.498601 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 20:01:50.853028521 +0000 UTC Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.501327 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.501398 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.501423 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.501453 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.501474 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:56Z","lastTransitionTime":"2026-02-19T21:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.511273 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.511305 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.511305 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:56 crc kubenswrapper[4795]: E0219 21:28:56.511466 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:56 crc kubenswrapper[4795]: E0219 21:28:56.511638 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:56 crc kubenswrapper[4795]: E0219 21:28:56.511773 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.603691 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.603753 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.603777 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.603807 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.603829 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:56Z","lastTransitionTime":"2026-02-19T21:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.707458 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.707517 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.707533 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.707558 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.707601 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:56Z","lastTransitionTime":"2026-02-19T21:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.810426 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.810479 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.810495 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.810518 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.810538 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:56Z","lastTransitionTime":"2026-02-19T21:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.913599 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.913664 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.913688 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.913716 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.913842 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:56Z","lastTransitionTime":"2026-02-19T21:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.021362 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.021423 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.021433 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.021446 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.021455 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:57Z","lastTransitionTime":"2026-02-19T21:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.125065 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.125225 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.125252 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.125282 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.125717 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:57Z","lastTransitionTime":"2026-02-19T21:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.228594 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.228664 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.228688 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.228720 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.228742 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:57Z","lastTransitionTime":"2026-02-19T21:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.332540 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.332591 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.332607 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.332628 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.332644 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:57Z","lastTransitionTime":"2026-02-19T21:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.436120 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.436210 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.436229 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.436255 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.436300 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:57Z","lastTransitionTime":"2026-02-19T21:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.499401 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 13:33:57.459071703 +0000 UTC Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.510610 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:28:57 crc kubenswrapper[4795]: E0219 21:28:57.510835 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.539356 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.539424 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.539444 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.539473 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.539491 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:57Z","lastTransitionTime":"2026-02-19T21:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.642198 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.642255 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.642272 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.642296 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.642317 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:57Z","lastTransitionTime":"2026-02-19T21:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.745712 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.745842 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.745879 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.745898 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.745911 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:57Z","lastTransitionTime":"2026-02-19T21:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.848815 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.848871 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.848889 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.848918 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.848935 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:57Z","lastTransitionTime":"2026-02-19T21:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.952280 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.952349 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.952370 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.952400 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.952422 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:57Z","lastTransitionTime":"2026-02-19T21:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.055289 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.055335 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.055373 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.055392 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.055403 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:58Z","lastTransitionTime":"2026-02-19T21:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.158917 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.158996 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.159012 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.159065 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.159082 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:58Z","lastTransitionTime":"2026-02-19T21:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.262513 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.262582 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.262598 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.262620 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.262634 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:58Z","lastTransitionTime":"2026-02-19T21:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.365326 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.365381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.365397 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.365420 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.365438 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:58Z","lastTransitionTime":"2026-02-19T21:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.468412 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.468483 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.468500 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.468529 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.468551 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:58Z","lastTransitionTime":"2026-02-19T21:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.500009 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 00:32:32.094501688 +0000 UTC Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.511395 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.511490 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:58 crc kubenswrapper[4795]: E0219 21:28:58.511613 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:58 crc kubenswrapper[4795]: E0219 21:28:58.512252 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.512779 4795 scope.go:117] "RemoveContainer" containerID="175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.513333 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:58 crc kubenswrapper[4795]: E0219 21:28:58.513488 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.572519 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.572785 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.572796 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.572843 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.572856 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:58Z","lastTransitionTime":"2026-02-19T21:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.675560 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.675601 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.675611 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.675627 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.675639 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:58Z","lastTransitionTime":"2026-02-19T21:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.755405 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovnkube-controller/1.log" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.760139 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerStarted","Data":"b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0"} Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.760796 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.775538 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:58Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.778687 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.778740 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.778758 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.778780 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.778798 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:58Z","lastTransitionTime":"2026-02-19T21:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.789698 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:58Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.812647 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"cal for Pod openshift-multus/multus-additional-cni-plugins-l29c7 in node crc\\\\nI0219 21:28:44.491789 6229 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-l29c7 after 0 failed attempt(s)\\\\nI0219 21:28:44.491795 6229 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-l29c7\\\\nI0219 21:28:44.491778 6229 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.194:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d7d7b270-1480-47f8-bdf9-690dbab310cb}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0219 21:28:44.491057 6229 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} wa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:58Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.824689 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:58Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.838075 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:58Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.851049 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:58Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.863746 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:58Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.877368 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:58Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.881109 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.881309 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.881430 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.881547 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.881662 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:58Z","lastTransitionTime":"2026-02-19T21:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.890839 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:58Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.902549 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:58Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.927406 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:58Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.944125 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:58Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.956377 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:58Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.972266 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b1b4346-e02e-4614-b2ff-e4628046a92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ff4bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:58Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.983714 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.983765 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.983777 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.983796 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.983808 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:58Z","lastTransitionTime":"2026-02-19T21:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.991158 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:58Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.001405 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:58Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.012062 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.086377 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.086607 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.086667 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.086736 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.086837 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:59Z","lastTransitionTime":"2026-02-19T21:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.189465 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.189497 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.189508 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.189523 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.189535 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:59Z","lastTransitionTime":"2026-02-19T21:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.291547 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.291858 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.291939 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.292012 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.292085 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:59Z","lastTransitionTime":"2026-02-19T21:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.394359 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.394418 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.394459 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.394483 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.394499 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:59Z","lastTransitionTime":"2026-02-19T21:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.496755 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.496819 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.496836 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.496862 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.496880 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:59Z","lastTransitionTime":"2026-02-19T21:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.501015 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 16:28:43.243495386 +0000 UTC Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.511578 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:28:59 crc kubenswrapper[4795]: E0219 21:28:59.511756 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.525234 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.539681 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.552572 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.566077 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.587841 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"cal for Pod openshift-multus/multus-additional-cni-plugins-l29c7 in node crc\\\\nI0219 21:28:44.491789 6229 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-l29c7 after 0 failed attempt(s)\\\\nI0219 21:28:44.491795 6229 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-l29c7\\\\nI0219 21:28:44.491778 6229 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.194:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d7d7b270-1480-47f8-bdf9-690dbab310cb}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0219 21:28:44.491057 6229 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} wa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.599248 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.599459 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.599658 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.599857 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.600035 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:59Z","lastTransitionTime":"2026-02-19T21:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.602338 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.615977 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.629986 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.642073 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.654570 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.669357 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.693895 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.703317 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.703371 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.703381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.703397 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.703410 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:59Z","lastTransitionTime":"2026-02-19T21:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.727325 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.742813 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.754461 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.764583 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b1b4346-e02e-4614-b2ff-e4628046a92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ff4bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.764699 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovnkube-controller/2.log" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.765340 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovnkube-controller/1.log" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.768850 4795 generic.go:334] "Generic (PLEG): container finished" podID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerID="b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0" exitCode=1 Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.768933 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerDied","Data":"b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0"} Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.769103 4795 scope.go:117] "RemoveContainer" containerID="175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.769403 4795 scope.go:117] "RemoveContainer" containerID="b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0" Feb 19 21:28:59 crc kubenswrapper[4795]: E0219 21:28:59.769535 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.785713 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.796379 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.805406 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.805700 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.805725 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.805734 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.805749 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.805777 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:59Z","lastTransitionTime":"2026-02-19T21:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.821154 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"cal for Pod openshift-multus/multus-additional-cni-plugins-l29c7 in node crc\\\\nI0219 21:28:44.491789 6229 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-l29c7 after 0 failed attempt(s)\\\\nI0219 21:28:44.491795 6229 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-l29c7\\\\nI0219 21:28:44.491778 6229 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.194:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d7d7b270-1480-47f8-bdf9-690dbab310cb}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0219 21:28:44.491057 6229 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} wa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:59Z\\\",\\\"message\\\":\\\"oup\\\\\\\"}}}\\\\nI0219 21:28:59.338551 6443 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0219 21:28:59.338560 6443 services_controller.go:452] Built service openshift-config-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 21:28:59.338574 6443 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nF0219 21:28:59.338577 6443 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z]\\\\nI0219 21:28:59.338588 6443 services_controller.go:454] Service open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.829993 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.840122 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.851744 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.863815 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.878689 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.891129 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.900365 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.911908 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.911958 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.911969 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.911984 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.911995 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:59Z","lastTransitionTime":"2026-02-19T21:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.916958 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.928109 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.937522 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b1b4346-e02e-4614-b2ff-e4628046a92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ff4bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.953711 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.963606 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.973243 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.988585 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.014568 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.014610 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.014619 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.014633 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.014643 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:00Z","lastTransitionTime":"2026-02-19T21:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.116657 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.116694 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.116703 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.116717 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.116727 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:00Z","lastTransitionTime":"2026-02-19T21:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.219155 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.219208 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.219218 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.219235 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.219249 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:00Z","lastTransitionTime":"2026-02-19T21:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.320999 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.321043 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.321056 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.321072 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.321083 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:00Z","lastTransitionTime":"2026-02-19T21:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.345331 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.345545 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:32.345520516 +0000 UTC m=+83.538038390 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.423775 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.423816 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.423825 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.423840 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.423850 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:00Z","lastTransitionTime":"2026-02-19T21:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.446051 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.446139 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.446234 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.446312 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:29:32.446293354 +0000 UTC m=+83.638811208 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.446306 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.446372 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.446487 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.446536 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.446556 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.446484 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.446569 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.446614 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.446629 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 21:29:32.446605593 +0000 UTC m=+83.639123527 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.446638 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.446655 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:29:32.446644834 +0000 UTC m=+83.639162818 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.446727 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 21:29:32.446700966 +0000 UTC m=+83.639218870 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.501365 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 16:44:03.167294156 +0000 UTC Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.511040 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.511040 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.511260 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.511062 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.511336 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.511368 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.526706 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.526771 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.526783 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.526801 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.526813 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:00Z","lastTransitionTime":"2026-02-19T21:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.629711 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.629756 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.629774 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.629796 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.629813 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:00Z","lastTransitionTime":"2026-02-19T21:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.630133 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.643549 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.646693 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.668071 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.690540 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.715687 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.732825 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.732883 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.732905 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.732935 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.732958 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:00Z","lastTransitionTime":"2026-02-19T21:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.742252 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.764771 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.778512 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovnkube-controller/2.log" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.784563 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.786360 4795 scope.go:117] "RemoveContainer" containerID="b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0" Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.786671 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.802779 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b1b4346-e02e-4614-b2ff-e4628046a92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ff4bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.834282 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.836298 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.836345 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.836359 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.836379 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.836393 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:00Z","lastTransitionTime":"2026-02-19T21:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.855351 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.871764 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.890797 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.904606 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.919805 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.936778 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"cal for Pod openshift-multus/multus-additional-cni-plugins-l29c7 in node crc\\\\nI0219 21:28:44.491789 6229 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-l29c7 after 0 failed attempt(s)\\\\nI0219 21:28:44.491795 6229 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-l29c7\\\\nI0219 21:28:44.491778 6229 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.194:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d7d7b270-1480-47f8-bdf9-690dbab310cb}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0219 21:28:44.491057 6229 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} wa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:59Z\\\",\\\"message\\\":\\\"oup\\\\\\\"}}}\\\\nI0219 21:28:59.338551 6443 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0219 21:28:59.338560 6443 services_controller.go:452] Built service openshift-config-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 21:28:59.338574 6443 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nF0219 21:28:59.338577 6443 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z]\\\\nI0219 21:28:59.338588 6443 services_controller.go:454] Service open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.937866 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.937893 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.937903 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.937918 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.937930 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:00Z","lastTransitionTime":"2026-02-19T21:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.947081 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.956150 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.965639 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.976825 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.989738 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.001475 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:01Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.008883 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:01Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.017531 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b1b4346-e02e-4614-b2ff-e4628046a92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ff4bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:01Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.032747 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:01Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.040276 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.040419 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.040496 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.040590 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.040669 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:01Z","lastTransitionTime":"2026-02-19T21:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.042357 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:01Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.051983 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:01Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.060944 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538a8ea5-cec4-4706-8187-425879d18d4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7547886d43376602238ac541e375478233fda6564afeb045cb84192812dd7136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd41d9aec7be2e2d5e5ec8b06259364f0b7a1bf4fadbd8fe8abb8ff625cd6072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd942d9f97c8b019171b7d06c9f4a4b85decbf641bd03a76701b7d27520db62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:01Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.070028 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:01Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.079411 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:01Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.093552 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:59Z\\\",\\\"message\\\":\\\"oup\\\\\\\"}}}\\\\nI0219 21:28:59.338551 6443 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0219 21:28:59.338560 6443 services_controller.go:452] Built service openshift-config-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 21:28:59.338574 6443 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nF0219 21:28:59.338577 6443 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z]\\\\nI0219 21:28:59.338588 6443 services_controller.go:454] Service open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:01Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.102762 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:01Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.112523 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:01Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.122692 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:01Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.131483 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:01Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.143062 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.143236 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.143304 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.143404 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.143785 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:01Z","lastTransitionTime":"2026-02-19T21:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.143701 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:01Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.246519 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.246764 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.246778 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.246791 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.246801 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:01Z","lastTransitionTime":"2026-02-19T21:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.349602 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.349645 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.349659 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.349676 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.349687 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:01Z","lastTransitionTime":"2026-02-19T21:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.452268 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.452628 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.453002 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.453378 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.453591 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:01Z","lastTransitionTime":"2026-02-19T21:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.502445 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 00:04:47.283024595 +0000 UTC Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.511027 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:01 crc kubenswrapper[4795]: E0219 21:29:01.511149 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.555456 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.555690 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.555752 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.555825 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.555893 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:01Z","lastTransitionTime":"2026-02-19T21:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.659613 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.659662 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.659677 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.659697 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.659712 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:01Z","lastTransitionTime":"2026-02-19T21:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.762565 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.762614 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.762630 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.762649 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.762663 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:01Z","lastTransitionTime":"2026-02-19T21:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.865444 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.865505 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.865523 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.865547 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.865563 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:01Z","lastTransitionTime":"2026-02-19T21:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.968155 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.968246 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.968270 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.968300 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.968321 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:01Z","lastTransitionTime":"2026-02-19T21:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.070453 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.070518 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.070540 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.070568 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.070591 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:02Z","lastTransitionTime":"2026-02-19T21:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.173860 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.173917 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.173936 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.173960 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.173978 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:02Z","lastTransitionTime":"2026-02-19T21:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.277025 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.277066 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.277078 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.277094 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.277107 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:02Z","lastTransitionTime":"2026-02-19T21:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.381133 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.381246 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.381275 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.381307 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.381330 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:02Z","lastTransitionTime":"2026-02-19T21:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.484263 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.484306 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.484317 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.484336 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.484349 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:02Z","lastTransitionTime":"2026-02-19T21:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.502921 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 15:44:37.145204642 +0000 UTC Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.511300 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:02 crc kubenswrapper[4795]: E0219 21:29:02.511880 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.511681 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:02 crc kubenswrapper[4795]: E0219 21:29:02.512060 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.511302 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:02 crc kubenswrapper[4795]: E0219 21:29:02.512216 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.586806 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.586841 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.586857 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.586875 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.586887 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:02Z","lastTransitionTime":"2026-02-19T21:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.689987 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.690043 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.690061 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.690084 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.690101 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:02Z","lastTransitionTime":"2026-02-19T21:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.792002 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.792065 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.792085 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.792108 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.792126 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:02Z","lastTransitionTime":"2026-02-19T21:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.894995 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.895059 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.895081 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.895109 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.895131 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:02Z","lastTransitionTime":"2026-02-19T21:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.998416 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.998488 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.998511 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.998540 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.998583 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:02Z","lastTransitionTime":"2026-02-19T21:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.101812 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.101870 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.101888 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.101910 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.101928 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:03Z","lastTransitionTime":"2026-02-19T21:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.180766 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs\") pod \"network-metrics-daemon-ff4bs\" (UID: \"1b1b4346-e02e-4614-b2ff-e4628046a92f\") " pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:03 crc kubenswrapper[4795]: E0219 21:29:03.181021 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:29:03 crc kubenswrapper[4795]: E0219 21:29:03.181111 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs podName:1b1b4346-e02e-4614-b2ff-e4628046a92f nodeName:}" failed. No retries permitted until 2026-02-19 21:29:19.181086317 +0000 UTC m=+70.373604221 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs") pod "network-metrics-daemon-ff4bs" (UID: "1b1b4346-e02e-4614-b2ff-e4628046a92f") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.205388 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.205438 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.205455 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.205478 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.205502 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:03Z","lastTransitionTime":"2026-02-19T21:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.308711 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.308774 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.308794 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.308819 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.308838 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:03Z","lastTransitionTime":"2026-02-19T21:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.411754 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.412111 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.412381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.412591 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.412859 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:03Z","lastTransitionTime":"2026-02-19T21:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.504029 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 13:08:21.640712588 +0000 UTC Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.510979 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:03 crc kubenswrapper[4795]: E0219 21:29:03.511189 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.516131 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.516215 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.516236 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.516263 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.516284 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:03Z","lastTransitionTime":"2026-02-19T21:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.618714 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.618758 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.618773 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.618797 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.618813 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:03Z","lastTransitionTime":"2026-02-19T21:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.722144 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.722254 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.722283 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.722313 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.722336 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:03Z","lastTransitionTime":"2026-02-19T21:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.825102 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.825140 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.825148 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.825187 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.825198 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:03Z","lastTransitionTime":"2026-02-19T21:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.927767 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.928195 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.928215 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.928233 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.928246 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:03Z","lastTransitionTime":"2026-02-19T21:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.031077 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.031562 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.031581 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.031605 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.031621 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:04Z","lastTransitionTime":"2026-02-19T21:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.133096 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.133125 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.133133 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.133147 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.133157 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:04Z","lastTransitionTime":"2026-02-19T21:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.235863 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.235933 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.235956 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.235981 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.236000 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:04Z","lastTransitionTime":"2026-02-19T21:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.338768 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.339104 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.339208 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.339314 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.339391 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:04Z","lastTransitionTime":"2026-02-19T21:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.441620 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.441657 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.441665 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.441680 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.441689 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:04Z","lastTransitionTime":"2026-02-19T21:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.505438 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 05:54:58.080200067 +0000 UTC Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.510768 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:04 crc kubenswrapper[4795]: E0219 21:29:04.510943 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.511015 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.511127 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:04 crc kubenswrapper[4795]: E0219 21:29:04.511352 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:04 crc kubenswrapper[4795]: E0219 21:29:04.511517 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.544923 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.544966 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.544975 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.544990 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.545002 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:04Z","lastTransitionTime":"2026-02-19T21:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.648053 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.648099 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.648117 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.648140 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.648158 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:04Z","lastTransitionTime":"2026-02-19T21:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.750728 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.750767 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.750777 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.750791 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.750802 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:04Z","lastTransitionTime":"2026-02-19T21:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.852708 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.852774 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.852795 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.852821 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.852845 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:04Z","lastTransitionTime":"2026-02-19T21:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.957987 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.958024 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.958033 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.958047 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.958055 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:04Z","lastTransitionTime":"2026-02-19T21:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.060537 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.060579 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.060595 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.060620 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.060638 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:05Z","lastTransitionTime":"2026-02-19T21:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.162920 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.162978 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.163001 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.163028 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.163049 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:05Z","lastTransitionTime":"2026-02-19T21:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.265846 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.265882 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.265891 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.265912 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.265930 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:05Z","lastTransitionTime":"2026-02-19T21:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.288925 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.288991 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.289015 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.289046 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.289071 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:05Z","lastTransitionTime":"2026-02-19T21:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:05 crc kubenswrapper[4795]: E0219 21:29:05.309713 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:05Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.314696 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.314734 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.314743 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.314762 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.314780 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:05Z","lastTransitionTime":"2026-02-19T21:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:05 crc kubenswrapper[4795]: E0219 21:29:05.332104 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:05Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.336705 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.336756 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.336775 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.336803 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.336822 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:05Z","lastTransitionTime":"2026-02-19T21:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:05 crc kubenswrapper[4795]: E0219 21:29:05.353815 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:05Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.358771 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.358831 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.358850 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.358874 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.358895 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:05Z","lastTransitionTime":"2026-02-19T21:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:05 crc kubenswrapper[4795]: E0219 21:29:05.376505 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:05Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.382514 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.382747 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.382779 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.382812 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.382834 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:05Z","lastTransitionTime":"2026-02-19T21:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:05 crc kubenswrapper[4795]: E0219 21:29:05.403020 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:05Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:05 crc kubenswrapper[4795]: E0219 21:29:05.403396 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.405654 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.405705 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.405725 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.405748 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.405764 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:05Z","lastTransitionTime":"2026-02-19T21:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.505657 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 09:15:58.84426875 +0000 UTC Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.507772 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.507802 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.507813 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.507872 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.507882 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:05Z","lastTransitionTime":"2026-02-19T21:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.511388 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:05 crc kubenswrapper[4795]: E0219 21:29:05.511512 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.611103 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.611139 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.611152 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.611183 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.611194 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:05Z","lastTransitionTime":"2026-02-19T21:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.713784 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.713827 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.713838 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.713855 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.713864 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:05Z","lastTransitionTime":"2026-02-19T21:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.816563 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.816618 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.816635 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.816660 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.816676 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:05Z","lastTransitionTime":"2026-02-19T21:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.920344 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.920403 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.920422 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.920446 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.920465 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:05Z","lastTransitionTime":"2026-02-19T21:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.023417 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.023480 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.023496 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.023517 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.023534 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:06Z","lastTransitionTime":"2026-02-19T21:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.127489 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.127592 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.127622 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.127655 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.127678 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:06Z","lastTransitionTime":"2026-02-19T21:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.230625 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.230681 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.230716 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.230744 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.230768 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:06Z","lastTransitionTime":"2026-02-19T21:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.333467 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.333520 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.333538 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.333563 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.333579 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:06Z","lastTransitionTime":"2026-02-19T21:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.435836 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.435885 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.435906 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.435925 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.435938 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:06Z","lastTransitionTime":"2026-02-19T21:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.506559 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 00:53:17.385416089 +0000 UTC Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.510902 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.510902 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:06 crc kubenswrapper[4795]: E0219 21:29:06.511057 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.511077 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:06 crc kubenswrapper[4795]: E0219 21:29:06.511148 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:06 crc kubenswrapper[4795]: E0219 21:29:06.511261 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.538712 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.538759 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.538783 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.538806 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.538821 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:06Z","lastTransitionTime":"2026-02-19T21:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.641827 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.641871 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.641882 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.641899 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.641911 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:06Z","lastTransitionTime":"2026-02-19T21:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.745347 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.745427 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.745439 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.745457 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.745469 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:06Z","lastTransitionTime":"2026-02-19T21:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.848194 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.848246 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.848258 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.848279 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.848291 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:06Z","lastTransitionTime":"2026-02-19T21:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.951635 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.951681 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.951696 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.951720 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.951735 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:06Z","lastTransitionTime":"2026-02-19T21:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.055405 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.055455 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.055471 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.055497 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.055516 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:07Z","lastTransitionTime":"2026-02-19T21:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.158525 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.158625 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.158701 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.158733 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.158750 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:07Z","lastTransitionTime":"2026-02-19T21:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.261663 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.261713 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.261731 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.261756 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.261774 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:07Z","lastTransitionTime":"2026-02-19T21:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.364137 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.364254 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.364279 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.364306 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.364323 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:07Z","lastTransitionTime":"2026-02-19T21:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.467264 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.467316 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.467335 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.467358 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.467377 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:07Z","lastTransitionTime":"2026-02-19T21:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.507144 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 17:00:12.980213872 +0000 UTC Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.511770 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:07 crc kubenswrapper[4795]: E0219 21:29:07.512012 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.571764 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.571831 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.571856 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.571894 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.571919 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:07Z","lastTransitionTime":"2026-02-19T21:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.673797 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.673827 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.673837 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.673850 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.673859 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:07Z","lastTransitionTime":"2026-02-19T21:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.775901 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.775934 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.775943 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.775956 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.775965 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:07Z","lastTransitionTime":"2026-02-19T21:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.878340 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.878377 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.878388 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.878405 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.878416 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:07Z","lastTransitionTime":"2026-02-19T21:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.981149 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.981231 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.981248 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.981272 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.981288 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:07Z","lastTransitionTime":"2026-02-19T21:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.084357 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.084776 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.084927 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.085085 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.085254 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:08Z","lastTransitionTime":"2026-02-19T21:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.188894 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.188957 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.188974 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.189038 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.189060 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:08Z","lastTransitionTime":"2026-02-19T21:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.293143 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.293264 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.293283 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.293306 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.293323 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:08Z","lastTransitionTime":"2026-02-19T21:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.396572 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.396634 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.396651 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.396674 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.396693 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:08Z","lastTransitionTime":"2026-02-19T21:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.499865 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.499929 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.499961 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.499990 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.500007 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:08Z","lastTransitionTime":"2026-02-19T21:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.508242 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 11:22:33.292224279 +0000 UTC Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.512130 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.512236 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:08 crc kubenswrapper[4795]: E0219 21:29:08.512501 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.512137 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:08 crc kubenswrapper[4795]: E0219 21:29:08.512584 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:08 crc kubenswrapper[4795]: E0219 21:29:08.512625 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.603273 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.603356 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.603369 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.603407 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.603419 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:08Z","lastTransitionTime":"2026-02-19T21:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.705773 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.705832 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.705849 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.705871 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.705888 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:08Z","lastTransitionTime":"2026-02-19T21:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.808533 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.808562 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.808570 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.808582 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.808590 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:08Z","lastTransitionTime":"2026-02-19T21:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.910992 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.911053 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.911079 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.911108 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.911129 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:08Z","lastTransitionTime":"2026-02-19T21:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.014012 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.014081 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.014107 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.014137 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.014584 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:09Z","lastTransitionTime":"2026-02-19T21:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.121867 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.122033 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.122054 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.122078 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.122094 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:09Z","lastTransitionTime":"2026-02-19T21:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.224583 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.224662 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.224679 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.224712 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.224731 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:09Z","lastTransitionTime":"2026-02-19T21:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.328099 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.328143 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.328161 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.328217 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.328235 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:09Z","lastTransitionTime":"2026-02-19T21:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.431150 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.431227 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.431247 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.431270 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.431284 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:09Z","lastTransitionTime":"2026-02-19T21:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.508598 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 09:42:45.041659164 +0000 UTC Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.510848 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:09 crc kubenswrapper[4795]: E0219 21:29:09.511108 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.528612 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.534775 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.534831 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.534850 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.534874 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.534892 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:09Z","lastTransitionTime":"2026-02-19T21:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.545025 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.560979 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.577693 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.594024 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.604840 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b1b4346-e02e-4614-b2ff-e4628046a92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ff4bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.621757 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.633375 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.642394 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.642422 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.642432 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.642447 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.642456 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:09Z","lastTransitionTime":"2026-02-19T21:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.646592 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.663261 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538a8ea5-cec4-4706-8187-425879d18d4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7547886d43376602238ac541e375478233fda6564afeb045cb84192812dd7136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd41d9aec7be2e2d5e5ec8b06259364f0b7a1bf4fadbd8fe8abb8ff625cd6072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd942d9f97c8b019171b7d06c9f4a4b85decbf641bd03a76701b7d27520db62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.675441 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.686646 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.706387 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:59Z\\\",\\\"message\\\":\\\"oup\\\\\\\"}}}\\\\nI0219 21:28:59.338551 6443 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0219 21:28:59.338560 6443 services_controller.go:452] Built service openshift-config-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 21:28:59.338574 6443 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nF0219 21:28:59.338577 6443 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z]\\\\nI0219 21:28:59.338588 6443 services_controller.go:454] Service open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.717049 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.730740 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.744063 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.745722 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.745768 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.745780 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.745800 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.745819 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:09Z","lastTransitionTime":"2026-02-19T21:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.755828 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.769754 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.847900 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.847934 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.847944 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.847959 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.847969 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:09Z","lastTransitionTime":"2026-02-19T21:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.949934 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.949997 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.950019 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.950046 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.950068 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:09Z","lastTransitionTime":"2026-02-19T21:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.052671 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.052703 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.052711 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.052724 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.052733 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:10Z","lastTransitionTime":"2026-02-19T21:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.154713 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.154746 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.154756 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.154769 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.154777 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:10Z","lastTransitionTime":"2026-02-19T21:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.256900 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.256948 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.256960 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.256979 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.256991 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:10Z","lastTransitionTime":"2026-02-19T21:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.358728 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.358765 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.358774 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.358786 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.358795 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:10Z","lastTransitionTime":"2026-02-19T21:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.461045 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.461075 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.461083 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.461098 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.461107 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:10Z","lastTransitionTime":"2026-02-19T21:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.509983 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 22:42:33.973508553 +0000 UTC Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.511256 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.511284 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.511338 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:10 crc kubenswrapper[4795]: E0219 21:29:10.511419 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:10 crc kubenswrapper[4795]: E0219 21:29:10.511507 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:10 crc kubenswrapper[4795]: E0219 21:29:10.511599 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.564312 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.564358 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.564368 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.564382 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.564392 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:10Z","lastTransitionTime":"2026-02-19T21:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.667427 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.667484 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.667500 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.667526 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.667543 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:10Z","lastTransitionTime":"2026-02-19T21:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.770665 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.770715 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.770728 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.770747 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.770760 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:10Z","lastTransitionTime":"2026-02-19T21:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.873718 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.873781 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.873803 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.873838 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.873861 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:10Z","lastTransitionTime":"2026-02-19T21:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.976314 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.976361 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.976373 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.976393 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.976406 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:10Z","lastTransitionTime":"2026-02-19T21:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.081571 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.081619 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.081633 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.081681 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.081694 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:11Z","lastTransitionTime":"2026-02-19T21:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.186349 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.186393 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.186402 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.186419 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.186430 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:11Z","lastTransitionTime":"2026-02-19T21:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.289683 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.289755 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.289777 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.289806 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.289827 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:11Z","lastTransitionTime":"2026-02-19T21:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.392733 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.392795 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.392813 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.392838 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.392856 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:11Z","lastTransitionTime":"2026-02-19T21:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.495707 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.495773 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.495798 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.495834 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.495861 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:11Z","lastTransitionTime":"2026-02-19T21:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.510835 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 00:45:14.525645727 +0000 UTC Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.511089 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:11 crc kubenswrapper[4795]: E0219 21:29:11.511365 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.599187 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.599228 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.599240 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.599257 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.599268 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:11Z","lastTransitionTime":"2026-02-19T21:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.702604 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.702641 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.702652 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.702667 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.702678 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:11Z","lastTransitionTime":"2026-02-19T21:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.805000 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.805046 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.805064 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.805086 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.805103 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:11Z","lastTransitionTime":"2026-02-19T21:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.908347 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.908389 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.908404 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.908426 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.908442 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:11Z","lastTransitionTime":"2026-02-19T21:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.012282 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.012314 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.012325 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.012339 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.012349 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:12Z","lastTransitionTime":"2026-02-19T21:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.114700 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.114748 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.114764 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.114787 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.114804 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:12Z","lastTransitionTime":"2026-02-19T21:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.217319 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.217372 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.217396 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.217430 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.217454 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:12Z","lastTransitionTime":"2026-02-19T21:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.320656 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.320714 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.320732 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.320755 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.320773 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:12Z","lastTransitionTime":"2026-02-19T21:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.423874 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.423940 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.423955 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.424026 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.424040 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:12Z","lastTransitionTime":"2026-02-19T21:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.511536 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:12 crc kubenswrapper[4795]: E0219 21:29:12.511658 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.511846 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:12 crc kubenswrapper[4795]: E0219 21:29:12.511895 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.512021 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:12 crc kubenswrapper[4795]: E0219 21:29:12.512079 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.512249 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 00:09:09.471252734 +0000 UTC Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.526188 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.526207 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.526214 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.526224 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.526232 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:12Z","lastTransitionTime":"2026-02-19T21:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.628374 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.628412 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.628420 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.628434 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.628445 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:12Z","lastTransitionTime":"2026-02-19T21:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.730397 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.730442 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.730457 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.730476 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.730486 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:12Z","lastTransitionTime":"2026-02-19T21:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.832796 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.832839 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.832852 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.832871 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.832883 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:12Z","lastTransitionTime":"2026-02-19T21:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.935565 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.935870 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.936003 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.936209 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.936358 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:12Z","lastTransitionTime":"2026-02-19T21:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.039457 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.039751 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.039858 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.039961 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.040042 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:13Z","lastTransitionTime":"2026-02-19T21:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.143125 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.143207 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.143225 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.143252 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.143270 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:13Z","lastTransitionTime":"2026-02-19T21:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.246547 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.246594 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.246607 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.246625 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.246644 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:13Z","lastTransitionTime":"2026-02-19T21:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.349498 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.349556 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.349578 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.349601 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.349618 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:13Z","lastTransitionTime":"2026-02-19T21:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.452865 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.452910 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.452921 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.452940 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.452952 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:13Z","lastTransitionTime":"2026-02-19T21:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.510695 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:13 crc kubenswrapper[4795]: E0219 21:29:13.511151 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.516330 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 07:56:49.459256912 +0000 UTC Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.555927 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.555997 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.556013 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.556031 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.556043 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:13Z","lastTransitionTime":"2026-02-19T21:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.658271 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.658306 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.658314 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.658325 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.658333 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:13Z","lastTransitionTime":"2026-02-19T21:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.760685 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.760732 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.760754 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.760770 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.760779 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:13Z","lastTransitionTime":"2026-02-19T21:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.862528 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.862566 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.862577 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.862593 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.862605 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:13Z","lastTransitionTime":"2026-02-19T21:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.965142 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.965197 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.965207 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.965221 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.965231 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:13Z","lastTransitionTime":"2026-02-19T21:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.067369 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.067403 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.067412 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.067424 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.067432 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:14Z","lastTransitionTime":"2026-02-19T21:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.170063 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.170088 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.170096 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.170108 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.170117 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:14Z","lastTransitionTime":"2026-02-19T21:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.272281 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.272313 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.272321 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.272335 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.272344 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:14Z","lastTransitionTime":"2026-02-19T21:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.374484 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.374516 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.374532 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.374548 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.374559 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:14Z","lastTransitionTime":"2026-02-19T21:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.477136 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.477201 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.477215 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.477231 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.477241 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:14Z","lastTransitionTime":"2026-02-19T21:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.510646 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.510694 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.510734 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:14 crc kubenswrapper[4795]: E0219 21:29:14.510757 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:14 crc kubenswrapper[4795]: E0219 21:29:14.510813 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:14 crc kubenswrapper[4795]: E0219 21:29:14.510917 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.517176 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 18:19:41.952522536 +0000 UTC Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.579343 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.579409 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.579421 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.579436 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.579445 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:14Z","lastTransitionTime":"2026-02-19T21:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.682682 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.682971 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.682984 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.682999 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.683011 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:14Z","lastTransitionTime":"2026-02-19T21:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.785394 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.785430 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.785439 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.785453 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.785461 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:14Z","lastTransitionTime":"2026-02-19T21:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.887418 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.887466 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.887478 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.887496 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.887508 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:14Z","lastTransitionTime":"2026-02-19T21:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.989639 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.989699 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.989716 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.989738 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.989755 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:14Z","lastTransitionTime":"2026-02-19T21:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.092254 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.092323 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.092342 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.092366 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.092382 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:15Z","lastTransitionTime":"2026-02-19T21:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.194541 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.194582 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.194595 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.194611 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.194631 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:15Z","lastTransitionTime":"2026-02-19T21:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.296780 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.296920 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.296933 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.296948 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.296958 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:15Z","lastTransitionTime":"2026-02-19T21:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.399949 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.399988 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.399998 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.400014 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.400024 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:15Z","lastTransitionTime":"2026-02-19T21:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.497765 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.497806 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.497816 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.497831 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.497841 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:15Z","lastTransitionTime":"2026-02-19T21:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:15 crc kubenswrapper[4795]: E0219 21:29:15.511140 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:15Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.511611 4795 scope.go:117] "RemoveContainer" containerID="b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.511725 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:15 crc kubenswrapper[4795]: E0219 21:29:15.511858 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" Feb 19 21:29:15 crc kubenswrapper[4795]: E0219 21:29:15.511883 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.515595 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.515650 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.515670 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.515709 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.515727 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:15Z","lastTransitionTime":"2026-02-19T21:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.517283 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 09:22:51.128643495 +0000 UTC Feb 19 21:29:15 crc kubenswrapper[4795]: E0219 21:29:15.538038 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:15Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.541388 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.541443 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.541462 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.541484 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.541500 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:15Z","lastTransitionTime":"2026-02-19T21:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:15 crc kubenswrapper[4795]: E0219 21:29:15.557314 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:15Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.561097 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.561129 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.561140 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.561156 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.561191 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:15Z","lastTransitionTime":"2026-02-19T21:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:15 crc kubenswrapper[4795]: E0219 21:29:15.583994 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:15Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.587975 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.588020 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.588032 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.588049 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.588061 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:15Z","lastTransitionTime":"2026-02-19T21:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:15 crc kubenswrapper[4795]: E0219 21:29:15.603713 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:15Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:15 crc kubenswrapper[4795]: E0219 21:29:15.603958 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.605634 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.605704 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.605722 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.605747 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.605766 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:15Z","lastTransitionTime":"2026-02-19T21:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.708049 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.708093 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.708103 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.708119 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.708132 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:15Z","lastTransitionTime":"2026-02-19T21:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.810811 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.811100 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.811222 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.811358 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.811487 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:15Z","lastTransitionTime":"2026-02-19T21:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.913696 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.913955 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.914227 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.914445 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.914568 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:15Z","lastTransitionTime":"2026-02-19T21:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.017408 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.017443 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.017452 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.017468 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.017478 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:16Z","lastTransitionTime":"2026-02-19T21:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.120725 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.121199 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.121444 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.122064 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.122588 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:16Z","lastTransitionTime":"2026-02-19T21:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.225851 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.226105 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.226193 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.226264 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.226347 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:16Z","lastTransitionTime":"2026-02-19T21:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.329114 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.329180 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.329198 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.329217 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.329229 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:16Z","lastTransitionTime":"2026-02-19T21:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.431362 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.431390 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.431398 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.431412 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.431420 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:16Z","lastTransitionTime":"2026-02-19T21:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.511178 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:16 crc kubenswrapper[4795]: E0219 21:29:16.511275 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.511303 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.511332 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:16 crc kubenswrapper[4795]: E0219 21:29:16.511375 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:16 crc kubenswrapper[4795]: E0219 21:29:16.511491 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.518537 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 12:08:48.955260037 +0000 UTC Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.533143 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.533545 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.533639 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.533764 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.533863 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:16Z","lastTransitionTime":"2026-02-19T21:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.636699 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.636949 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.637021 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.637087 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.637155 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:16Z","lastTransitionTime":"2026-02-19T21:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.739644 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.739681 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.739691 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.739705 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.739715 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:16Z","lastTransitionTime":"2026-02-19T21:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.842991 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.843032 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.843044 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.843061 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.843074 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:16Z","lastTransitionTime":"2026-02-19T21:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.944814 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.944871 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.944879 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.944891 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.944899 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:16Z","lastTransitionTime":"2026-02-19T21:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.047060 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.047413 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.047527 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.047622 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.047714 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:17Z","lastTransitionTime":"2026-02-19T21:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.150465 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.150792 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.150951 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.151246 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.151388 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:17Z","lastTransitionTime":"2026-02-19T21:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.253456 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.253485 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.253493 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.253519 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.253529 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:17Z","lastTransitionTime":"2026-02-19T21:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.356190 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.356220 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.356228 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.356259 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.356269 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:17Z","lastTransitionTime":"2026-02-19T21:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.458559 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.458591 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.458605 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.458619 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.458630 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:17Z","lastTransitionTime":"2026-02-19T21:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.511253 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:17 crc kubenswrapper[4795]: E0219 21:29:17.511450 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.519610 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 21:22:51.006375964 +0000 UTC Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.560694 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.560714 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.560721 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.560738 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.560747 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:17Z","lastTransitionTime":"2026-02-19T21:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.662796 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.662825 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.662836 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.662849 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.662856 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:17Z","lastTransitionTime":"2026-02-19T21:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.764716 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.764779 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.764794 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.764814 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.764825 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:17Z","lastTransitionTime":"2026-02-19T21:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.867989 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.868028 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.868037 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.868050 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.868060 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:17Z","lastTransitionTime":"2026-02-19T21:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.970626 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.970669 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.970679 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.970695 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.970705 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:17Z","lastTransitionTime":"2026-02-19T21:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.072609 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.072659 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.072675 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.072696 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.072712 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:18Z","lastTransitionTime":"2026-02-19T21:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.174822 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.174869 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.174919 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.174936 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.174948 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:18Z","lastTransitionTime":"2026-02-19T21:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.277072 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.277110 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.277121 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.277138 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.277148 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:18Z","lastTransitionTime":"2026-02-19T21:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.379089 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.379122 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.379131 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.379143 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.379152 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:18Z","lastTransitionTime":"2026-02-19T21:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.481543 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.481586 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.481599 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.481616 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.481626 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:18Z","lastTransitionTime":"2026-02-19T21:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.511055 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.511107 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:18 crc kubenswrapper[4795]: E0219 21:29:18.511215 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.511288 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:18 crc kubenswrapper[4795]: E0219 21:29:18.511392 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:18 crc kubenswrapper[4795]: E0219 21:29:18.511547 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.520119 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 16:10:37.38792007 +0000 UTC Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.583494 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.583534 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.583543 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.583557 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.583565 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:18Z","lastTransitionTime":"2026-02-19T21:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.685844 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.685888 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.685899 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.685917 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.685928 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:18Z","lastTransitionTime":"2026-02-19T21:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.788995 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.789038 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.789048 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.789066 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.789081 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:18Z","lastTransitionTime":"2026-02-19T21:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.892033 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.892065 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.892073 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.892089 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.892098 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:18Z","lastTransitionTime":"2026-02-19T21:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.994973 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.995028 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.995045 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.995070 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.995087 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:18Z","lastTransitionTime":"2026-02-19T21:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.097705 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.097741 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.097752 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.097766 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.097777 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:19Z","lastTransitionTime":"2026-02-19T21:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.199717 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.199751 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.199759 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.199772 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.199780 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:19Z","lastTransitionTime":"2026-02-19T21:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.251911 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs\") pod \"network-metrics-daemon-ff4bs\" (UID: \"1b1b4346-e02e-4614-b2ff-e4628046a92f\") " pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:19 crc kubenswrapper[4795]: E0219 21:29:19.252073 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:29:19 crc kubenswrapper[4795]: E0219 21:29:19.252156 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs podName:1b1b4346-e02e-4614-b2ff-e4628046a92f nodeName:}" failed. No retries permitted until 2026-02-19 21:29:51.252141645 +0000 UTC m=+102.444659509 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs") pod "network-metrics-daemon-ff4bs" (UID: "1b1b4346-e02e-4614-b2ff-e4628046a92f") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.301536 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.301573 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.301584 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.301598 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.301609 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:19Z","lastTransitionTime":"2026-02-19T21:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.404247 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.404279 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.404290 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.404305 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.404316 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:19Z","lastTransitionTime":"2026-02-19T21:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.505891 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.505931 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.505939 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.505954 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.505963 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:19Z","lastTransitionTime":"2026-02-19T21:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.511313 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:19 crc kubenswrapper[4795]: E0219 21:29:19.511422 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.520988 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 20:16:01.400728976 +0000 UTC Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.532143 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.548526 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.557425 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.570821 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.582248 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.597558 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.608023 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b1b4346-e02e-4614-b2ff-e4628046a92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ff4bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.608400 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.608475 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.608537 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.608600 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.608656 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:19Z","lastTransitionTime":"2026-02-19T21:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.619493 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.629756 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.641045 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538a8ea5-cec4-4706-8187-425879d18d4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7547886d43376602238ac541e375478233fda6564afeb045cb84192812dd7136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd41d9aec7be2e2d5e5ec8b06259364f0b7a1bf4fadbd8fe8abb8ff625cd6072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd942d9f97c8b019171b7d06c9f4a4b85decbf641bd03a76701b7d27520db62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.652098 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.662826 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.681793 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:59Z\\\",\\\"message\\\":\\\"oup\\\\\\\"}}}\\\\nI0219 21:28:59.338551 6443 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0219 21:28:59.338560 6443 services_controller.go:452] Built service openshift-config-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 21:28:59.338574 6443 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nF0219 21:28:59.338577 6443 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z]\\\\nI0219 21:28:59.338588 6443 services_controller.go:454] Service open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.692455 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.706444 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.714839 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.714874 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.714882 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.714896 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.714904 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:19Z","lastTransitionTime":"2026-02-19T21:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.718820 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.730372 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.739641 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.817420 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.817683 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.817779 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.817870 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.817957 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:19Z","lastTransitionTime":"2026-02-19T21:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.920362 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.920398 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.920410 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.920425 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.920434 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:19Z","lastTransitionTime":"2026-02-19T21:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.021984 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.022193 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.022283 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.022353 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.022407 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:20Z","lastTransitionTime":"2026-02-19T21:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.124786 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.124820 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.124829 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.124843 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.124881 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:20Z","lastTransitionTime":"2026-02-19T21:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.226990 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.227017 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.227025 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.227040 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.227049 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:20Z","lastTransitionTime":"2026-02-19T21:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.329584 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.329627 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.329644 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.329666 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.329682 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:20Z","lastTransitionTime":"2026-02-19T21:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.431612 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.431663 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.431677 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.431695 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.431708 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:20Z","lastTransitionTime":"2026-02-19T21:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.511371 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.511447 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:20 crc kubenswrapper[4795]: E0219 21:29:20.511479 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:20 crc kubenswrapper[4795]: E0219 21:29:20.511581 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.511922 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:20 crc kubenswrapper[4795]: E0219 21:29:20.512222 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.521566 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 01:11:40.660456072 +0000 UTC Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.534214 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.534246 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.534257 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.534283 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.534295 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:20Z","lastTransitionTime":"2026-02-19T21:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.637612 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.637858 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.637947 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.638031 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.638122 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:20Z","lastTransitionTime":"2026-02-19T21:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.740492 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.740522 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.740530 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.740542 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.740550 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:20Z","lastTransitionTime":"2026-02-19T21:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.840382 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5p6d9_e967392b-9bd8-4111-b1b9-96d503a19668/kube-multus/0.log" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.840427 4795 generic.go:334] "Generic (PLEG): container finished" podID="e967392b-9bd8-4111-b1b9-96d503a19668" containerID="ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d" exitCode=1 Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.840453 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5p6d9" event={"ID":"e967392b-9bd8-4111-b1b9-96d503a19668","Type":"ContainerDied","Data":"ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d"} Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.840803 4795 scope.go:117] "RemoveContainer" containerID="ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.842406 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.842441 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.842451 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.842475 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.842486 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:20Z","lastTransitionTime":"2026-02-19T21:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.859670 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:20Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.877346 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:29:20Z\\\",\\\"message\\\":\\\"2026-02-19T21:28:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8c80439f-0348-449a-8009-7180f0a7269a\\\\n2026-02-19T21:28:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8c80439f-0348-449a-8009-7180f0a7269a to /host/opt/cni/bin/\\\\n2026-02-19T21:28:35Z [verbose] multus-daemon started\\\\n2026-02-19T21:28:35Z [verbose] Readiness Indicator file check\\\\n2026-02-19T21:29:20Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:20Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.888220 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:20Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.896981 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b1b4346-e02e-4614-b2ff-e4628046a92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ff4bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:20Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.913396 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:20Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.923382 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:20Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.931424 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:20Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.943793 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:20Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.944937 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.944955 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.944965 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.944979 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.945004 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:20Z","lastTransitionTime":"2026-02-19T21:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.955269 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:20Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.967216 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:20Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.990353 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:59Z\\\",\\\"message\\\":\\\"oup\\\\\\\"}}}\\\\nI0219 21:28:59.338551 6443 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0219 21:28:59.338560 6443 services_controller.go:452] Built service openshift-config-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 21:28:59.338574 6443 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nF0219 21:28:59.338577 6443 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z]\\\\nI0219 21:28:59.338588 6443 services_controller.go:454] Service open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:20Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.002477 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.016063 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538a8ea5-cec4-4706-8187-425879d18d4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7547886d43376602238ac541e375478233fda6564afeb045cb84192812dd7136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd41d9aec7be2e2d5e5ec8b06259364f0b7a1bf4fadbd8fe8abb8ff625cd6072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd942d9f97c8b019171b7d06c9f4a4b85decbf641bd03a76701b7d27520db62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.029068 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.040682 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.047108 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.047133 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.047141 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.047154 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.047173 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:21Z","lastTransitionTime":"2026-02-19T21:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.051850 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.063226 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.074835 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.149620 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.149674 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.149683 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.149697 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.149705 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:21Z","lastTransitionTime":"2026-02-19T21:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.252573 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.252602 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.252610 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.252623 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.252632 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:21Z","lastTransitionTime":"2026-02-19T21:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.355603 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.355650 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.355659 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.355675 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.355684 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:21Z","lastTransitionTime":"2026-02-19T21:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.458516 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.458574 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.458586 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.458608 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.458620 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:21Z","lastTransitionTime":"2026-02-19T21:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.510834 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:21 crc kubenswrapper[4795]: E0219 21:29:21.510983 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.521726 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 07:15:39.942938654 +0000 UTC Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.561493 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.561520 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.561529 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.561546 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.561555 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:21Z","lastTransitionTime":"2026-02-19T21:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.664094 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.664150 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.664174 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.664191 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.664202 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:21Z","lastTransitionTime":"2026-02-19T21:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.767217 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.767556 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.767653 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.767746 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.767825 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:21Z","lastTransitionTime":"2026-02-19T21:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.846299 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5p6d9_e967392b-9bd8-4111-b1b9-96d503a19668/kube-multus/0.log" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.846365 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5p6d9" event={"ID":"e967392b-9bd8-4111-b1b9-96d503a19668","Type":"ContainerStarted","Data":"02e3717583e910ca7356ee58e22d63f40252488fb8540be7a30ab0fc918b908b"} Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.858102 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.870490 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.870641 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.870744 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.870833 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.870919 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:21Z","lastTransitionTime":"2026-02-19T21:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.872650 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.884757 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02e3717583e910ca7356ee58e22d63f40252488fb8540be7a30ab0fc918b908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:29:20Z\\\",\\\"message\\\":\\\"2026-02-19T21:28:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8c80439f-0348-449a-8009-7180f0a7269a\\\\n2026-02-19T21:28:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8c80439f-0348-449a-8009-7180f0a7269a to /host/opt/cni/bin/\\\\n2026-02-19T21:28:35Z [verbose] multus-daemon started\\\\n2026-02-19T21:28:35Z [verbose] Readiness Indicator file check\\\\n2026-02-19T21:29:20Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.896589 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.911075 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b1b4346-e02e-4614-b2ff-e4628046a92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ff4bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.929970 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.942849 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.954592 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.964336 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.972948 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.973387 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.973556 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.974009 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.974391 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:21Z","lastTransitionTime":"2026-02-19T21:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.976437 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.986889 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.003596 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:59Z\\\",\\\"message\\\":\\\"oup\\\\\\\"}}}\\\\nI0219 21:28:59.338551 6443 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0219 21:28:59.338560 6443 services_controller.go:452] Built service openshift-config-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 21:28:59.338574 6443 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nF0219 21:28:59.338577 6443 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z]\\\\nI0219 21:28:59.338588 6443 services_controller.go:454] Service open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:22Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.012934 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:22Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.022396 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538a8ea5-cec4-4706-8187-425879d18d4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7547886d43376602238ac541e375478233fda6564afeb045cb84192812dd7136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd41d9aec7be2e2d5e5ec8b06259364f0b7a1bf4fadbd8fe8abb8ff625cd6072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd942d9f97c8b019171b7d06c9f4a4b85decbf641bd03a76701b7d27520db62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:22Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.033367 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:22Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.042522 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:22Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.053032 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:22Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.064331 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:22Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.076832 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.076972 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.077066 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.077182 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.077267 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:22Z","lastTransitionTime":"2026-02-19T21:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.180433 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.180458 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.180467 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.180481 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.180489 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:22Z","lastTransitionTime":"2026-02-19T21:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.283254 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.283281 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.283289 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.283302 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.283311 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:22Z","lastTransitionTime":"2026-02-19T21:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.385746 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.386033 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.386131 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.386254 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.386318 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:22Z","lastTransitionTime":"2026-02-19T21:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.488346 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.488652 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.488741 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.488846 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.488933 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:22Z","lastTransitionTime":"2026-02-19T21:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.510886 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.510930 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:22 crc kubenswrapper[4795]: E0219 21:29:22.511000 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:22 crc kubenswrapper[4795]: E0219 21:29:22.511066 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.511326 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:22 crc kubenswrapper[4795]: E0219 21:29:22.511490 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.523062 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 17:11:27.599983745 +0000 UTC Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.591504 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.591547 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.591556 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.591571 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.591582 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:22Z","lastTransitionTime":"2026-02-19T21:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.693654 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.693960 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.694020 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.694079 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.694133 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:22Z","lastTransitionTime":"2026-02-19T21:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.797625 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.797675 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.797692 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.797715 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.797733 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:22Z","lastTransitionTime":"2026-02-19T21:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.901891 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.901951 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.901968 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.901994 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.902011 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:22Z","lastTransitionTime":"2026-02-19T21:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.004339 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.004389 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.004400 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.004416 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.004429 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:23Z","lastTransitionTime":"2026-02-19T21:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.107567 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.107614 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.107625 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.107643 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.107655 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:23Z","lastTransitionTime":"2026-02-19T21:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.210525 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.210575 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.210590 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.210610 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.210627 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:23Z","lastTransitionTime":"2026-02-19T21:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.314449 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.314482 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.314491 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.314504 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.314512 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:23Z","lastTransitionTime":"2026-02-19T21:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.416842 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.416887 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.416920 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.416936 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.416947 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:23Z","lastTransitionTime":"2026-02-19T21:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.510766 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:23 crc kubenswrapper[4795]: E0219 21:29:23.510893 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.518888 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.518922 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.518932 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.518944 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.518954 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:23Z","lastTransitionTime":"2026-02-19T21:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.524079 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 10:12:48.009016533 +0000 UTC Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.621380 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.621417 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.621425 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.621445 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.621455 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:23Z","lastTransitionTime":"2026-02-19T21:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.724562 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.724638 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.724663 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.724693 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.724716 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:23Z","lastTransitionTime":"2026-02-19T21:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.827056 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.827096 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.827106 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.827122 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.827131 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:23Z","lastTransitionTime":"2026-02-19T21:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.928904 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.928935 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.928944 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.928961 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.928969 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:23Z","lastTransitionTime":"2026-02-19T21:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.031552 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.031588 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.031599 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.031615 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.031626 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:24Z","lastTransitionTime":"2026-02-19T21:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.133653 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.133710 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.133730 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.133750 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.133761 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:24Z","lastTransitionTime":"2026-02-19T21:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.236239 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.236291 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.236305 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.236331 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.236347 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:24Z","lastTransitionTime":"2026-02-19T21:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.338657 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.338716 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.338729 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.338748 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.338761 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:24Z","lastTransitionTime":"2026-02-19T21:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.441218 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.441265 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.441281 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.441296 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.441304 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:24Z","lastTransitionTime":"2026-02-19T21:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.511427 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.511427 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:24 crc kubenswrapper[4795]: E0219 21:29:24.511600 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.511689 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:24 crc kubenswrapper[4795]: E0219 21:29:24.511818 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:24 crc kubenswrapper[4795]: E0219 21:29:24.511893 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.525074 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 04:30:32.870465938 +0000 UTC Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.543955 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.543993 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.544002 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.544018 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.544027 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:24Z","lastTransitionTime":"2026-02-19T21:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.646939 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.646999 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.647017 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.647042 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.647060 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:24Z","lastTransitionTime":"2026-02-19T21:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.750620 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.750717 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.750738 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.750763 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.750782 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:24Z","lastTransitionTime":"2026-02-19T21:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.860202 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.860284 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.860305 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.860331 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.860352 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:24Z","lastTransitionTime":"2026-02-19T21:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.963507 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.963566 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.963584 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.963608 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.963625 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:24Z","lastTransitionTime":"2026-02-19T21:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.066621 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.066684 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.066702 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.066728 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.066746 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:25Z","lastTransitionTime":"2026-02-19T21:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.170598 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.170658 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.170674 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.170698 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.170715 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:25Z","lastTransitionTime":"2026-02-19T21:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.273953 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.274147 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.274233 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.274265 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.274321 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:25Z","lastTransitionTime":"2026-02-19T21:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.378711 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.378773 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.378790 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.378814 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.378831 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:25Z","lastTransitionTime":"2026-02-19T21:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.481970 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.482031 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.482049 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.482074 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.482092 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:25Z","lastTransitionTime":"2026-02-19T21:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.511372 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:25 crc kubenswrapper[4795]: E0219 21:29:25.511561 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.525440 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 09:42:02.795383918 +0000 UTC Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.585966 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.586038 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.586063 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.586091 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.586112 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:25Z","lastTransitionTime":"2026-02-19T21:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.690130 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.690242 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.690270 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.690297 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.690316 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:25Z","lastTransitionTime":"2026-02-19T21:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.792592 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.792623 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.792634 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.792650 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.792662 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:25Z","lastTransitionTime":"2026-02-19T21:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.895417 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.895483 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.895505 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.895537 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.895561 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:25Z","lastTransitionTime":"2026-02-19T21:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.958252 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.958345 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.958363 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.958386 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.958565 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:25Z","lastTransitionTime":"2026-02-19T21:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:25 crc kubenswrapper[4795]: E0219 21:29:25.978679 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:25Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.984127 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.984223 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.984242 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.984261 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.984313 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:25Z","lastTransitionTime":"2026-02-19T21:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:25 crc kubenswrapper[4795]: E0219 21:29:25.998590 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:25Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.003465 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.003498 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.003522 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.003536 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.003544 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:26Z","lastTransitionTime":"2026-02-19T21:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:26 crc kubenswrapper[4795]: E0219 21:29:26.018932 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:26Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.023860 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.023895 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.023907 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.023925 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.023938 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:26Z","lastTransitionTime":"2026-02-19T21:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:26 crc kubenswrapper[4795]: E0219 21:29:26.043075 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:26Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.047271 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.047305 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.047319 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.047337 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.047350 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:26Z","lastTransitionTime":"2026-02-19T21:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:26 crc kubenswrapper[4795]: E0219 21:29:26.064196 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:26Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:26 crc kubenswrapper[4795]: E0219 21:29:26.064527 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.066875 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.066927 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.066951 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.066977 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.066999 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:26Z","lastTransitionTime":"2026-02-19T21:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.169202 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.169276 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.169296 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.169320 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.169338 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:26Z","lastTransitionTime":"2026-02-19T21:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.271658 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.271715 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.271734 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.271758 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.271781 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:26Z","lastTransitionTime":"2026-02-19T21:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.374423 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.374470 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.374487 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.374513 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.374530 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:26Z","lastTransitionTime":"2026-02-19T21:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.477548 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.477616 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.477638 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.477666 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.477690 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:26Z","lastTransitionTime":"2026-02-19T21:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.510983 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.511052 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:26 crc kubenswrapper[4795]: E0219 21:29:26.511377 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:26 crc kubenswrapper[4795]: E0219 21:29:26.511548 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.511130 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:26 crc kubenswrapper[4795]: E0219 21:29:26.512077 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.525674 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 14:42:15.284865661 +0000 UTC Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.580385 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.580424 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.580434 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.580450 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.580460 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:26Z","lastTransitionTime":"2026-02-19T21:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.686897 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.686981 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.687005 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.687039 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.687187 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:26Z","lastTransitionTime":"2026-02-19T21:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.790653 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.790710 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.790726 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.790749 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.790767 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:26Z","lastTransitionTime":"2026-02-19T21:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.892858 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.892912 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.892929 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.892951 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.892968 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:26Z","lastTransitionTime":"2026-02-19T21:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.996309 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.996370 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.996387 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.996410 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.996427 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:26Z","lastTransitionTime":"2026-02-19T21:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.099363 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.099422 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.099438 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.099460 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.099477 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:27Z","lastTransitionTime":"2026-02-19T21:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.202743 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.202813 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.202835 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.202865 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.202894 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:27Z","lastTransitionTime":"2026-02-19T21:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.305918 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.305984 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.306005 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.306031 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.306048 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:27Z","lastTransitionTime":"2026-02-19T21:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.408720 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.408777 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.408798 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.408821 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.408838 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:27Z","lastTransitionTime":"2026-02-19T21:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.510780 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:27 crc kubenswrapper[4795]: E0219 21:29:27.510971 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.511516 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.511553 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.511568 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.511588 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.511604 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:27Z","lastTransitionTime":"2026-02-19T21:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.526378 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 17:07:09.152403131 +0000 UTC Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.614696 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.614749 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.614766 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.614785 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.614795 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:27Z","lastTransitionTime":"2026-02-19T21:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.717143 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.717246 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.717268 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.717296 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.717319 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:27Z","lastTransitionTime":"2026-02-19T21:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.819483 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.819557 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.819568 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.819584 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.819594 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:27Z","lastTransitionTime":"2026-02-19T21:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.921974 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.922012 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.922021 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.922033 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.922042 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:27Z","lastTransitionTime":"2026-02-19T21:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.024586 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.024660 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.024672 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.024688 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.024697 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:28Z","lastTransitionTime":"2026-02-19T21:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.126865 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.126927 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.126945 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.126973 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.126990 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:28Z","lastTransitionTime":"2026-02-19T21:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.229505 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.229580 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.229603 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.229631 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.229653 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:28Z","lastTransitionTime":"2026-02-19T21:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.332197 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.332235 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.332244 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.332259 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.332267 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:28Z","lastTransitionTime":"2026-02-19T21:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.434149 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.434221 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.434240 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.434260 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.434273 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:28Z","lastTransitionTime":"2026-02-19T21:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.511599 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.511657 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.511622 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:28 crc kubenswrapper[4795]: E0219 21:29:28.511795 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:28 crc kubenswrapper[4795]: E0219 21:29:28.511891 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:28 crc kubenswrapper[4795]: E0219 21:29:28.512279 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.512707 4795 scope.go:117] "RemoveContainer" containerID="b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.527557 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 02:20:50.184964946 +0000 UTC Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.536825 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.536878 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.536895 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.536919 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.536936 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:28Z","lastTransitionTime":"2026-02-19T21:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.639184 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.639221 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.639229 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.639243 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.639252 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:28Z","lastTransitionTime":"2026-02-19T21:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.742294 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.742338 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.742349 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.742366 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.742378 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:28Z","lastTransitionTime":"2026-02-19T21:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.844620 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.844654 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.844665 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.844702 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.844711 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:28Z","lastTransitionTime":"2026-02-19T21:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.872380 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovnkube-controller/2.log" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.875502 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerStarted","Data":"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d"} Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.875872 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.899705 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:28Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.919853 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:28Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.938791 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:28Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.946926 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.946980 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.946993 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.947010 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.947021 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:28Z","lastTransitionTime":"2026-02-19T21:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.957613 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:28Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.975902 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:28Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.990248 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02e3717583e910ca7356ee58e22d63f40252488fb8540be7a30ab0fc918b908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:29:20Z\\\",\\\"message\\\":\\\"2026-02-19T21:28:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8c80439f-0348-449a-8009-7180f0a7269a\\\\n2026-02-19T21:28:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8c80439f-0348-449a-8009-7180f0a7269a to /host/opt/cni/bin/\\\\n2026-02-19T21:28:35Z [verbose] multus-daemon started\\\\n2026-02-19T21:28:35Z [verbose] Readiness Indicator file check\\\\n2026-02-19T21:29:20Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:28Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.001744 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.012279 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b1b4346-e02e-4614-b2ff-e4628046a92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ff4bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.029024 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.041228 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.048667 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.048722 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.048739 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.048760 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.048778 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:29Z","lastTransitionTime":"2026-02-19T21:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.053064 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.068571 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.082363 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.093927 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.113703 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:59Z\\\",\\\"message\\\":\\\"oup\\\\\\\"}}}\\\\nI0219 21:28:59.338551 6443 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0219 21:28:59.338560 6443 services_controller.go:452] Built service openshift-config-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 21:28:59.338574 6443 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nF0219 21:28:59.338577 6443 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z]\\\\nI0219 21:28:59.338588 6443 services_controller.go:454] Service open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.125419 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.139249 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538a8ea5-cec4-4706-8187-425879d18d4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7547886d43376602238ac541e375478233fda6564afeb045cb84192812dd7136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd41d9aec7be2e2d5e5ec8b06259364f0b7a1bf4fadbd8fe8abb8ff625cd6072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd942d9f97c8b019171b7d06c9f4a4b85decbf641bd03a76701b7d27520db62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.152694 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.152734 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.152745 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.152762 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.152774 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:29Z","lastTransitionTime":"2026-02-19T21:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.156944 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.254894 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.254936 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.254947 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.254963 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.254975 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:29Z","lastTransitionTime":"2026-02-19T21:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.357512 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.357549 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.357557 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.357571 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.357582 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:29Z","lastTransitionTime":"2026-02-19T21:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.460644 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.460694 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.460710 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.460732 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.460749 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:29Z","lastTransitionTime":"2026-02-19T21:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.511231 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:29 crc kubenswrapper[4795]: E0219 21:29:29.511363 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.524705 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.528454 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 21:26:29.53218513 +0000 UTC Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.541766 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.553424 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.563340 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.563391 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.563404 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.563418 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.563427 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:29Z","lastTransitionTime":"2026-02-19T21:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.569337 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538a8ea5-cec4-4706-8187-425879d18d4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7547886d43376602238ac541e375478233fda6564afeb045cb84192812dd7136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd41d9aec7be2e2d5e5ec8b06259364f0b7a1bf4fadbd8fe8abb8ff625cd6072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd942d9f97c8b019171b7d06c9f4a4b85decbf641bd03a76701b7d27520db62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.580929 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.590850 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.612244 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:59Z\\\",\\\"message\\\":\\\"oup\\\\\\\"}}}\\\\nI0219 21:28:59.338551 6443 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0219 21:28:59.338560 6443 services_controller.go:452] Built service openshift-config-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 21:28:59.338574 6443 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nF0219 21:28:59.338577 6443 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z]\\\\nI0219 21:28:59.338588 6443 services_controller.go:454] Service open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.626272 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.636559 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.665508 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.665552 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.665569 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.665590 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.665605 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:29Z","lastTransitionTime":"2026-02-19T21:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.673081 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.695049 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.704786 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.714610 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b1b4346-e02e-4614-b2ff-e4628046a92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ff4bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.732179 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.743017 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.752868 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.768458 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.768532 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.768560 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.768593 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.768617 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:29Z","lastTransitionTime":"2026-02-19T21:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.768673 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.781479 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02e3717583e910ca7356ee58e22d63f40252488fb8540be7a30ab0fc918b908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:29:20Z\\\",\\\"message\\\":\\\"2026-02-19T21:28:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8c80439f-0348-449a-8009-7180f0a7269a\\\\n2026-02-19T21:28:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8c80439f-0348-449a-8009-7180f0a7269a to /host/opt/cni/bin/\\\\n2026-02-19T21:28:35Z [verbose] multus-daemon started\\\\n2026-02-19T21:28:35Z [verbose] Readiness Indicator file check\\\\n2026-02-19T21:29:20Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.871144 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.871191 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.871202 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.871215 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.871225 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:29Z","lastTransitionTime":"2026-02-19T21:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.881391 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovnkube-controller/3.log" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.882214 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovnkube-controller/2.log" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.884911 4795 generic.go:334] "Generic (PLEG): container finished" podID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerID="b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d" exitCode=1 Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.884957 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerDied","Data":"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d"} Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.885001 4795 scope.go:117] "RemoveContainer" containerID="b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.885659 4795 scope.go:117] "RemoveContainer" containerID="b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d" Feb 19 21:29:29 crc kubenswrapper[4795]: E0219 21:29:29.885880 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.901093 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.910712 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.924423 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538a8ea5-cec4-4706-8187-425879d18d4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7547886d43376602238ac541e375478233fda6564afeb045cb84192812dd7136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd41d9aec7be2e2d5e5ec8b06259364f0b7a1bf4fadbd8fe8abb8ff625cd6072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd942d9f97c8b019171b7d06c9f4a4b85decbf641bd03a76701b7d27520db62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.939545 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.953424 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.973269 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:59Z\\\",\\\"message\\\":\\\"oup\\\\\\\"}}}\\\\nI0219 21:28:59.338551 6443 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0219 21:28:59.338560 6443 services_controller.go:452] Built service openshift-config-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 21:28:59.338574 6443 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nF0219 21:28:59.338577 6443 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z]\\\\nI0219 21:28:59.338588 6443 services_controller.go:454] Service open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:29:29Z\\\",\\\"message\\\":\\\"dding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-l29c7\\\\nI0219 21:29:29.388497 6838 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-l29c7 in node crc\\\\nI0219 21:29:29.388507 6838 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-l29c7 after 0 failed attempt(s)\\\\nI0219 21:29:29.388515 6838 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-l29c7\\\\nI0219 21:29:29.388436 6838 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0219 21:29:29.388402 6838 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0219 21:29:29.388466 6838 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-jvnv5 in node crc\\\\nI0219 21:29:29.388540 6838 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-jvnv5 after 0 failed attempt(s)\\\\nI0219 21:29:29.388544 6838 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-jvnv5\\\\nI0219 21:29:29.388279 6838 obj_retry.go:303] R\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.973444 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.973465 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.973474 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.973487 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.973496 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:29Z","lastTransitionTime":"2026-02-19T21:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.985260 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.998506 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.013726 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:30Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.025740 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:30Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.037383 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:30Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.058603 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:30Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.069070 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:30Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.075419 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.075452 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.075462 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.075494 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.075503 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:30Z","lastTransitionTime":"2026-02-19T21:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.077815 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:30Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.090828 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:30Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.101691 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02e3717583e910ca7356ee58e22d63f40252488fb8540be7a30ab0fc918b908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:29:20Z\\\",\\\"message\\\":\\\"2026-02-19T21:28:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8c80439f-0348-449a-8009-7180f0a7269a\\\\n2026-02-19T21:28:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8c80439f-0348-449a-8009-7180f0a7269a to /host/opt/cni/bin/\\\\n2026-02-19T21:28:35Z [verbose] multus-daemon started\\\\n2026-02-19T21:28:35Z [verbose] Readiness Indicator file check\\\\n2026-02-19T21:29:20Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:30Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.109966 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:30Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.120829 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b1b4346-e02e-4614-b2ff-e4628046a92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ff4bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:30Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.178293 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.178611 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.178749 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.178903 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.179054 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:30Z","lastTransitionTime":"2026-02-19T21:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.281447 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.281491 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.281500 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.281513 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.281522 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:30Z","lastTransitionTime":"2026-02-19T21:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.383694 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.383762 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.383784 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.383811 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.383833 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:30Z","lastTransitionTime":"2026-02-19T21:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.486406 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.486727 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.486907 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.487124 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.487334 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:30Z","lastTransitionTime":"2026-02-19T21:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.510871 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.510980 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.510983 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:30 crc kubenswrapper[4795]: E0219 21:29:30.511690 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:30 crc kubenswrapper[4795]: E0219 21:29:30.511803 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:30 crc kubenswrapper[4795]: E0219 21:29:30.511923 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.529488 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 04:08:42.738903615 +0000 UTC Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.590051 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.590095 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.590108 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.590126 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.590138 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:30Z","lastTransitionTime":"2026-02-19T21:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.692805 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.692862 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.692879 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.692904 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.692922 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:30Z","lastTransitionTime":"2026-02-19T21:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.795095 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.795128 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.795139 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.795153 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.795182 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:30Z","lastTransitionTime":"2026-02-19T21:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.889415 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovnkube-controller/3.log" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.892915 4795 scope.go:117] "RemoveContainer" containerID="b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d" Feb 19 21:29:30 crc kubenswrapper[4795]: E0219 21:29:30.893081 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.896727 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.896747 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.896755 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.896767 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.896775 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:30Z","lastTransitionTime":"2026-02-19T21:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.909754 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:30Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.923313 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538a8ea5-cec4-4706-8187-425879d18d4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7547886d43376602238ac541e375478233fda6564afeb045cb84192812dd7136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd41d9aec7be2e2d5e5ec8b06259364f0b7a1bf4fadbd8fe8abb8ff625cd6072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd942d9f97c8b019171b7d06c9f4a4b85decbf641bd03a76701b7d27520db62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:30Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.936425 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:30Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.950803 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:30Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.973107 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:29:29Z\\\",\\\"message\\\":\\\"dding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-l29c7\\\\nI0219 21:29:29.388497 6838 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-l29c7 in node crc\\\\nI0219 21:29:29.388507 6838 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-l29c7 after 0 failed attempt(s)\\\\nI0219 21:29:29.388515 6838 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-l29c7\\\\nI0219 21:29:29.388436 6838 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0219 21:29:29.388402 6838 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0219 21:29:29.388466 6838 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-jvnv5 in node crc\\\\nI0219 21:29:29.388540 6838 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-jvnv5 after 0 failed attempt(s)\\\\nI0219 21:29:29.388544 6838 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-jvnv5\\\\nI0219 21:29:29.388279 6838 obj_retry.go:303] R\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:29:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:30Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.989816 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:30Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.998593 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.998634 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.998647 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.998665 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.998680 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:30Z","lastTransitionTime":"2026-02-19T21:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.006350 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.023411 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.037350 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.048431 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.066630 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b1b4346-e02e-4614-b2ff-e4628046a92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ff4bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.095622 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.105418 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.105456 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.105472 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.105516 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.105530 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:31Z","lastTransitionTime":"2026-02-19T21:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.115687 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.131423 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.147640 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.167030 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02e3717583e910ca7356ee58e22d63f40252488fb8540be7a30ab0fc918b908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:29:20Z\\\",\\\"message\\\":\\\"2026-02-19T21:28:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8c80439f-0348-449a-8009-7180f0a7269a\\\\n2026-02-19T21:28:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8c80439f-0348-449a-8009-7180f0a7269a to /host/opt/cni/bin/\\\\n2026-02-19T21:28:35Z [verbose] multus-daemon started\\\\n2026-02-19T21:28:35Z [verbose] Readiness Indicator file check\\\\n2026-02-19T21:29:20Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.188498 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.202998 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.207916 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.207975 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.207995 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.208020 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.208037 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:31Z","lastTransitionTime":"2026-02-19T21:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.311492 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.311983 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.312079 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.312188 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.312283 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:31Z","lastTransitionTime":"2026-02-19T21:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.415182 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.415224 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.415234 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.415248 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.415258 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:31Z","lastTransitionTime":"2026-02-19T21:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.511556 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:31 crc kubenswrapper[4795]: E0219 21:29:31.511874 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.516843 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.516895 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.516914 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.516934 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.516970 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:31Z","lastTransitionTime":"2026-02-19T21:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.526716 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.529665 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 22:33:06.832992682 +0000 UTC Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.619874 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.619929 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.619946 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.619970 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.619989 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:31Z","lastTransitionTime":"2026-02-19T21:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.722749 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.722795 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.722806 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.722819 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.722829 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:31Z","lastTransitionTime":"2026-02-19T21:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.825380 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.825429 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.825448 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.825473 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.825491 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:31Z","lastTransitionTime":"2026-02-19T21:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.928028 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.928096 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.928120 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.928153 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.928210 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:31Z","lastTransitionTime":"2026-02-19T21:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.032785 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.032828 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.032840 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.032856 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.032868 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:32Z","lastTransitionTime":"2026-02-19T21:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.136322 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.136655 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.136843 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.137010 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.137128 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:32Z","lastTransitionTime":"2026-02-19T21:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.240942 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.240999 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.241017 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.241037 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.241049 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:32Z","lastTransitionTime":"2026-02-19T21:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.343813 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.343866 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.343879 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.343897 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.343911 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:32Z","lastTransitionTime":"2026-02-19T21:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.381496 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:32 crc kubenswrapper[4795]: E0219 21:29:32.381922 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.381887344 +0000 UTC m=+147.574405248 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.447278 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.447326 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.447337 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.447355 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.447367 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:32Z","lastTransitionTime":"2026-02-19T21:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.482348 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.482425 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.482474 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:32 crc kubenswrapper[4795]: E0219 21:29:32.482543 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:29:32 crc kubenswrapper[4795]: E0219 21:29:32.482630 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.482612434 +0000 UTC m=+147.675130298 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.482552 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:32 crc kubenswrapper[4795]: E0219 21:29:32.482685 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:29:32 crc kubenswrapper[4795]: E0219 21:29:32.482692 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:29:32 crc kubenswrapper[4795]: E0219 21:29:32.482713 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:29:32 crc kubenswrapper[4795]: E0219 21:29:32.482737 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:29:32 crc kubenswrapper[4795]: E0219 21:29:32.482782 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.482762068 +0000 UTC m=+147.675279952 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:29:32 crc kubenswrapper[4795]: E0219 21:29:32.482805 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.482796459 +0000 UTC m=+147.675314333 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:29:32 crc kubenswrapper[4795]: E0219 21:29:32.482860 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:29:32 crc kubenswrapper[4795]: E0219 21:29:32.482885 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:29:32 crc kubenswrapper[4795]: E0219 21:29:32.482900 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:29:32 crc kubenswrapper[4795]: E0219 21:29:32.482969 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.482948864 +0000 UTC m=+147.675466728 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.511185 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:32 crc kubenswrapper[4795]: E0219 21:29:32.511299 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.511356 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.511364 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:32 crc kubenswrapper[4795]: E0219 21:29:32.511578 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:32 crc kubenswrapper[4795]: E0219 21:29:32.511665 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.530416 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 12:48:54.001068845 +0000 UTC Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.548953 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.549109 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.549187 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.549264 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.549330 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:32Z","lastTransitionTime":"2026-02-19T21:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.651638 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.651681 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.651692 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.651710 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.651721 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:32Z","lastTransitionTime":"2026-02-19T21:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.754011 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.754319 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.754472 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.754633 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.754765 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:32Z","lastTransitionTime":"2026-02-19T21:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.858509 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.858548 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.858559 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.858576 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.858587 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:32Z","lastTransitionTime":"2026-02-19T21:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.961467 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.961530 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.961550 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.961576 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.961595 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:32Z","lastTransitionTime":"2026-02-19T21:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.064870 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.064925 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.064942 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.064966 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.064983 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:33Z","lastTransitionTime":"2026-02-19T21:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.168461 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.168520 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.168536 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.168562 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.168580 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:33Z","lastTransitionTime":"2026-02-19T21:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.271535 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.271631 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.271643 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.271661 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.271675 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:33Z","lastTransitionTime":"2026-02-19T21:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.374536 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.374626 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.374643 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.374669 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.374688 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:33Z","lastTransitionTime":"2026-02-19T21:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.477868 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.477934 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.477956 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.477983 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.478003 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:33Z","lastTransitionTime":"2026-02-19T21:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.510812 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:33 crc kubenswrapper[4795]: E0219 21:29:33.511067 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.531235 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 06:34:30.991199645 +0000 UTC Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.580804 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.580892 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.580916 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.580939 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.580956 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:33Z","lastTransitionTime":"2026-02-19T21:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.684537 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.684598 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.684615 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.684638 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.684655 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:33Z","lastTransitionTime":"2026-02-19T21:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.788112 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.788155 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.788209 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.788238 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.788256 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:33Z","lastTransitionTime":"2026-02-19T21:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.891722 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.891780 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.891800 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.891826 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.891843 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:33Z","lastTransitionTime":"2026-02-19T21:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.994989 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.995067 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.995088 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.995119 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.995142 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:33Z","lastTransitionTime":"2026-02-19T21:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.098601 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.098660 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.098677 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.098703 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.098720 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:34Z","lastTransitionTime":"2026-02-19T21:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.202568 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.202632 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.202648 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.202670 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.202687 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:34Z","lastTransitionTime":"2026-02-19T21:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.305775 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.305832 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.305849 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.305873 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.305889 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:34Z","lastTransitionTime":"2026-02-19T21:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.408431 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.408497 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.408514 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.408539 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.408557 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:34Z","lastTransitionTime":"2026-02-19T21:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.510553 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.510681 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:34 crc kubenswrapper[4795]: E0219 21:29:34.510876 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.510992 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:34 crc kubenswrapper[4795]: E0219 21:29:34.511091 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:34 crc kubenswrapper[4795]: E0219 21:29:34.511143 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.512366 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.512433 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.512454 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.512478 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.512495 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:34Z","lastTransitionTime":"2026-02-19T21:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.531903 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 17:36:52.018493243 +0000 UTC Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.615420 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.615488 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.615510 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.615537 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.615553 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:34Z","lastTransitionTime":"2026-02-19T21:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.718285 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.718352 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.718369 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.718396 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.718415 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:34Z","lastTransitionTime":"2026-02-19T21:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.821696 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.821758 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.821776 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.821801 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.821820 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:34Z","lastTransitionTime":"2026-02-19T21:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.924623 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.924673 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.924685 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.924704 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.924721 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:34Z","lastTransitionTime":"2026-02-19T21:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.028645 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.028694 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.028707 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.028724 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.028736 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:35Z","lastTransitionTime":"2026-02-19T21:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.131790 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.131857 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.131873 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.131898 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.131916 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:35Z","lastTransitionTime":"2026-02-19T21:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.235308 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.235381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.235398 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.235424 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.235444 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:35Z","lastTransitionTime":"2026-02-19T21:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.339159 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.339249 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.339266 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.339290 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.339307 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:35Z","lastTransitionTime":"2026-02-19T21:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.442058 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.442113 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.442130 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.442155 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.442208 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:35Z","lastTransitionTime":"2026-02-19T21:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.511500 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:35 crc kubenswrapper[4795]: E0219 21:29:35.511697 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.532831 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 20:07:13.390109887 +0000 UTC Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.545758 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.545843 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.545869 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.545902 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.545979 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:35Z","lastTransitionTime":"2026-02-19T21:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.649148 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.649278 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.649300 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.649322 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.649341 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:35Z","lastTransitionTime":"2026-02-19T21:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.751772 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.751835 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.751846 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.751860 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.751871 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:35Z","lastTransitionTime":"2026-02-19T21:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.854863 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.854918 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.854928 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.854944 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.854953 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:35Z","lastTransitionTime":"2026-02-19T21:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.957557 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.957599 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.957623 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.957638 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.957649 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:35Z","lastTransitionTime":"2026-02-19T21:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.060545 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.060608 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.060625 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.060649 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.060666 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:36Z","lastTransitionTime":"2026-02-19T21:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.163451 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.163479 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.163487 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.163500 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.163508 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:36Z","lastTransitionTime":"2026-02-19T21:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.262422 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.262480 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.262498 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.262524 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.262547 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:36Z","lastTransitionTime":"2026-02-19T21:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:36 crc kubenswrapper[4795]: E0219 21:29:36.279961 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.283647 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.283927 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.284133 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.284420 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.284653 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:36Z","lastTransitionTime":"2026-02-19T21:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:36 crc kubenswrapper[4795]: E0219 21:29:36.298076 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.302978 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.303331 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.303587 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.303761 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.303922 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:36Z","lastTransitionTime":"2026-02-19T21:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:36 crc kubenswrapper[4795]: E0219 21:29:36.318003 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.322551 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.322580 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.322591 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.322608 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.322620 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:36Z","lastTransitionTime":"2026-02-19T21:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:36 crc kubenswrapper[4795]: E0219 21:29:36.337925 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.342150 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.342240 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.342250 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.342265 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.342275 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:36Z","lastTransitionTime":"2026-02-19T21:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:36 crc kubenswrapper[4795]: E0219 21:29:36.355557 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:36 crc kubenswrapper[4795]: E0219 21:29:36.355734 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.357513 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.357594 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.357611 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.357633 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.357649 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:36Z","lastTransitionTime":"2026-02-19T21:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.461161 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.461769 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.461912 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.462040 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.462209 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:36Z","lastTransitionTime":"2026-02-19T21:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.510614 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.510619 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.510715 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:36 crc kubenswrapper[4795]: E0219 21:29:36.510954 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:36 crc kubenswrapper[4795]: E0219 21:29:36.511196 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:36 crc kubenswrapper[4795]: E0219 21:29:36.511637 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.535244 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 01:03:59.930389504 +0000 UTC Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.565678 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.565730 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.565742 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.565759 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.565769 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:36Z","lastTransitionTime":"2026-02-19T21:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.669293 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.669363 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.669381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.669407 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.669429 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:36Z","lastTransitionTime":"2026-02-19T21:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.771937 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.771976 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.771984 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.772000 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.772009 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:36Z","lastTransitionTime":"2026-02-19T21:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.874971 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.875019 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.875030 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.875047 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.875059 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:36Z","lastTransitionTime":"2026-02-19T21:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.977597 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.977645 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.977657 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.977679 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.977692 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:36Z","lastTransitionTime":"2026-02-19T21:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.079987 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.080045 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.080060 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.080081 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.080097 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:37Z","lastTransitionTime":"2026-02-19T21:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.183106 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.183382 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.183460 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.183535 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.183596 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:37Z","lastTransitionTime":"2026-02-19T21:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.287420 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.287475 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.287528 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.287554 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.287604 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:37Z","lastTransitionTime":"2026-02-19T21:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.390716 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.390754 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.390765 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.390785 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.390796 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:37Z","lastTransitionTime":"2026-02-19T21:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.493458 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.493522 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.493541 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.493566 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.493583 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:37Z","lastTransitionTime":"2026-02-19T21:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.511445 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:37 crc kubenswrapper[4795]: E0219 21:29:37.511636 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.535927 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 02:17:45.367206751 +0000 UTC Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.596458 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.596509 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.596530 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.596556 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.596573 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:37Z","lastTransitionTime":"2026-02-19T21:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.699763 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.699809 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.699828 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.699853 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.699871 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:37Z","lastTransitionTime":"2026-02-19T21:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.801918 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.801998 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.802017 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.802062 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.802079 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:37Z","lastTransitionTime":"2026-02-19T21:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.904854 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.904921 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.904944 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.905008 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.905034 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:37Z","lastTransitionTime":"2026-02-19T21:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.008554 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.008615 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.008632 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.008657 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.008674 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:38Z","lastTransitionTime":"2026-02-19T21:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.112746 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.113133 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.113310 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.113447 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.113600 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:38Z","lastTransitionTime":"2026-02-19T21:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.217034 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.217136 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.217218 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.217266 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.217292 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:38Z","lastTransitionTime":"2026-02-19T21:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.321003 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.321066 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.321083 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.321112 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.321130 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:38Z","lastTransitionTime":"2026-02-19T21:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.423628 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.423687 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.423739 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.423765 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.423789 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:38Z","lastTransitionTime":"2026-02-19T21:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.510762 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.510821 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.510864 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:38 crc kubenswrapper[4795]: E0219 21:29:38.510956 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:38 crc kubenswrapper[4795]: E0219 21:29:38.511076 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:38 crc kubenswrapper[4795]: E0219 21:29:38.511183 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.526186 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.526232 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.526249 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.526272 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.526294 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:38Z","lastTransitionTime":"2026-02-19T21:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.536909 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 22:01:22.62978722 +0000 UTC Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.629928 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.629985 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.629998 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.630018 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.630030 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:38Z","lastTransitionTime":"2026-02-19T21:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.732579 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.732612 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.732624 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.732638 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.732648 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:38Z","lastTransitionTime":"2026-02-19T21:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.835580 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.835644 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.835662 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.835687 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.835705 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:38Z","lastTransitionTime":"2026-02-19T21:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.938455 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.938528 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.938546 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.938563 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.938607 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:38Z","lastTransitionTime":"2026-02-19T21:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.041546 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.041622 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.041639 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.041654 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.041665 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:39Z","lastTransitionTime":"2026-02-19T21:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.144994 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.145066 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.145087 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.145113 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.145130 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:39Z","lastTransitionTime":"2026-02-19T21:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.248093 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.248143 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.248155 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.248203 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.248216 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:39Z","lastTransitionTime":"2026-02-19T21:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.351031 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.351081 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.351110 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.351131 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.351144 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:39Z","lastTransitionTime":"2026-02-19T21:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.459382 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.459443 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.459466 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.459514 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.459537 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:39Z","lastTransitionTime":"2026-02-19T21:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.510979 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:39 crc kubenswrapper[4795]: E0219 21:29:39.511640 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.533361 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.537081 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 21:08:06.320435583 +0000 UTC Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.549962 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.564648 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.564731 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.564744 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.564762 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.564775 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:39Z","lastTransitionTime":"2026-02-19T21:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.656473 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=68.656455011 podStartE2EDuration="1m8.656455011s" podCreationTimestamp="2026-02-19 21:28:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:39.655509475 +0000 UTC m=+90.848027329" watchObservedRunningTime="2026-02-19 21:29:39.656455011 +0000 UTC m=+90.848972875" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.667365 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.667388 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.667396 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.667408 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.667417 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:39Z","lastTransitionTime":"2026-02-19T21:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.696433 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-l29c7" podStartSLOduration=66.696410872 podStartE2EDuration="1m6.696410872s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:39.695243779 +0000 UTC m=+90.887761663" watchObservedRunningTime="2026-02-19 21:29:39.696410872 +0000 UTC m=+90.888928736" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.696842 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-blzsk" podStartSLOduration=66.696833774 podStartE2EDuration="1m6.696833774s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:39.678632659 +0000 UTC m=+90.871150593" watchObservedRunningTime="2026-02-19 21:29:39.696833774 +0000 UTC m=+90.889351648" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.709539 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-5p6d9" podStartSLOduration=66.709515242 podStartE2EDuration="1m6.709515242s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:39.707683991 +0000 UTC m=+90.900201875" watchObservedRunningTime="2026-02-19 21:29:39.709515242 +0000 UTC m=+90.902033136" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.718970 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jvnv5" podStartSLOduration=66.718948899 podStartE2EDuration="1m6.718948899s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:39.718103625 +0000 UTC m=+90.910621499" watchObservedRunningTime="2026-02-19 21:29:39.718948899 +0000 UTC m=+90.911466773" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.749940 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podStartSLOduration=66.749922266 podStartE2EDuration="1m6.749922266s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:39.749817343 +0000 UTC m=+90.942335247" watchObservedRunningTime="2026-02-19 21:29:39.749922266 +0000 UTC m=+90.942440130" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.762841 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=8.76281736 podStartE2EDuration="8.76281736s" podCreationTimestamp="2026-02-19 21:29:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:39.761961466 +0000 UTC m=+90.954479330" watchObservedRunningTime="2026-02-19 21:29:39.76281736 +0000 UTC m=+90.955335264" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.770031 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.770228 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.770309 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.770387 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.770444 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:39Z","lastTransitionTime":"2026-02-19T21:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.779642 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=39.779618336 podStartE2EDuration="39.779618336s" podCreationTimestamp="2026-02-19 21:29:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:39.778441902 +0000 UTC m=+90.970959786" watchObservedRunningTime="2026-02-19 21:29:39.779618336 +0000 UTC m=+90.972136240" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.832983 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" podStartSLOduration=66.832963415 podStartE2EDuration="1m6.832963415s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:39.832039439 +0000 UTC m=+91.024557303" watchObservedRunningTime="2026-02-19 21:29:39.832963415 +0000 UTC m=+91.025481299" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.872680 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.872915 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.872982 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.873055 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.873128 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:39Z","lastTransitionTime":"2026-02-19T21:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.975727 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.975774 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.975786 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.975806 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.975818 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:39Z","lastTransitionTime":"2026-02-19T21:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.077568 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.077796 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.077903 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.077984 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.078060 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:40Z","lastTransitionTime":"2026-02-19T21:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.180767 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.180794 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.180802 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.180814 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.180822 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:40Z","lastTransitionTime":"2026-02-19T21:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.283525 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.283863 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.284024 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.284192 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.284334 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:40Z","lastTransitionTime":"2026-02-19T21:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.386594 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.386964 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.387103 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.387285 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.387422 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:40Z","lastTransitionTime":"2026-02-19T21:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.490084 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.490291 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.490373 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.490449 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.490526 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:40Z","lastTransitionTime":"2026-02-19T21:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.511490 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.511526 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:40 crc kubenswrapper[4795]: E0219 21:29:40.511656 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:40 crc kubenswrapper[4795]: E0219 21:29:40.511759 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.511526 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:40 crc kubenswrapper[4795]: E0219 21:29:40.512260 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.537719 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 14:30:04.713145803 +0000 UTC Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.593717 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.593783 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.593804 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.593830 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.593850 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:40Z","lastTransitionTime":"2026-02-19T21:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.698031 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.698088 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.698105 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.698130 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.698147 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:40Z","lastTransitionTime":"2026-02-19T21:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.801152 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.801454 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.801521 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.801598 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.801675 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:40Z","lastTransitionTime":"2026-02-19T21:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.905301 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.905361 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.905379 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.905402 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.905420 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:40Z","lastTransitionTime":"2026-02-19T21:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.007551 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.007798 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.007865 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.007938 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.008000 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:41Z","lastTransitionTime":"2026-02-19T21:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.111133 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.111248 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.111273 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.111301 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.111323 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:41Z","lastTransitionTime":"2026-02-19T21:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.214490 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.214757 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.214825 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.214894 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.214953 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:41Z","lastTransitionTime":"2026-02-19T21:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.317629 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.317693 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.317716 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.317743 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.317765 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:41Z","lastTransitionTime":"2026-02-19T21:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.420158 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.420228 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.420237 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.420253 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.420262 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:41Z","lastTransitionTime":"2026-02-19T21:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.510924 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:41 crc kubenswrapper[4795]: E0219 21:29:41.511040 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.522871 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.522915 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.522949 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.522968 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.522979 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:41Z","lastTransitionTime":"2026-02-19T21:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.538759 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 16:47:44.756201663 +0000 UTC Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.625499 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.625541 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.625554 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.625573 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.625584 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:41Z","lastTransitionTime":"2026-02-19T21:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.728309 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.728353 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.728367 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.728381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.728393 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:41Z","lastTransitionTime":"2026-02-19T21:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.831146 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.831209 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.831217 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.831232 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.831241 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:41Z","lastTransitionTime":"2026-02-19T21:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.933522 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.933600 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.933614 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.933631 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.933646 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:41Z","lastTransitionTime":"2026-02-19T21:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.036666 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.036723 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.036736 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.036760 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.036777 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:42Z","lastTransitionTime":"2026-02-19T21:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.139383 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.139442 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.139460 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.139488 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.139514 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:42Z","lastTransitionTime":"2026-02-19T21:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.243368 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.243426 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.243444 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.243467 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.243484 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:42Z","lastTransitionTime":"2026-02-19T21:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.346298 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.346382 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.346403 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.346428 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.346446 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:42Z","lastTransitionTime":"2026-02-19T21:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.448645 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.448679 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.448687 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.448701 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.448710 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:42Z","lastTransitionTime":"2026-02-19T21:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.511526 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.511905 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.512208 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:42 crc kubenswrapper[4795]: E0219 21:29:42.512312 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:42 crc kubenswrapper[4795]: E0219 21:29:42.512439 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:42 crc kubenswrapper[4795]: E0219 21:29:42.512541 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.512616 4795 scope.go:117] "RemoveContainer" containerID="b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d" Feb 19 21:29:42 crc kubenswrapper[4795]: E0219 21:29:42.512799 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.539434 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 22:10:18.445984639 +0000 UTC Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.551458 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.551501 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.551518 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.551542 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.551558 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:42Z","lastTransitionTime":"2026-02-19T21:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.653526 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.653562 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.653571 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.653583 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.653592 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:42Z","lastTransitionTime":"2026-02-19T21:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.755206 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.755274 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.755284 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.755298 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.755307 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:42Z","lastTransitionTime":"2026-02-19T21:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.857328 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.857358 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.857368 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.857385 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.857397 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:42Z","lastTransitionTime":"2026-02-19T21:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.959697 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.959722 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.959731 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.959743 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.959752 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:42Z","lastTransitionTime":"2026-02-19T21:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.061895 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.061922 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.061931 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.061944 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.061952 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:43Z","lastTransitionTime":"2026-02-19T21:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.164207 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.164255 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.164268 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.164285 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.164302 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:43Z","lastTransitionTime":"2026-02-19T21:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.266469 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.266515 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.266526 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.266543 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.266554 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:43Z","lastTransitionTime":"2026-02-19T21:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.368813 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.368845 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.368855 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.368871 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.368881 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:43Z","lastTransitionTime":"2026-02-19T21:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.471098 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.471135 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.471143 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.471157 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.471185 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:43Z","lastTransitionTime":"2026-02-19T21:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.510847 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:43 crc kubenswrapper[4795]: E0219 21:29:43.510991 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.540679 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 22:56:32.84282829 +0000 UTC Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.573212 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.573233 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.573242 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.573255 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.573283 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:43Z","lastTransitionTime":"2026-02-19T21:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.675210 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.675447 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.675509 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.675574 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.675642 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:43Z","lastTransitionTime":"2026-02-19T21:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.777451 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.777516 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.777539 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.777573 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.777595 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:43Z","lastTransitionTime":"2026-02-19T21:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.880379 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.880413 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.880425 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.880441 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.880453 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:43Z","lastTransitionTime":"2026-02-19T21:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.983062 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.983124 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.983143 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.983218 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.983244 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:43Z","lastTransitionTime":"2026-02-19T21:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.086518 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.086565 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.086574 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.086588 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.086598 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:44Z","lastTransitionTime":"2026-02-19T21:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.189663 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.189710 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.189722 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.189741 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.189753 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:44Z","lastTransitionTime":"2026-02-19T21:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.292998 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.293073 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.293091 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.293114 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.293132 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:44Z","lastTransitionTime":"2026-02-19T21:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.396526 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.396588 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.396605 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.396629 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.396647 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:44Z","lastTransitionTime":"2026-02-19T21:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.499844 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.499874 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.499884 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.499900 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.499909 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:44Z","lastTransitionTime":"2026-02-19T21:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.511470 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.511551 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.511485 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:44 crc kubenswrapper[4795]: E0219 21:29:44.511708 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:44 crc kubenswrapper[4795]: E0219 21:29:44.511829 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:44 crc kubenswrapper[4795]: E0219 21:29:44.511967 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.541230 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 07:48:40.414171664 +0000 UTC Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.602435 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.602508 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.602526 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.602551 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.602570 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:44Z","lastTransitionTime":"2026-02-19T21:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.706203 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.706281 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.706303 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.706333 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.706355 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:44Z","lastTransitionTime":"2026-02-19T21:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.809664 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.810068 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.810328 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.810552 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.810756 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:44Z","lastTransitionTime":"2026-02-19T21:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.914449 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.914504 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.914522 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.914546 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.914563 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:44Z","lastTransitionTime":"2026-02-19T21:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.018357 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.018752 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.018950 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.019152 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.019407 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:45Z","lastTransitionTime":"2026-02-19T21:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.121665 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.121969 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.122088 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.122227 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.122344 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:45Z","lastTransitionTime":"2026-02-19T21:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.225315 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.225637 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.225819 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.225980 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.226144 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:45Z","lastTransitionTime":"2026-02-19T21:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.329037 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.329107 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.329127 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.329154 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.329221 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:45Z","lastTransitionTime":"2026-02-19T21:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.431375 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.431430 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.431446 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.431470 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.431487 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:45Z","lastTransitionTime":"2026-02-19T21:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.511068 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:45 crc kubenswrapper[4795]: E0219 21:29:45.511466 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.534020 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.534050 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.534058 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.534069 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.534078 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:45Z","lastTransitionTime":"2026-02-19T21:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.542812 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 08:10:57.798662672 +0000 UTC Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.636692 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.636726 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.636735 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.636749 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.636759 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:45Z","lastTransitionTime":"2026-02-19T21:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.738820 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.738858 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.738869 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.738884 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.738897 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:45Z","lastTransitionTime":"2026-02-19T21:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.841040 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.841100 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.841117 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.841141 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.841158 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:45Z","lastTransitionTime":"2026-02-19T21:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.943689 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.943903 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.943994 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.944055 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.944119 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:45Z","lastTransitionTime":"2026-02-19T21:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.047813 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.047861 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.047870 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.047886 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.047895 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:46Z","lastTransitionTime":"2026-02-19T21:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.150994 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.151348 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.151642 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.151780 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.151927 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:46Z","lastTransitionTime":"2026-02-19T21:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.254379 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.254416 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.254434 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.254452 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.254464 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:46Z","lastTransitionTime":"2026-02-19T21:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.357465 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.357540 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.357562 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.357589 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.357611 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:46Z","lastTransitionTime":"2026-02-19T21:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.461142 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.461272 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.461298 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.461329 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.461350 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:46Z","lastTransitionTime":"2026-02-19T21:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.510863 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.510902 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.510998 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:46 crc kubenswrapper[4795]: E0219 21:29:46.511061 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:46 crc kubenswrapper[4795]: E0219 21:29:46.511266 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:46 crc kubenswrapper[4795]: E0219 21:29:46.511450 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.544037 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 14:31:49.604200175 +0000 UTC Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.564415 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.564638 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.564700 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.564795 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.564889 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:46Z","lastTransitionTime":"2026-02-19T21:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.667726 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.667767 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.667774 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.667786 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.667796 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:46Z","lastTransitionTime":"2026-02-19T21:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.690224 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.690483 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.690822 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.691141 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.691489 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:46Z","lastTransitionTime":"2026-02-19T21:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.747679 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6"] Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.748055 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.750570 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.750575 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.750831 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.753231 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.773320 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=78.773305587 podStartE2EDuration="1m18.773305587s" podCreationTimestamp="2026-02-19 21:28:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:46.772686729 +0000 UTC m=+97.965204633" watchObservedRunningTime="2026-02-19 21:29:46.773305587 +0000 UTC m=+97.965823451" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.790124 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=71.790106442 podStartE2EDuration="1m11.790106442s" podCreationTimestamp="2026-02-19 21:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:46.789721231 +0000 UTC m=+97.982239115" watchObservedRunningTime="2026-02-19 21:29:46.790106442 +0000 UTC m=+97.982624306" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.838320 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8a207a66-b652-4d53-9424-48b3f88d4c93-service-ca\") pod \"cluster-version-operator-5c965bbfc6-c4gh6\" (UID: \"8a207a66-b652-4d53-9424-48b3f88d4c93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.838370 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a207a66-b652-4d53-9424-48b3f88d4c93-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-c4gh6\" (UID: \"8a207a66-b652-4d53-9424-48b3f88d4c93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.838414 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8a207a66-b652-4d53-9424-48b3f88d4c93-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-c4gh6\" (UID: \"8a207a66-b652-4d53-9424-48b3f88d4c93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.838471 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8a207a66-b652-4d53-9424-48b3f88d4c93-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-c4gh6\" (UID: \"8a207a66-b652-4d53-9424-48b3f88d4c93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.838495 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a207a66-b652-4d53-9424-48b3f88d4c93-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-c4gh6\" (UID: \"8a207a66-b652-4d53-9424-48b3f88d4c93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.939036 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8a207a66-b652-4d53-9424-48b3f88d4c93-service-ca\") pod \"cluster-version-operator-5c965bbfc6-c4gh6\" (UID: \"8a207a66-b652-4d53-9424-48b3f88d4c93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.939082 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a207a66-b652-4d53-9424-48b3f88d4c93-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-c4gh6\" (UID: \"8a207a66-b652-4d53-9424-48b3f88d4c93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.939118 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8a207a66-b652-4d53-9424-48b3f88d4c93-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-c4gh6\" (UID: \"8a207a66-b652-4d53-9424-48b3f88d4c93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.939134 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8a207a66-b652-4d53-9424-48b3f88d4c93-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-c4gh6\" (UID: \"8a207a66-b652-4d53-9424-48b3f88d4c93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.939150 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a207a66-b652-4d53-9424-48b3f88d4c93-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-c4gh6\" (UID: \"8a207a66-b652-4d53-9424-48b3f88d4c93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.939288 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8a207a66-b652-4d53-9424-48b3f88d4c93-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-c4gh6\" (UID: \"8a207a66-b652-4d53-9424-48b3f88d4c93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.939337 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8a207a66-b652-4d53-9424-48b3f88d4c93-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-c4gh6\" (UID: \"8a207a66-b652-4d53-9424-48b3f88d4c93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.939884 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8a207a66-b652-4d53-9424-48b3f88d4c93-service-ca\") pod \"cluster-version-operator-5c965bbfc6-c4gh6\" (UID: \"8a207a66-b652-4d53-9424-48b3f88d4c93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.947734 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a207a66-b652-4d53-9424-48b3f88d4c93-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-c4gh6\" (UID: \"8a207a66-b652-4d53-9424-48b3f88d4c93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.957757 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a207a66-b652-4d53-9424-48b3f88d4c93-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-c4gh6\" (UID: \"8a207a66-b652-4d53-9424-48b3f88d4c93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:47 crc kubenswrapper[4795]: I0219 21:29:47.068527 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:47 crc kubenswrapper[4795]: W0219 21:29:47.088060 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a207a66_b652_4d53_9424_48b3f88d4c93.slice/crio-e34a32ae52ae70c6baaa514d5575954eeeee61e00f13acd641a6a1b2305f1382 WatchSource:0}: Error finding container e34a32ae52ae70c6baaa514d5575954eeeee61e00f13acd641a6a1b2305f1382: Status 404 returned error can't find the container with id e34a32ae52ae70c6baaa514d5575954eeeee61e00f13acd641a6a1b2305f1382 Feb 19 21:29:47 crc kubenswrapper[4795]: I0219 21:29:47.511775 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:47 crc kubenswrapper[4795]: E0219 21:29:47.511947 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:47 crc kubenswrapper[4795]: I0219 21:29:47.544717 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 09:24:55.411879249 +0000 UTC Feb 19 21:29:47 crc kubenswrapper[4795]: I0219 21:29:47.544814 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 19 21:29:47 crc kubenswrapper[4795]: I0219 21:29:47.552879 4795 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 19 21:29:47 crc kubenswrapper[4795]: I0219 21:29:47.947589 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" event={"ID":"8a207a66-b652-4d53-9424-48b3f88d4c93","Type":"ContainerStarted","Data":"fd72049883e0afb17c0fdce1f500ad100bfded9e38a3818a9f0a3e29a04b09c7"} Feb 19 21:29:47 crc kubenswrapper[4795]: I0219 21:29:47.947666 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" event={"ID":"8a207a66-b652-4d53-9424-48b3f88d4c93","Type":"ContainerStarted","Data":"e34a32ae52ae70c6baaa514d5575954eeeee61e00f13acd641a6a1b2305f1382"} Feb 19 21:29:48 crc kubenswrapper[4795]: I0219 21:29:48.510815 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:48 crc kubenswrapper[4795]: E0219 21:29:48.511027 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:48 crc kubenswrapper[4795]: I0219 21:29:48.511333 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:48 crc kubenswrapper[4795]: I0219 21:29:48.511403 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:48 crc kubenswrapper[4795]: E0219 21:29:48.511532 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:48 crc kubenswrapper[4795]: E0219 21:29:48.511739 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:49 crc kubenswrapper[4795]: I0219 21:29:49.511030 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:49 crc kubenswrapper[4795]: E0219 21:29:49.511930 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:50 crc kubenswrapper[4795]: I0219 21:29:50.510985 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:50 crc kubenswrapper[4795]: I0219 21:29:50.511046 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:50 crc kubenswrapper[4795]: I0219 21:29:50.511058 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:50 crc kubenswrapper[4795]: E0219 21:29:50.511122 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:50 crc kubenswrapper[4795]: E0219 21:29:50.511243 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:50 crc kubenswrapper[4795]: E0219 21:29:50.511343 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:51 crc kubenswrapper[4795]: I0219 21:29:51.288357 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs\") pod \"network-metrics-daemon-ff4bs\" (UID: \"1b1b4346-e02e-4614-b2ff-e4628046a92f\") " pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:51 crc kubenswrapper[4795]: E0219 21:29:51.288506 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:29:51 crc kubenswrapper[4795]: E0219 21:29:51.288582 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs podName:1b1b4346-e02e-4614-b2ff-e4628046a92f nodeName:}" failed. No retries permitted until 2026-02-19 21:30:55.28856317 +0000 UTC m=+166.481081044 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs") pod "network-metrics-daemon-ff4bs" (UID: "1b1b4346-e02e-4614-b2ff-e4628046a92f") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:29:51 crc kubenswrapper[4795]: I0219 21:29:51.510934 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:51 crc kubenswrapper[4795]: E0219 21:29:51.511095 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:52 crc kubenswrapper[4795]: I0219 21:29:52.510660 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:52 crc kubenswrapper[4795]: E0219 21:29:52.510791 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:52 crc kubenswrapper[4795]: I0219 21:29:52.510668 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:52 crc kubenswrapper[4795]: I0219 21:29:52.510660 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:52 crc kubenswrapper[4795]: E0219 21:29:52.510855 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:52 crc kubenswrapper[4795]: E0219 21:29:52.511023 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:53 crc kubenswrapper[4795]: I0219 21:29:53.510810 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:53 crc kubenswrapper[4795]: E0219 21:29:53.511110 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:54 crc kubenswrapper[4795]: I0219 21:29:54.510586 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:54 crc kubenswrapper[4795]: I0219 21:29:54.510652 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:54 crc kubenswrapper[4795]: I0219 21:29:54.510651 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:54 crc kubenswrapper[4795]: E0219 21:29:54.510795 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:54 crc kubenswrapper[4795]: E0219 21:29:54.510906 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:54 crc kubenswrapper[4795]: E0219 21:29:54.511035 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:55 crc kubenswrapper[4795]: I0219 21:29:55.511464 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:55 crc kubenswrapper[4795]: E0219 21:29:55.512215 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:55 crc kubenswrapper[4795]: I0219 21:29:55.512725 4795 scope.go:117] "RemoveContainer" containerID="b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d" Feb 19 21:29:55 crc kubenswrapper[4795]: E0219 21:29:55.513008 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" Feb 19 21:29:56 crc kubenswrapper[4795]: I0219 21:29:56.511292 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:56 crc kubenswrapper[4795]: I0219 21:29:56.511344 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:56 crc kubenswrapper[4795]: I0219 21:29:56.511292 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:56 crc kubenswrapper[4795]: E0219 21:29:56.511559 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:56 crc kubenswrapper[4795]: E0219 21:29:56.511719 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:56 crc kubenswrapper[4795]: E0219 21:29:56.511842 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:57 crc kubenswrapper[4795]: I0219 21:29:57.511476 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:57 crc kubenswrapper[4795]: E0219 21:29:57.511615 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:58 crc kubenswrapper[4795]: I0219 21:29:58.511084 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:58 crc kubenswrapper[4795]: I0219 21:29:58.511640 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:58 crc kubenswrapper[4795]: E0219 21:29:58.511728 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:58 crc kubenswrapper[4795]: I0219 21:29:58.511900 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:58 crc kubenswrapper[4795]: E0219 21:29:58.511958 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:58 crc kubenswrapper[4795]: E0219 21:29:58.512062 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:59 crc kubenswrapper[4795]: I0219 21:29:59.511378 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:59 crc kubenswrapper[4795]: E0219 21:29:59.512775 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:30:00 crc kubenswrapper[4795]: I0219 21:30:00.511553 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:00 crc kubenswrapper[4795]: I0219 21:30:00.511552 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:00 crc kubenswrapper[4795]: I0219 21:30:00.511645 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:00 crc kubenswrapper[4795]: E0219 21:30:00.511885 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:30:00 crc kubenswrapper[4795]: E0219 21:30:00.512003 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:30:00 crc kubenswrapper[4795]: E0219 21:30:00.512676 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:30:01 crc kubenswrapper[4795]: I0219 21:30:01.511744 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:30:01 crc kubenswrapper[4795]: E0219 21:30:01.512321 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:30:02 crc kubenswrapper[4795]: I0219 21:30:02.511282 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:02 crc kubenswrapper[4795]: I0219 21:30:02.511339 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:02 crc kubenswrapper[4795]: E0219 21:30:02.511480 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:30:02 crc kubenswrapper[4795]: I0219 21:30:02.511351 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:02 crc kubenswrapper[4795]: E0219 21:30:02.511629 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:30:02 crc kubenswrapper[4795]: E0219 21:30:02.511671 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:30:03 crc kubenswrapper[4795]: I0219 21:30:03.511274 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:30:03 crc kubenswrapper[4795]: E0219 21:30:03.511452 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:30:04 crc kubenswrapper[4795]: I0219 21:30:04.511242 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:04 crc kubenswrapper[4795]: I0219 21:30:04.511339 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:04 crc kubenswrapper[4795]: E0219 21:30:04.511450 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:30:04 crc kubenswrapper[4795]: E0219 21:30:04.511715 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:30:04 crc kubenswrapper[4795]: I0219 21:30:04.512362 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:04 crc kubenswrapper[4795]: E0219 21:30:04.512541 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:30:05 crc kubenswrapper[4795]: I0219 21:30:05.511050 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:30:05 crc kubenswrapper[4795]: E0219 21:30:05.511253 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:30:06 crc kubenswrapper[4795]: I0219 21:30:06.511019 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:06 crc kubenswrapper[4795]: I0219 21:30:06.511128 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:06 crc kubenswrapper[4795]: I0219 21:30:06.511134 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:06 crc kubenswrapper[4795]: E0219 21:30:06.511438 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:30:06 crc kubenswrapper[4795]: E0219 21:30:06.511526 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:30:06 crc kubenswrapper[4795]: E0219 21:30:06.511616 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:30:07 crc kubenswrapper[4795]: I0219 21:30:07.004647 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5p6d9_e967392b-9bd8-4111-b1b9-96d503a19668/kube-multus/1.log" Feb 19 21:30:07 crc kubenswrapper[4795]: I0219 21:30:07.005691 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5p6d9_e967392b-9bd8-4111-b1b9-96d503a19668/kube-multus/0.log" Feb 19 21:30:07 crc kubenswrapper[4795]: I0219 21:30:07.005753 4795 generic.go:334] "Generic (PLEG): container finished" podID="e967392b-9bd8-4111-b1b9-96d503a19668" containerID="02e3717583e910ca7356ee58e22d63f40252488fb8540be7a30ab0fc918b908b" exitCode=1 Feb 19 21:30:07 crc kubenswrapper[4795]: I0219 21:30:07.005809 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5p6d9" event={"ID":"e967392b-9bd8-4111-b1b9-96d503a19668","Type":"ContainerDied","Data":"02e3717583e910ca7356ee58e22d63f40252488fb8540be7a30ab0fc918b908b"} Feb 19 21:30:07 crc kubenswrapper[4795]: I0219 21:30:07.005910 4795 scope.go:117] "RemoveContainer" containerID="ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d" Feb 19 21:30:07 crc kubenswrapper[4795]: I0219 21:30:07.007217 4795 scope.go:117] "RemoveContainer" containerID="02e3717583e910ca7356ee58e22d63f40252488fb8540be7a30ab0fc918b908b" Feb 19 21:30:07 crc kubenswrapper[4795]: E0219 21:30:07.007658 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-5p6d9_openshift-multus(e967392b-9bd8-4111-b1b9-96d503a19668)\"" pod="openshift-multus/multus-5p6d9" podUID="e967392b-9bd8-4111-b1b9-96d503a19668" Feb 19 21:30:07 crc kubenswrapper[4795]: I0219 21:30:07.031303 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" podStartSLOduration=94.031278446 podStartE2EDuration="1m34.031278446s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:47.967341117 +0000 UTC m=+99.159859011" watchObservedRunningTime="2026-02-19 21:30:07.031278446 +0000 UTC m=+118.223796340" Feb 19 21:30:07 crc kubenswrapper[4795]: I0219 21:30:07.511043 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:30:07 crc kubenswrapper[4795]: E0219 21:30:07.511211 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:30:08 crc kubenswrapper[4795]: I0219 21:30:08.010791 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5p6d9_e967392b-9bd8-4111-b1b9-96d503a19668/kube-multus/1.log" Feb 19 21:30:08 crc kubenswrapper[4795]: I0219 21:30:08.511383 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:08 crc kubenswrapper[4795]: I0219 21:30:08.511476 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:08 crc kubenswrapper[4795]: I0219 21:30:08.511396 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:08 crc kubenswrapper[4795]: E0219 21:30:08.511589 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:30:08 crc kubenswrapper[4795]: E0219 21:30:08.511719 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:30:08 crc kubenswrapper[4795]: E0219 21:30:08.511865 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:30:09 crc kubenswrapper[4795]: E0219 21:30:09.478727 4795 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 19 21:30:09 crc kubenswrapper[4795]: I0219 21:30:09.511555 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:30:09 crc kubenswrapper[4795]: E0219 21:30:09.512508 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:30:09 crc kubenswrapper[4795]: E0219 21:30:09.596555 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 21:30:10 crc kubenswrapper[4795]: I0219 21:30:10.511436 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:10 crc kubenswrapper[4795]: I0219 21:30:10.511466 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:10 crc kubenswrapper[4795]: I0219 21:30:10.511506 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:10 crc kubenswrapper[4795]: E0219 21:30:10.511763 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:30:10 crc kubenswrapper[4795]: E0219 21:30:10.511838 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:30:10 crc kubenswrapper[4795]: E0219 21:30:10.512007 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:30:10 crc kubenswrapper[4795]: I0219 21:30:10.513309 4795 scope.go:117] "RemoveContainer" containerID="b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d" Feb 19 21:30:11 crc kubenswrapper[4795]: I0219 21:30:11.022582 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovnkube-controller/3.log" Feb 19 21:30:11 crc kubenswrapper[4795]: I0219 21:30:11.026807 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerStarted","Data":"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81"} Feb 19 21:30:11 crc kubenswrapper[4795]: I0219 21:30:11.027291 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:30:11 crc kubenswrapper[4795]: I0219 21:30:11.068812 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podStartSLOduration=98.068794743 podStartE2EDuration="1m38.068794743s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:11.067787985 +0000 UTC m=+122.260305849" watchObservedRunningTime="2026-02-19 21:30:11.068794743 +0000 UTC m=+122.261312607" Feb 19 21:30:11 crc kubenswrapper[4795]: I0219 21:30:11.414973 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ff4bs"] Feb 19 21:30:11 crc kubenswrapper[4795]: I0219 21:30:11.415085 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:30:11 crc kubenswrapper[4795]: E0219 21:30:11.415189 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:30:12 crc kubenswrapper[4795]: I0219 21:30:12.511041 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:12 crc kubenswrapper[4795]: I0219 21:30:12.511067 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:12 crc kubenswrapper[4795]: I0219 21:30:12.511041 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:12 crc kubenswrapper[4795]: E0219 21:30:12.511193 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:30:12 crc kubenswrapper[4795]: E0219 21:30:12.511291 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:30:12 crc kubenswrapper[4795]: E0219 21:30:12.511356 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:30:13 crc kubenswrapper[4795]: I0219 21:30:13.511766 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:30:13 crc kubenswrapper[4795]: E0219 21:30:13.512001 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:30:14 crc kubenswrapper[4795]: I0219 21:30:14.511661 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:14 crc kubenswrapper[4795]: I0219 21:30:14.511691 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:14 crc kubenswrapper[4795]: E0219 21:30:14.511938 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:30:14 crc kubenswrapper[4795]: E0219 21:30:14.511975 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:30:14 crc kubenswrapper[4795]: I0219 21:30:14.511699 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:14 crc kubenswrapper[4795]: E0219 21:30:14.512410 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:30:14 crc kubenswrapper[4795]: E0219 21:30:14.597881 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 21:30:15 crc kubenswrapper[4795]: I0219 21:30:15.510777 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:30:15 crc kubenswrapper[4795]: E0219 21:30:15.510978 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:30:16 crc kubenswrapper[4795]: I0219 21:30:16.511350 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:16 crc kubenswrapper[4795]: I0219 21:30:16.511360 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:16 crc kubenswrapper[4795]: E0219 21:30:16.511533 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:30:16 crc kubenswrapper[4795]: I0219 21:30:16.511375 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:16 crc kubenswrapper[4795]: E0219 21:30:16.511599 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:30:16 crc kubenswrapper[4795]: E0219 21:30:16.511709 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:30:16 crc kubenswrapper[4795]: I0219 21:30:16.661925 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:30:17 crc kubenswrapper[4795]: I0219 21:30:17.511482 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:30:17 crc kubenswrapper[4795]: E0219 21:30:17.511619 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:30:18 crc kubenswrapper[4795]: I0219 21:30:18.510624 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:18 crc kubenswrapper[4795]: I0219 21:30:18.510671 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:18 crc kubenswrapper[4795]: E0219 21:30:18.510812 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:30:18 crc kubenswrapper[4795]: I0219 21:30:18.510858 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:18 crc kubenswrapper[4795]: E0219 21:30:18.511045 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:30:18 crc kubenswrapper[4795]: E0219 21:30:18.511216 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:30:19 crc kubenswrapper[4795]: I0219 21:30:19.510856 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:30:19 crc kubenswrapper[4795]: E0219 21:30:19.513059 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:30:19 crc kubenswrapper[4795]: E0219 21:30:19.599435 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 21:30:20 crc kubenswrapper[4795]: I0219 21:30:20.511132 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:20 crc kubenswrapper[4795]: I0219 21:30:20.511211 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:20 crc kubenswrapper[4795]: I0219 21:30:20.511150 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:20 crc kubenswrapper[4795]: E0219 21:30:20.511343 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:30:20 crc kubenswrapper[4795]: E0219 21:30:20.511601 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:30:20 crc kubenswrapper[4795]: E0219 21:30:20.511790 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:30:20 crc kubenswrapper[4795]: I0219 21:30:20.512112 4795 scope.go:117] "RemoveContainer" containerID="02e3717583e910ca7356ee58e22d63f40252488fb8540be7a30ab0fc918b908b" Feb 19 21:30:21 crc kubenswrapper[4795]: I0219 21:30:21.058976 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5p6d9_e967392b-9bd8-4111-b1b9-96d503a19668/kube-multus/1.log" Feb 19 21:30:21 crc kubenswrapper[4795]: I0219 21:30:21.059345 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5p6d9" event={"ID":"e967392b-9bd8-4111-b1b9-96d503a19668","Type":"ContainerStarted","Data":"2822f57eb80a87e2600f1bdd3a3189e4bded6fbc0e489210ee08ea133cbf2aa9"} Feb 19 21:30:21 crc kubenswrapper[4795]: I0219 21:30:21.511036 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:30:21 crc kubenswrapper[4795]: E0219 21:30:21.511262 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:30:22 crc kubenswrapper[4795]: I0219 21:30:22.510867 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:22 crc kubenswrapper[4795]: I0219 21:30:22.510920 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:22 crc kubenswrapper[4795]: I0219 21:30:22.510981 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:22 crc kubenswrapper[4795]: E0219 21:30:22.511028 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:30:22 crc kubenswrapper[4795]: E0219 21:30:22.511102 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:30:22 crc kubenswrapper[4795]: E0219 21:30:22.511239 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:30:23 crc kubenswrapper[4795]: I0219 21:30:23.510922 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:30:23 crc kubenswrapper[4795]: E0219 21:30:23.511130 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:30:24 crc kubenswrapper[4795]: I0219 21:30:24.511599 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:24 crc kubenswrapper[4795]: I0219 21:30:24.511599 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:24 crc kubenswrapper[4795]: I0219 21:30:24.511622 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:24 crc kubenswrapper[4795]: E0219 21:30:24.511863 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:30:24 crc kubenswrapper[4795]: E0219 21:30:24.511962 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:30:24 crc kubenswrapper[4795]: E0219 21:30:24.511906 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:30:25 crc kubenswrapper[4795]: I0219 21:30:25.511052 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:30:25 crc kubenswrapper[4795]: I0219 21:30:25.514460 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 19 21:30:25 crc kubenswrapper[4795]: I0219 21:30:25.518869 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 19 21:30:26 crc kubenswrapper[4795]: I0219 21:30:26.510580 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:26 crc kubenswrapper[4795]: I0219 21:30:26.510665 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:26 crc kubenswrapper[4795]: I0219 21:30:26.510665 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:26 crc kubenswrapper[4795]: I0219 21:30:26.512926 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 19 21:30:26 crc kubenswrapper[4795]: I0219 21:30:26.513020 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 19 21:30:26 crc kubenswrapper[4795]: I0219 21:30:26.513055 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 19 21:30:26 crc kubenswrapper[4795]: I0219 21:30:26.514489 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.609885 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.650811 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.651218 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.652951 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.653987 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qks97"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.654774 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7ph8l"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.655254 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.655644 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.655924 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.656036 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.656900 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-t48rm"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.657002 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.657019 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.657030 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.657073 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.658539 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22qsd"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.659462 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22qsd" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.660029 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.659618 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.659690 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.664743 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-pb7s7"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.659746 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.665763 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.669913 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.670252 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.670282 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.670326 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.685060 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.685208 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.687207 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.689107 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.690202 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.690798 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.691599 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.694329 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.695519 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.695619 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.695759 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.696058 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.698849 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.699345 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.699602 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.699836 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.700040 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.700342 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.700339 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.700343 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.700629 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.700636 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.700896 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.700841 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.700943 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.700888 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.701016 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.701065 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.701114 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.701227 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.701251 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.701353 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.701496 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.703566 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bks74"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.704243 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-rvkhj"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.704745 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-rt6qz"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.704848 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.704923 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.704990 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.705089 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.705110 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-bks74" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.705207 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.705281 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.705331 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.705390 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.705284 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-rt6qz" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.705488 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.705587 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.718747 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.719102 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.719386 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.719468 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.719502 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.725703 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.726392 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7mzng"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.726952 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.722045 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.727455 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.726313 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.727738 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.728095 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.729120 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.729279 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.729426 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.729532 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.729579 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.745911 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.747466 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-8l99c"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.749012 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.749801 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.745992 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.751031 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.751378 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.746802 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.747713 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.748700 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.762144 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763342 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763387 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-console-config\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763411 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3d1887a7-d8a5-45f6-97fc-c32a870089ef-encryption-config\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763435 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ed6485ab-c517-41cd-a755-d5dc9557456b-node-pullsecrets\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763457 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763480 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/958d08af-86cf-467f-947b-4485163fd695-machine-approver-tls\") pod \"machine-approver-56656f9798-rxhk8\" (UID: \"958d08af-86cf-467f-947b-4485163fd695\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763499 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed6485ab-c517-41cd-a755-d5dc9557456b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763518 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85a5ad8f-8c61-4e28-8e23-9a51e7796e37-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qks97\" (UID: \"85a5ad8f-8c61-4e28-8e23-9a51e7796e37\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763539 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w4sp\" (UniqueName: \"kubernetes.io/projected/96e1b4b4-8d48-4955-b756-71d21a5aea0b-kube-api-access-4w4sp\") pod \"downloads-7954f5f757-rt6qz\" (UID: \"96e1b4b4-8d48-4955-b756-71d21a5aea0b\") " pod="openshift-console/downloads-7954f5f757-rt6qz" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763558 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85a5ad8f-8c61-4e28-8e23-9a51e7796e37-serving-cert\") pod \"authentication-operator-69f744f599-qks97\" (UID: \"85a5ad8f-8c61-4e28-8e23-9a51e7796e37\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763586 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e726eeb3-dfb2-4c3a-bea7-a5c945f25d52-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-89r2g\" (UID: \"e726eeb3-dfb2-4c3a-bea7-a5c945f25d52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763606 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763632 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f28c04c3-66f1-4c29-b7d1-cac5aa342370-config\") pod \"console-operator-58897d9998-bks74\" (UID: \"f28c04c3-66f1-4c29-b7d1-cac5aa342370\") " pod="openshift-console-operator/console-operator-58897d9998-bks74" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763652 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f28c04c3-66f1-4c29-b7d1-cac5aa342370-serving-cert\") pod \"console-operator-58897d9998-bks74\" (UID: \"f28c04c3-66f1-4c29-b7d1-cac5aa342370\") " pod="openshift-console-operator/console-operator-58897d9998-bks74" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763672 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/958d08af-86cf-467f-947b-4485163fd695-config\") pod \"machine-approver-56656f9798-rxhk8\" (UID: \"958d08af-86cf-467f-947b-4485163fd695\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763689 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab78fbf6-65df-4306-a7b8-c7bd98cfdf49-config\") pod \"machine-api-operator-5694c8668f-7mzng\" (UID: \"ab78fbf6-65df-4306-a7b8-c7bd98cfdf49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763709 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ab78fbf6-65df-4306-a7b8-c7bd98cfdf49-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7mzng\" (UID: \"ab78fbf6-65df-4306-a7b8-c7bd98cfdf49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763729 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p5td\" (UniqueName: \"kubernetes.io/projected/ec60d287-0f21-467c-8030-84b8726af567-kube-api-access-8p5td\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763747 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-config\") pod \"controller-manager-879f6c89f-7ph8l\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763766 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cxdq\" (UniqueName: \"kubernetes.io/projected/958d08af-86cf-467f-947b-4485163fd695-kube-api-access-6cxdq\") pod \"machine-approver-56656f9798-rxhk8\" (UID: \"958d08af-86cf-467f-947b-4485163fd695\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763789 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763827 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d1887a7-d8a5-45f6-97fc-c32a870089ef-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763849 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xv7l\" (UniqueName: \"kubernetes.io/projected/f28c04c3-66f1-4c29-b7d1-cac5aa342370-kube-api-access-8xv7l\") pod \"console-operator-58897d9998-bks74\" (UID: \"f28c04c3-66f1-4c29-b7d1-cac5aa342370\") " pod="openshift-console-operator/console-operator-58897d9998-bks74" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763868 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3d1887a7-d8a5-45f6-97fc-c32a870089ef-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763889 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-trusted-ca-bundle\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763919 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3d1887a7-d8a5-45f6-97fc-c32a870089ef-etcd-client\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763942 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763963 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-546h9\" (UniqueName: \"kubernetes.io/projected/102f7fb5-3031-4853-b112-2aa910aa63a7-kube-api-access-546h9\") pod \"route-controller-manager-6576b87f9c-vtqjw\" (UID: \"102f7fb5-3031-4853-b112-2aa910aa63a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763984 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ed6485ab-c517-41cd-a755-d5dc9557456b-audit-dir\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764002 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764024 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ab78fbf6-65df-4306-a7b8-c7bd98cfdf49-images\") pod \"machine-api-operator-5694c8668f-7mzng\" (UID: \"ab78fbf6-65df-4306-a7b8-c7bd98cfdf49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764043 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e811d29-20d6-4576-be0f-dc59cf11b497-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jdtt8\" (UID: \"4e811d29-20d6-4576-be0f-dc59cf11b497\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764064 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ec60d287-0f21-467c-8030-84b8726af567-console-oauth-config\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764084 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86ffb50f-47f6-47b2-9141-1de9999a13e0-serving-cert\") pod \"controller-manager-879f6c89f-7ph8l\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764106 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85a5ad8f-8c61-4e28-8e23-9a51e7796e37-service-ca-bundle\") pod \"authentication-operator-69f744f599-qks97\" (UID: \"85a5ad8f-8c61-4e28-8e23-9a51e7796e37\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764127 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ed6485ab-c517-41cd-a755-d5dc9557456b-audit\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764148 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764186 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qgn8\" (UniqueName: \"kubernetes.io/projected/e736a4ca-7c18-423c-a5de-aeafd8d6a42e-kube-api-access-6qgn8\") pod \"cluster-samples-operator-665b6dd947-22qsd\" (UID: \"e736a4ca-7c18-423c-a5de-aeafd8d6a42e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22qsd" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764207 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85a5ad8f-8c61-4e28-8e23-9a51e7796e37-config\") pod \"authentication-operator-69f744f599-qks97\" (UID: \"85a5ad8f-8c61-4e28-8e23-9a51e7796e37\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764227 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7ph8l\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764244 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxfbg\" (UniqueName: \"kubernetes.io/projected/9de314c5-1440-476b-b98b-7804f5d95145-kube-api-access-cxfbg\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764260 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f28c04c3-66f1-4c29-b7d1-cac5aa342370-trusted-ca\") pod \"console-operator-58897d9998-bks74\" (UID: \"f28c04c3-66f1-4c29-b7d1-cac5aa342370\") " pod="openshift-console-operator/console-operator-58897d9998-bks74" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764275 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed6485ab-c517-41cd-a755-d5dc9557456b-config\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764288 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjzvv\" (UniqueName: \"kubernetes.io/projected/ab78fbf6-65df-4306-a7b8-c7bd98cfdf49-kube-api-access-qjzvv\") pod \"machine-api-operator-5694c8668f-7mzng\" (UID: \"ab78fbf6-65df-4306-a7b8-c7bd98cfdf49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764303 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764319 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7n56\" (UniqueName: \"kubernetes.io/projected/85a5ad8f-8c61-4e28-8e23-9a51e7796e37-kube-api-access-r7n56\") pod \"authentication-operator-69f744f599-qks97\" (UID: \"85a5ad8f-8c61-4e28-8e23-9a51e7796e37\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764336 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764352 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9de314c5-1440-476b-b98b-7804f5d95145-audit-dir\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764381 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnzj5\" (UniqueName: \"kubernetes.io/projected/ed6485ab-c517-41cd-a755-d5dc9557456b-kube-api-access-rnzj5\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764431 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e726eeb3-dfb2-4c3a-bea7-a5c945f25d52-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-89r2g\" (UID: \"e726eeb3-dfb2-4c3a-bea7-a5c945f25d52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764455 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbbjp\" (UniqueName: \"kubernetes.io/projected/4e811d29-20d6-4576-be0f-dc59cf11b497-kube-api-access-tbbjp\") pod \"openshift-apiserver-operator-796bbdcf4f-jdtt8\" (UID: \"4e811d29-20d6-4576-be0f-dc59cf11b497\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764474 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-oauth-serving-cert\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764495 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d1887a7-d8a5-45f6-97fc-c32a870089ef-serving-cert\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764513 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ed6485ab-c517-41cd-a755-d5dc9557456b-etcd-client\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764535 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rvcj\" (UniqueName: \"kubernetes.io/projected/86ffb50f-47f6-47b2-9141-1de9999a13e0-kube-api-access-4rvcj\") pod \"controller-manager-879f6c89f-7ph8l\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764556 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764577 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3d1887a7-d8a5-45f6-97fc-c32a870089ef-audit-dir\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764596 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/958d08af-86cf-467f-947b-4485163fd695-auth-proxy-config\") pod \"machine-approver-56656f9798-rxhk8\" (UID: \"958d08af-86cf-467f-947b-4485163fd695\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764615 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ed6485ab-c517-41cd-a755-d5dc9557456b-encryption-config\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764636 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/102f7fb5-3031-4853-b112-2aa910aa63a7-config\") pod \"route-controller-manager-6576b87f9c-vtqjw\" (UID: \"102f7fb5-3031-4853-b112-2aa910aa63a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764665 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3d1887a7-d8a5-45f6-97fc-c32a870089ef-audit-policies\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764684 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed6485ab-c517-41cd-a755-d5dc9557456b-serving-cert\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764704 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-service-ca\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764724 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ed6485ab-c517-41cd-a755-d5dc9557456b-etcd-serving-ca\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764744 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ed6485ab-c517-41cd-a755-d5dc9557456b-image-import-ca\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764763 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-audit-policies\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764783 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e811d29-20d6-4576-be0f-dc59cf11b497-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jdtt8\" (UID: \"4e811d29-20d6-4576-be0f-dc59cf11b497\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764806 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e736a4ca-7c18-423c-a5de-aeafd8d6a42e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-22qsd\" (UID: \"e736a4ca-7c18-423c-a5de-aeafd8d6a42e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22qsd" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764827 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e726eeb3-dfb2-4c3a-bea7-a5c945f25d52-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-89r2g\" (UID: \"e726eeb3-dfb2-4c3a-bea7-a5c945f25d52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764849 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/102f7fb5-3031-4853-b112-2aa910aa63a7-client-ca\") pod \"route-controller-manager-6576b87f9c-vtqjw\" (UID: \"102f7fb5-3031-4853-b112-2aa910aa63a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764874 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec60d287-0f21-467c-8030-84b8726af567-console-serving-cert\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764892 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/102f7fb5-3031-4853-b112-2aa910aa63a7-serving-cert\") pod \"route-controller-manager-6576b87f9c-vtqjw\" (UID: \"102f7fb5-3031-4853-b112-2aa910aa63a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764909 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czvcn\" (UniqueName: \"kubernetes.io/projected/3d1887a7-d8a5-45f6-97fc-c32a870089ef-kube-api-access-czvcn\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764929 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-client-ca\") pod \"controller-manager-879f6c89f-7ph8l\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764948 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c8ll\" (UniqueName: \"kubernetes.io/projected/e726eeb3-dfb2-4c3a-bea7-a5c945f25d52-kube-api-access-4c8ll\") pod \"cluster-image-registry-operator-dc59b4c8b-89r2g\" (UID: \"e726eeb3-dfb2-4c3a-bea7-a5c945f25d52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764966 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.765971 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.766596 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4h49m"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.767152 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.767742 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.769069 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.769946 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.770075 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.769082 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.769091 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.769101 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.769114 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.770686 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.770712 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.772528 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.773393 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.773711 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.774622 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.774773 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.774811 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.775251 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.778924 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.778968 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-r8dcx"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.779473 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.779894 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.780499 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.783920 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-sd5jz"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.784302 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.784662 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.785033 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.785426 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.785599 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.787746 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.788138 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ggndt"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.788779 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-ggndt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.789773 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.790271 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.793265 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6b8wp"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.793951 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6b8wp" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.805119 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.805331 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.812688 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.812699 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.812846 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.813021 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.814359 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.814913 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.815441 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-42wfj"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.815976 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.816072 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-42wfj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.816231 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.816362 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.816532 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.816794 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.816879 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.816952 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.817323 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.817540 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.827937 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.828353 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.828717 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qks97"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.828798 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.829740 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.844337 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bks74"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.844407 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-5qppq"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.844506 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.844767 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.846454 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-5qppq" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.846869 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.847982 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.849094 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7mzng"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.851443 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.851494 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22qsd"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.853096 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.855191 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.858475 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.859564 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.860516 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.894928 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.895797 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ed6485ab-c517-41cd-a755-d5dc9557456b-audit\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.895831 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qgn8\" (UniqueName: \"kubernetes.io/projected/e736a4ca-7c18-423c-a5de-aeafd8d6a42e-kube-api-access-6qgn8\") pod \"cluster-samples-operator-665b6dd947-22qsd\" (UID: \"e736a4ca-7c18-423c-a5de-aeafd8d6a42e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22qsd" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.895851 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85a5ad8f-8c61-4e28-8e23-9a51e7796e37-config\") pod \"authentication-operator-69f744f599-qks97\" (UID: \"85a5ad8f-8c61-4e28-8e23-9a51e7796e37\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.895867 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7ph8l\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.895882 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.895897 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f28c04c3-66f1-4c29-b7d1-cac5aa342370-trusted-ca\") pod \"console-operator-58897d9998-bks74\" (UID: \"f28c04c3-66f1-4c29-b7d1-cac5aa342370\") " pod="openshift-console-operator/console-operator-58897d9998-bks74" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.895910 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed6485ab-c517-41cd-a755-d5dc9557456b-config\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.895926 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjzvv\" (UniqueName: \"kubernetes.io/projected/ab78fbf6-65df-4306-a7b8-c7bd98cfdf49-kube-api-access-qjzvv\") pod \"machine-api-operator-5694c8668f-7mzng\" (UID: \"ab78fbf6-65df-4306-a7b8-c7bd98cfdf49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.895935 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-wp452"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896517 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-wp452" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.895941 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896603 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxfbg\" (UniqueName: \"kubernetes.io/projected/9de314c5-1440-476b-b98b-7804f5d95145-kube-api-access-cxfbg\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896624 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7n56\" (UniqueName: \"kubernetes.io/projected/85a5ad8f-8c61-4e28-8e23-9a51e7796e37-kube-api-access-r7n56\") pod \"authentication-operator-69f744f599-qks97\" (UID: \"85a5ad8f-8c61-4e28-8e23-9a51e7796e37\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896641 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896661 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9de314c5-1440-476b-b98b-7804f5d95145-audit-dir\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896697 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnzj5\" (UniqueName: \"kubernetes.io/projected/ed6485ab-c517-41cd-a755-d5dc9557456b-kube-api-access-rnzj5\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896712 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e726eeb3-dfb2-4c3a-bea7-a5c945f25d52-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-89r2g\" (UID: \"e726eeb3-dfb2-4c3a-bea7-a5c945f25d52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896729 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbbjp\" (UniqueName: \"kubernetes.io/projected/4e811d29-20d6-4576-be0f-dc59cf11b497-kube-api-access-tbbjp\") pod \"openshift-apiserver-operator-796bbdcf4f-jdtt8\" (UID: \"4e811d29-20d6-4576-be0f-dc59cf11b497\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896745 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-oauth-serving-cert\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896761 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d1887a7-d8a5-45f6-97fc-c32a870089ef-serving-cert\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896776 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ed6485ab-c517-41cd-a755-d5dc9557456b-etcd-client\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896791 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3d1887a7-d8a5-45f6-97fc-c32a870089ef-audit-dir\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896806 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rvcj\" (UniqueName: \"kubernetes.io/projected/86ffb50f-47f6-47b2-9141-1de9999a13e0-kube-api-access-4rvcj\") pod \"controller-manager-879f6c89f-7ph8l\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896822 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896840 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/102f7fb5-3031-4853-b112-2aa910aa63a7-config\") pod \"route-controller-manager-6576b87f9c-vtqjw\" (UID: \"102f7fb5-3031-4853-b112-2aa910aa63a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896853 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/958d08af-86cf-467f-947b-4485163fd695-auth-proxy-config\") pod \"machine-approver-56656f9798-rxhk8\" (UID: \"958d08af-86cf-467f-947b-4485163fd695\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896869 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ed6485ab-c517-41cd-a755-d5dc9557456b-encryption-config\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896890 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3d1887a7-d8a5-45f6-97fc-c32a870089ef-audit-policies\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896904 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed6485ab-c517-41cd-a755-d5dc9557456b-serving-cert\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896921 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-service-ca\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896937 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ed6485ab-c517-41cd-a755-d5dc9557456b-etcd-serving-ca\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896951 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ed6485ab-c517-41cd-a755-d5dc9557456b-image-import-ca\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896967 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-audit-policies\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896983 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e736a4ca-7c18-423c-a5de-aeafd8d6a42e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-22qsd\" (UID: \"e736a4ca-7c18-423c-a5de-aeafd8d6a42e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22qsd" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897000 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e726eeb3-dfb2-4c3a-bea7-a5c945f25d52-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-89r2g\" (UID: \"e726eeb3-dfb2-4c3a-bea7-a5c945f25d52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897015 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e811d29-20d6-4576-be0f-dc59cf11b497-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jdtt8\" (UID: \"4e811d29-20d6-4576-be0f-dc59cf11b497\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897031 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/102f7fb5-3031-4853-b112-2aa910aa63a7-client-ca\") pod \"route-controller-manager-6576b87f9c-vtqjw\" (UID: \"102f7fb5-3031-4853-b112-2aa910aa63a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897046 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/102f7fb5-3031-4853-b112-2aa910aa63a7-serving-cert\") pod \"route-controller-manager-6576b87f9c-vtqjw\" (UID: \"102f7fb5-3031-4853-b112-2aa910aa63a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897064 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czvcn\" (UniqueName: \"kubernetes.io/projected/3d1887a7-d8a5-45f6-97fc-c32a870089ef-kube-api-access-czvcn\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897078 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-client-ca\") pod \"controller-manager-879f6c89f-7ph8l\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897094 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c8ll\" (UniqueName: \"kubernetes.io/projected/e726eeb3-dfb2-4c3a-bea7-a5c945f25d52-kube-api-access-4c8ll\") pod \"cluster-image-registry-operator-dc59b4c8b-89r2g\" (UID: \"e726eeb3-dfb2-4c3a-bea7-a5c945f25d52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897108 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897123 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec60d287-0f21-467c-8030-84b8726af567-console-serving-cert\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897140 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897154 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-console-config\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897185 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3d1887a7-d8a5-45f6-97fc-c32a870089ef-encryption-config\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897202 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ed6485ab-c517-41cd-a755-d5dc9557456b-node-pullsecrets\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897218 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897235 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/958d08af-86cf-467f-947b-4485163fd695-machine-approver-tls\") pod \"machine-approver-56656f9798-rxhk8\" (UID: \"958d08af-86cf-467f-947b-4485163fd695\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897251 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed6485ab-c517-41cd-a755-d5dc9557456b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897267 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w4sp\" (UniqueName: \"kubernetes.io/projected/96e1b4b4-8d48-4955-b756-71d21a5aea0b-kube-api-access-4w4sp\") pod \"downloads-7954f5f757-rt6qz\" (UID: \"96e1b4b4-8d48-4955-b756-71d21a5aea0b\") " pod="openshift-console/downloads-7954f5f757-rt6qz" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897281 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85a5ad8f-8c61-4e28-8e23-9a51e7796e37-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qks97\" (UID: \"85a5ad8f-8c61-4e28-8e23-9a51e7796e37\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897297 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85a5ad8f-8c61-4e28-8e23-9a51e7796e37-serving-cert\") pod \"authentication-operator-69f744f599-qks97\" (UID: \"85a5ad8f-8c61-4e28-8e23-9a51e7796e37\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897328 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e726eeb3-dfb2-4c3a-bea7-a5c945f25d52-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-89r2g\" (UID: \"e726eeb3-dfb2-4c3a-bea7-a5c945f25d52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897318 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dfw9n"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897344 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897360 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f28c04c3-66f1-4c29-b7d1-cac5aa342370-serving-cert\") pod \"console-operator-58897d9998-bks74\" (UID: \"f28c04c3-66f1-4c29-b7d1-cac5aa342370\") " pod="openshift-console-operator/console-operator-58897d9998-bks74" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897379 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f28c04c3-66f1-4c29-b7d1-cac5aa342370-config\") pod \"console-operator-58897d9998-bks74\" (UID: \"f28c04c3-66f1-4c29-b7d1-cac5aa342370\") " pod="openshift-console-operator/console-operator-58897d9998-bks74" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897395 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/958d08af-86cf-467f-947b-4485163fd695-config\") pod \"machine-approver-56656f9798-rxhk8\" (UID: \"958d08af-86cf-467f-947b-4485163fd695\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897409 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab78fbf6-65df-4306-a7b8-c7bd98cfdf49-config\") pod \"machine-api-operator-5694c8668f-7mzng\" (UID: \"ab78fbf6-65df-4306-a7b8-c7bd98cfdf49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897425 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ab78fbf6-65df-4306-a7b8-c7bd98cfdf49-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7mzng\" (UID: \"ab78fbf6-65df-4306-a7b8-c7bd98cfdf49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897440 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p5td\" (UniqueName: \"kubernetes.io/projected/ec60d287-0f21-467c-8030-84b8726af567-kube-api-access-8p5td\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897457 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897477 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d1887a7-d8a5-45f6-97fc-c32a870089ef-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897495 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xv7l\" (UniqueName: \"kubernetes.io/projected/f28c04c3-66f1-4c29-b7d1-cac5aa342370-kube-api-access-8xv7l\") pod \"console-operator-58897d9998-bks74\" (UID: \"f28c04c3-66f1-4c29-b7d1-cac5aa342370\") " pod="openshift-console-operator/console-operator-58897d9998-bks74" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897510 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-config\") pod \"controller-manager-879f6c89f-7ph8l\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897526 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cxdq\" (UniqueName: \"kubernetes.io/projected/958d08af-86cf-467f-947b-4485163fd695-kube-api-access-6cxdq\") pod \"machine-approver-56656f9798-rxhk8\" (UID: \"958d08af-86cf-467f-947b-4485163fd695\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897551 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3d1887a7-d8a5-45f6-97fc-c32a870089ef-etcd-client\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897565 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3d1887a7-d8a5-45f6-97fc-c32a870089ef-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897581 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-trusted-ca-bundle\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897596 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897615 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-546h9\" (UniqueName: \"kubernetes.io/projected/102f7fb5-3031-4853-b112-2aa910aa63a7-kube-api-access-546h9\") pod \"route-controller-manager-6576b87f9c-vtqjw\" (UID: \"102f7fb5-3031-4853-b112-2aa910aa63a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897630 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ed6485ab-c517-41cd-a755-d5dc9557456b-audit-dir\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897646 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897662 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ec60d287-0f21-467c-8030-84b8726af567-console-oauth-config\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897679 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86ffb50f-47f6-47b2-9141-1de9999a13e0-serving-cert\") pod \"controller-manager-879f6c89f-7ph8l\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897694 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ab78fbf6-65df-4306-a7b8-c7bd98cfdf49-images\") pod \"machine-api-operator-5694c8668f-7mzng\" (UID: \"ab78fbf6-65df-4306-a7b8-c7bd98cfdf49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897709 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e811d29-20d6-4576-be0f-dc59cf11b497-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jdtt8\" (UID: \"4e811d29-20d6-4576-be0f-dc59cf11b497\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897724 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85a5ad8f-8c61-4e28-8e23-9a51e7796e37-service-ca-bundle\") pod \"authentication-operator-69f744f599-qks97\" (UID: \"85a5ad8f-8c61-4e28-8e23-9a51e7796e37\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897757 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f28c04c3-66f1-4c29-b7d1-cac5aa342370-trusted-ca\") pod \"console-operator-58897d9998-bks74\" (UID: \"f28c04c3-66f1-4c29-b7d1-cac5aa342370\") " pod="openshift-console-operator/console-operator-58897d9998-bks74" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.898016 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.898242 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed6485ab-c517-41cd-a755-d5dc9557456b-config\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.895910 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.898404 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85a5ad8f-8c61-4e28-8e23-9a51e7796e37-service-ca-bundle\") pod \"authentication-operator-69f744f599-qks97\" (UID: \"85a5ad8f-8c61-4e28-8e23-9a51e7796e37\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896621 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.898767 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.898897 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.899121 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-oauth-serving-cert\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.899664 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.900092 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-pb7s7"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.902043 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d1887a7-d8a5-45f6-97fc-c32a870089ef-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.903339 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d1887a7-d8a5-45f6-97fc-c32a870089ef-serving-cert\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.905762 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/958d08af-86cf-467f-947b-4485163fd695-machine-approver-tls\") pod \"machine-approver-56656f9798-rxhk8\" (UID: \"958d08af-86cf-467f-947b-4485163fd695\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.906867 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e736a4ca-7c18-423c-a5de-aeafd8d6a42e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-22qsd\" (UID: \"e736a4ca-7c18-423c-a5de-aeafd8d6a42e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22qsd" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.906964 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3d1887a7-d8a5-45f6-97fc-c32a870089ef-audit-dir\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.907156 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f28c04c3-66f1-4c29-b7d1-cac5aa342370-serving-cert\") pod \"console-operator-58897d9998-bks74\" (UID: \"f28c04c3-66f1-4c29-b7d1-cac5aa342370\") " pod="openshift-console-operator/console-operator-58897d9998-bks74" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.907347 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ed6485ab-c517-41cd-a755-d5dc9557456b-etcd-client\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.908638 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-config\") pod \"controller-manager-879f6c89f-7ph8l\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.908633 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85a5ad8f-8c61-4e28-8e23-9a51e7796e37-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qks97\" (UID: \"85a5ad8f-8c61-4e28-8e23-9a51e7796e37\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.909545 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.910248 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e726eeb3-dfb2-4c3a-bea7-a5c945f25d52-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-89r2g\" (UID: \"e726eeb3-dfb2-4c3a-bea7-a5c945f25d52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.910476 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/102f7fb5-3031-4853-b112-2aa910aa63a7-config\") pod \"route-controller-manager-6576b87f9c-vtqjw\" (UID: \"102f7fb5-3031-4853-b112-2aa910aa63a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.910972 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e811d29-20d6-4576-be0f-dc59cf11b497-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jdtt8\" (UID: \"4e811d29-20d6-4576-be0f-dc59cf11b497\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.911100 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/958d08af-86cf-467f-947b-4485163fd695-auth-proxy-config\") pod \"machine-approver-56656f9798-rxhk8\" (UID: \"958d08af-86cf-467f-947b-4485163fd695\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.911238 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85a5ad8f-8c61-4e28-8e23-9a51e7796e37-serving-cert\") pod \"authentication-operator-69f744f599-qks97\" (UID: \"85a5ad8f-8c61-4e28-8e23-9a51e7796e37\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.911602 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/102f7fb5-3031-4853-b112-2aa910aa63a7-client-ca\") pod \"route-controller-manager-6576b87f9c-vtqjw\" (UID: \"102f7fb5-3031-4853-b112-2aa910aa63a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.911619 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed6485ab-c517-41cd-a755-d5dc9557456b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.912256 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e726eeb3-dfb2-4c3a-bea7-a5c945f25d52-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-89r2g\" (UID: \"e726eeb3-dfb2-4c3a-bea7-a5c945f25d52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.912302 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f28c04c3-66f1-4c29-b7d1-cac5aa342370-config\") pod \"console-operator-58897d9998-bks74\" (UID: \"f28c04c3-66f1-4c29-b7d1-cac5aa342370\") " pod="openshift-console-operator/console-operator-58897d9998-bks74" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.912702 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/958d08af-86cf-467f-947b-4485163fd695-config\") pod \"machine-approver-56656f9798-rxhk8\" (UID: \"958d08af-86cf-467f-947b-4485163fd695\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.912789 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.912852 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9de314c5-1440-476b-b98b-7804f5d95145-audit-dir\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.913455 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4h49m"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.914708 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85a5ad8f-8c61-4e28-8e23-9a51e7796e37-config\") pod \"authentication-operator-69f744f599-qks97\" (UID: \"85a5ad8f-8c61-4e28-8e23-9a51e7796e37\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.915628 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7ph8l\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.904281 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ed6485ab-c517-41cd-a755-d5dc9557456b-audit\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.915709 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/102f7fb5-3031-4853-b112-2aa910aa63a7-serving-cert\") pod \"route-controller-manager-6576b87f9c-vtqjw\" (UID: \"102f7fb5-3031-4853-b112-2aa910aa63a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.916011 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab78fbf6-65df-4306-a7b8-c7bd98cfdf49-config\") pod \"machine-api-operator-5694c8668f-7mzng\" (UID: \"ab78fbf6-65df-4306-a7b8-c7bd98cfdf49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.916469 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ed6485ab-c517-41cd-a755-d5dc9557456b-audit-dir\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.916513 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.917040 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3d1887a7-d8a5-45f6-97fc-c32a870089ef-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.917083 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-client-ca\") pod \"controller-manager-879f6c89f-7ph8l\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.917617 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-service-ca\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.917967 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-trusted-ca-bundle\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.918041 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec60d287-0f21-467c-8030-84b8726af567-console-serving-cert\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.918472 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3d1887a7-d8a5-45f6-97fc-c32a870089ef-audit-policies\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.918907 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-console-config\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.919043 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.919253 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.919561 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.919663 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.920358 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.920571 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.921602 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed6485ab-c517-41cd-a755-d5dc9557456b-serving-cert\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.921754 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.921826 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ed6485ab-c517-41cd-a755-d5dc9557456b-image-import-ca\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.922243 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-audit-policies\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.922298 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ed6485ab-c517-41cd-a755-d5dc9557456b-node-pullsecrets\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.922573 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ab78fbf6-65df-4306-a7b8-c7bd98cfdf49-images\") pod \"machine-api-operator-5694c8668f-7mzng\" (UID: \"ab78fbf6-65df-4306-a7b8-c7bd98cfdf49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.922718 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ec60d287-0f21-467c-8030-84b8726af567-console-oauth-config\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.923138 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ed6485ab-c517-41cd-a755-d5dc9557456b-etcd-serving-ca\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.923903 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86ffb50f-47f6-47b2-9141-1de9999a13e0-serving-cert\") pod \"controller-manager-879f6c89f-7ph8l\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.924393 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3d1887a7-d8a5-45f6-97fc-c32a870089ef-etcd-client\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.924967 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.926774 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.928469 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.929117 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e811d29-20d6-4576-be0f-dc59cf11b497-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jdtt8\" (UID: \"4e811d29-20d6-4576-be0f-dc59cf11b497\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.930203 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3d1887a7-d8a5-45f6-97fc-c32a870089ef-encryption-config\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.930262 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.930325 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ab78fbf6-65df-4306-a7b8-c7bd98cfdf49-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7mzng\" (UID: \"ab78fbf6-65df-4306-a7b8-c7bd98cfdf49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.930710 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ed6485ab-c517-41cd-a755-d5dc9557456b-encryption-config\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.930752 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-sd5jz"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.932027 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.934493 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.935507 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-hxxkd"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.936515 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.937397 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.939395 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-c6n7h"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.940178 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-c6n7h" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.943239 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rvkhj"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.943271 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-t48rm"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.944199 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.945426 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6b8wp"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.946388 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.947349 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.948331 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-rt6qz"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.949348 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7ph8l"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.950368 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-hxxkd"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.951372 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.952346 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-7m8cj"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.952888 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.953184 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7m8cj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.953390 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-st76p"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.953988 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-st76p" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.955870 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ggndt"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.957119 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.959002 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.959955 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-8l99c"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.962836 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.963944 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.965465 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.966678 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.968246 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.969683 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7m8cj"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.970827 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.972660 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.981577 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.982710 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-c6n7h"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.983941 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-42wfj"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.985008 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-wp452"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.986124 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-5qppq"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.987026 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dfw9n"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.993380 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.013487 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.033091 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.052995 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.073210 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.092591 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.113353 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.133086 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.153047 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.173980 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.194398 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.212954 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.233716 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.252930 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.273555 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.293287 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.314207 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.333119 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.353717 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.374113 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.392977 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.413195 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.434072 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.453850 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.473562 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.493612 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.514261 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.533439 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.554330 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.574387 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.593751 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.613972 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.633329 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.652775 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.673855 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.692919 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.714057 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.733480 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.753070 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.774035 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.794009 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.813110 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.832046 4795 request.go:700] Waited for 1.014814646s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/secrets?fieldSelector=metadata.name%3Dkube-storage-version-migrator-operator-dockercfg-2bh8d&limit=500&resourceVersion=0 Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.833653 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.853731 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.873711 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.893046 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.913373 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.953060 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.973900 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.993901 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.013917 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.053097 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.073426 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.093506 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.113456 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.133766 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.153417 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.173643 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.194458 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.213952 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.234450 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.254376 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.274067 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.293972 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.314023 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.333387 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.353262 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.391833 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e726eeb3-dfb2-4c3a-bea7-a5c945f25d52-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-89r2g\" (UID: \"e726eeb3-dfb2-4c3a-bea7-a5c945f25d52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.410869 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxfbg\" (UniqueName: \"kubernetes.io/projected/9de314c5-1440-476b-b98b-7804f5d95145-kube-api-access-cxfbg\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.431724 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7n56\" (UniqueName: \"kubernetes.io/projected/85a5ad8f-8c61-4e28-8e23-9a51e7796e37-kube-api-access-r7n56\") pod \"authentication-operator-69f744f599-qks97\" (UID: \"85a5ad8f-8c61-4e28-8e23-9a51e7796e37\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.433380 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.455964 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.472123 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjzvv\" (UniqueName: \"kubernetes.io/projected/ab78fbf6-65df-4306-a7b8-c7bd98cfdf49-kube-api-access-qjzvv\") pod \"machine-api-operator-5694c8668f-7mzng\" (UID: \"ab78fbf6-65df-4306-a7b8-c7bd98cfdf49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.488016 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbbjp\" (UniqueName: \"kubernetes.io/projected/4e811d29-20d6-4576-be0f-dc59cf11b497-kube-api-access-tbbjp\") pod \"openshift-apiserver-operator-796bbdcf4f-jdtt8\" (UID: \"4e811d29-20d6-4576-be0f-dc59cf11b497\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.493921 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.497371 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.505572 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.514606 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.542621 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.555907 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.573338 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.596466 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qgn8\" (UniqueName: \"kubernetes.io/projected/e736a4ca-7c18-423c-a5de-aeafd8d6a42e-kube-api-access-6qgn8\") pod \"cluster-samples-operator-665b6dd947-22qsd\" (UID: \"e736a4ca-7c18-423c-a5de-aeafd8d6a42e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22qsd" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.610824 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xv7l\" (UniqueName: \"kubernetes.io/projected/f28c04c3-66f1-4c29-b7d1-cac5aa342370-kube-api-access-8xv7l\") pod \"console-operator-58897d9998-bks74\" (UID: \"f28c04c3-66f1-4c29-b7d1-cac5aa342370\") " pod="openshift-console-operator/console-operator-58897d9998-bks74" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.629362 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rvcj\" (UniqueName: \"kubernetes.io/projected/86ffb50f-47f6-47b2-9141-1de9999a13e0-kube-api-access-4rvcj\") pod \"controller-manager-879f6c89f-7ph8l\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.649463 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22qsd" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.650323 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w4sp\" (UniqueName: \"kubernetes.io/projected/96e1b4b4-8d48-4955-b756-71d21a5aea0b-kube-api-access-4w4sp\") pod \"downloads-7954f5f757-rt6qz\" (UID: \"96e1b4b4-8d48-4955-b756-71d21a5aea0b\") " pod="openshift-console/downloads-7954f5f757-rt6qz" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.652922 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-pb7s7"] Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.670801 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cxdq\" (UniqueName: \"kubernetes.io/projected/958d08af-86cf-467f-947b-4485163fd695-kube-api-access-6cxdq\") pod \"machine-approver-56656f9798-rxhk8\" (UID: \"958d08af-86cf-467f-947b-4485163fd695\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.683961 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7mzng"] Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.689085 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnzj5\" (UniqueName: \"kubernetes.io/projected/ed6485ab-c517-41cd-a755-d5dc9557456b-kube-api-access-rnzj5\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:29 crc kubenswrapper[4795]: W0219 21:30:29.690865 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab78fbf6_65df_4306_a7b8_c7bd98cfdf49.slice/crio-e08cbad049afd6e4df92a67723860d8461febe9f1e199e1df3fb5da0e93d79e9 WatchSource:0}: Error finding container e08cbad049afd6e4df92a67723860d8461febe9f1e199e1df3fb5da0e93d79e9: Status 404 returned error can't find the container with id e08cbad049afd6e4df92a67723860d8461febe9f1e199e1df3fb5da0e93d79e9 Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.727520 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czvcn\" (UniqueName: \"kubernetes.io/projected/3d1887a7-d8a5-45f6-97fc-c32a870089ef-kube-api-access-czvcn\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.749761 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-546h9\" (UniqueName: \"kubernetes.io/projected/102f7fb5-3031-4853-b112-2aa910aa63a7-kube-api-access-546h9\") pod \"route-controller-manager-6576b87f9c-vtqjw\" (UID: \"102f7fb5-3031-4853-b112-2aa910aa63a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.752266 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p5td\" (UniqueName: \"kubernetes.io/projected/ec60d287-0f21-467c-8030-84b8726af567-kube-api-access-8p5td\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.768520 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c8ll\" (UniqueName: \"kubernetes.io/projected/e726eeb3-dfb2-4c3a-bea7-a5c945f25d52-kube-api-access-4c8ll\") pod \"cluster-image-registry-operator-dc59b4c8b-89r2g\" (UID: \"e726eeb3-dfb2-4c3a-bea7-a5c945f25d52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.774789 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.777396 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-bks74" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.785550 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.790832 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8"] Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.791103 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-rt6qz" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.793238 4795 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.813190 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.814331 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.826884 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.828920 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qks97"] Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.832556 4795 request.go:700] Waited for 1.892220907s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-default-metrics-tls&limit=500&resourceVersion=0 Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.834129 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 19 21:30:29 crc kubenswrapper[4795]: W0219 21:30:29.839578 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e811d29_20d6_4576_be0f_dc59cf11b497.slice/crio-a225b9dbb5829dc9a010e0cea574920e124adc579b745a974853939c6761b15b WatchSource:0}: Error finding container a225b9dbb5829dc9a010e0cea574920e124adc579b745a974853939c6761b15b: Status 404 returned error can't find the container with id a225b9dbb5829dc9a010e0cea574920e124adc579b745a974853939c6761b15b Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.853795 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.874622 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.893896 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.907435 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.907789 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22qsd"] Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.913896 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.929257 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.933952 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.955584 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.973957 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.975635 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.993966 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.013152 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.066534 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.106866 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" event={"ID":"958d08af-86cf-467f-947b-4485163fd695","Type":"ContainerStarted","Data":"6ef3a2b7973f10cbab1a50172dcfa9231c814a27302bbe3f3d2d2766a6779de2"} Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.111962 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" event={"ID":"ab78fbf6-65df-4306-a7b8-c7bd98cfdf49","Type":"ContainerStarted","Data":"5e9a49f9ee76f8861850372260d1dc3a6cb6e8528a5cfb294577c6562c884c7e"} Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.111988 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" event={"ID":"ab78fbf6-65df-4306-a7b8-c7bd98cfdf49","Type":"ContainerStarted","Data":"e399854ac6989202c88f9416e637fc539ba3651ba6b6d92efc53ec026b8cbe64"} Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.111998 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" event={"ID":"ab78fbf6-65df-4306-a7b8-c7bd98cfdf49","Type":"ContainerStarted","Data":"e08cbad049afd6e4df92a67723860d8461febe9f1e199e1df3fb5da0e93d79e9"} Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.117863 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" event={"ID":"9de314c5-1440-476b-b98b-7804f5d95145","Type":"ContainerStarted","Data":"4877e8c5ce5d915ffce78d7fcaa41becc17c2fa95a99e16fe8f7fd820d5bc200"} Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.117923 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.117937 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" event={"ID":"9de314c5-1440-476b-b98b-7804f5d95145","Type":"ContainerStarted","Data":"704e722c476db821d8e6f00d8c80db7e6888aef51f3367193fd7b4f2cac02bc3"} Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.121006 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22qsd" event={"ID":"e736a4ca-7c18-423c-a5de-aeafd8d6a42e","Type":"ContainerStarted","Data":"a47acb3ecc1d10e216fd9499681158ffe4290a98dd0b6be6c113f96e5a89d287"} Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.123931 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/96b2274a-7361-4463-8562-1319e967066b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-94s6c\" (UID: \"96b2274a-7361-4463-8562-1319e967066b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.123966 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1f61e27e-ff25-4d76-814b-ed72e576547c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjnbz\" (UID: \"1f61e27e-ff25-4d76-814b-ed72e576547c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124001 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7390208-03b5-4ffa-9259-f5f1d9354c52-serving-cert\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124020 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vppps\" (UniqueName: \"kubernetes.io/projected/a3a4a159-1ab3-412f-ac71-11a7a41012ea-kube-api-access-vppps\") pod \"router-default-5444994796-r8dcx\" (UID: \"a3a4a159-1ab3-412f-ac71-11a7a41012ea\") " pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124057 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz87x\" (UniqueName: \"kubernetes.io/projected/821fa263-1235-4a62-8818-1f41d7e77a62-kube-api-access-pz87x\") pod \"openshift-controller-manager-operator-756b6f6bc6-sf27h\" (UID: \"821fa263-1235-4a62-8818-1f41d7e77a62\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124104 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts7rd\" (UniqueName: \"kubernetes.io/projected/4491a495-df05-43ef-bff7-2438317eac71-kube-api-access-ts7rd\") pod \"dns-operator-744455d44c-ggndt\" (UID: \"4491a495-df05-43ef-bff7-2438317eac71\") " pod="openshift-dns-operator/dns-operator-744455d44c-ggndt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124120 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgtbb\" (UniqueName: \"kubernetes.io/projected/b9f6ef9a-86de-4d04-ba12-30197f5c83ed-kube-api-access-hgtbb\") pod \"machine-config-controller-84d6567774-tg6hf\" (UID: \"b9f6ef9a-86de-4d04-ba12-30197f5c83ed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124134 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fee0aadb-9ec0-4e6c-bb48-27b74560a4ac-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vhbxd\" (UID: \"fee0aadb-9ec0-4e6c-bb48-27b74560a4ac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124149 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b7390208-03b5-4ffa-9259-f5f1d9354c52-etcd-client\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124178 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/35ca1bac-5479-4e7f-80a5-c1811edc9e8e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-8l99c\" (UID: \"35ca1bac-5479-4e7f-80a5-c1811edc9e8e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124196 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/49a1bc45-62f5-45d2-a475-b2b562cd9b98-srv-cert\") pod \"catalog-operator-68c6474976-5t4db\" (UID: \"49a1bc45-62f5-45d2-a475-b2b562cd9b98\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124241 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4kqs\" (UniqueName: \"kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-kube-api-access-b4kqs\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124255 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4491a495-df05-43ef-bff7-2438317eac71-metrics-tls\") pod \"dns-operator-744455d44c-ggndt\" (UID: \"4491a495-df05-43ef-bff7-2438317eac71\") " pod="openshift-dns-operator/dns-operator-744455d44c-ggndt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124279 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/80407681-6091-46cc-836f-757ec4d16604-registry-certificates\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124301 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/275a566e-d165-434a-95c1-9154ede6e14e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4lbxt\" (UID: \"275a566e-d165-434a-95c1-9154ede6e14e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124327 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89594551-78e6-49af-9376-477cf01d2dc5-config\") pod \"kube-apiserver-operator-766d6c64bb-6np95\" (UID: \"89594551-78e6-49af-9376-477cf01d2dc5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124361 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/821fa263-1235-4a62-8818-1f41d7e77a62-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-sf27h\" (UID: \"821fa263-1235-4a62-8818-1f41d7e77a62\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124385 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7390208-03b5-4ffa-9259-f5f1d9354c52-config\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124400 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3a4a159-1ab3-412f-ac71-11a7a41012ea-service-ca-bundle\") pod \"router-default-5444994796-r8dcx\" (UID: \"a3a4a159-1ab3-412f-ac71-11a7a41012ea\") " pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124416 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9be22b71-1386-4be4-a38a-6f3b97669b9c-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-x54th\" (UID: \"9be22b71-1386-4be4-a38a-6f3b97669b9c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124446 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89594551-78e6-49af-9376-477cf01d2dc5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6np95\" (UID: \"89594551-78e6-49af-9376-477cf01d2dc5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124460 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9f6ef9a-86de-4d04-ba12-30197f5c83ed-proxy-tls\") pod \"machine-config-controller-84d6567774-tg6hf\" (UID: \"b9f6ef9a-86de-4d04-ba12-30197f5c83ed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124475 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0da0af7f-f8f8-492d-bd44-1e81ab242a24-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-42wfj\" (UID: \"0da0af7f-f8f8-492d-bd44-1e81ab242a24\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-42wfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124507 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74cgb\" (UniqueName: \"kubernetes.io/projected/116348e7-5632-4244-8ae4-f81b06c6df4d-kube-api-access-74cgb\") pod \"migrator-59844c95c7-6b8wp\" (UID: \"116348e7-5632-4244-8ae4-f81b06c6df4d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6b8wp" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124523 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hxj5\" (UniqueName: \"kubernetes.io/projected/49a1bc45-62f5-45d2-a475-b2b562cd9b98-kube-api-access-5hxj5\") pod \"catalog-operator-68c6474976-5t4db\" (UID: \"49a1bc45-62f5-45d2-a475-b2b562cd9b98\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124539 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4tq7\" (UniqueName: \"kubernetes.io/projected/73ed383c-c1e3-4f47-86d3-6faa77121e28-kube-api-access-q4tq7\") pod \"kube-storage-version-migrator-operator-b67b599dd-nxhdc\" (UID: \"73ed383c-c1e3-4f47-86d3-6faa77121e28\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124563 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9be22b71-1386-4be4-a38a-6f3b97669b9c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-x54th\" (UID: \"9be22b71-1386-4be4-a38a-6f3b97669b9c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124643 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fee0aadb-9ec0-4e6c-bb48-27b74560a4ac-images\") pod \"machine-config-operator-74547568cd-vhbxd\" (UID: \"fee0aadb-9ec0-4e6c-bb48-27b74560a4ac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124674 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjl9d\" (UniqueName: \"kubernetes.io/projected/275a566e-d165-434a-95c1-9154ede6e14e-kube-api-access-pjl9d\") pod \"ingress-operator-5b745b69d9-4lbxt\" (UID: \"275a566e-d165-434a-95c1-9154ede6e14e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124710 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73ed383c-c1e3-4f47-86d3-6faa77121e28-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-nxhdc\" (UID: \"73ed383c-c1e3-4f47-86d3-6faa77121e28\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124726 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt2hd\" (UniqueName: \"kubernetes.io/projected/fee0aadb-9ec0-4e6c-bb48-27b74560a4ac-kube-api-access-lt2hd\") pod \"machine-config-operator-74547568cd-vhbxd\" (UID: \"fee0aadb-9ec0-4e6c-bb48-27b74560a4ac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124749 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3a4a159-1ab3-412f-ac71-11a7a41012ea-metrics-certs\") pod \"router-default-5444994796-r8dcx\" (UID: \"a3a4a159-1ab3-412f-ac71-11a7a41012ea\") " pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124764 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f61e27e-ff25-4d76-814b-ed72e576547c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjnbz\" (UID: \"1f61e27e-ff25-4d76-814b-ed72e576547c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124779 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/275a566e-d165-434a-95c1-9154ede6e14e-trusted-ca\") pod \"ingress-operator-5b745b69d9-4lbxt\" (UID: \"275a566e-d165-434a-95c1-9154ede6e14e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124795 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f61e27e-ff25-4d76-814b-ed72e576547c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjnbz\" (UID: \"1f61e27e-ff25-4d76-814b-ed72e576547c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124810 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80407681-6091-46cc-836f-757ec4d16604-trusted-ca\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124836 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124895 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89594551-78e6-49af-9376-477cf01d2dc5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6np95\" (UID: \"89594551-78e6-49af-9376-477cf01d2dc5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124960 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-bound-sa-token\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124975 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrjbp\" (UniqueName: \"kubernetes.io/projected/35ca1bac-5479-4e7f-80a5-c1811edc9e8e-kube-api-access-nrjbp\") pod \"openshift-config-operator-7777fb866f-8l99c\" (UID: \"35ca1bac-5479-4e7f-80a5-c1811edc9e8e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125001 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6k8fb\" (UniqueName: \"kubernetes.io/projected/96b2274a-7361-4463-8562-1319e967066b-kube-api-access-6k8fb\") pod \"package-server-manager-789f6589d5-94s6c\" (UID: \"96b2274a-7361-4463-8562-1319e967066b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125054 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b7390208-03b5-4ffa-9259-f5f1d9354c52-etcd-service-ca\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125071 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swsnw\" (UniqueName: \"kubernetes.io/projected/b7390208-03b5-4ffa-9259-f5f1d9354c52-kube-api-access-swsnw\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125087 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b9f6ef9a-86de-4d04-ba12-30197f5c83ed-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tg6hf\" (UID: \"b9f6ef9a-86de-4d04-ba12-30197f5c83ed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125101 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/80407681-6091-46cc-836f-757ec4d16604-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125116 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a3a4a159-1ab3-412f-ac71-11a7a41012ea-default-certificate\") pod \"router-default-5444994796-r8dcx\" (UID: \"a3a4a159-1ab3-412f-ac71-11a7a41012ea\") " pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125134 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b7390208-03b5-4ffa-9259-f5f1d9354c52-etcd-ca\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125150 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/80407681-6091-46cc-836f-757ec4d16604-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125178 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/49a1bc45-62f5-45d2-a475-b2b562cd9b98-profile-collector-cert\") pod \"catalog-operator-68c6474976-5t4db\" (UID: \"49a1bc45-62f5-45d2-a475-b2b562cd9b98\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125194 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a3a4a159-1ab3-412f-ac71-11a7a41012ea-stats-auth\") pod \"router-default-5444994796-r8dcx\" (UID: \"a3a4a159-1ab3-412f-ac71-11a7a41012ea\") " pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125210 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fee0aadb-9ec0-4e6c-bb48-27b74560a4ac-proxy-tls\") pod \"machine-config-operator-74547568cd-vhbxd\" (UID: \"fee0aadb-9ec0-4e6c-bb48-27b74560a4ac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125225 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/821fa263-1235-4a62-8818-1f41d7e77a62-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-sf27h\" (UID: \"821fa263-1235-4a62-8818-1f41d7e77a62\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125268 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-registry-tls\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125283 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73ed383c-c1e3-4f47-86d3-6faa77121e28-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-nxhdc\" (UID: \"73ed383c-c1e3-4f47-86d3-6faa77121e28\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125313 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh8cm\" (UniqueName: \"kubernetes.io/projected/0da0af7f-f8f8-492d-bd44-1e81ab242a24-kube-api-access-hh8cm\") pod \"control-plane-machine-set-operator-78cbb6b69f-42wfj\" (UID: \"0da0af7f-f8f8-492d-bd44-1e81ab242a24\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-42wfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125338 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/275a566e-d165-434a-95c1-9154ede6e14e-metrics-tls\") pod \"ingress-operator-5b745b69d9-4lbxt\" (UID: \"275a566e-d165-434a-95c1-9154ede6e14e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125353 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9be22b71-1386-4be4-a38a-6f3b97669b9c-config\") pod \"kube-controller-manager-operator-78b949d7b-x54th\" (UID: \"9be22b71-1386-4be4-a38a-6f3b97669b9c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125370 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35ca1bac-5479-4e7f-80a5-c1811edc9e8e-serving-cert\") pod \"openshift-config-operator-7777fb866f-8l99c\" (UID: \"35ca1bac-5479-4e7f-80a5-c1811edc9e8e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" Feb 19 21:30:30 crc kubenswrapper[4795]: E0219 21:30:30.127839 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:30.627827569 +0000 UTC m=+141.820345433 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.129570 4795 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-pb7s7 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" start-of-body= Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.129618 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" podUID="9de314c5-1440-476b-b98b-7804f5d95145" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.132289 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" event={"ID":"85a5ad8f-8c61-4e28-8e23-9a51e7796e37","Type":"ContainerStarted","Data":"1e9c5c5010f2025652496676ae8ef5a35367e452fc08339397f613200f034bf7"} Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.132367 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" event={"ID":"85a5ad8f-8c61-4e28-8e23-9a51e7796e37","Type":"ContainerStarted","Data":"009cee7f31631bbb3b8adf58cf6d07a80b4ad1881ea2472cb834a312f36c3b3b"} Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.135076 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8" event={"ID":"4e811d29-20d6-4576-be0f-dc59cf11b497","Type":"ContainerStarted","Data":"f98138c9981b8d5841eb213d8855b1938ee68b7d61ad23adca721db8dc42dd50"} Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.135150 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8" event={"ID":"4e811d29-20d6-4576-be0f-dc59cf11b497","Type":"ContainerStarted","Data":"a225b9dbb5829dc9a010e0cea574920e124adc579b745a974853939c6761b15b"} Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226087 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226348 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz87x\" (UniqueName: \"kubernetes.io/projected/821fa263-1235-4a62-8818-1f41d7e77a62-kube-api-access-pz87x\") pod \"openshift-controller-manager-operator-756b6f6bc6-sf27h\" (UID: \"821fa263-1235-4a62-8818-1f41d7e77a62\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226422 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts7rd\" (UniqueName: \"kubernetes.io/projected/4491a495-df05-43ef-bff7-2438317eac71-kube-api-access-ts7rd\") pod \"dns-operator-744455d44c-ggndt\" (UID: \"4491a495-df05-43ef-bff7-2438317eac71\") " pod="openshift-dns-operator/dns-operator-744455d44c-ggndt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226466 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgtbb\" (UniqueName: \"kubernetes.io/projected/b9f6ef9a-86de-4d04-ba12-30197f5c83ed-kube-api-access-hgtbb\") pod \"machine-config-controller-84d6567774-tg6hf\" (UID: \"b9f6ef9a-86de-4d04-ba12-30197f5c83ed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226494 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b7390208-03b5-4ffa-9259-f5f1d9354c52-etcd-client\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226520 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fee0aadb-9ec0-4e6c-bb48-27b74560a4ac-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vhbxd\" (UID: \"fee0aadb-9ec0-4e6c-bb48-27b74560a4ac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226550 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/adcffec1-66db-4e9b-9502-be3c9b008dde-metrics-tls\") pod \"dns-default-c6n7h\" (UID: \"adcffec1-66db-4e9b-9502-be3c9b008dde\") " pod="openshift-dns/dns-default-c6n7h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226576 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/35ca1bac-5479-4e7f-80a5-c1811edc9e8e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-8l99c\" (UID: \"35ca1bac-5479-4e7f-80a5-c1811edc9e8e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226602 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-258mh\" (UniqueName: \"kubernetes.io/projected/9ed4dfab-8b23-46d5-a983-db2ec1371ce2-kube-api-access-258mh\") pod \"ingress-canary-7m8cj\" (UID: \"9ed4dfab-8b23-46d5-a983-db2ec1371ce2\") " pod="openshift-ingress-canary/ingress-canary-7m8cj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226642 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/49a1bc45-62f5-45d2-a475-b2b562cd9b98-srv-cert\") pod \"catalog-operator-68c6474976-5t4db\" (UID: \"49a1bc45-62f5-45d2-a475-b2b562cd9b98\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226671 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht9nk\" (UniqueName: \"kubernetes.io/projected/3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8-kube-api-access-ht9nk\") pod \"machine-config-server-st76p\" (UID: \"3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8\") " pod="openshift-machine-config-operator/machine-config-server-st76p" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226697 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8-certs\") pod \"machine-config-server-st76p\" (UID: \"3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8\") " pod="openshift-machine-config-operator/machine-config-server-st76p" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226738 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4kqs\" (UniqueName: \"kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-kube-api-access-b4kqs\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226763 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4491a495-df05-43ef-bff7-2438317eac71-metrics-tls\") pod \"dns-operator-744455d44c-ggndt\" (UID: \"4491a495-df05-43ef-bff7-2438317eac71\") " pod="openshift-dns-operator/dns-operator-744455d44c-ggndt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226786 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/80407681-6091-46cc-836f-757ec4d16604-registry-certificates\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226809 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/275a566e-d165-434a-95c1-9154ede6e14e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4lbxt\" (UID: \"275a566e-d165-434a-95c1-9154ede6e14e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226835 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89594551-78e6-49af-9376-477cf01d2dc5-config\") pod \"kube-apiserver-operator-766d6c64bb-6np95\" (UID: \"89594551-78e6-49af-9376-477cf01d2dc5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226858 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-mountpoint-dir\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226898 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-csi-data-dir\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226920 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htpcf\" (UniqueName: \"kubernetes.io/projected/a82f8242-05f0-48f7-8f86-cf472309b8e3-kube-api-access-htpcf\") pod \"service-ca-operator-777779d784-2cbkw\" (UID: \"a82f8242-05f0-48f7-8f86-cf472309b8e3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226963 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/821fa263-1235-4a62-8818-1f41d7e77a62-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-sf27h\" (UID: \"821fa263-1235-4a62-8818-1f41d7e77a62\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227030 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7390208-03b5-4ffa-9259-f5f1d9354c52-config\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227055 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5667f763-a535-49eb-90a2-b78f1ebad0b7-signing-key\") pod \"service-ca-9c57cc56f-wp452\" (UID: \"5667f763-a535-49eb-90a2-b78f1ebad0b7\") " pod="openshift-service-ca/service-ca-9c57cc56f-wp452" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227093 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3a4a159-1ab3-412f-ac71-11a7a41012ea-service-ca-bundle\") pod \"router-default-5444994796-r8dcx\" (UID: \"a3a4a159-1ab3-412f-ac71-11a7a41012ea\") " pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227115 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9be22b71-1386-4be4-a38a-6f3b97669b9c-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-x54th\" (UID: \"9be22b71-1386-4be4-a38a-6f3b97669b9c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227138 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9f6ef9a-86de-4d04-ba12-30197f5c83ed-proxy-tls\") pod \"machine-config-controller-84d6567774-tg6hf\" (UID: \"b9f6ef9a-86de-4d04-ba12-30197f5c83ed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227180 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0da0af7f-f8f8-492d-bd44-1e81ab242a24-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-42wfj\" (UID: \"0da0af7f-f8f8-492d-bd44-1e81ab242a24\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-42wfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227209 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89594551-78e6-49af-9376-477cf01d2dc5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6np95\" (UID: \"89594551-78e6-49af-9376-477cf01d2dc5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227261 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hxj5\" (UniqueName: \"kubernetes.io/projected/49a1bc45-62f5-45d2-a475-b2b562cd9b98-kube-api-access-5hxj5\") pod \"catalog-operator-68c6474976-5t4db\" (UID: \"49a1bc45-62f5-45d2-a475-b2b562cd9b98\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227288 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4tq7\" (UniqueName: \"kubernetes.io/projected/73ed383c-c1e3-4f47-86d3-6faa77121e28-kube-api-access-q4tq7\") pod \"kube-storage-version-migrator-operator-b67b599dd-nxhdc\" (UID: \"73ed383c-c1e3-4f47-86d3-6faa77121e28\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227316 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74cgb\" (UniqueName: \"kubernetes.io/projected/116348e7-5632-4244-8ae4-f81b06c6df4d-kube-api-access-74cgb\") pod \"migrator-59844c95c7-6b8wp\" (UID: \"116348e7-5632-4244-8ae4-f81b06c6df4d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6b8wp" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227345 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9be22b71-1386-4be4-a38a-6f3b97669b9c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-x54th\" (UID: \"9be22b71-1386-4be4-a38a-6f3b97669b9c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227371 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b11366da-6972-44ce-8e8c-151de77fa689-webhook-cert\") pod \"packageserver-d55dfcdfc-pgfxb\" (UID: \"b11366da-6972-44ce-8e8c-151de77fa689\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227429 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fee0aadb-9ec0-4e6c-bb48-27b74560a4ac-images\") pod \"machine-config-operator-74547568cd-vhbxd\" (UID: \"fee0aadb-9ec0-4e6c-bb48-27b74560a4ac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227452 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-675sf\" (UniqueName: \"kubernetes.io/projected/c66857b5-461d-4c11-8fa3-52cd619bba60-kube-api-access-675sf\") pod \"multus-admission-controller-857f4d67dd-5qppq\" (UID: \"c66857b5-461d-4c11-8fa3-52cd619bba60\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5qppq" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227491 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7005ed62-dcc4-4fb5-ac2b-3aba9de5708a-srv-cert\") pod \"olm-operator-6b444d44fb-m4bg2\" (UID: \"7005ed62-dcc4-4fb5-ac2b-3aba9de5708a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227514 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjl9d\" (UniqueName: \"kubernetes.io/projected/275a566e-d165-434a-95c1-9154ede6e14e-kube-api-access-pjl9d\") pod \"ingress-operator-5b745b69d9-4lbxt\" (UID: \"275a566e-d165-434a-95c1-9154ede6e14e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227542 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73ed383c-c1e3-4f47-86d3-6faa77121e28-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-nxhdc\" (UID: \"73ed383c-c1e3-4f47-86d3-6faa77121e28\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227565 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt2hd\" (UniqueName: \"kubernetes.io/projected/fee0aadb-9ec0-4e6c-bb48-27b74560a4ac-kube-api-access-lt2hd\") pod \"machine-config-operator-74547568cd-vhbxd\" (UID: \"fee0aadb-9ec0-4e6c-bb48-27b74560a4ac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227619 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3a4a159-1ab3-412f-ac71-11a7a41012ea-metrics-certs\") pod \"router-default-5444994796-r8dcx\" (UID: \"a3a4a159-1ab3-412f-ac71-11a7a41012ea\") " pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227643 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f61e27e-ff25-4d76-814b-ed72e576547c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjnbz\" (UID: \"1f61e27e-ff25-4d76-814b-ed72e576547c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227668 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7005ed62-dcc4-4fb5-ac2b-3aba9de5708a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-m4bg2\" (UID: \"7005ed62-dcc4-4fb5-ac2b-3aba9de5708a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227693 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/275a566e-d165-434a-95c1-9154ede6e14e-trusted-ca\") pod \"ingress-operator-5b745b69d9-4lbxt\" (UID: \"275a566e-d165-434a-95c1-9154ede6e14e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227727 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f61e27e-ff25-4d76-814b-ed72e576547c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjnbz\" (UID: \"1f61e27e-ff25-4d76-814b-ed72e576547c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227750 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80407681-6091-46cc-836f-757ec4d16604-trusted-ca\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227794 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b11366da-6972-44ce-8e8c-151de77fa689-apiservice-cert\") pod \"packageserver-d55dfcdfc-pgfxb\" (UID: \"b11366da-6972-44ce-8e8c-151de77fa689\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" Feb 19 21:30:30 crc kubenswrapper[4795]: E0219 21:30:30.227842 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:30.727822768 +0000 UTC m=+141.920340632 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227745 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/35ca1bac-5479-4e7f-80a5-c1811edc9e8e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-8l99c\" (UID: \"35ca1bac-5479-4e7f-80a5-c1811edc9e8e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.228382 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8bcw\" (UniqueName: \"kubernetes.io/projected/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-kube-api-access-t8bcw\") pod \"marketplace-operator-79b997595-dfw9n\" (UID: \"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.228401 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgb98\" (UniqueName: \"kubernetes.io/projected/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-kube-api-access-zgb98\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.228417 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-secret-volume\") pod \"collect-profiles-29525610-9dlfj\" (UID: \"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.228481 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89594551-78e6-49af-9376-477cf01d2dc5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6np95\" (UID: \"89594551-78e6-49af-9376-477cf01d2dc5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.228532 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l8c6\" (UniqueName: \"kubernetes.io/projected/5667f763-a535-49eb-90a2-b78f1ebad0b7-kube-api-access-7l8c6\") pod \"service-ca-9c57cc56f-wp452\" (UID: \"5667f763-a535-49eb-90a2-b78f1ebad0b7\") " pod="openshift-service-ca/service-ca-9c57cc56f-wp452" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.228550 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-bound-sa-token\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.228567 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrjbp\" (UniqueName: \"kubernetes.io/projected/35ca1bac-5479-4e7f-80a5-c1811edc9e8e-kube-api-access-nrjbp\") pod \"openshift-config-operator-7777fb866f-8l99c\" (UID: \"35ca1bac-5479-4e7f-80a5-c1811edc9e8e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.228623 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6k8fb\" (UniqueName: \"kubernetes.io/projected/96b2274a-7361-4463-8562-1319e967066b-kube-api-access-6k8fb\") pod \"package-server-manager-789f6589d5-94s6c\" (UID: \"96b2274a-7361-4463-8562-1319e967066b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.228642 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dfw9n\" (UID: \"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.228662 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thnz2\" (UniqueName: \"kubernetes.io/projected/adcffec1-66db-4e9b-9502-be3c9b008dde-kube-api-access-thnz2\") pod \"dns-default-c6n7h\" (UID: \"adcffec1-66db-4e9b-9502-be3c9b008dde\") " pod="openshift-dns/dns-default-c6n7h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.228680 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a82f8242-05f0-48f7-8f86-cf472309b8e3-config\") pod \"service-ca-operator-777779d784-2cbkw\" (UID: \"a82f8242-05f0-48f7-8f86-cf472309b8e3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.228718 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b7390208-03b5-4ffa-9259-f5f1d9354c52-etcd-service-ca\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.228736 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swsnw\" (UniqueName: \"kubernetes.io/projected/b7390208-03b5-4ffa-9259-f5f1d9354c52-kube-api-access-swsnw\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.228752 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b9f6ef9a-86de-4d04-ba12-30197f5c83ed-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tg6hf\" (UID: \"b9f6ef9a-86de-4d04-ba12-30197f5c83ed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.229248 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73ed383c-c1e3-4f47-86d3-6faa77121e28-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-nxhdc\" (UID: \"73ed383c-c1e3-4f47-86d3-6faa77121e28\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.233084 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/80407681-6091-46cc-836f-757ec4d16604-registry-certificates\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.235972 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9be22b71-1386-4be4-a38a-6f3b97669b9c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-x54th\" (UID: \"9be22b71-1386-4be4-a38a-6f3b97669b9c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.236058 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b7390208-03b5-4ffa-9259-f5f1d9354c52-etcd-client\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.237474 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fee0aadb-9ec0-4e6c-bb48-27b74560a4ac-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vhbxd\" (UID: \"fee0aadb-9ec0-4e6c-bb48-27b74560a4ac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.237675 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89594551-78e6-49af-9376-477cf01d2dc5-config\") pod \"kube-apiserver-operator-766d6c64bb-6np95\" (UID: \"89594551-78e6-49af-9376-477cf01d2dc5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.243203 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/49a1bc45-62f5-45d2-a475-b2b562cd9b98-srv-cert\") pod \"catalog-operator-68c6474976-5t4db\" (UID: \"49a1bc45-62f5-45d2-a475-b2b562cd9b98\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.243482 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fee0aadb-9ec0-4e6c-bb48-27b74560a4ac-images\") pod \"machine-config-operator-74547568cd-vhbxd\" (UID: \"fee0aadb-9ec0-4e6c-bb48-27b74560a4ac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.244277 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0da0af7f-f8f8-492d-bd44-1e81ab242a24-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-42wfj\" (UID: \"0da0af7f-f8f8-492d-bd44-1e81ab242a24\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-42wfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.248120 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rvkhj"] Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.252291 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7ph8l"] Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.254660 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f61e27e-ff25-4d76-814b-ed72e576547c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjnbz\" (UID: \"1f61e27e-ff25-4d76-814b-ed72e576547c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.255534 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/275a566e-d165-434a-95c1-9154ede6e14e-trusted-ca\") pod \"ingress-operator-5b745b69d9-4lbxt\" (UID: \"275a566e-d165-434a-95c1-9154ede6e14e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.255542 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4491a495-df05-43ef-bff7-2438317eac71-metrics-tls\") pod \"dns-operator-744455d44c-ggndt\" (UID: \"4491a495-df05-43ef-bff7-2438317eac71\") " pod="openshift-dns-operator/dns-operator-744455d44c-ggndt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.256288 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a3a4a159-1ab3-412f-ac71-11a7a41012ea-default-certificate\") pod \"router-default-5444994796-r8dcx\" (UID: \"a3a4a159-1ab3-412f-ac71-11a7a41012ea\") " pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.257346 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/821fa263-1235-4a62-8818-1f41d7e77a62-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-sf27h\" (UID: \"821fa263-1235-4a62-8818-1f41d7e77a62\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.256442 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b7390208-03b5-4ffa-9259-f5f1d9354c52-etcd-ca\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.257598 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3a4a159-1ab3-412f-ac71-11a7a41012ea-service-ca-bundle\") pod \"router-default-5444994796-r8dcx\" (UID: \"a3a4a159-1ab3-412f-ac71-11a7a41012ea\") " pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.257660 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-plugins-dir\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.257730 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/80407681-6091-46cc-836f-757ec4d16604-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.257770 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8-node-bootstrap-token\") pod \"machine-config-server-st76p\" (UID: \"3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8\") " pod="openshift-machine-config-operator/machine-config-server-st76p" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.258322 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/80407681-6091-46cc-836f-757ec4d16604-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.258356 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/49a1bc45-62f5-45d2-a475-b2b562cd9b98-profile-collector-cert\") pod \"catalog-operator-68c6474976-5t4db\" (UID: \"49a1bc45-62f5-45d2-a475-b2b562cd9b98\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.258388 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a3a4a159-1ab3-412f-ac71-11a7a41012ea-stats-auth\") pod \"router-default-5444994796-r8dcx\" (UID: \"a3a4a159-1ab3-412f-ac71-11a7a41012ea\") " pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.258507 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-registration-dir\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.258720 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/80407681-6091-46cc-836f-757ec4d16604-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.258750 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/adcffec1-66db-4e9b-9502-be3c9b008dde-config-volume\") pod \"dns-default-c6n7h\" (UID: \"adcffec1-66db-4e9b-9502-be3c9b008dde\") " pod="openshift-dns/dns-default-c6n7h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.258806 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/821fa263-1235-4a62-8818-1f41d7e77a62-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-sf27h\" (UID: \"821fa263-1235-4a62-8818-1f41d7e77a62\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.258864 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fee0aadb-9ec0-4e6c-bb48-27b74560a4ac-proxy-tls\") pod \"machine-config-operator-74547568cd-vhbxd\" (UID: \"fee0aadb-9ec0-4e6c-bb48-27b74560a4ac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.258886 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-socket-dir\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.258955 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b9f6ef9a-86de-4d04-ba12-30197f5c83ed-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tg6hf\" (UID: \"b9f6ef9a-86de-4d04-ba12-30197f5c83ed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.258965 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5667f763-a535-49eb-90a2-b78f1ebad0b7-signing-cabundle\") pod \"service-ca-9c57cc56f-wp452\" (UID: \"5667f763-a535-49eb-90a2-b78f1ebad0b7\") " pod="openshift-service-ca/service-ca-9c57cc56f-wp452" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.260287 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9f6ef9a-86de-4d04-ba12-30197f5c83ed-proxy-tls\") pod \"machine-config-controller-84d6567774-tg6hf\" (UID: \"b9f6ef9a-86de-4d04-ba12-30197f5c83ed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.260694 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/821fa263-1235-4a62-8818-1f41d7e77a62-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-sf27h\" (UID: \"821fa263-1235-4a62-8818-1f41d7e77a62\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.261082 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80407681-6091-46cc-836f-757ec4d16604-trusted-ca\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.261105 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ed4dfab-8b23-46d5-a983-db2ec1371ce2-cert\") pod \"ingress-canary-7m8cj\" (UID: \"9ed4dfab-8b23-46d5-a983-db2ec1371ce2\") " pod="openshift-ingress-canary/ingress-canary-7m8cj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.261306 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-registry-tls\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.261396 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9bnm\" (UniqueName: \"kubernetes.io/projected/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-kube-api-access-w9bnm\") pod \"collect-profiles-29525610-9dlfj\" (UID: \"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.261445 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73ed383c-c1e3-4f47-86d3-6faa77121e28-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-nxhdc\" (UID: \"73ed383c-c1e3-4f47-86d3-6faa77121e28\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.261475 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dfw9n\" (UID: \"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.261529 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh8cm\" (UniqueName: \"kubernetes.io/projected/0da0af7f-f8f8-492d-bd44-1e81ab242a24-kube-api-access-hh8cm\") pod \"control-plane-machine-set-operator-78cbb6b69f-42wfj\" (UID: \"0da0af7f-f8f8-492d-bd44-1e81ab242a24\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-42wfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.261713 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c66857b5-461d-4c11-8fa3-52cd619bba60-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-5qppq\" (UID: \"c66857b5-461d-4c11-8fa3-52cd619bba60\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5qppq" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.261756 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/275a566e-d165-434a-95c1-9154ede6e14e-metrics-tls\") pod \"ingress-operator-5b745b69d9-4lbxt\" (UID: \"275a566e-d165-434a-95c1-9154ede6e14e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.261778 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9be22b71-1386-4be4-a38a-6f3b97669b9c-config\") pod \"kube-controller-manager-operator-78b949d7b-x54th\" (UID: \"9be22b71-1386-4be4-a38a-6f3b97669b9c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.261805 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35ca1bac-5479-4e7f-80a5-c1811edc9e8e-serving-cert\") pod \"openshift-config-operator-7777fb866f-8l99c\" (UID: \"35ca1bac-5479-4e7f-80a5-c1811edc9e8e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.261825 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-config-volume\") pod \"collect-profiles-29525610-9dlfj\" (UID: \"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.261843 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvhgm\" (UniqueName: \"kubernetes.io/projected/7005ed62-dcc4-4fb5-ac2b-3aba9de5708a-kube-api-access-zvhgm\") pod \"olm-operator-6b444d44fb-m4bg2\" (UID: \"7005ed62-dcc4-4fb5-ac2b-3aba9de5708a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.262701 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/49a1bc45-62f5-45d2-a475-b2b562cd9b98-profile-collector-cert\") pod \"catalog-operator-68c6474976-5t4db\" (UID: \"49a1bc45-62f5-45d2-a475-b2b562cd9b98\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.263450 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a3a4a159-1ab3-412f-ac71-11a7a41012ea-stats-auth\") pod \"router-default-5444994796-r8dcx\" (UID: \"a3a4a159-1ab3-412f-ac71-11a7a41012ea\") " pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.263456 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1f61e27e-ff25-4d76-814b-ed72e576547c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjnbz\" (UID: \"1f61e27e-ff25-4d76-814b-ed72e576547c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.263481 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/96b2274a-7361-4463-8562-1319e967066b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-94s6c\" (UID: \"96b2274a-7361-4463-8562-1319e967066b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.263519 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xckwg\" (UniqueName: \"kubernetes.io/projected/b11366da-6972-44ce-8e8c-151de77fa689-kube-api-access-xckwg\") pod \"packageserver-d55dfcdfc-pgfxb\" (UID: \"b11366da-6972-44ce-8e8c-151de77fa689\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.264914 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a82f8242-05f0-48f7-8f86-cf472309b8e3-serving-cert\") pod \"service-ca-operator-777779d784-2cbkw\" (UID: \"a82f8242-05f0-48f7-8f86-cf472309b8e3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.264972 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7390208-03b5-4ffa-9259-f5f1d9354c52-serving-cert\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.264998 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b11366da-6972-44ce-8e8c-151de77fa689-tmpfs\") pod \"packageserver-d55dfcdfc-pgfxb\" (UID: \"b11366da-6972-44ce-8e8c-151de77fa689\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.265073 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vppps\" (UniqueName: \"kubernetes.io/projected/a3a4a159-1ab3-412f-ac71-11a7a41012ea-kube-api-access-vppps\") pod \"router-default-5444994796-r8dcx\" (UID: \"a3a4a159-1ab3-412f-ac71-11a7a41012ea\") " pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.266036 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b7390208-03b5-4ffa-9259-f5f1d9354c52-etcd-ca\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.266453 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9be22b71-1386-4be4-a38a-6f3b97669b9c-config\") pod \"kube-controller-manager-operator-78b949d7b-x54th\" (UID: \"9be22b71-1386-4be4-a38a-6f3b97669b9c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.266542 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b7390208-03b5-4ffa-9259-f5f1d9354c52-etcd-service-ca\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.266621 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-t48rm"] Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.266966 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/80407681-6091-46cc-836f-757ec4d16604-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.267424 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-registry-tls\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.267675 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89594551-78e6-49af-9376-477cf01d2dc5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6np95\" (UID: \"89594551-78e6-49af-9376-477cf01d2dc5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.269247 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/96b2274a-7361-4463-8562-1319e967066b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-94s6c\" (UID: \"96b2274a-7361-4463-8562-1319e967066b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.269469 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7390208-03b5-4ffa-9259-f5f1d9354c52-config\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.270066 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f61e27e-ff25-4d76-814b-ed72e576547c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjnbz\" (UID: \"1f61e27e-ff25-4d76-814b-ed72e576547c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.270997 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73ed383c-c1e3-4f47-86d3-6faa77121e28-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-nxhdc\" (UID: \"73ed383c-c1e3-4f47-86d3-6faa77121e28\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.271648 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a3a4a159-1ab3-412f-ac71-11a7a41012ea-default-certificate\") pod \"router-default-5444994796-r8dcx\" (UID: \"a3a4a159-1ab3-412f-ac71-11a7a41012ea\") " pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.274379 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hxj5\" (UniqueName: \"kubernetes.io/projected/49a1bc45-62f5-45d2-a475-b2b562cd9b98-kube-api-access-5hxj5\") pod \"catalog-operator-68c6474976-5t4db\" (UID: \"49a1bc45-62f5-45d2-a475-b2b562cd9b98\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.274771 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fee0aadb-9ec0-4e6c-bb48-27b74560a4ac-proxy-tls\") pod \"machine-config-operator-74547568cd-vhbxd\" (UID: \"fee0aadb-9ec0-4e6c-bb48-27b74560a4ac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.275121 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35ca1bac-5479-4e7f-80a5-c1811edc9e8e-serving-cert\") pod \"openshift-config-operator-7777fb866f-8l99c\" (UID: \"35ca1bac-5479-4e7f-80a5-c1811edc9e8e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.275419 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3a4a159-1ab3-412f-ac71-11a7a41012ea-metrics-certs\") pod \"router-default-5444994796-r8dcx\" (UID: \"a3a4a159-1ab3-412f-ac71-11a7a41012ea\") " pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.278224 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7390208-03b5-4ffa-9259-f5f1d9354c52-serving-cert\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.286324 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/275a566e-d165-434a-95c1-9154ede6e14e-metrics-tls\") pod \"ingress-operator-5b745b69d9-4lbxt\" (UID: \"275a566e-d165-434a-95c1-9154ede6e14e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.291888 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4tq7\" (UniqueName: \"kubernetes.io/projected/73ed383c-c1e3-4f47-86d3-6faa77121e28-kube-api-access-q4tq7\") pod \"kube-storage-version-migrator-operator-b67b599dd-nxhdc\" (UID: \"73ed383c-c1e3-4f47-86d3-6faa77121e28\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.328277 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g"] Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.331066 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74cgb\" (UniqueName: \"kubernetes.io/projected/116348e7-5632-4244-8ae4-f81b06c6df4d-kube-api-access-74cgb\") pod \"migrator-59844c95c7-6b8wp\" (UID: \"116348e7-5632-4244-8ae4-f81b06c6df4d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6b8wp" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.332467 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz87x\" (UniqueName: \"kubernetes.io/projected/821fa263-1235-4a62-8818-1f41d7e77a62-kube-api-access-pz87x\") pod \"openshift-controller-manager-operator-756b6f6bc6-sf27h\" (UID: \"821fa263-1235-4a62-8818-1f41d7e77a62\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.336324 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bks74"] Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.346817 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-rt6qz"] Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.347803 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw"] Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.349808 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb"] Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.351480 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts7rd\" (UniqueName: \"kubernetes.io/projected/4491a495-df05-43ef-bff7-2438317eac71-kube-api-access-ts7rd\") pod \"dns-operator-744455d44c-ggndt\" (UID: \"4491a495-df05-43ef-bff7-2438317eac71\") " pod="openshift-dns-operator/dns-operator-744455d44c-ggndt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.365786 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a82f8242-05f0-48f7-8f86-cf472309b8e3-serving-cert\") pod \"service-ca-operator-777779d784-2cbkw\" (UID: \"a82f8242-05f0-48f7-8f86-cf472309b8e3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.365833 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b11366da-6972-44ce-8e8c-151de77fa689-tmpfs\") pod \"packageserver-d55dfcdfc-pgfxb\" (UID: \"b11366da-6972-44ce-8e8c-151de77fa689\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.365860 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/adcffec1-66db-4e9b-9502-be3c9b008dde-metrics-tls\") pod \"dns-default-c6n7h\" (UID: \"adcffec1-66db-4e9b-9502-be3c9b008dde\") " pod="openshift-dns/dns-default-c6n7h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.365877 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht9nk\" (UniqueName: \"kubernetes.io/projected/3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8-kube-api-access-ht9nk\") pod \"machine-config-server-st76p\" (UID: \"3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8\") " pod="openshift-machine-config-operator/machine-config-server-st76p" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.365893 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-258mh\" (UniqueName: \"kubernetes.io/projected/9ed4dfab-8b23-46d5-a983-db2ec1371ce2-kube-api-access-258mh\") pod \"ingress-canary-7m8cj\" (UID: \"9ed4dfab-8b23-46d5-a983-db2ec1371ce2\") " pod="openshift-ingress-canary/ingress-canary-7m8cj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.365910 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8-certs\") pod \"machine-config-server-st76p\" (UID: \"3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8\") " pod="openshift-machine-config-operator/machine-config-server-st76p" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366396 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-mountpoint-dir\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366424 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-csi-data-dir\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366440 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htpcf\" (UniqueName: \"kubernetes.io/projected/a82f8242-05f0-48f7-8f86-cf472309b8e3-kube-api-access-htpcf\") pod \"service-ca-operator-777779d784-2cbkw\" (UID: \"a82f8242-05f0-48f7-8f86-cf472309b8e3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366459 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5667f763-a535-49eb-90a2-b78f1ebad0b7-signing-key\") pod \"service-ca-9c57cc56f-wp452\" (UID: \"5667f763-a535-49eb-90a2-b78f1ebad0b7\") " pod="openshift-service-ca/service-ca-9c57cc56f-wp452" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366516 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b11366da-6972-44ce-8e8c-151de77fa689-webhook-cert\") pod \"packageserver-d55dfcdfc-pgfxb\" (UID: \"b11366da-6972-44ce-8e8c-151de77fa689\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366558 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-675sf\" (UniqueName: \"kubernetes.io/projected/c66857b5-461d-4c11-8fa3-52cd619bba60-kube-api-access-675sf\") pod \"multus-admission-controller-857f4d67dd-5qppq\" (UID: \"c66857b5-461d-4c11-8fa3-52cd619bba60\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5qppq" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366583 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7005ed62-dcc4-4fb5-ac2b-3aba9de5708a-srv-cert\") pod \"olm-operator-6b444d44fb-m4bg2\" (UID: \"7005ed62-dcc4-4fb5-ac2b-3aba9de5708a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366600 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-csi-data-dir\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366620 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7005ed62-dcc4-4fb5-ac2b-3aba9de5708a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-m4bg2\" (UID: \"7005ed62-dcc4-4fb5-ac2b-3aba9de5708a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366657 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-mountpoint-dir\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366657 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366702 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b11366da-6972-44ce-8e8c-151de77fa689-apiservice-cert\") pod \"packageserver-d55dfcdfc-pgfxb\" (UID: \"b11366da-6972-44ce-8e8c-151de77fa689\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366731 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8bcw\" (UniqueName: \"kubernetes.io/projected/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-kube-api-access-t8bcw\") pod \"marketplace-operator-79b997595-dfw9n\" (UID: \"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366771 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgb98\" (UniqueName: \"kubernetes.io/projected/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-kube-api-access-zgb98\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366795 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-secret-volume\") pod \"collect-profiles-29525610-9dlfj\" (UID: \"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366841 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l8c6\" (UniqueName: \"kubernetes.io/projected/5667f763-a535-49eb-90a2-b78f1ebad0b7-kube-api-access-7l8c6\") pod \"service-ca-9c57cc56f-wp452\" (UID: \"5667f763-a535-49eb-90a2-b78f1ebad0b7\") " pod="openshift-service-ca/service-ca-9c57cc56f-wp452" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366891 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dfw9n\" (UID: \"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366920 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thnz2\" (UniqueName: \"kubernetes.io/projected/adcffec1-66db-4e9b-9502-be3c9b008dde-kube-api-access-thnz2\") pod \"dns-default-c6n7h\" (UID: \"adcffec1-66db-4e9b-9502-be3c9b008dde\") " pod="openshift-dns/dns-default-c6n7h" Feb 19 21:30:30 crc kubenswrapper[4795]: E0219 21:30:30.366942 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:30.866926788 +0000 UTC m=+142.059444742 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366971 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a82f8242-05f0-48f7-8f86-cf472309b8e3-config\") pod \"service-ca-operator-777779d784-2cbkw\" (UID: \"a82f8242-05f0-48f7-8f86-cf472309b8e3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.367094 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-plugins-dir\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.367119 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8-node-bootstrap-token\") pod \"machine-config-server-st76p\" (UID: \"3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8\") " pod="openshift-machine-config-operator/machine-config-server-st76p" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.367143 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-socket-dir\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.367204 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-registration-dir\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.367228 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/adcffec1-66db-4e9b-9502-be3c9b008dde-config-volume\") pod \"dns-default-c6n7h\" (UID: \"adcffec1-66db-4e9b-9502-be3c9b008dde\") " pod="openshift-dns/dns-default-c6n7h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.367248 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5667f763-a535-49eb-90a2-b78f1ebad0b7-signing-cabundle\") pod \"service-ca-9c57cc56f-wp452\" (UID: \"5667f763-a535-49eb-90a2-b78f1ebad0b7\") " pod="openshift-service-ca/service-ca-9c57cc56f-wp452" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.367273 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ed4dfab-8b23-46d5-a983-db2ec1371ce2-cert\") pod \"ingress-canary-7m8cj\" (UID: \"9ed4dfab-8b23-46d5-a983-db2ec1371ce2\") " pod="openshift-ingress-canary/ingress-canary-7m8cj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.367303 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9bnm\" (UniqueName: \"kubernetes.io/projected/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-kube-api-access-w9bnm\") pod \"collect-profiles-29525610-9dlfj\" (UID: \"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.367330 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dfw9n\" (UID: \"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.367361 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c66857b5-461d-4c11-8fa3-52cd619bba60-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-5qppq\" (UID: \"c66857b5-461d-4c11-8fa3-52cd619bba60\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5qppq" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.367380 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-config-volume\") pod \"collect-profiles-29525610-9dlfj\" (UID: \"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.367400 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvhgm\" (UniqueName: \"kubernetes.io/projected/7005ed62-dcc4-4fb5-ac2b-3aba9de5708a-kube-api-access-zvhgm\") pod \"olm-operator-6b444d44fb-m4bg2\" (UID: \"7005ed62-dcc4-4fb5-ac2b-3aba9de5708a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.367427 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xckwg\" (UniqueName: \"kubernetes.io/projected/b11366da-6972-44ce-8e8c-151de77fa689-kube-api-access-xckwg\") pod \"packageserver-d55dfcdfc-pgfxb\" (UID: \"b11366da-6972-44ce-8e8c-151de77fa689\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.368275 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-socket-dir\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.368355 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-registration-dir\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.368796 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b11366da-6972-44ce-8e8c-151de77fa689-tmpfs\") pod \"packageserver-d55dfcdfc-pgfxb\" (UID: \"b11366da-6972-44ce-8e8c-151de77fa689\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.369093 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-config-volume\") pod \"collect-profiles-29525610-9dlfj\" (UID: \"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.369329 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/adcffec1-66db-4e9b-9502-be3c9b008dde-config-volume\") pod \"dns-default-c6n7h\" (UID: \"adcffec1-66db-4e9b-9502-be3c9b008dde\") " pod="openshift-dns/dns-default-c6n7h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.370147 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a82f8242-05f0-48f7-8f86-cf472309b8e3-serving-cert\") pod \"service-ca-operator-777779d784-2cbkw\" (UID: \"a82f8242-05f0-48f7-8f86-cf472309b8e3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.370225 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5667f763-a535-49eb-90a2-b78f1ebad0b7-signing-cabundle\") pod \"service-ca-9c57cc56f-wp452\" (UID: \"5667f763-a535-49eb-90a2-b78f1ebad0b7\") " pod="openshift-service-ca/service-ca-9c57cc56f-wp452" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.370586 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8-certs\") pod \"machine-config-server-st76p\" (UID: \"3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8\") " pod="openshift-machine-config-operator/machine-config-server-st76p" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.370668 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-plugins-dir\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.371137 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a82f8242-05f0-48f7-8f86-cf472309b8e3-config\") pod \"service-ca-operator-777779d784-2cbkw\" (UID: \"a82f8242-05f0-48f7-8f86-cf472309b8e3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.371461 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dfw9n\" (UID: \"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.371732 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgtbb\" (UniqueName: \"kubernetes.io/projected/b9f6ef9a-86de-4d04-ba12-30197f5c83ed-kube-api-access-hgtbb\") pod \"machine-config-controller-84d6567774-tg6hf\" (UID: \"b9f6ef9a-86de-4d04-ba12-30197f5c83ed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.371771 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dfw9n\" (UID: \"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.374661 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/adcffec1-66db-4e9b-9502-be3c9b008dde-metrics-tls\") pod \"dns-default-c6n7h\" (UID: \"adcffec1-66db-4e9b-9502-be3c9b008dde\") " pod="openshift-dns/dns-default-c6n7h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.377113 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b11366da-6972-44ce-8e8c-151de77fa689-apiservice-cert\") pod \"packageserver-d55dfcdfc-pgfxb\" (UID: \"b11366da-6972-44ce-8e8c-151de77fa689\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.377147 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-secret-volume\") pod \"collect-profiles-29525610-9dlfj\" (UID: \"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.378081 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7005ed62-dcc4-4fb5-ac2b-3aba9de5708a-srv-cert\") pod \"olm-operator-6b444d44fb-m4bg2\" (UID: \"7005ed62-dcc4-4fb5-ac2b-3aba9de5708a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.380276 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c66857b5-461d-4c11-8fa3-52cd619bba60-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-5qppq\" (UID: \"c66857b5-461d-4c11-8fa3-52cd619bba60\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5qppq" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.380768 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7005ed62-dcc4-4fb5-ac2b-3aba9de5708a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-m4bg2\" (UID: \"7005ed62-dcc4-4fb5-ac2b-3aba9de5708a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.380773 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8-node-bootstrap-token\") pod \"machine-config-server-st76p\" (UID: \"3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8\") " pod="openshift-machine-config-operator/machine-config-server-st76p" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.380989 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5667f763-a535-49eb-90a2-b78f1ebad0b7-signing-key\") pod \"service-ca-9c57cc56f-wp452\" (UID: \"5667f763-a535-49eb-90a2-b78f1ebad0b7\") " pod="openshift-service-ca/service-ca-9c57cc56f-wp452" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.382946 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b11366da-6972-44ce-8e8c-151de77fa689-webhook-cert\") pod \"packageserver-d55dfcdfc-pgfxb\" (UID: \"b11366da-6972-44ce-8e8c-151de77fa689\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.382961 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ed4dfab-8b23-46d5-a983-db2ec1371ce2-cert\") pod \"ingress-canary-7m8cj\" (UID: \"9ed4dfab-8b23-46d5-a983-db2ec1371ce2\") " pod="openshift-ingress-canary/ingress-canary-7m8cj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.390279 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89594551-78e6-49af-9376-477cf01d2dc5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6np95\" (UID: \"89594551-78e6-49af-9376-477cf01d2dc5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.407529 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4kqs\" (UniqueName: \"kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-kube-api-access-b4kqs\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.422240 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.426488 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/275a566e-d165-434a-95c1-9154ede6e14e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4lbxt\" (UID: \"275a566e-d165-434a-95c1-9154ede6e14e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.443340 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.448138 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6k8fb\" (UniqueName: \"kubernetes.io/projected/96b2274a-7361-4463-8562-1319e967066b-kube-api-access-6k8fb\") pod \"package-server-manager-789f6589d5-94s6c\" (UID: \"96b2274a-7361-4463-8562-1319e967066b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.468701 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:30 crc kubenswrapper[4795]: E0219 21:30:30.468971 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:30.968939761 +0000 UTC m=+142.161457625 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.469085 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: E0219 21:30:30.469550 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:30.969532866 +0000 UTC m=+142.162050740 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.470845 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrjbp\" (UniqueName: \"kubernetes.io/projected/35ca1bac-5479-4e7f-80a5-c1811edc9e8e-kube-api-access-nrjbp\") pod \"openshift-config-operator-7777fb866f-8l99c\" (UID: \"35ca1bac-5479-4e7f-80a5-c1811edc9e8e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.477737 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.486493 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-ggndt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.487368 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjl9d\" (UniqueName: \"kubernetes.io/projected/275a566e-d165-434a-95c1-9154ede6e14e-kube-api-access-pjl9d\") pod \"ingress-operator-5b745b69d9-4lbxt\" (UID: \"275a566e-d165-434a-95c1-9154ede6e14e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.498349 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6b8wp" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.511053 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt2hd\" (UniqueName: \"kubernetes.io/projected/fee0aadb-9ec0-4e6c-bb48-27b74560a4ac-kube-api-access-lt2hd\") pod \"machine-config-operator-74547568cd-vhbxd\" (UID: \"fee0aadb-9ec0-4e6c-bb48-27b74560a4ac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.512744 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.524387 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.526963 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.531774 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-bound-sa-token\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.548122 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9be22b71-1386-4be4-a38a-6f3b97669b9c-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-x54th\" (UID: \"9be22b71-1386-4be4-a38a-6f3b97669b9c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.570040 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swsnw\" (UniqueName: \"kubernetes.io/projected/b7390208-03b5-4ffa-9259-f5f1d9354c52-kube-api-access-swsnw\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.571291 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:30 crc kubenswrapper[4795]: E0219 21:30:30.571453 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:31.071441086 +0000 UTC m=+142.263958950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.571642 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: E0219 21:30:30.572192 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:31.072184516 +0000 UTC m=+142.264702380 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.621170 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh8cm\" (UniqueName: \"kubernetes.io/projected/0da0af7f-f8f8-492d-bd44-1e81ab242a24-kube-api-access-hh8cm\") pod \"control-plane-machine-set-operator-78cbb6b69f-42wfj\" (UID: \"0da0af7f-f8f8-492d-bd44-1e81ab242a24\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-42wfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.641506 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1f61e27e-ff25-4d76-814b-ed72e576547c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjnbz\" (UID: \"1f61e27e-ff25-4d76-814b-ed72e576547c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.650857 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vppps\" (UniqueName: \"kubernetes.io/projected/a3a4a159-1ab3-412f-ac71-11a7a41012ea-kube-api-access-vppps\") pod \"router-default-5444994796-r8dcx\" (UID: \"a3a4a159-1ab3-412f-ac71-11a7a41012ea\") " pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.673002 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:30 crc kubenswrapper[4795]: E0219 21:30:30.673502 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:31.17348424 +0000 UTC m=+142.366002114 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.693783 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-258mh\" (UniqueName: \"kubernetes.io/projected/9ed4dfab-8b23-46d5-a983-db2ec1371ce2-kube-api-access-258mh\") pod \"ingress-canary-7m8cj\" (UID: \"9ed4dfab-8b23-46d5-a983-db2ec1371ce2\") " pod="openshift-ingress-canary/ingress-canary-7m8cj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.696897 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht9nk\" (UniqueName: \"kubernetes.io/projected/3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8-kube-api-access-ht9nk\") pod \"machine-config-server-st76p\" (UID: \"3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8\") " pod="openshift-machine-config-operator/machine-config-server-st76p" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.713407 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.729429 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.729920 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thnz2\" (UniqueName: \"kubernetes.io/projected/adcffec1-66db-4e9b-9502-be3c9b008dde-kube-api-access-thnz2\") pod \"dns-default-c6n7h\" (UID: \"adcffec1-66db-4e9b-9502-be3c9b008dde\") " pod="openshift-dns/dns-default-c6n7h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.742253 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8bcw\" (UniqueName: \"kubernetes.io/projected/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-kube-api-access-t8bcw\") pod \"marketplace-operator-79b997595-dfw9n\" (UID: \"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.750609 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.756476 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.756688 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgb98\" (UniqueName: \"kubernetes.io/projected/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-kube-api-access-zgb98\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.762766 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.772562 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.774987 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.777274 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xckwg\" (UniqueName: \"kubernetes.io/projected/b11366da-6972-44ce-8e8c-151de77fa689-kube-api-access-xckwg\") pod \"packageserver-d55dfcdfc-pgfxb\" (UID: \"b11366da-6972-44ce-8e8c-151de77fa689\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" Feb 19 21:30:30 crc kubenswrapper[4795]: E0219 21:30:30.780336 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:31.280315108 +0000 UTC m=+142.472832972 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.790755 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.791648 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9bnm\" (UniqueName: \"kubernetes.io/projected/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-kube-api-access-w9bnm\") pod \"collect-profiles-29525610-9dlfj\" (UID: \"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.804963 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-42wfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.810055 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htpcf\" (UniqueName: \"kubernetes.io/projected/a82f8242-05f0-48f7-8f86-cf472309b8e3-kube-api-access-htpcf\") pod \"service-ca-operator-777779d784-2cbkw\" (UID: \"a82f8242-05f0-48f7-8f86-cf472309b8e3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.819159 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf"] Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.832248 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l8c6\" (UniqueName: \"kubernetes.io/projected/5667f763-a535-49eb-90a2-b78f1ebad0b7-kube-api-access-7l8c6\") pod \"service-ca-9c57cc56f-wp452\" (UID: \"5667f763-a535-49eb-90a2-b78f1ebad0b7\") " pod="openshift-service-ca/service-ca-9c57cc56f-wp452" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.847060 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.854301 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.860534 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.863133 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-675sf\" (UniqueName: \"kubernetes.io/projected/c66857b5-461d-4c11-8fa3-52cd619bba60-kube-api-access-675sf\") pod \"multus-admission-controller-857f4d67dd-5qppq\" (UID: \"c66857b5-461d-4c11-8fa3-52cd619bba60\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5qppq" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.870715 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-wp452" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.875903 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:30 crc kubenswrapper[4795]: E0219 21:30:30.876234 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:31.376219731 +0000 UTC m=+142.568737595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.876334 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.895836 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvhgm\" (UniqueName: \"kubernetes.io/projected/7005ed62-dcc4-4fb5-ac2b-3aba9de5708a-kube-api-access-zvhgm\") pod \"olm-operator-6b444d44fb-m4bg2\" (UID: \"7005ed62-dcc4-4fb5-ac2b-3aba9de5708a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.897444 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.904087 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-c6n7h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.910994 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7m8cj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.921008 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-st76p" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.973221 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h"] Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.978419 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: E0219 21:30:30.978833 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:31.478820159 +0000 UTC m=+142.671338023 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:31 crc kubenswrapper[4795]: W0219 21:30:31.058663 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod821fa263_1235_4a62_8818_1f41d7e77a62.slice/crio-56286eeaa421704e013da8463ab6cc556f56e9d2aed38e4b3489266c84c1b9ea WatchSource:0}: Error finding container 56286eeaa421704e013da8463ab6cc556f56e9d2aed38e4b3489266c84c1b9ea: Status 404 returned error can't find the container with id 56286eeaa421704e013da8463ab6cc556f56e9d2aed38e4b3489266c84c1b9ea Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.079375 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:31 crc kubenswrapper[4795]: E0219 21:30:31.080071 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:31.580046861 +0000 UTC m=+142.772564725 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.080362 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:31 crc kubenswrapper[4795]: E0219 21:30:31.081012 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:31.580997426 +0000 UTC m=+142.773515290 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.134824 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-5qppq" Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.140338 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.152023 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6b8wp"] Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.160052 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95"] Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.161319 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" event={"ID":"e726eeb3-dfb2-4c3a-bea7-a5c945f25d52","Type":"ContainerStarted","Data":"b9ca794a7257f326161a0498af5987554c0c1c2f210b07e97380bc7713ff16a4"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.161394 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" event={"ID":"e726eeb3-dfb2-4c3a-bea7-a5c945f25d52","Type":"ContainerStarted","Data":"fd805e4fcb58a725d5225bfaf1940704f4aa9a8e3cd2c3338fff969733e167b2"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.166176 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ggndt"] Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.173854 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" event={"ID":"102f7fb5-3031-4853-b112-2aa910aa63a7","Type":"ContainerStarted","Data":"09bfebe5361b480de9eacdb44947d498846179bb75edaddc250a409a01766d3b"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.173896 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" event={"ID":"102f7fb5-3031-4853-b112-2aa910aa63a7","Type":"ContainerStarted","Data":"f4d807af544e927e81a81905631510e6f7454a6d612cc5078b9fdd6b9b356c32"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.174622 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.176402 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-r8dcx" event={"ID":"a3a4a159-1ab3-412f-ac71-11a7a41012ea","Type":"ContainerStarted","Data":"71a2c0fae3102f165644cba6fd7af15af48a7844131e7534c4403e9ec4772574"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.176537 4795 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-vtqjw container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.176579 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" podUID="102f7fb5-3031-4853-b112-2aa910aa63a7" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.177788 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc"] Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.178283 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" event={"ID":"3d1887a7-d8a5-45f6-97fc-c32a870089ef","Type":"ContainerDied","Data":"f0cde8ac048a3cf993681070ef0d56921f0043611917efe89b9b8e0a5e28375d"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.178133 4795 generic.go:334] "Generic (PLEG): container finished" podID="3d1887a7-d8a5-45f6-97fc-c32a870089ef" containerID="f0cde8ac048a3cf993681070ef0d56921f0043611917efe89b9b8e0a5e28375d" exitCode=0 Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.178461 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" event={"ID":"3d1887a7-d8a5-45f6-97fc-c32a870089ef","Type":"ContainerStarted","Data":"c722c0dd0327b20acc81d77a82fc7b1282c9c0cf149cc086e65963352858e300"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.181659 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:31 crc kubenswrapper[4795]: E0219 21:30:31.182769 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:31.682751241 +0000 UTC m=+142.875269105 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.183470 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-rt6qz" event={"ID":"96e1b4b4-8d48-4955-b756-71d21a5aea0b","Type":"ContainerStarted","Data":"3dd0685b4530dd1fe342bc28f461e5de1e5f379ed3a082e2a45f0460350baa66"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.183504 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-rt6qz" event={"ID":"96e1b4b4-8d48-4955-b756-71d21a5aea0b","Type":"ContainerStarted","Data":"83a817e2970ad2a1632d8b71f15be18e56f12022d5a772355cc6f4cad1d10c52"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.183841 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-rt6qz" Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.186377 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22qsd" event={"ID":"e736a4ca-7c18-423c-a5de-aeafd8d6a42e","Type":"ContainerStarted","Data":"6e0e3ff5de17f106edb0c69e22256075514d4cef13c40131bda5b1e6dc246118"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.186422 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22qsd" event={"ID":"e736a4ca-7c18-423c-a5de-aeafd8d6a42e","Type":"ContainerStarted","Data":"9df260857e4677af91be5709179f5553fea66fe8e1c2ee47b052bc4921fcb804"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.187333 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-rt6qz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.187374 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-rt6qz" podUID="96e1b4b4-8d48-4955-b756-71d21a5aea0b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.188893 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-bks74" event={"ID":"f28c04c3-66f1-4c29-b7d1-cac5aa342370","Type":"ContainerStarted","Data":"ac07608a22e647356a2a68507010c9abd136e30894df52889b17fff4c24d2ba3"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.188932 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-bks74" event={"ID":"f28c04c3-66f1-4c29-b7d1-cac5aa342370","Type":"ContainerStarted","Data":"40bb6474669f29cc42e7cb33423eacd5d40546cce81b359830e2220fb1a5d2aa"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.189740 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-bks74" Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.191120 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h" event={"ID":"821fa263-1235-4a62-8818-1f41d7e77a62","Type":"ContainerStarted","Data":"56286eeaa421704e013da8463ab6cc556f56e9d2aed38e4b3489266c84c1b9ea"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.191344 4795 patch_prober.go:28] interesting pod/console-operator-58897d9998-bks74 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.191389 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-bks74" podUID="f28c04c3-66f1-4c29-b7d1-cac5aa342370" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.193171 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" event={"ID":"958d08af-86cf-467f-947b-4485163fd695","Type":"ContainerStarted","Data":"2804453379dfda5aae2485de4530be2b31c99e85c377d29abc8a6728e9210670"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.193217 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" event={"ID":"958d08af-86cf-467f-947b-4485163fd695","Type":"ContainerStarted","Data":"325a439f8d758572082bb2ce43737b9bc7dabbb98d24390b4525636cbb073668"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.203502 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" event={"ID":"86ffb50f-47f6-47b2-9141-1de9999a13e0","Type":"ContainerStarted","Data":"71e5ceb34779a2773b15e3bbb07890cf9d8cbc1dd13d422577cbf3017f33a99e"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.203551 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" event={"ID":"86ffb50f-47f6-47b2-9141-1de9999a13e0","Type":"ContainerStarted","Data":"f50c3553621e34238711ac41e2e592ef162e4af963002aedb152ce56da5992e5"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.204582 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.209444 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rvkhj" event={"ID":"ec60d287-0f21-467c-8030-84b8726af567","Type":"ContainerStarted","Data":"4035d8859f231da873e32f045fcb444bd306ddab0862e7e765fb5b6fd021c463"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.209473 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rvkhj" event={"ID":"ec60d287-0f21-467c-8030-84b8726af567","Type":"ContainerStarted","Data":"01605015262ec0d283a1299b16fa7df4e9785d87441123f54856fb5a6f2abf61"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.210849 4795 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-7ph8l container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.210914 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" podUID="86ffb50f-47f6-47b2-9141-1de9999a13e0" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.221910 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf" event={"ID":"b9f6ef9a-86de-4d04-ba12-30197f5c83ed","Type":"ContainerStarted","Data":"601826ee4b11b39daa395ad3da07b7f46f6892d67fffb97a81226278879a919c"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.224211 4795 generic.go:334] "Generic (PLEG): container finished" podID="ed6485ab-c517-41cd-a755-d5dc9557456b" containerID="aba703f930910bb23f6018706ca395beec254504d78904c6db0462f6f7a5b4eb" exitCode=0 Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.224305 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-t48rm" event={"ID":"ed6485ab-c517-41cd-a755-d5dc9557456b","Type":"ContainerDied","Data":"aba703f930910bb23f6018706ca395beec254504d78904c6db0462f6f7a5b4eb"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.224322 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-t48rm" event={"ID":"ed6485ab-c517-41cd-a755-d5dc9557456b","Type":"ContainerStarted","Data":"796f3e65b9756bbc57b6b03fbd8211042c936c8c64a0f444a7aca8b40a2f60fd"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.244559 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:31 crc kubenswrapper[4795]: W0219 21:30:31.270574 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod116348e7_5632_4244_8ae4_f81b06c6df4d.slice/crio-bd9be097580ab759fac27216a30d387fcf80c02c8acc7458d5fa0d5543a8dd06 WatchSource:0}: Error finding container bd9be097580ab759fac27216a30d387fcf80c02c8acc7458d5fa0d5543a8dd06: Status 404 returned error can't find the container with id bd9be097580ab759fac27216a30d387fcf80c02c8acc7458d5fa0d5543a8dd06 Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.285356 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:31 crc kubenswrapper[4795]: E0219 21:30:31.292234 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:31.792217659 +0000 UTC m=+142.984735523 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.317897 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db"] Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.325489 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt"] Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.338636 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c"] Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.387356 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:31 crc kubenswrapper[4795]: E0219 21:30:31.387635 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:31.887617708 +0000 UTC m=+143.080135572 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.388087 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:31 crc kubenswrapper[4795]: E0219 21:30:31.388659 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:31.888415809 +0000 UTC m=+143.080933673 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.451652 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz"] Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.454840 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-8l99c"] Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.490449 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:31 crc kubenswrapper[4795]: E0219 21:30:31.491044 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:31.991022637 +0000 UTC m=+143.183540501 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.535871 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" podStartSLOduration=118.535852467 podStartE2EDuration="1m58.535852467s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:31.531674468 +0000 UTC m=+142.724192332" watchObservedRunningTime="2026-02-19 21:30:31.535852467 +0000 UTC m=+142.728370331" Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.572179 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" podStartSLOduration=118.572160735 podStartE2EDuration="1m58.572160735s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:31.571999711 +0000 UTC m=+142.764517575" watchObservedRunningTime="2026-02-19 21:30:31.572160735 +0000 UTC m=+142.764678599" Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.591804 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:31 crc kubenswrapper[4795]: E0219 21:30:31.592188 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:32.092156997 +0000 UTC m=+143.284674861 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.625765 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb"] Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.625809 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-sd5jz"] Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.625819 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-42wfj"] Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.692267 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:31 crc kubenswrapper[4795]: E0219 21:30:31.692573 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:32.192553937 +0000 UTC m=+143.385071811 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.793076 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:31 crc kubenswrapper[4795]: E0219 21:30:31.793672 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:32.293660096 +0000 UTC m=+143.486177960 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.894740 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:31 crc kubenswrapper[4795]: E0219 21:30:31.895185 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:32.395160185 +0000 UTC m=+143.587678049 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.998340 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:31 crc kubenswrapper[4795]: E0219 21:30:31.998698 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:32.498683107 +0000 UTC m=+143.691200971 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.070217 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd"] Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.091673 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th"] Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.092481 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8" podStartSLOduration=119.092469015 podStartE2EDuration="1m59.092469015s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:32.07580698 +0000 UTC m=+143.268324844" watchObservedRunningTime="2026-02-19 21:30:32.092469015 +0000 UTC m=+143.284986879" Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.100136 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:32 crc kubenswrapper[4795]: E0219 21:30:32.100678 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:32.600662639 +0000 UTC m=+143.793180503 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.203605 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:32 crc kubenswrapper[4795]: E0219 21:30:32.203909 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:32.703896373 +0000 UTC m=+143.896414237 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.206312 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-wp452"] Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.218547 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw"] Feb 19 21:30:32 crc kubenswrapper[4795]: W0219 21:30:32.251100 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5667f763_a535_49eb_90a2_b78f1ebad0b7.slice/crio-d657e6145213a444981d32c25878e6c8aebbf41ec7496db4ae641bcf17c4600e WatchSource:0}: Error finding container d657e6145213a444981d32c25878e6c8aebbf41ec7496db4ae641bcf17c4600e: Status 404 returned error can't find the container with id d657e6145213a444981d32c25878e6c8aebbf41ec7496db4ae641bcf17c4600e Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.252387 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dfw9n"] Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.253263 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc" event={"ID":"73ed383c-c1e3-4f47-86d3-6faa77121e28","Type":"ContainerStarted","Data":"681975eecd2824ee2b18364c0676ca10317e27cadf641e4850de46e5b1bd2a25"} Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.254654 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz" event={"ID":"1f61e27e-ff25-4d76-814b-ed72e576547c","Type":"ContainerStarted","Data":"3d3e2d6d25f5674899e7d1b668557321521d58f757d7282a588fd063dac3e037"} Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.256379 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6b8wp" event={"ID":"116348e7-5632-4244-8ae4-f81b06c6df4d","Type":"ContainerStarted","Data":"bd9be097580ab759fac27216a30d387fcf80c02c8acc7458d5fa0d5543a8dd06"} Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.266223 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2"] Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.268624 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th" event={"ID":"9be22b71-1386-4be4-a38a-6f3b97669b9c","Type":"ContainerStarted","Data":"e247c0bda061b1d394a25f5efe898c368927a8ca75f985d4742ad2e630ac465f"} Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.270848 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" event={"ID":"b11366da-6972-44ce-8e8c-151de77fa689","Type":"ContainerStarted","Data":"53abc20ac9ab5c5e9c00220bd7928ad8a28c3b011f55813f000e904fdef09783"} Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.271855 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" event={"ID":"49a1bc45-62f5-45d2-a475-b2b562cd9b98","Type":"ContainerStarted","Data":"e9139851c2bef4af0ae2f2c689b117f01c8077e3e83da00d8dbbcfc4f6566b25"} Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.280035 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95" event={"ID":"89594551-78e6-49af-9376-477cf01d2dc5","Type":"ContainerStarted","Data":"a5f1bf1469e5995d24e08040279d8cd83af4d6df342a409a32ab2dc9474cb5e7"} Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.281004 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" event={"ID":"b7390208-03b5-4ffa-9259-f5f1d9354c52","Type":"ContainerStarted","Data":"12f4a241cedf4df064621df47c24f7aad9281d889676cd2bd4da009eecfcd016"} Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.282475 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" event={"ID":"275a566e-d165-434a-95c1-9154ede6e14e","Type":"ContainerStarted","Data":"483c845d4c052aa50a28451932712fe5b0a864fbc1b6a919d8f6f91b5590fef9"} Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.291927 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-st76p" event={"ID":"3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8","Type":"ContainerStarted","Data":"f6c3de970402f5f2a706b9a0f207f80e122bf7818363fc8c9eaed089aeab076c"} Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.296952 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7m8cj"] Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.300855 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c" event={"ID":"96b2274a-7361-4463-8562-1319e967066b","Type":"ContainerStarted","Data":"a32167185a23f95c01f66f7d9775147e491c7807b907029f8e4f552e46ba8539"} Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.304342 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:32 crc kubenswrapper[4795]: E0219 21:30:32.304795 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:32.804773106 +0000 UTC m=+143.997290970 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.306246 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-5qppq"] Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.308454 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj"] Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.315800 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-c6n7h"] Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.318245 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" event={"ID":"fee0aadb-9ec0-4e6c-bb48-27b74560a4ac","Type":"ContainerStarted","Data":"3547188c0442dcc93380f2aecc8c910e14e5fe0d19b0fe22602f18e9a63827a2"} Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.337692 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-42wfj" event={"ID":"0da0af7f-f8f8-492d-bd44-1e81ab242a24","Type":"ContainerStarted","Data":"6921e6d27e7dee7ceb4444fc64f8fad7f7e3c6af8d2c46d1d831032cbced4404"} Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.338299 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-hxxkd"] Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.346658 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" event={"ID":"35ca1bac-5479-4e7f-80a5-c1811edc9e8e","Type":"ContainerStarted","Data":"a4eb8515b2b6362f98c6cdac68d937019825faa1528d1b43504d50f91d117161"} Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.366105 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-r8dcx" event={"ID":"a3a4a159-1ab3-412f-ac71-11a7a41012ea","Type":"ContainerStarted","Data":"ffa2030a5fa12af377a387f5f235d33162796cf624478203f6186832d3f12ade"} Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.394791 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ggndt" event={"ID":"4491a495-df05-43ef-bff7-2438317eac71","Type":"ContainerStarted","Data":"61b87cb2f15b2aa6261b9cfd630153a7503830a3de1e63bb8282a31307bf5406"} Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.397368 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-rt6qz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.397416 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-rt6qz" podUID="96e1b4b4-8d48-4955-b756-71d21a5aea0b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.397653 4795 patch_prober.go:28] interesting pod/console-operator-58897d9998-bks74 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.397717 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-bks74" podUID="f28c04c3-66f1-4c29-b7d1-cac5aa342370" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.403993 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.410857 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:32 crc kubenswrapper[4795]: E0219 21:30:32.415495 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:32.915480496 +0000 UTC m=+144.107998360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.424398 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" podStartSLOduration=119.424377748 podStartE2EDuration="1m59.424377748s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:32.388141492 +0000 UTC m=+143.580659366" watchObservedRunningTime="2026-02-19 21:30:32.424377748 +0000 UTC m=+143.616895612" Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.491204 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:32 crc kubenswrapper[4795]: W0219 21:30:32.501120 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ed4dfab_8b23_46d5_a983_db2ec1371ce2.slice/crio-6f0f435341f7c3d5c312608d16c6301b721319ab948b919042c34efe2e1663a6 WatchSource:0}: Error finding container 6f0f435341f7c3d5c312608d16c6301b721319ab948b919042c34efe2e1663a6: Status 404 returned error can't find the container with id 6f0f435341f7c3d5c312608d16c6301b721319ab948b919042c34efe2e1663a6 Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.544530 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-r8dcx" podStartSLOduration=119.544509983 podStartE2EDuration="1m59.544509983s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:32.481815277 +0000 UTC m=+143.674333141" watchObservedRunningTime="2026-02-19 21:30:32.544509983 +0000 UTC m=+143.737027847" Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.549602 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:32 crc kubenswrapper[4795]: E0219 21:30:32.560873 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:33.060840999 +0000 UTC m=+144.253358863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.561043 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:32 crc kubenswrapper[4795]: E0219 21:30:32.561923 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:33.061898427 +0000 UTC m=+144.254416291 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.593303 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" podStartSLOduration=119.593288336 podStartE2EDuration="1m59.593288336s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:32.59266533 +0000 UTC m=+143.785183214" watchObservedRunningTime="2026-02-19 21:30:32.593288336 +0000 UTC m=+143.785806200" Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.602012 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" podStartSLOduration=119.601993703 podStartE2EDuration="1m59.601993703s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:32.548788455 +0000 UTC m=+143.741306319" watchObservedRunningTime="2026-02-19 21:30:32.601993703 +0000 UTC m=+143.794511568" Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.628112 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-rvkhj" podStartSLOduration=119.628094205 podStartE2EDuration="1m59.628094205s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:32.627687634 +0000 UTC m=+143.820205498" watchObservedRunningTime="2026-02-19 21:30:32.628094205 +0000 UTC m=+143.820612069" Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.641728 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-rt6qz" podStartSLOduration=119.64171288 podStartE2EDuration="1m59.64171288s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:32.639598315 +0000 UTC m=+143.832116179" watchObservedRunningTime="2026-02-19 21:30:32.64171288 +0000 UTC m=+143.834230744" Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.664797 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:32 crc kubenswrapper[4795]: E0219 21:30:32.665100 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:33.16508664 +0000 UTC m=+144.357604504 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.699567 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-bks74" podStartSLOduration=119.69955275 podStartE2EDuration="1m59.69955275s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:32.698334818 +0000 UTC m=+143.890852682" watchObservedRunningTime="2026-02-19 21:30:32.69955275 +0000 UTC m=+143.892070614" Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.728812 4795 csr.go:261] certificate signing request csr-42z98 is approved, waiting to be issued Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.735589 4795 csr.go:257] certificate signing request csr-42z98 is issued Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.743148 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22qsd" podStartSLOduration=119.743125587 podStartE2EDuration="1m59.743125587s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:32.736040212 +0000 UTC m=+143.928558076" watchObservedRunningTime="2026-02-19 21:30:32.743125587 +0000 UTC m=+143.935643441" Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.757870 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.760399 4795 patch_prober.go:28] interesting pod/router-default-5444994796-r8dcx container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.760447 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r8dcx" podUID="a3a4a159-1ab3-412f-ac71-11a7a41012ea" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.766655 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:32 crc kubenswrapper[4795]: E0219 21:30:32.767067 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:33.267051441 +0000 UTC m=+144.459569305 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.843861 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" podStartSLOduration=120.843846856 podStartE2EDuration="2m0.843846856s" podCreationTimestamp="2026-02-19 21:28:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:32.843165588 +0000 UTC m=+144.035683452" watchObservedRunningTime="2026-02-19 21:30:32.843846856 +0000 UTC m=+144.036364720" Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.867424 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:32 crc kubenswrapper[4795]: E0219 21:30:32.867607 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:33.367578305 +0000 UTC m=+144.560096169 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.867678 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:32 crc kubenswrapper[4795]: E0219 21:30:32.868179 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:33.36816055 +0000 UTC m=+144.560678414 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.885426 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" podStartSLOduration=119.885407281 podStartE2EDuration="1m59.885407281s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:32.884055475 +0000 UTC m=+144.076573339" watchObservedRunningTime="2026-02-19 21:30:32.885407281 +0000 UTC m=+144.077925145" Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.968864 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:32 crc kubenswrapper[4795]: E0219 21:30:32.969953 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:33.469908266 +0000 UTC m=+144.662426130 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.075909 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:33 crc kubenswrapper[4795]: E0219 21:30:33.076197 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:33.57618513 +0000 UTC m=+144.768702994 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.176875 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:33 crc kubenswrapper[4795]: E0219 21:30:33.177596 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:33.677581756 +0000 UTC m=+144.870099620 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.283660 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:33 crc kubenswrapper[4795]: E0219 21:30:33.284076 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:33.784065145 +0000 UTC m=+144.976582999 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.390925 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:33 crc kubenswrapper[4795]: E0219 21:30:33.391318 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:33.891300724 +0000 UTC m=+145.083818588 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.416367 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz" event={"ID":"1f61e27e-ff25-4d76-814b-ed72e576547c","Type":"ContainerStarted","Data":"8bcd435b33ea136204c8f0e8c059624b480764535edbe83ef981101bcd6f4ce9"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.425917 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-st76p" event={"ID":"3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8","Type":"ContainerStarted","Data":"9cea8165b9fb5b94d36107b221ff7de0d864648d3305ffab89f15a34c5a73433"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.427471 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th" event={"ID":"9be22b71-1386-4be4-a38a-6f3b97669b9c","Type":"ContainerStarted","Data":"1009d387cfe21828d5eaa8b00af7c1c54ece5e43bb881a0d6dafae5d4967d158"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.429981 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-42wfj" event={"ID":"0da0af7f-f8f8-492d-bd44-1e81ab242a24","Type":"ContainerStarted","Data":"ca5407bd9e071e7e6f3aca525f18b8d4c2bd120c4fb40c28ae454708659ee98e"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.432672 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" event={"ID":"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca","Type":"ContainerStarted","Data":"79fa299dc315e3fe30e63332104d23b13faff115fe64d3c739841e3e2664edb0"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.432703 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" event={"ID":"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca","Type":"ContainerStarted","Data":"365d5c2e07de412e6c9e8f0e65078f4ceb7110e13c2cb20266daf040eaf8acbd"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.433527 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.441825 4795 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dfw9n container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.441868 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" podUID="7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.478674 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" event={"ID":"3d1887a7-d8a5-45f6-97fc-c32a870089ef","Type":"ContainerStarted","Data":"a8be11a5ecbdba1a373b058454c84bbf743649dbe6d8762cfdb2e2bf1d51f7b7"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.485357 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz" podStartSLOduration=120.485334519 podStartE2EDuration="2m0.485334519s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:33.445388276 +0000 UTC m=+144.637906140" watchObservedRunningTime="2026-02-19 21:30:33.485334519 +0000 UTC m=+144.677852383" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.490993 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" event={"ID":"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a","Type":"ContainerStarted","Data":"8f02da0738e9d178691c0389499a8480ffbd2f961710f2202fa44516bf81c4c6"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.491043 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" event={"ID":"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a","Type":"ContainerStarted","Data":"169f9f502d25a3abb7242d9edf9ee834076b8950d2fb019424ade7bd23fc1053"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.492851 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:33 crc kubenswrapper[4795]: E0219 21:30:33.496851 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:33.996837209 +0000 UTC m=+145.189355073 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.513618 4795 generic.go:334] "Generic (PLEG): container finished" podID="35ca1bac-5479-4e7f-80a5-c1811edc9e8e" containerID="95e19a883af5daf03192e0708c4543079b32d50d08aa677517a9f6338924bec4" exitCode=0 Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.525583 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" event={"ID":"35ca1bac-5479-4e7f-80a5-c1811edc9e8e","Type":"ContainerDied","Data":"95e19a883af5daf03192e0708c4543079b32d50d08aa677517a9f6338924bec4"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.529477 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ggndt" event={"ID":"4491a495-df05-43ef-bff7-2438317eac71","Type":"ContainerStarted","Data":"8521c9a65633694ba5c47d95e1712947e71138c020f2ba77a274ce7f257f522b"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.532678 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-c6n7h" event={"ID":"adcffec1-66db-4e9b-9502-be3c9b008dde","Type":"ContainerStarted","Data":"77eff233c81ca07bf464ee0b564513104cf6fb1e8d21b9eff4fbfe696fb19053"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.533271 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-c6n7h" event={"ID":"adcffec1-66db-4e9b-9502-be3c9b008dde","Type":"ContainerStarted","Data":"be39b5fb6d1e66cf1cc04be43ee6115a7e63a9f4b93f4c6ea7c9ca27a58dd0ea"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.548436 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-st76p" podStartSLOduration=6.548417715 podStartE2EDuration="6.548417715s" podCreationTimestamp="2026-02-19 21:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:33.547955933 +0000 UTC m=+144.740473797" watchObservedRunningTime="2026-02-19 21:30:33.548417715 +0000 UTC m=+144.740935579" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.548880 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th" podStartSLOduration=120.548875647 podStartE2EDuration="2m0.548875647s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:33.48961698 +0000 UTC m=+144.682134844" watchObservedRunningTime="2026-02-19 21:30:33.548875647 +0000 UTC m=+144.741393511" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.549383 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw" event={"ID":"a82f8242-05f0-48f7-8f86-cf472309b8e3","Type":"ContainerStarted","Data":"73e9ab517dfe3bd917e8cb4a706967758fa3729eaf10d2d76e6118ebe151cb27"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.549442 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw" event={"ID":"a82f8242-05f0-48f7-8f86-cf472309b8e3","Type":"ContainerStarted","Data":"94abb0779964b77bc9695db82705b627d6a09f50ec89fb6e0bd009a50c32336a"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.570845 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc" event={"ID":"73ed383c-c1e3-4f47-86d3-6faa77121e28","Type":"ContainerStarted","Data":"86d1b0f9d34d3e67ed2159559a57b46d5af93d1e4591581042d64f48b0477702"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.591967 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" event={"ID":"b7390208-03b5-4ffa-9259-f5f1d9354c52","Type":"ContainerStarted","Data":"d5e5e4f50109a1f19e3f14be3f9a45e9fc807df79c25f4e093ae90602509eae3"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.596619 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:33 crc kubenswrapper[4795]: E0219 21:30:33.597937 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:34.097921397 +0000 UTC m=+145.290439261 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.600896 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-42wfj" podStartSLOduration=120.600880484 podStartE2EDuration="2m0.600880484s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:33.600159946 +0000 UTC m=+144.792677810" watchObservedRunningTime="2026-02-19 21:30:33.600880484 +0000 UTC m=+144.793398348" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.610227 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6b8wp" event={"ID":"116348e7-5632-4244-8ae4-f81b06c6df4d","Type":"ContainerStarted","Data":"cd491d499a82aa69bb3da98d33c76fbfe67523b991060de30a6ef7599fedd161"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.610270 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6b8wp" event={"ID":"116348e7-5632-4244-8ae4-f81b06c6df4d","Type":"ContainerStarted","Data":"a0559ba05ff203b441c0fb43c75de1acf6cfe4fb5ffc0c89ca80f83d757b62c6"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.632816 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" podStartSLOduration=120.632799417 podStartE2EDuration="2m0.632799417s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:33.632674554 +0000 UTC m=+144.825192428" watchObservedRunningTime="2026-02-19 21:30:33.632799417 +0000 UTC m=+144.825317281" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.659687 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" event={"ID":"7005ed62-dcc4-4fb5-ac2b-3aba9de5708a","Type":"ContainerStarted","Data":"ee9e3a4801a9d28bda13d120b281173089455737a945dafa5ddd8f9905238b39"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.661294 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.676263 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95" event={"ID":"89594551-78e6-49af-9376-477cf01d2dc5","Type":"ContainerStarted","Data":"7d33e8b545692ae287cea893cce811ac6b8db4d57335286c08afaf3a2c5b4415"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.680418 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c" event={"ID":"96b2274a-7361-4463-8562-1319e967066b","Type":"ContainerStarted","Data":"316fbea86e860fc1f5090b074e37c6e1b9b7f3e461c7ed53f8b719ad99b89de0"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.680448 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c" event={"ID":"96b2274a-7361-4463-8562-1319e967066b","Type":"ContainerStarted","Data":"73b6566dab869dee9b85d42f0e46ae7dcb08a351ce23a9ea17d70d9e5449f3b1"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.680837 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.683791 4795 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-m4bg2 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.683826 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" podUID="7005ed62-dcc4-4fb5-ac2b-3aba9de5708a" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.684070 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" event={"ID":"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a","Type":"ContainerStarted","Data":"8cb0ebaa94d308e7f429c893028dba537b16caa1897a2e1ce60c2be387bea7db"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.695726 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-5qppq" event={"ID":"c66857b5-461d-4c11-8fa3-52cd619bba60","Type":"ContainerStarted","Data":"c64024d8c4819b2555225a8768af2f8162944f51b37f9c298c87f8232cba18a9"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.695939 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-5qppq" event={"ID":"c66857b5-461d-4c11-8fa3-52cd619bba60","Type":"ContainerStarted","Data":"f9fdc7e0c291cf9b2f22329e019263dbbe6aa801c5e605ee9aaf41b6bfbdbf95"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.698282 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:33 crc kubenswrapper[4795]: E0219 21:30:33.699369 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:34.199355865 +0000 UTC m=+145.391873729 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.700433 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" podStartSLOduration=120.700417972 podStartE2EDuration="2m0.700417972s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:33.659269868 +0000 UTC m=+144.851787732" watchObservedRunningTime="2026-02-19 21:30:33.700417972 +0000 UTC m=+144.892935836" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.700539 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw" podStartSLOduration=120.700535435 podStartE2EDuration="2m0.700535435s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:33.698448151 +0000 UTC m=+144.890966015" watchObservedRunningTime="2026-02-19 21:30:33.700535435 +0000 UTC m=+144.893053299" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.729779 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" podStartSLOduration=120.729765747 podStartE2EDuration="2m0.729765747s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:33.725888436 +0000 UTC m=+144.918406300" watchObservedRunningTime="2026-02-19 21:30:33.729765747 +0000 UTC m=+144.922283611" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.736770 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-19 21:25:32 +0000 UTC, rotation deadline is 2026-12-08 16:07:39.213325082 +0000 UTC Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.736798 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7002h37m5.476529271s for next certificate rotation Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.746401 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf" event={"ID":"b9f6ef9a-86de-4d04-ba12-30197f5c83ed","Type":"ContainerStarted","Data":"0a86514fc846cf89d9663375fe844770178044662956d35d21e21865cb19e91c"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.746447 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf" event={"ID":"b9f6ef9a-86de-4d04-ba12-30197f5c83ed","Type":"ContainerStarted","Data":"59df8637026a3509abce7c5c573f9d70f6008512281964698f67187d18a8d84c"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.758333 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" event={"ID":"fee0aadb-9ec0-4e6c-bb48-27b74560a4ac","Type":"ContainerStarted","Data":"8f759ccd97553ffb79952af3bb4312047bf76086b9b0e676f54bb06778350bda"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.761142 4795 patch_prober.go:28] interesting pod/router-default-5444994796-r8dcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 21:30:33 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Feb 19 21:30:33 crc kubenswrapper[4795]: [+]process-running ok Feb 19 21:30:33 crc kubenswrapper[4795]: healthz check failed Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.761185 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r8dcx" podUID="a3a4a159-1ab3-412f-ac71-11a7a41012ea" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.770553 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" event={"ID":"275a566e-d165-434a-95c1-9154ede6e14e","Type":"ContainerStarted","Data":"41bd1e2fb64001b31b3537cbeb0b8f4e9d68c4a266e136ed56c8c548d3c92155"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.781432 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-wp452" event={"ID":"5667f763-a535-49eb-90a2-b78f1ebad0b7","Type":"ContainerStarted","Data":"6e3321abaa556d00dd5bb4aa054f1e474b25b71bb48beab02c6a824ce6afbf16"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.781483 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-wp452" event={"ID":"5667f763-a535-49eb-90a2-b78f1ebad0b7","Type":"ContainerStarted","Data":"d657e6145213a444981d32c25878e6c8aebbf41ec7496db4ae641bcf17c4600e"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.782367 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6b8wp" podStartSLOduration=120.78234473 podStartE2EDuration="2m0.78234473s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:33.78083369 +0000 UTC m=+144.973351554" watchObservedRunningTime="2026-02-19 21:30:33.78234473 +0000 UTC m=+144.974862594" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.783031 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7m8cj" event={"ID":"9ed4dfab-8b23-46d5-a983-db2ec1371ce2","Type":"ContainerStarted","Data":"035f03143c1dcd4a5cdd1970765872e48d09ffd08d71a34d5c15ff08da5fa9e3"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.783051 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7m8cj" event={"ID":"9ed4dfab-8b23-46d5-a983-db2ec1371ce2","Type":"ContainerStarted","Data":"6f0f435341f7c3d5c312608d16c6301b721319ab948b919042c34efe2e1663a6"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.791925 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-t48rm" event={"ID":"ed6485ab-c517-41cd-a755-d5dc9557456b","Type":"ContainerStarted","Data":"c7c59f37f95c70304ef3b294ce77051edb09d011f297b891921c5ea231510190"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.799039 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h" event={"ID":"821fa263-1235-4a62-8818-1f41d7e77a62","Type":"ContainerStarted","Data":"cb29f0bd525d1d4d261a058ab9d4851a3b8b426a577612388490598276033eb9"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.801095 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:33 crc kubenswrapper[4795]: E0219 21:30:33.802116 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:34.302077335 +0000 UTC m=+145.494595199 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.802223 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:33 crc kubenswrapper[4795]: E0219 21:30:33.806908 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:34.30688983 +0000 UTC m=+145.499407764 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.874562 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" event={"ID":"49a1bc45-62f5-45d2-a475-b2b562cd9b98","Type":"ContainerStarted","Data":"0d5b40b626fbf1ff6723285343e564ff35f9a801d6e5d76de81dafa7b485fcdd"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.875226 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.902393 4795 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-5t4db container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.902449 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" podUID="49a1bc45-62f5-45d2-a475-b2b562cd9b98" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.905421 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:33 crc kubenswrapper[4795]: E0219 21:30:33.905581 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:34.405561436 +0000 UTC m=+145.598079300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.906703 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:33 crc kubenswrapper[4795]: E0219 21:30:33.907045 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:34.407033954 +0000 UTC m=+145.599551818 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.920662 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc" podStartSLOduration=120.920642729 podStartE2EDuration="2m0.920642729s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:33.915660609 +0000 UTC m=+145.108178483" watchObservedRunningTime="2026-02-19 21:30:33.920642729 +0000 UTC m=+145.113160603" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.920767 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" podStartSLOduration=33.920761652 podStartE2EDuration="33.920761652s" podCreationTimestamp="2026-02-19 21:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:33.846475163 +0000 UTC m=+145.038993027" watchObservedRunningTime="2026-02-19 21:30:33.920761652 +0000 UTC m=+145.113279516" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.976229 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" event={"ID":"b11366da-6972-44ce-8e8c-151de77fa689","Type":"ContainerStarted","Data":"38e286dbef09b9d06c2af60f210cfc7b002762c25563f2396a11593922be7fae"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.977242 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.000792 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" podStartSLOduration=121.000777711 podStartE2EDuration="2m1.000777711s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:33.946905585 +0000 UTC m=+145.139423469" watchObservedRunningTime="2026-02-19 21:30:34.000777711 +0000 UTC m=+145.193295575" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.010006 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:34 crc kubenswrapper[4795]: E0219 21:30:34.010695 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:34.510668519 +0000 UTC m=+145.703186383 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.035316 4795 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-pgfxb container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" start-of-body= Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.035647 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" podUID="b11366da-6972-44ce-8e8c-151de77fa689" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.072515 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-t48rm" podStartSLOduration=121.072497343 podStartE2EDuration="2m1.072497343s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:34.068056457 +0000 UTC m=+145.260574321" watchObservedRunningTime="2026-02-19 21:30:34.072497343 +0000 UTC m=+145.265015207" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.074182 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95" podStartSLOduration=121.074175346 podStartE2EDuration="2m1.074175346s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:33.999748034 +0000 UTC m=+145.192265898" watchObservedRunningTime="2026-02-19 21:30:34.074175346 +0000 UTC m=+145.266693210" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.115680 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf" podStartSLOduration=121.115665539 podStartE2EDuration="2m1.115665539s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:34.096704814 +0000 UTC m=+145.289222678" watchObservedRunningTime="2026-02-19 21:30:34.115665539 +0000 UTC m=+145.308183403" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.116708 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-wp452" podStartSLOduration=121.116703626 podStartE2EDuration="2m1.116703626s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:34.113945624 +0000 UTC m=+145.306463488" watchObservedRunningTime="2026-02-19 21:30:34.116703626 +0000 UTC m=+145.309221490" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.118246 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:34 crc kubenswrapper[4795]: E0219 21:30:34.119584 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:34.619569381 +0000 UTC m=+145.812087245 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.160020 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" podStartSLOduration=121.160004157 podStartE2EDuration="2m1.160004157s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:34.159633897 +0000 UTC m=+145.352151761" watchObservedRunningTime="2026-02-19 21:30:34.160004157 +0000 UTC m=+145.352522021" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.163704 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h" podStartSLOduration=121.163695993 podStartE2EDuration="2m1.163695993s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:34.137087998 +0000 UTC m=+145.329605862" watchObservedRunningTime="2026-02-19 21:30:34.163695993 +0000 UTC m=+145.356213847" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.186961 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" podStartSLOduration=121.1869467 podStartE2EDuration="2m1.1869467s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:34.18579208 +0000 UTC m=+145.378309944" watchObservedRunningTime="2026-02-19 21:30:34.1869467 +0000 UTC m=+145.379464564" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.219727 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:34 crc kubenswrapper[4795]: E0219 21:30:34.219871 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:34.719857839 +0000 UTC m=+145.912375693 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.220124 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:34 crc kubenswrapper[4795]: E0219 21:30:34.220425 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:34.720417093 +0000 UTC m=+145.912934957 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.234305 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-7m8cj" podStartSLOduration=7.234292215 podStartE2EDuration="7.234292215s" podCreationTimestamp="2026-02-19 21:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:34.231311358 +0000 UTC m=+145.423829212" watchObservedRunningTime="2026-02-19 21:30:34.234292215 +0000 UTC m=+145.426810079" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.234771 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" podStartSLOduration=121.234766418 podStartE2EDuration="2m1.234766418s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:34.20956532 +0000 UTC m=+145.402083184" watchObservedRunningTime="2026-02-19 21:30:34.234766418 +0000 UTC m=+145.427284282" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.305249 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" podStartSLOduration=121.305236427 podStartE2EDuration="2m1.305236427s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:34.301645093 +0000 UTC m=+145.494162957" watchObservedRunningTime="2026-02-19 21:30:34.305236427 +0000 UTC m=+145.497754291" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.305657 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c" podStartSLOduration=121.305639568 podStartE2EDuration="2m1.305639568s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:34.269454443 +0000 UTC m=+145.461972307" watchObservedRunningTime="2026-02-19 21:30:34.305639568 +0000 UTC m=+145.498157432" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.321472 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:34 crc kubenswrapper[4795]: E0219 21:30:34.321887 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:34.821872911 +0000 UTC m=+146.014390775 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.423117 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:34 crc kubenswrapper[4795]: E0219 21:30:34.423819 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:34.923808072 +0000 UTC m=+146.116325936 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.524656 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:34 crc kubenswrapper[4795]: E0219 21:30:34.525071 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:35.025056484 +0000 UTC m=+146.217574348 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.627855 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:34 crc kubenswrapper[4795]: E0219 21:30:34.628269 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:35.128254998 +0000 UTC m=+146.320772862 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.728516 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:34 crc kubenswrapper[4795]: E0219 21:30:34.729021 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:35.229005657 +0000 UTC m=+146.421523521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.762656 4795 patch_prober.go:28] interesting pod/router-default-5444994796-r8dcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 21:30:34 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Feb 19 21:30:34 crc kubenswrapper[4795]: [+]process-running ok Feb 19 21:30:34 crc kubenswrapper[4795]: healthz check failed Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.762717 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r8dcx" podUID="a3a4a159-1ab3-412f-ac71-11a7a41012ea" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.815041 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.815420 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.829771 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:34 crc kubenswrapper[4795]: E0219 21:30:34.830383 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:35.330367733 +0000 UTC m=+146.522885587 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.930736 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:34 crc kubenswrapper[4795]: E0219 21:30:34.931097 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:35.431076151 +0000 UTC m=+146.623594015 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.975071 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.975394 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.976467 4795 patch_prober.go:28] interesting pod/apiserver-76f77b778f-t48rm container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.976499 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-t48rm" podUID="ed6485ab-c517-41cd-a755-d5dc9557456b" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.982834 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" event={"ID":"35ca1bac-5479-4e7f-80a5-c1811edc9e8e","Type":"ContainerStarted","Data":"baac9fd9e19418cf7815f351248782eb3254e0242609ec82088db0d409d83853"} Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.983611 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.985169 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ggndt" event={"ID":"4491a495-df05-43ef-bff7-2438317eac71","Type":"ContainerStarted","Data":"447e03980132dc620ddb750533dc7366ebeea6d6100dd620bc4d50401b5dfb1c"} Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.987377 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" event={"ID":"275a566e-d165-434a-95c1-9154ede6e14e","Type":"ContainerStarted","Data":"6109fd1152f22fb3b193c8608c764935c95c318e0c9d1e71816f145bf442ea90"} Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.989600 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-5qppq" event={"ID":"c66857b5-461d-4c11-8fa3-52cd619bba60","Type":"ContainerStarted","Data":"effc347963e485f919cb2646b27c3004a96259c88215be04078cf0db6a682e85"} Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.991259 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-c6n7h" event={"ID":"adcffec1-66db-4e9b-9502-be3c9b008dde","Type":"ContainerStarted","Data":"562372e76d08c2a484afcd0157b6171d224a10266b2bfcba849a4ba8b46225f2"} Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.991632 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-c6n7h" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.992583 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" event={"ID":"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a","Type":"ContainerStarted","Data":"59d30cab330fef3987235993f1393803a98398cc08f27df1b7f764a182964aae"} Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.994154 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" event={"ID":"fee0aadb-9ec0-4e6c-bb48-27b74560a4ac","Type":"ContainerStarted","Data":"d48d1d12d674b168b9ecf175194c27f5d1f3329e9a58960e0b5c14b09de4ca7b"} Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.002845 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" event={"ID":"7005ed62-dcc4-4fb5-ac2b-3aba9de5708a","Type":"ContainerStarted","Data":"90410357321a2ffb19fdf39a1cd4975b34f403ce0d02edc585087774898072f3"} Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.004077 4795 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-m4bg2 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.004110 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" podUID="7005ed62-dcc4-4fb5-ac2b-3aba9de5708a" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.008840 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-t48rm" event={"ID":"ed6485ab-c517-41cd-a755-d5dc9557456b","Type":"ContainerStarted","Data":"a3566673f0303cd80cc2ea7f3733afdf3ad6e57c924761ff75cb63c35e05fbeb"} Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.011802 4795 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dfw9n container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.011845 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" podUID="7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.037485 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:35 crc kubenswrapper[4795]: E0219 21:30:35.037921 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:35.53790652 +0000 UTC m=+146.730424384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.052456 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.139050 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:35 crc kubenswrapper[4795]: E0219 21:30:35.141168 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:35.641150244 +0000 UTC m=+146.833668108 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.186881 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-5qppq" podStartSLOduration=122.186861697 podStartE2EDuration="2m2.186861697s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:35.184636859 +0000 UTC m=+146.377154753" watchObservedRunningTime="2026-02-19 21:30:35.186861697 +0000 UTC m=+146.379379561" Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.187540 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" podStartSLOduration=122.187532425 podStartE2EDuration="2m2.187532425s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:35.093563582 +0000 UTC m=+146.286081446" watchObservedRunningTime="2026-02-19 21:30:35.187532425 +0000 UTC m=+146.380050289" Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.241946 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:35 crc kubenswrapper[4795]: E0219 21:30:35.242353 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:35.742339245 +0000 UTC m=+146.934857109 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.250226 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-ggndt" podStartSLOduration=122.250212531 podStartE2EDuration="2m2.250212531s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:35.24866379 +0000 UTC m=+146.441181654" watchObservedRunningTime="2026-02-19 21:30:35.250212531 +0000 UTC m=+146.442730385" Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.311421 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.342750 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:35 crc kubenswrapper[4795]: E0219 21:30:35.343184 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:35.843161037 +0000 UTC m=+147.035678901 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.440413 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-c6n7h" podStartSLOduration=8.440394235 podStartE2EDuration="8.440394235s" podCreationTimestamp="2026-02-19 21:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:35.3624284 +0000 UTC m=+146.554946264" watchObservedRunningTime="2026-02-19 21:30:35.440394235 +0000 UTC m=+146.632912099" Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.443950 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:35 crc kubenswrapper[4795]: E0219 21:30:35.444348 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:35.944328437 +0000 UTC m=+147.136846301 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.545230 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:35 crc kubenswrapper[4795]: E0219 21:30:35.545416 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.045387625 +0000 UTC m=+147.237905489 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.545549 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:35 crc kubenswrapper[4795]: E0219 21:30:35.545831 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.045818946 +0000 UTC m=+147.238336810 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.647107 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:35 crc kubenswrapper[4795]: E0219 21:30:35.647427 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.147396657 +0000 UTC m=+147.339914521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.647537 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:35 crc kubenswrapper[4795]: E0219 21:30:35.647867 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.14785933 +0000 UTC m=+147.340377194 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.748396 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:35 crc kubenswrapper[4795]: E0219 21:30:35.748583 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.248556808 +0000 UTC m=+147.441074672 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.748715 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:35 crc kubenswrapper[4795]: E0219 21:30:35.749030 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.2490209 +0000 UTC m=+147.441538764 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.759304 4795 patch_prober.go:28] interesting pod/router-default-5444994796-r8dcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 21:30:35 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Feb 19 21:30:35 crc kubenswrapper[4795]: [+]process-running ok Feb 19 21:30:35 crc kubenswrapper[4795]: healthz check failed Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.759364 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r8dcx" podUID="a3a4a159-1ab3-412f-ac71-11a7a41012ea" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.850101 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:35 crc kubenswrapper[4795]: E0219 21:30:35.850291 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.350262932 +0000 UTC m=+147.542780796 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.850383 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:35 crc kubenswrapper[4795]: E0219 21:30:35.850637 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.350627952 +0000 UTC m=+147.543145916 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.951837 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:35 crc kubenswrapper[4795]: E0219 21:30:35.952037 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.452010908 +0000 UTC m=+147.644528772 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.952262 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:35 crc kubenswrapper[4795]: E0219 21:30:35.952593 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.452580863 +0000 UTC m=+147.645098727 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.015282 4795 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-pgfxb container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.015326 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" podUID="b11366da-6972-44ce-8e8c-151de77fa689" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.28:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.016205 4795 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dfw9n container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.016246 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" podUID="7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.020233 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.029414 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.052788 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:36 crc kubenswrapper[4795]: E0219 21:30:36.053119 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.553104416 +0000 UTC m=+147.745622280 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.065670 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zzmtm"] Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.066551 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.069045 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.069513 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.095972 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zzmtm"] Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.155135 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.155666 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8c7f503-32c4-4ca2-8435-9918cae8d931-catalog-content\") pod \"certified-operators-zzmtm\" (UID: \"e8c7f503-32c4-4ca2-8435-9918cae8d931\") " pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.155695 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8c7f503-32c4-4ca2-8435-9918cae8d931-utilities\") pod \"certified-operators-zzmtm\" (UID: \"e8c7f503-32c4-4ca2-8435-9918cae8d931\") " pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.155865 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krx4q\" (UniqueName: \"kubernetes.io/projected/e8c7f503-32c4-4ca2-8435-9918cae8d931-kube-api-access-krx4q\") pod \"certified-operators-zzmtm\" (UID: \"e8c7f503-32c4-4ca2-8435-9918cae8d931\") " pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:30:36 crc kubenswrapper[4795]: E0219 21:30:36.161612 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.661596168 +0000 UTC m=+147.854114032 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.256990 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.257273 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krx4q\" (UniqueName: \"kubernetes.io/projected/e8c7f503-32c4-4ca2-8435-9918cae8d931-kube-api-access-krx4q\") pod \"certified-operators-zzmtm\" (UID: \"e8c7f503-32c4-4ca2-8435-9918cae8d931\") " pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.257396 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8c7f503-32c4-4ca2-8435-9918cae8d931-catalog-content\") pod \"certified-operators-zzmtm\" (UID: \"e8c7f503-32c4-4ca2-8435-9918cae8d931\") " pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.257434 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8c7f503-32c4-4ca2-8435-9918cae8d931-utilities\") pod \"certified-operators-zzmtm\" (UID: \"e8c7f503-32c4-4ca2-8435-9918cae8d931\") " pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.257970 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8c7f503-32c4-4ca2-8435-9918cae8d931-utilities\") pod \"certified-operators-zzmtm\" (UID: \"e8c7f503-32c4-4ca2-8435-9918cae8d931\") " pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:30:36 crc kubenswrapper[4795]: E0219 21:30:36.258089 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.758073726 +0000 UTC m=+147.950591590 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.258639 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8c7f503-32c4-4ca2-8435-9918cae8d931-catalog-content\") pod \"certified-operators-zzmtm\" (UID: \"e8c7f503-32c4-4ca2-8435-9918cae8d931\") " pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.311362 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krx4q\" (UniqueName: \"kubernetes.io/projected/e8c7f503-32c4-4ca2-8435-9918cae8d931-kube-api-access-krx4q\") pod \"certified-operators-zzmtm\" (UID: \"e8c7f503-32c4-4ca2-8435-9918cae8d931\") " pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.360902 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:36 crc kubenswrapper[4795]: E0219 21:30:36.361241 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.861212268 +0000 UTC m=+148.053730132 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.380603 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.446370 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fq62x"] Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.447492 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.461468 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:36 crc kubenswrapper[4795]: E0219 21:30:36.461603 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.961581048 +0000 UTC m=+148.154098912 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.461778 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:36 crc kubenswrapper[4795]: E0219 21:30:36.462077 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.96206998 +0000 UTC m=+148.154587844 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.500941 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fq62x"] Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.563085 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:36 crc kubenswrapper[4795]: E0219 21:30:36.563273 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:37.063233461 +0000 UTC m=+148.255751325 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.563602 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.563646 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:36 crc kubenswrapper[4795]: E0219 21:30:36.563926 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:37.063914648 +0000 UTC m=+148.256432512 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.563667 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.564120 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-catalog-content\") pod \"certified-operators-fq62x\" (UID: \"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7\") " pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.564147 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.564212 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-utilities\") pod \"certified-operators-fq62x\" (UID: \"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7\") " pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.564730 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.564761 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr9vx\" (UniqueName: \"kubernetes.io/projected/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-kube-api-access-jr9vx\") pod \"certified-operators-fq62x\" (UID: \"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7\") " pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.568640 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.569722 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.569838 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.572889 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.664018 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-c9sh5"] Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.672329 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.673883 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.674208 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr9vx\" (UniqueName: \"kubernetes.io/projected/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-kube-api-access-jr9vx\") pod \"certified-operators-fq62x\" (UID: \"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7\") " pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.674302 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-catalog-content\") pod \"certified-operators-fq62x\" (UID: \"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7\") " pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.674333 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-utilities\") pod \"certified-operators-fq62x\" (UID: \"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7\") " pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.674688 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-utilities\") pod \"certified-operators-fq62x\" (UID: \"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7\") " pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:30:36 crc kubenswrapper[4795]: E0219 21:30:36.674756 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:37.174744381 +0000 UTC m=+148.367262245 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.675099 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-catalog-content\") pod \"certified-operators-fq62x\" (UID: \"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7\") " pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.675528 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c9sh5"] Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.684938 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.708025 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr9vx\" (UniqueName: \"kubernetes.io/projected/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-kube-api-access-jr9vx\") pod \"certified-operators-fq62x\" (UID: \"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7\") " pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.724089 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.734097 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.740484 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.763682 4795 patch_prober.go:28] interesting pod/router-default-5444994796-r8dcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 21:30:36 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Feb 19 21:30:36 crc kubenswrapper[4795]: [+]process-running ok Feb 19 21:30:36 crc kubenswrapper[4795]: healthz check failed Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.763738 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r8dcx" podUID="a3a4a159-1ab3-412f-ac71-11a7a41012ea" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.774036 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.778957 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lns2m\" (UniqueName: \"kubernetes.io/projected/7ae7ca82-f2b1-4fec-9f66-732017519586-kube-api-access-lns2m\") pod \"community-operators-c9sh5\" (UID: \"7ae7ca82-f2b1-4fec-9f66-732017519586\") " pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.779000 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ae7ca82-f2b1-4fec-9f66-732017519586-utilities\") pod \"community-operators-c9sh5\" (UID: \"7ae7ca82-f2b1-4fec-9f66-732017519586\") " pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.779043 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ae7ca82-f2b1-4fec-9f66-732017519586-catalog-content\") pod \"community-operators-c9sh5\" (UID: \"7ae7ca82-f2b1-4fec-9f66-732017519586\") " pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.779074 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:36 crc kubenswrapper[4795]: E0219 21:30:36.779379 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:37.279367372 +0000 UTC m=+148.471885236 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.829640 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5j7b9"] Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.830525 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.847345 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5j7b9"] Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.881621 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.881846 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ae7ca82-f2b1-4fec-9f66-732017519586-catalog-content\") pod \"community-operators-c9sh5\" (UID: \"7ae7ca82-f2b1-4fec-9f66-732017519586\") " pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.881929 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lns2m\" (UniqueName: \"kubernetes.io/projected/7ae7ca82-f2b1-4fec-9f66-732017519586-kube-api-access-lns2m\") pod \"community-operators-c9sh5\" (UID: \"7ae7ca82-f2b1-4fec-9f66-732017519586\") " pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.881962 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ae7ca82-f2b1-4fec-9f66-732017519586-utilities\") pod \"community-operators-c9sh5\" (UID: \"7ae7ca82-f2b1-4fec-9f66-732017519586\") " pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.882359 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ae7ca82-f2b1-4fec-9f66-732017519586-utilities\") pod \"community-operators-c9sh5\" (UID: \"7ae7ca82-f2b1-4fec-9f66-732017519586\") " pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:30:36 crc kubenswrapper[4795]: E0219 21:30:36.882423 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:37.382409141 +0000 UTC m=+148.574927005 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.882617 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ae7ca82-f2b1-4fec-9f66-732017519586-catalog-content\") pod \"community-operators-c9sh5\" (UID: \"7ae7ca82-f2b1-4fec-9f66-732017519586\") " pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.949138 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lns2m\" (UniqueName: \"kubernetes.io/projected/7ae7ca82-f2b1-4fec-9f66-732017519586-kube-api-access-lns2m\") pod \"community-operators-c9sh5\" (UID: \"7ae7ca82-f2b1-4fec-9f66-732017519586\") " pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.983024 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.983105 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fa17669-dc5e-46a8-a76d-befdbc69aeed-catalog-content\") pod \"community-operators-5j7b9\" (UID: \"1fa17669-dc5e-46a8-a76d-befdbc69aeed\") " pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.983125 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9dxw\" (UniqueName: \"kubernetes.io/projected/1fa17669-dc5e-46a8-a76d-befdbc69aeed-kube-api-access-l9dxw\") pod \"community-operators-5j7b9\" (UID: \"1fa17669-dc5e-46a8-a76d-befdbc69aeed\") " pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.983144 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fa17669-dc5e-46a8-a76d-befdbc69aeed-utilities\") pod \"community-operators-5j7b9\" (UID: \"1fa17669-dc5e-46a8-a76d-befdbc69aeed\") " pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:30:36 crc kubenswrapper[4795]: E0219 21:30:36.983428 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:37.483416927 +0000 UTC m=+148.675934781 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.018502 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.034559 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zzmtm"] Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.049561 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" event={"ID":"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a","Type":"ContainerStarted","Data":"5e33ec90e2349cffba1bed6b000e6ddc3fa76b571da946385658fa940b065a10"} Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.049618 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" event={"ID":"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a","Type":"ContainerStarted","Data":"5887ede1a359c126f9c483aacc645e76ff321fd23ce5eccfe6599ca7db6dc0e0"} Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.069966 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.084365 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.084604 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fa17669-dc5e-46a8-a76d-befdbc69aeed-catalog-content\") pod \"community-operators-5j7b9\" (UID: \"1fa17669-dc5e-46a8-a76d-befdbc69aeed\") " pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.084653 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9dxw\" (UniqueName: \"kubernetes.io/projected/1fa17669-dc5e-46a8-a76d-befdbc69aeed-kube-api-access-l9dxw\") pod \"community-operators-5j7b9\" (UID: \"1fa17669-dc5e-46a8-a76d-befdbc69aeed\") " pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.084681 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fa17669-dc5e-46a8-a76d-befdbc69aeed-utilities\") pod \"community-operators-5j7b9\" (UID: \"1fa17669-dc5e-46a8-a76d-befdbc69aeed\") " pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.085447 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fa17669-dc5e-46a8-a76d-befdbc69aeed-utilities\") pod \"community-operators-5j7b9\" (UID: \"1fa17669-dc5e-46a8-a76d-befdbc69aeed\") " pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:30:37 crc kubenswrapper[4795]: E0219 21:30:37.085737 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:37.585723998 +0000 UTC m=+148.778241862 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.086054 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fa17669-dc5e-46a8-a76d-befdbc69aeed-catalog-content\") pod \"community-operators-5j7b9\" (UID: \"1fa17669-dc5e-46a8-a76d-befdbc69aeed\") " pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.101808 4795 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.123807 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9dxw\" (UniqueName: \"kubernetes.io/projected/1fa17669-dc5e-46a8-a76d-befdbc69aeed-kube-api-access-l9dxw\") pod \"community-operators-5j7b9\" (UID: \"1fa17669-dc5e-46a8-a76d-befdbc69aeed\") " pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.170108 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.186153 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:37 crc kubenswrapper[4795]: E0219 21:30:37.193564 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:37.693548162 +0000 UTC m=+148.886066026 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.294990 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:37 crc kubenswrapper[4795]: E0219 21:30:37.295186 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:37.795148053 +0000 UTC m=+148.987665917 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.295465 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:37 crc kubenswrapper[4795]: E0219 21:30:37.295875 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:37.795863261 +0000 UTC m=+148.988381125 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.394026 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fq62x"] Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.396940 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:37 crc kubenswrapper[4795]: E0219 21:30:37.397041 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:37.897024452 +0000 UTC m=+149.089542316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.397325 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:37 crc kubenswrapper[4795]: E0219 21:30:37.397591 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:37.897583046 +0000 UTC m=+149.090100910 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.479912 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c9sh5"] Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.498287 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:37 crc kubenswrapper[4795]: E0219 21:30:37.498630 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:37.998615743 +0000 UTC m=+149.191133607 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.579796 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5j7b9"] Feb 19 21:30:37 crc kubenswrapper[4795]: W0219 21:30:37.596175 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fa17669_dc5e_46a8_a76d_befdbc69aeed.slice/crio-825dc9912efe54135ca23a728bf83c290bc47d9c0fb6c14bb961a4a89a196a6f WatchSource:0}: Error finding container 825dc9912efe54135ca23a728bf83c290bc47d9c0fb6c14bb961a4a89a196a6f: Status 404 returned error can't find the container with id 825dc9912efe54135ca23a728bf83c290bc47d9c0fb6c14bb961a4a89a196a6f Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.599036 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:37 crc kubenswrapper[4795]: E0219 21:30:37.599344 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:38.099333292 +0000 UTC m=+149.291851156 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.650401 4795 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-19T21:30:37.101834458Z","Handler":null,"Name":""} Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.653018 4795 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.653044 4795 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.700658 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.706464 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.761839 4795 patch_prober.go:28] interesting pod/router-default-5444994796-r8dcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 21:30:37 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Feb 19 21:30:37 crc kubenswrapper[4795]: [+]process-running ok Feb 19 21:30:37 crc kubenswrapper[4795]: healthz check failed Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.761925 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r8dcx" podUID="a3a4a159-1ab3-412f-ac71-11a7a41012ea" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.802565 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.806645 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.806690 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.830493 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.936944 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.051952 4795 generic.go:334] "Generic (PLEG): container finished" podID="8e47d73b-026b-4bc2-b1f0-c69efadc0ce7" containerID="7884e76708116ab40efc68a93802f7cd42d9acd2ddce9816192d25b3558b0941" exitCode=0 Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.052013 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fq62x" event={"ID":"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7","Type":"ContainerDied","Data":"7884e76708116ab40efc68a93802f7cd42d9acd2ddce9816192d25b3558b0941"} Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.052346 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fq62x" event={"ID":"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7","Type":"ContainerStarted","Data":"56f8a93ddff4618883796150de8b693b1c3a76f9b5f00a99b738a48400fed9ee"} Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.054344 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.060538 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" event={"ID":"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a","Type":"ContainerStarted","Data":"eeeda87c478c0e1dada2824cab3e810a39e2f842567d1a3bb0ad3612f1a76b4a"} Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.063387 4795 generic.go:334] "Generic (PLEG): container finished" podID="7ae7ca82-f2b1-4fec-9f66-732017519586" containerID="4275386b7725c8ec761b57381bafb0a0755373bf551591680c5fa169d19954f2" exitCode=0 Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.063441 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c9sh5" event={"ID":"7ae7ca82-f2b1-4fec-9f66-732017519586","Type":"ContainerDied","Data":"4275386b7725c8ec761b57381bafb0a0755373bf551591680c5fa169d19954f2"} Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.063459 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c9sh5" event={"ID":"7ae7ca82-f2b1-4fec-9f66-732017519586","Type":"ContainerStarted","Data":"28fe4330f368c52acd76e8871982506eeb01297bf98866cc2f51b2139ec4aa1c"} Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.068547 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"27be47797c25c6a79413f9b396c516b314e3ef3560b58fd64ee67c1fe2df8d32"} Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.068571 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a3ed799ffa6ca54f767f59fc5711fe1659ebf77723f8d0f85d3fc88c2808c6fe"} Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.079998 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"51b6a9c659607d7c5797c52ed691c081d0bf0058928d445bdf4dcf47ab7ea3a9"} Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.080034 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"28fdba99ff83e7589075fc6aa68f08d877c3c4ad185af341ba92b78c15e02b95"} Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.083544 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"abcb30a3d78b3a1d89fac137a91a0fcba6f1c414993c128184e69434203178ba"} Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.083565 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"df08c48e80cceee41ffef0a876495946fd36f0962bd8df4b127cce127913da82"} Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.083739 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.092342 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" podStartSLOduration=11.092322229 podStartE2EDuration="11.092322229s" podCreationTimestamp="2026-02-19 21:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:38.088614522 +0000 UTC m=+149.281132386" watchObservedRunningTime="2026-02-19 21:30:38.092322229 +0000 UTC m=+149.284840103" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.095912 4795 generic.go:334] "Generic (PLEG): container finished" podID="e8c7f503-32c4-4ca2-8435-9918cae8d931" containerID="843a490c0d5b242211efc7cdd05db863b65b298c17a2a1bc1bcd8caa9584d3d3" exitCode=0 Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.096006 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzmtm" event={"ID":"e8c7f503-32c4-4ca2-8435-9918cae8d931","Type":"ContainerDied","Data":"843a490c0d5b242211efc7cdd05db863b65b298c17a2a1bc1bcd8caa9584d3d3"} Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.096034 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzmtm" event={"ID":"e8c7f503-32c4-4ca2-8435-9918cae8d931","Type":"ContainerStarted","Data":"1cc35dfba309639dbd91f218472ea2c5630e482b3631fbe826d36494a013aa5f"} Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.098029 4795 generic.go:334] "Generic (PLEG): container finished" podID="1fa17669-dc5e-46a8-a76d-befdbc69aeed" containerID="ed0bfebb52fd6133757aa14996934704806932b85d024110638b142b237490a2" exitCode=0 Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.098734 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5j7b9" event={"ID":"1fa17669-dc5e-46a8-a76d-befdbc69aeed","Type":"ContainerDied","Data":"ed0bfebb52fd6133757aa14996934704806932b85d024110638b142b237490a2"} Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.098771 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5j7b9" event={"ID":"1fa17669-dc5e-46a8-a76d-befdbc69aeed","Type":"ContainerStarted","Data":"825dc9912efe54135ca23a728bf83c290bc47d9c0fb6c14bb961a4a89a196a6f"} Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.173044 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4h49m"] Feb 19 21:30:38 crc kubenswrapper[4795]: W0219 21:30:38.190405 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80407681_6091_46cc_836f_757ec4d16604.slice/crio-f75c3a05080941863a756e5365daccf1f7896cc60502e7fce756c537524233eb WatchSource:0}: Error finding container f75c3a05080941863a756e5365daccf1f7896cc60502e7fce756c537524233eb: Status 404 returned error can't find the container with id f75c3a05080941863a756e5365daccf1f7896cc60502e7fce756c537524233eb Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.230162 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bmzl7"] Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.238680 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.240463 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.250867 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bmzl7"] Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.324161 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5vg2\" (UniqueName: \"kubernetes.io/projected/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-kube-api-access-m5vg2\") pod \"redhat-marketplace-bmzl7\" (UID: \"12e5472a-2c4b-4b71-91fb-06c3d5fcca54\") " pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.324359 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-catalog-content\") pod \"redhat-marketplace-bmzl7\" (UID: \"12e5472a-2c4b-4b71-91fb-06c3d5fcca54\") " pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.324457 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-utilities\") pod \"redhat-marketplace-bmzl7\" (UID: \"12e5472a-2c4b-4b71-91fb-06c3d5fcca54\") " pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.425902 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-catalog-content\") pod \"redhat-marketplace-bmzl7\" (UID: \"12e5472a-2c4b-4b71-91fb-06c3d5fcca54\") " pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.425957 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-utilities\") pod \"redhat-marketplace-bmzl7\" (UID: \"12e5472a-2c4b-4b71-91fb-06c3d5fcca54\") " pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.426032 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5vg2\" (UniqueName: \"kubernetes.io/projected/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-kube-api-access-m5vg2\") pod \"redhat-marketplace-bmzl7\" (UID: \"12e5472a-2c4b-4b71-91fb-06c3d5fcca54\") " pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.426485 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-catalog-content\") pod \"redhat-marketplace-bmzl7\" (UID: \"12e5472a-2c4b-4b71-91fb-06c3d5fcca54\") " pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.426514 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-utilities\") pod \"redhat-marketplace-bmzl7\" (UID: \"12e5472a-2c4b-4b71-91fb-06c3d5fcca54\") " pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.446491 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5vg2\" (UniqueName: \"kubernetes.io/projected/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-kube-api-access-m5vg2\") pod \"redhat-marketplace-bmzl7\" (UID: \"12e5472a-2c4b-4b71-91fb-06c3d5fcca54\") " pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.557340 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.622551 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v698q"] Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.626511 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.661554 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v698q"] Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.732706 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d858d3ea-6432-49a9-9b32-2e36b61c6e57-utilities\") pod \"redhat-marketplace-v698q\" (UID: \"d858d3ea-6432-49a9-9b32-2e36b61c6e57\") " pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.732756 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d858d3ea-6432-49a9-9b32-2e36b61c6e57-catalog-content\") pod \"redhat-marketplace-v698q\" (UID: \"d858d3ea-6432-49a9-9b32-2e36b61c6e57\") " pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.732779 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnffn\" (UniqueName: \"kubernetes.io/projected/d858d3ea-6432-49a9-9b32-2e36b61c6e57-kube-api-access-hnffn\") pod \"redhat-marketplace-v698q\" (UID: \"d858d3ea-6432-49a9-9b32-2e36b61c6e57\") " pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.760685 4795 patch_prober.go:28] interesting pod/router-default-5444994796-r8dcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 21:30:38 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Feb 19 21:30:38 crc kubenswrapper[4795]: [+]process-running ok Feb 19 21:30:38 crc kubenswrapper[4795]: healthz check failed Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.760756 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r8dcx" podUID="a3a4a159-1ab3-412f-ac71-11a7a41012ea" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.774860 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bmzl7"] Feb 19 21:30:38 crc kubenswrapper[4795]: W0219 21:30:38.806869 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12e5472a_2c4b_4b71_91fb_06c3d5fcca54.slice/crio-e5d0662258d341fed3d8a6fa8a495b8db4873d0c1396a7edd323344c9bdab748 WatchSource:0}: Error finding container e5d0662258d341fed3d8a6fa8a495b8db4873d0c1396a7edd323344c9bdab748: Status 404 returned error can't find the container with id e5d0662258d341fed3d8a6fa8a495b8db4873d0c1396a7edd323344c9bdab748 Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.833666 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d858d3ea-6432-49a9-9b32-2e36b61c6e57-utilities\") pod \"redhat-marketplace-v698q\" (UID: \"d858d3ea-6432-49a9-9b32-2e36b61c6e57\") " pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.833711 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d858d3ea-6432-49a9-9b32-2e36b61c6e57-catalog-content\") pod \"redhat-marketplace-v698q\" (UID: \"d858d3ea-6432-49a9-9b32-2e36b61c6e57\") " pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.833736 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnffn\" (UniqueName: \"kubernetes.io/projected/d858d3ea-6432-49a9-9b32-2e36b61c6e57-kube-api-access-hnffn\") pod \"redhat-marketplace-v698q\" (UID: \"d858d3ea-6432-49a9-9b32-2e36b61c6e57\") " pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.834257 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d858d3ea-6432-49a9-9b32-2e36b61c6e57-catalog-content\") pod \"redhat-marketplace-v698q\" (UID: \"d858d3ea-6432-49a9-9b32-2e36b61c6e57\") " pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.834715 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d858d3ea-6432-49a9-9b32-2e36b61c6e57-utilities\") pod \"redhat-marketplace-v698q\" (UID: \"d858d3ea-6432-49a9-9b32-2e36b61c6e57\") " pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.854772 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnffn\" (UniqueName: \"kubernetes.io/projected/d858d3ea-6432-49a9-9b32-2e36b61c6e57-kube-api-access-hnffn\") pod \"redhat-marketplace-v698q\" (UID: \"d858d3ea-6432-49a9-9b32-2e36b61c6e57\") " pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.955930 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.110412 4795 generic.go:334] "Generic (PLEG): container finished" podID="05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a" containerID="8f02da0738e9d178691c0389499a8480ffbd2f961710f2202fa44516bf81c4c6" exitCode=0 Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.110479 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" event={"ID":"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a","Type":"ContainerDied","Data":"8f02da0738e9d178691c0389499a8480ffbd2f961710f2202fa44516bf81c4c6"} Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.114205 4795 generic.go:334] "Generic (PLEG): container finished" podID="12e5472a-2c4b-4b71-91fb-06c3d5fcca54" containerID="7d3a5224f2284bc0e0b27ce31d0a2ef7f65f8de8a465113fd61e0813421d7fde" exitCode=0 Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.114288 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bmzl7" event={"ID":"12e5472a-2c4b-4b71-91fb-06c3d5fcca54","Type":"ContainerDied","Data":"7d3a5224f2284bc0e0b27ce31d0a2ef7f65f8de8a465113fd61e0813421d7fde"} Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.114315 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bmzl7" event={"ID":"12e5472a-2c4b-4b71-91fb-06c3d5fcca54","Type":"ContainerStarted","Data":"e5d0662258d341fed3d8a6fa8a495b8db4873d0c1396a7edd323344c9bdab748"} Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.153243 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" event={"ID":"80407681-6091-46cc-836f-757ec4d16604","Type":"ContainerStarted","Data":"65bbbf8046489156a597ecad5046b6cb27a3c794bdbc97d6bd9820be41cf0ce9"} Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.153293 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" event={"ID":"80407681-6091-46cc-836f-757ec4d16604","Type":"ContainerStarted","Data":"f75c3a05080941863a756e5365daccf1f7896cc60502e7fce756c537524233eb"} Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.180844 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" podStartSLOduration=126.180823239 podStartE2EDuration="2m6.180823239s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:39.17857686 +0000 UTC m=+150.371094734" watchObservedRunningTime="2026-02-19 21:30:39.180823239 +0000 UTC m=+150.373341103" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.340987 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v698q"] Feb 19 21:30:39 crc kubenswrapper[4795]: W0219 21:30:39.355051 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd858d3ea_6432_49a9_9b32_2e36b61c6e57.slice/crio-28460660f6d692ba6e8f6b91806cd443070cee612e5427f6f9dabd127b7a144e WatchSource:0}: Error finding container 28460660f6d692ba6e8f6b91806cd443070cee612e5427f6f9dabd127b7a144e: Status 404 returned error can't find the container with id 28460660f6d692ba6e8f6b91806cd443070cee612e5427f6f9dabd127b7a144e Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.434496 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5tclw"] Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.436479 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.437501 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5tclw"] Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.438961 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.544517 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.545398 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-utilities\") pod \"redhat-operators-5tclw\" (UID: \"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9\") " pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.545553 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-catalog-content\") pod \"redhat-operators-5tclw\" (UID: \"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9\") " pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.545581 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ddlx\" (UniqueName: \"kubernetes.io/projected/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-kube-api-access-5ddlx\") pod \"redhat-operators-5tclw\" (UID: \"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9\") " pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.646607 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-catalog-content\") pod \"redhat-operators-5tclw\" (UID: \"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9\") " pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.646659 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ddlx\" (UniqueName: \"kubernetes.io/projected/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-kube-api-access-5ddlx\") pod \"redhat-operators-5tclw\" (UID: \"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9\") " pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.646720 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-utilities\") pod \"redhat-operators-5tclw\" (UID: \"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9\") " pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.648118 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-utilities\") pod \"redhat-operators-5tclw\" (UID: \"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9\") " pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.648588 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-catalog-content\") pod \"redhat-operators-5tclw\" (UID: \"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9\") " pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.678355 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ddlx\" (UniqueName: \"kubernetes.io/projected/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-kube-api-access-5ddlx\") pod \"redhat-operators-5tclw\" (UID: \"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9\") " pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.698418 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.699334 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.701528 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.702065 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.705163 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.756226 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd535c51-1ece-4449-823a-cf80a095eaeb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bd535c51-1ece-4449-823a-cf80a095eaeb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.756321 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd535c51-1ece-4449-823a-cf80a095eaeb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bd535c51-1ece-4449-823a-cf80a095eaeb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.757345 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.763408 4795 patch_prober.go:28] interesting pod/router-default-5444994796-r8dcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 21:30:39 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Feb 19 21:30:39 crc kubenswrapper[4795]: [+]process-running ok Feb 19 21:30:39 crc kubenswrapper[4795]: healthz check failed Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.763799 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r8dcx" podUID="a3a4a159-1ab3-412f-ac71-11a7a41012ea" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.784275 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-bks74" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.785809 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.785864 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.786763 4795 patch_prober.go:28] interesting pod/console-f9d7485db-rvkhj container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.786828 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-rvkhj" podUID="ec60d287-0f21-467c-8030-84b8726af567" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.792471 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-rt6qz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.792509 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-rt6qz" podUID="96e1b4b4-8d48-4955-b756-71d21a5aea0b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.792515 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-rt6qz container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.792551 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-rt6qz" podUID="96e1b4b4-8d48-4955-b756-71d21a5aea0b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.831280 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zpkx6"] Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.859566 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd535c51-1ece-4449-823a-cf80a095eaeb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bd535c51-1ece-4449-823a-cf80a095eaeb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.859841 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd535c51-1ece-4449-823a-cf80a095eaeb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bd535c51-1ece-4449-823a-cf80a095eaeb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.867526 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd535c51-1ece-4449-823a-cf80a095eaeb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bd535c51-1ece-4449-823a-cf80a095eaeb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.869647 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zpkx6"] Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.869769 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.917727 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd535c51-1ece-4449-823a-cf80a095eaeb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bd535c51-1ece-4449-823a-cf80a095eaeb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.961790 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jst7w\" (UniqueName: \"kubernetes.io/projected/ac83daf6-848e-4977-8bb9-a7b4db89618f-kube-api-access-jst7w\") pod \"redhat-operators-zpkx6\" (UID: \"ac83daf6-848e-4977-8bb9-a7b4db89618f\") " pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.961890 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac83daf6-848e-4977-8bb9-a7b4db89618f-utilities\") pod \"redhat-operators-zpkx6\" (UID: \"ac83daf6-848e-4977-8bb9-a7b4db89618f\") " pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.961915 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac83daf6-848e-4977-8bb9-a7b4db89618f-catalog-content\") pod \"redhat-operators-zpkx6\" (UID: \"ac83daf6-848e-4977-8bb9-a7b4db89618f\") " pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.008558 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.020242 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.045635 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.064523 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac83daf6-848e-4977-8bb9-a7b4db89618f-utilities\") pod \"redhat-operators-zpkx6\" (UID: \"ac83daf6-848e-4977-8bb9-a7b4db89618f\") " pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.064613 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac83daf6-848e-4977-8bb9-a7b4db89618f-catalog-content\") pod \"redhat-operators-zpkx6\" (UID: \"ac83daf6-848e-4977-8bb9-a7b4db89618f\") " pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.064729 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jst7w\" (UniqueName: \"kubernetes.io/projected/ac83daf6-848e-4977-8bb9-a7b4db89618f-kube-api-access-jst7w\") pod \"redhat-operators-zpkx6\" (UID: \"ac83daf6-848e-4977-8bb9-a7b4db89618f\") " pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.066519 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac83daf6-848e-4977-8bb9-a7b4db89618f-utilities\") pod \"redhat-operators-zpkx6\" (UID: \"ac83daf6-848e-4977-8bb9-a7b4db89618f\") " pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.066971 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac83daf6-848e-4977-8bb9-a7b4db89618f-catalog-content\") pod \"redhat-operators-zpkx6\" (UID: \"ac83daf6-848e-4977-8bb9-a7b4db89618f\") " pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.111878 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jst7w\" (UniqueName: \"kubernetes.io/projected/ac83daf6-848e-4977-8bb9-a7b4db89618f-kube-api-access-jst7w\") pod \"redhat-operators-zpkx6\" (UID: \"ac83daf6-848e-4977-8bb9-a7b4db89618f\") " pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.209729 4795 generic.go:334] "Generic (PLEG): container finished" podID="d858d3ea-6432-49a9-9b32-2e36b61c6e57" containerID="3f2cfda31513c4d8d601e01af9b97c3ba687e6c071cb1cb6f9eb5d2af4229073" exitCode=0 Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.210670 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v698q" event={"ID":"d858d3ea-6432-49a9-9b32-2e36b61c6e57","Type":"ContainerDied","Data":"3f2cfda31513c4d8d601e01af9b97c3ba687e6c071cb1cb6f9eb5d2af4229073"} Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.210702 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v698q" event={"ID":"d858d3ea-6432-49a9-9b32-2e36b61c6e57","Type":"ContainerStarted","Data":"28460660f6d692ba6e8f6b91806cd443070cee612e5427f6f9dabd127b7a144e"} Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.211096 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.236217 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.387204 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5tclw"] Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.582476 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.671467 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zpkx6"] Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.712254 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" Feb 19 21:30:40 crc kubenswrapper[4795]: W0219 21:30:40.712606 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac83daf6_848e_4977_8bb9_a7b4db89618f.slice/crio-5f603a895c6c8cc9d233f8386c8619bd99f45021b342fc0088eeb4edeed70ce6 WatchSource:0}: Error finding container 5f603a895c6c8cc9d233f8386c8619bd99f45021b342fc0088eeb4edeed70ce6: Status 404 returned error can't find the container with id 5f603a895c6c8cc9d233f8386c8619bd99f45021b342fc0088eeb4edeed70ce6 Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.757271 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.765146 4795 patch_prober.go:28] interesting pod/router-default-5444994796-r8dcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 21:30:40 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Feb 19 21:30:40 crc kubenswrapper[4795]: [+]process-running ok Feb 19 21:30:40 crc kubenswrapper[4795]: healthz check failed Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.765765 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r8dcx" podUID="a3a4a159-1ab3-412f-ac71-11a7a41012ea" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.792568 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-secret-volume\") pod \"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a\" (UID: \"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a\") " Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.792638 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-config-volume\") pod \"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a\" (UID: \"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a\") " Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.792672 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9bnm\" (UniqueName: \"kubernetes.io/projected/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-kube-api-access-w9bnm\") pod \"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a\" (UID: \"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a\") " Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.798452 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-config-volume" (OuterVolumeSpecName: "config-volume") pod "05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a" (UID: "05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.800232 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-kube-api-access-w9bnm" (OuterVolumeSpecName: "kube-api-access-w9bnm") pod "05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a" (UID: "05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a"). InnerVolumeSpecName "kube-api-access-w9bnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.801022 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a" (UID: "05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.882152 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.898346 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9bnm\" (UniqueName: \"kubernetes.io/projected/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-kube-api-access-w9bnm\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.898463 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.898473 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.222046 4795 generic.go:334] "Generic (PLEG): container finished" podID="ac83daf6-848e-4977-8bb9-a7b4db89618f" containerID="b9d88a868f45e4d7fdf10f8b6b1fdb6c04b5f7137907622ee25cca321d0d04fd" exitCode=0 Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.222236 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpkx6" event={"ID":"ac83daf6-848e-4977-8bb9-a7b4db89618f","Type":"ContainerDied","Data":"b9d88a868f45e4d7fdf10f8b6b1fdb6c04b5f7137907622ee25cca321d0d04fd"} Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.222463 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpkx6" event={"ID":"ac83daf6-848e-4977-8bb9-a7b4db89618f","Type":"ContainerStarted","Data":"5f603a895c6c8cc9d233f8386c8619bd99f45021b342fc0088eeb4edeed70ce6"} Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.229057 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bd535c51-1ece-4449-823a-cf80a095eaeb","Type":"ContainerStarted","Data":"8c3aef20d6522b3352b0833650c257ae48627d27a744a1ceabc69c6a6ad6e576"} Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.229081 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bd535c51-1ece-4449-823a-cf80a095eaeb","Type":"ContainerStarted","Data":"5c4de5efa33ea824be10af821058f85db139c88f413554e1f8448d076363ae4c"} Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.232230 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" event={"ID":"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a","Type":"ContainerDied","Data":"169f9f502d25a3abb7242d9edf9ee834076b8950d2fb019424ade7bd23fc1053"} Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.232254 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="169f9f502d25a3abb7242d9edf9ee834076b8950d2fb019424ade7bd23fc1053" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.232291 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.249591 4795 generic.go:334] "Generic (PLEG): container finished" podID="dd50e9de-1cec-4f7c-8308-e0e5eb7961b9" containerID="e32b30648a4cba474dcfdca283954a8496e4f09ea18f9bc9f3563fdbbe7453f6" exitCode=0 Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.250454 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tclw" event={"ID":"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9","Type":"ContainerDied","Data":"e32b30648a4cba474dcfdca283954a8496e4f09ea18f9bc9f3563fdbbe7453f6"} Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.250636 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tclw" event={"ID":"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9","Type":"ContainerStarted","Data":"d567b35b55d0a2cbb795271be9aef7ece38aac167cf48328a27f71f0b916ce76"} Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.273360 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.273333902 podStartE2EDuration="2.273333902s" podCreationTimestamp="2026-02-19 21:30:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:41.258328031 +0000 UTC m=+152.450845895" watchObservedRunningTime="2026-02-19 21:30:41.273333902 +0000 UTC m=+152.465851776" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.291627 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 19 21:30:41 crc kubenswrapper[4795]: E0219 21:30:41.291849 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a" containerName="collect-profiles" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.291865 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a" containerName="collect-profiles" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.291970 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a" containerName="collect-profiles" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.293721 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.297572 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.299848 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.314971 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.402810 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ac441c3-3e7c-482d-a324-0c383d0be8ef-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6ac441c3-3e7c-482d-a324-0c383d0be8ef\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.403216 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ac441c3-3e7c-482d-a324-0c383d0be8ef-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6ac441c3-3e7c-482d-a324-0c383d0be8ef\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.505059 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ac441c3-3e7c-482d-a324-0c383d0be8ef-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6ac441c3-3e7c-482d-a324-0c383d0be8ef\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.505111 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ac441c3-3e7c-482d-a324-0c383d0be8ef-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6ac441c3-3e7c-482d-a324-0c383d0be8ef\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.505265 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ac441c3-3e7c-482d-a324-0c383d0be8ef-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6ac441c3-3e7c-482d-a324-0c383d0be8ef\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.520830 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ac441c3-3e7c-482d-a324-0c383d0be8ef-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6ac441c3-3e7c-482d-a324-0c383d0be8ef\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.612813 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.760441 4795 patch_prober.go:28] interesting pod/router-default-5444994796-r8dcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 21:30:41 crc kubenswrapper[4795]: [+]has-synced ok Feb 19 21:30:41 crc kubenswrapper[4795]: [+]process-running ok Feb 19 21:30:41 crc kubenswrapper[4795]: healthz check failed Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.760790 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r8dcx" podUID="a3a4a159-1ab3-412f-ac71-11a7a41012ea" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.929301 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 19 21:30:42 crc kubenswrapper[4795]: I0219 21:30:42.258611 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bd535c51-1ece-4449-823a-cf80a095eaeb","Type":"ContainerDied","Data":"8c3aef20d6522b3352b0833650c257ae48627d27a744a1ceabc69c6a6ad6e576"} Feb 19 21:30:42 crc kubenswrapper[4795]: I0219 21:30:42.259074 4795 generic.go:334] "Generic (PLEG): container finished" podID="bd535c51-1ece-4449-823a-cf80a095eaeb" containerID="8c3aef20d6522b3352b0833650c257ae48627d27a744a1ceabc69c6a6ad6e576" exitCode=0 Feb 19 21:30:42 crc kubenswrapper[4795]: I0219 21:30:42.261923 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6ac441c3-3e7c-482d-a324-0c383d0be8ef","Type":"ContainerStarted","Data":"aeab273dd8030b77f965cee3b847bb8f7437db9bf610dae5a29b1f66ab406354"} Feb 19 21:30:42 crc kubenswrapper[4795]: I0219 21:30:42.764028 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:42 crc kubenswrapper[4795]: I0219 21:30:42.772640 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:43 crc kubenswrapper[4795]: I0219 21:30:43.287123 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6ac441c3-3e7c-482d-a324-0c383d0be8ef","Type":"ContainerStarted","Data":"029945b018ee1703eb6b861ff838efae767f8cd12918eeea6f22170a6224219e"} Feb 19 21:30:43 crc kubenswrapper[4795]: I0219 21:30:43.306117 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.306095587 podStartE2EDuration="2.306095587s" podCreationTimestamp="2026-02-19 21:30:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:43.299474335 +0000 UTC m=+154.491992209" watchObservedRunningTime="2026-02-19 21:30:43.306095587 +0000 UTC m=+154.498613451" Feb 19 21:30:44 crc kubenswrapper[4795]: I0219 21:30:44.294388 4795 generic.go:334] "Generic (PLEG): container finished" podID="6ac441c3-3e7c-482d-a324-0c383d0be8ef" containerID="029945b018ee1703eb6b861ff838efae767f8cd12918eeea6f22170a6224219e" exitCode=0 Feb 19 21:30:44 crc kubenswrapper[4795]: I0219 21:30:44.294478 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6ac441c3-3e7c-482d-a324-0c383d0be8ef","Type":"ContainerDied","Data":"029945b018ee1703eb6b861ff838efae767f8cd12918eeea6f22170a6224219e"} Feb 19 21:30:45 crc kubenswrapper[4795]: I0219 21:30:45.907910 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-c6n7h" Feb 19 21:30:49 crc kubenswrapper[4795]: I0219 21:30:49.796711 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-rt6qz" Feb 19 21:30:49 crc kubenswrapper[4795]: I0219 21:30:49.841736 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:49 crc kubenswrapper[4795]: I0219 21:30:49.845967 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:50 crc kubenswrapper[4795]: I0219 21:30:50.767962 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 21:30:50 crc kubenswrapper[4795]: I0219 21:30:50.872230 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd535c51-1ece-4449-823a-cf80a095eaeb-kube-api-access\") pod \"bd535c51-1ece-4449-823a-cf80a095eaeb\" (UID: \"bd535c51-1ece-4449-823a-cf80a095eaeb\") " Feb 19 21:30:50 crc kubenswrapper[4795]: I0219 21:30:50.872737 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd535c51-1ece-4449-823a-cf80a095eaeb-kubelet-dir\") pod \"bd535c51-1ece-4449-823a-cf80a095eaeb\" (UID: \"bd535c51-1ece-4449-823a-cf80a095eaeb\") " Feb 19 21:30:50 crc kubenswrapper[4795]: I0219 21:30:50.872986 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd535c51-1ece-4449-823a-cf80a095eaeb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bd535c51-1ece-4449-823a-cf80a095eaeb" (UID: "bd535c51-1ece-4449-823a-cf80a095eaeb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:30:50 crc kubenswrapper[4795]: I0219 21:30:50.873127 4795 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd535c51-1ece-4449-823a-cf80a095eaeb-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:50 crc kubenswrapper[4795]: I0219 21:30:50.882370 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd535c51-1ece-4449-823a-cf80a095eaeb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bd535c51-1ece-4449-823a-cf80a095eaeb" (UID: "bd535c51-1ece-4449-823a-cf80a095eaeb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:30:50 crc kubenswrapper[4795]: I0219 21:30:50.974611 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd535c51-1ece-4449-823a-cf80a095eaeb-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:51 crc kubenswrapper[4795]: I0219 21:30:51.374796 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bd535c51-1ece-4449-823a-cf80a095eaeb","Type":"ContainerDied","Data":"5c4de5efa33ea824be10af821058f85db139c88f413554e1f8448d076363ae4c"} Feb 19 21:30:51 crc kubenswrapper[4795]: I0219 21:30:51.374839 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c4de5efa33ea824be10af821058f85db139c88f413554e1f8448d076363ae4c" Feb 19 21:30:51 crc kubenswrapper[4795]: I0219 21:30:51.374866 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 21:30:51 crc kubenswrapper[4795]: I0219 21:30:51.547459 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 21:30:51 crc kubenswrapper[4795]: I0219 21:30:51.580610 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ac441c3-3e7c-482d-a324-0c383d0be8ef-kubelet-dir\") pod \"6ac441c3-3e7c-482d-a324-0c383d0be8ef\" (UID: \"6ac441c3-3e7c-482d-a324-0c383d0be8ef\") " Feb 19 21:30:51 crc kubenswrapper[4795]: I0219 21:30:51.580974 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ac441c3-3e7c-482d-a324-0c383d0be8ef-kube-api-access\") pod \"6ac441c3-3e7c-482d-a324-0c383d0be8ef\" (UID: \"6ac441c3-3e7c-482d-a324-0c383d0be8ef\") " Feb 19 21:30:51 crc kubenswrapper[4795]: I0219 21:30:51.580761 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ac441c3-3e7c-482d-a324-0c383d0be8ef-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6ac441c3-3e7c-482d-a324-0c383d0be8ef" (UID: "6ac441c3-3e7c-482d-a324-0c383d0be8ef"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:30:51 crc kubenswrapper[4795]: I0219 21:30:51.584687 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ac441c3-3e7c-482d-a324-0c383d0be8ef-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6ac441c3-3e7c-482d-a324-0c383d0be8ef" (UID: "6ac441c3-3e7c-482d-a324-0c383d0be8ef"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:30:51 crc kubenswrapper[4795]: I0219 21:30:51.683020 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ac441c3-3e7c-482d-a324-0c383d0be8ef-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:51 crc kubenswrapper[4795]: I0219 21:30:51.683049 4795 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ac441c3-3e7c-482d-a324-0c383d0be8ef-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:52 crc kubenswrapper[4795]: I0219 21:30:52.380804 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6ac441c3-3e7c-482d-a324-0c383d0be8ef","Type":"ContainerDied","Data":"aeab273dd8030b77f965cee3b847bb8f7437db9bf610dae5a29b1f66ab406354"} Feb 19 21:30:52 crc kubenswrapper[4795]: I0219 21:30:52.380858 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 21:30:52 crc kubenswrapper[4795]: I0219 21:30:52.380870 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aeab273dd8030b77f965cee3b847bb8f7437db9bf610dae5a29b1f66ab406354" Feb 19 21:30:55 crc kubenswrapper[4795]: I0219 21:30:55.329657 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs\") pod \"network-metrics-daemon-ff4bs\" (UID: \"1b1b4346-e02e-4614-b2ff-e4628046a92f\") " pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:30:55 crc kubenswrapper[4795]: I0219 21:30:55.333598 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs\") pod \"network-metrics-daemon-ff4bs\" (UID: \"1b1b4346-e02e-4614-b2ff-e4628046a92f\") " pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:30:55 crc kubenswrapper[4795]: I0219 21:30:55.527795 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:30:57 crc kubenswrapper[4795]: I0219 21:30:57.942776 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:58 crc kubenswrapper[4795]: I0219 21:30:58.427477 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:30:58 crc kubenswrapper[4795]: I0219 21:30:58.427527 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:31:08 crc kubenswrapper[4795]: E0219 21:31:08.491890 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 19 21:31:08 crc kubenswrapper[4795]: E0219 21:31:08.492798 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jst7w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-zpkx6_openshift-marketplace(ac83daf6-848e-4977-8bb9-a7b4db89618f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 21:31:08 crc kubenswrapper[4795]: E0219 21:31:08.494118 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-zpkx6" podUID="ac83daf6-848e-4977-8bb9-a7b4db89618f" Feb 19 21:31:08 crc kubenswrapper[4795]: I0219 21:31:08.777421 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ff4bs"] Feb 19 21:31:08 crc kubenswrapper[4795]: W0219 21:31:08.785633 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b1b4346_e02e_4614_b2ff_e4628046a92f.slice/crio-1216a86ba52aa2b06a57fc597c8fa689ed6ddd544ad0f4709f859040348d817f WatchSource:0}: Error finding container 1216a86ba52aa2b06a57fc597c8fa689ed6ddd544ad0f4709f859040348d817f: Status 404 returned error can't find the container with id 1216a86ba52aa2b06a57fc597c8fa689ed6ddd544ad0f4709f859040348d817f Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.467107 4795 generic.go:334] "Generic (PLEG): container finished" podID="1fa17669-dc5e-46a8-a76d-befdbc69aeed" containerID="90b9312e0f5408fc251999d131140fd68f045334ecdf8f5a7e0995574996fb05" exitCode=0 Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.467475 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5j7b9" event={"ID":"1fa17669-dc5e-46a8-a76d-befdbc69aeed","Type":"ContainerDied","Data":"90b9312e0f5408fc251999d131140fd68f045334ecdf8f5a7e0995574996fb05"} Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.470131 4795 generic.go:334] "Generic (PLEG): container finished" podID="e8c7f503-32c4-4ca2-8435-9918cae8d931" containerID="2bc16809b25a9de1b1eb527dd20e661703d80bd6e75ac128fd97a218812a8320" exitCode=0 Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.470200 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzmtm" event={"ID":"e8c7f503-32c4-4ca2-8435-9918cae8d931","Type":"ContainerDied","Data":"2bc16809b25a9de1b1eb527dd20e661703d80bd6e75ac128fd97a218812a8320"} Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.472851 4795 generic.go:334] "Generic (PLEG): container finished" podID="12e5472a-2c4b-4b71-91fb-06c3d5fcca54" containerID="cb4588aca4785a571a304fb53b83b17a5d4fe720455aa15c94f9117f8cbe5514" exitCode=0 Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.472875 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bmzl7" event={"ID":"12e5472a-2c4b-4b71-91fb-06c3d5fcca54","Type":"ContainerDied","Data":"cb4588aca4785a571a304fb53b83b17a5d4fe720455aa15c94f9117f8cbe5514"} Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.475534 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" event={"ID":"1b1b4346-e02e-4614-b2ff-e4628046a92f","Type":"ContainerStarted","Data":"4d69db8866e7c06900356daa727588178e6f7179fc0418bc2e66851ee0f66ee5"} Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.475599 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" event={"ID":"1b1b4346-e02e-4614-b2ff-e4628046a92f","Type":"ContainerStarted","Data":"9cfc570c13b0a6e9e34f1e123b7f4cab69a16b6ca9c1d2cc2e68267610fe8d15"} Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.475615 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" event={"ID":"1b1b4346-e02e-4614-b2ff-e4628046a92f","Type":"ContainerStarted","Data":"1216a86ba52aa2b06a57fc597c8fa689ed6ddd544ad0f4709f859040348d817f"} Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.480212 4795 generic.go:334] "Generic (PLEG): container finished" podID="8e47d73b-026b-4bc2-b1f0-c69efadc0ce7" containerID="b730d866cb5930ce6b3bd30068b74b30f568d9da17e9b40a916b2db10b9b7380" exitCode=0 Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.480237 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fq62x" event={"ID":"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7","Type":"ContainerDied","Data":"b730d866cb5930ce6b3bd30068b74b30f568d9da17e9b40a916b2db10b9b7380"} Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.484608 4795 generic.go:334] "Generic (PLEG): container finished" podID="d858d3ea-6432-49a9-9b32-2e36b61c6e57" containerID="76fd4bbcb0278e373cff16d83c7ceda3fc1a11cdd6076ee88c76e62b714b1bad" exitCode=0 Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.484698 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v698q" event={"ID":"d858d3ea-6432-49a9-9b32-2e36b61c6e57","Type":"ContainerDied","Data":"76fd4bbcb0278e373cff16d83c7ceda3fc1a11cdd6076ee88c76e62b714b1bad"} Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.490415 4795 generic.go:334] "Generic (PLEG): container finished" podID="7ae7ca82-f2b1-4fec-9f66-732017519586" containerID="099f13240cf3aea087d163d3e1052bf3c9c7278f2cd6182ccc06266ec67326ae" exitCode=0 Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.490551 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c9sh5" event={"ID":"7ae7ca82-f2b1-4fec-9f66-732017519586","Type":"ContainerDied","Data":"099f13240cf3aea087d163d3e1052bf3c9c7278f2cd6182ccc06266ec67326ae"} Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.498709 4795 generic.go:334] "Generic (PLEG): container finished" podID="dd50e9de-1cec-4f7c-8308-e0e5eb7961b9" containerID="1fec75a7b046544f542f990a4f1a07786af212b43341a6cab76f9622d84d42c4" exitCode=0 Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.499551 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tclw" event={"ID":"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9","Type":"ContainerDied","Data":"1fec75a7b046544f542f990a4f1a07786af212b43341a6cab76f9622d84d42c4"} Feb 19 21:31:09 crc kubenswrapper[4795]: E0219 21:31:09.510805 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zpkx6" podUID="ac83daf6-848e-4977-8bb9-a7b4db89618f" Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.525310 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ff4bs" podStartSLOduration=156.525288879 podStartE2EDuration="2m36.525288879s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:31:09.507742581 +0000 UTC m=+180.700260465" watchObservedRunningTime="2026-02-19 21:31:09.525288879 +0000 UTC m=+180.717806753" Feb 19 21:31:10 crc kubenswrapper[4795]: I0219 21:31:10.524935 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzmtm" event={"ID":"e8c7f503-32c4-4ca2-8435-9918cae8d931","Type":"ContainerStarted","Data":"e2b3b578e0a130165bd461f501cf99b487cf4f990927f2541dd89aab2055e28d"} Feb 19 21:31:10 crc kubenswrapper[4795]: I0219 21:31:10.532424 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c" Feb 19 21:31:10 crc kubenswrapper[4795]: I0219 21:31:10.549459 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fq62x" event={"ID":"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7","Type":"ContainerStarted","Data":"13c02d833399591c84abe5da0c16dc2fb6486b3bcc08499be35a9d2502d4f09f"} Feb 19 21:31:10 crc kubenswrapper[4795]: I0219 21:31:10.581051 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zzmtm" podStartSLOduration=2.43949081 podStartE2EDuration="34.581012803s" podCreationTimestamp="2026-02-19 21:30:36 +0000 UTC" firstStartedPulling="2026-02-19 21:30:38.097073263 +0000 UTC m=+149.289591117" lastFinishedPulling="2026-02-19 21:31:10.238595236 +0000 UTC m=+181.431113110" observedRunningTime="2026-02-19 21:31:10.553772082 +0000 UTC m=+181.746289966" watchObservedRunningTime="2026-02-19 21:31:10.581012803 +0000 UTC m=+181.773530697" Feb 19 21:31:10 crc kubenswrapper[4795]: I0219 21:31:10.619777 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fq62x" podStartSLOduration=2.344405468 podStartE2EDuration="34.619743154s" podCreationTimestamp="2026-02-19 21:30:36 +0000 UTC" firstStartedPulling="2026-02-19 21:30:38.05405459 +0000 UTC m=+149.246572454" lastFinishedPulling="2026-02-19 21:31:10.329392276 +0000 UTC m=+181.521910140" observedRunningTime="2026-02-19 21:31:10.596923008 +0000 UTC m=+181.789440902" watchObservedRunningTime="2026-02-19 21:31:10.619743154 +0000 UTC m=+181.812261008" Feb 19 21:31:11 crc kubenswrapper[4795]: I0219 21:31:11.555711 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5j7b9" event={"ID":"1fa17669-dc5e-46a8-a76d-befdbc69aeed","Type":"ContainerStarted","Data":"f41b709a11c74f3734b7f736873f5d029cecf42b67def1cf94218b316be19474"} Feb 19 21:31:11 crc kubenswrapper[4795]: I0219 21:31:11.557752 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bmzl7" event={"ID":"12e5472a-2c4b-4b71-91fb-06c3d5fcca54","Type":"ContainerStarted","Data":"124253c24d0ba04a222871263e3eb4ccb45c5fd2f1555778cc72b79051abee81"} Feb 19 21:31:11 crc kubenswrapper[4795]: I0219 21:31:11.559449 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v698q" event={"ID":"d858d3ea-6432-49a9-9b32-2e36b61c6e57","Type":"ContainerStarted","Data":"a7485d12d59f4055683cb6ae750e079b0729d5deb16ed6d10099b8d940c03a00"} Feb 19 21:31:11 crc kubenswrapper[4795]: I0219 21:31:11.561212 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c9sh5" event={"ID":"7ae7ca82-f2b1-4fec-9f66-732017519586","Type":"ContainerStarted","Data":"e2d47055ec62ef6c48d9090ad6f5104fd8d79584ae42ab1689cf9d640447aeec"} Feb 19 21:31:11 crc kubenswrapper[4795]: I0219 21:31:11.563927 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tclw" event={"ID":"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9","Type":"ContainerStarted","Data":"7c6c77a2d2c99d511b04824d382fd22558c9e465bd1270597992cd286f73dd2f"} Feb 19 21:31:11 crc kubenswrapper[4795]: I0219 21:31:11.573661 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bmzl7" podStartSLOduration=2.3790651130000002 podStartE2EDuration="33.573648021s" podCreationTimestamp="2026-02-19 21:30:38 +0000 UTC" firstStartedPulling="2026-02-19 21:30:39.116845559 +0000 UTC m=+150.309363423" lastFinishedPulling="2026-02-19 21:31:10.311428437 +0000 UTC m=+181.503946331" observedRunningTime="2026-02-19 21:31:10.621004567 +0000 UTC m=+181.813522471" watchObservedRunningTime="2026-02-19 21:31:11.573648021 +0000 UTC m=+182.766165885" Feb 19 21:31:11 crc kubenswrapper[4795]: I0219 21:31:11.576756 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5j7b9" podStartSLOduration=3.138516995 podStartE2EDuration="35.576749902s" podCreationTimestamp="2026-02-19 21:30:36 +0000 UTC" firstStartedPulling="2026-02-19 21:30:38.099827935 +0000 UTC m=+149.292345809" lastFinishedPulling="2026-02-19 21:31:10.538060842 +0000 UTC m=+181.730578716" observedRunningTime="2026-02-19 21:31:11.572693576 +0000 UTC m=+182.765211440" watchObservedRunningTime="2026-02-19 21:31:11.576749902 +0000 UTC m=+182.769267766" Feb 19 21:31:11 crc kubenswrapper[4795]: I0219 21:31:11.595621 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v698q" podStartSLOduration=3.064592825 podStartE2EDuration="33.595601034s" podCreationTimestamp="2026-02-19 21:30:38 +0000 UTC" firstStartedPulling="2026-02-19 21:30:40.234600812 +0000 UTC m=+151.427118676" lastFinishedPulling="2026-02-19 21:31:10.765609021 +0000 UTC m=+181.958126885" observedRunningTime="2026-02-19 21:31:11.592494643 +0000 UTC m=+182.785012497" watchObservedRunningTime="2026-02-19 21:31:11.595601034 +0000 UTC m=+182.788118898" Feb 19 21:31:11 crc kubenswrapper[4795]: I0219 21:31:11.611843 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-c9sh5" podStartSLOduration=3.030864155 podStartE2EDuration="35.611822307s" podCreationTimestamp="2026-02-19 21:30:36 +0000 UTC" firstStartedPulling="2026-02-19 21:30:38.06629394 +0000 UTC m=+149.258811804" lastFinishedPulling="2026-02-19 21:31:10.647252092 +0000 UTC m=+181.839769956" observedRunningTime="2026-02-19 21:31:11.611788096 +0000 UTC m=+182.804305970" watchObservedRunningTime="2026-02-19 21:31:11.611822307 +0000 UTC m=+182.804340171" Feb 19 21:31:11 crc kubenswrapper[4795]: I0219 21:31:11.636745 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5tclw" podStartSLOduration=3.480116909 podStartE2EDuration="32.636724877s" podCreationTimestamp="2026-02-19 21:30:39 +0000 UTC" firstStartedPulling="2026-02-19 21:30:41.265940009 +0000 UTC m=+152.458457873" lastFinishedPulling="2026-02-19 21:31:10.422547977 +0000 UTC m=+181.615065841" observedRunningTime="2026-02-19 21:31:11.635712631 +0000 UTC m=+182.828230505" watchObservedRunningTime="2026-02-19 21:31:11.636724877 +0000 UTC m=+182.829242741" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.381630 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.382239 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.470744 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 19 21:31:16 crc kubenswrapper[4795]: E0219 21:31:16.470940 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ac441c3-3e7c-482d-a324-0c383d0be8ef" containerName="pruner" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.470951 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ac441c3-3e7c-482d-a324-0c383d0be8ef" containerName="pruner" Feb 19 21:31:16 crc kubenswrapper[4795]: E0219 21:31:16.470960 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd535c51-1ece-4449-823a-cf80a095eaeb" containerName="pruner" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.470966 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd535c51-1ece-4449-823a-cf80a095eaeb" containerName="pruner" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.471054 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd535c51-1ece-4449-823a-cf80a095eaeb" containerName="pruner" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.471068 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ac441c3-3e7c-482d-a324-0c383d0be8ef" containerName="pruner" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.471451 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.475096 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.475431 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.486495 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.503998 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9b7f5192-259d-44ff-9e42-5ab977c95519-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9b7f5192-259d-44ff-9e42-5ab977c95519\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.504279 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b7f5192-259d-44ff-9e42-5ab977c95519-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9b7f5192-259d-44ff-9e42-5ab977c95519\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.605688 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b7f5192-259d-44ff-9e42-5ab977c95519-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9b7f5192-259d-44ff-9e42-5ab977c95519\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.605771 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9b7f5192-259d-44ff-9e42-5ab977c95519-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9b7f5192-259d-44ff-9e42-5ab977c95519\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.605868 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9b7f5192-259d-44ff-9e42-5ab977c95519-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9b7f5192-259d-44ff-9e42-5ab977c95519\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.623981 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b7f5192-259d-44ff-9e42-5ab977c95519-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9b7f5192-259d-44ff-9e42-5ab977c95519\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.757486 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.776473 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.776536 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.778695 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.787868 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.829510 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.830515 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:31:17 crc kubenswrapper[4795]: I0219 21:31:17.020483 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:31:17 crc kubenswrapper[4795]: I0219 21:31:17.020813 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:31:17 crc kubenswrapper[4795]: I0219 21:31:17.064798 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:31:17 crc kubenswrapper[4795]: I0219 21:31:17.170657 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:31:17 crc kubenswrapper[4795]: I0219 21:31:17.170771 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:31:17 crc kubenswrapper[4795]: I0219 21:31:17.209344 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:31:17 crc kubenswrapper[4795]: I0219 21:31:17.227034 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 19 21:31:17 crc kubenswrapper[4795]: W0219 21:31:17.233186 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9b7f5192_259d_44ff_9e42_5ab977c95519.slice/crio-080d0dae8e8a181a5f82b3abfaa74c44eff0ab9f2582c06c29aee612f9a99152 WatchSource:0}: Error finding container 080d0dae8e8a181a5f82b3abfaa74c44eff0ab9f2582c06c29aee612f9a99152: Status 404 returned error can't find the container with id 080d0dae8e8a181a5f82b3abfaa74c44eff0ab9f2582c06c29aee612f9a99152 Feb 19 21:31:17 crc kubenswrapper[4795]: I0219 21:31:17.598523 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"9b7f5192-259d-44ff-9e42-5ab977c95519","Type":"ContainerStarted","Data":"080d0dae8e8a181a5f82b3abfaa74c44eff0ab9f2582c06c29aee612f9a99152"} Feb 19 21:31:17 crc kubenswrapper[4795]: I0219 21:31:17.640875 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:31:17 crc kubenswrapper[4795]: I0219 21:31:17.642894 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:31:17 crc kubenswrapper[4795]: I0219 21:31:17.643334 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:31:18 crc kubenswrapper[4795]: I0219 21:31:18.427398 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fq62x"] Feb 19 21:31:18 crc kubenswrapper[4795]: I0219 21:31:18.557974 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:31:18 crc kubenswrapper[4795]: I0219 21:31:18.558041 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:31:18 crc kubenswrapper[4795]: I0219 21:31:18.597417 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:31:18 crc kubenswrapper[4795]: I0219 21:31:18.603732 4795 generic.go:334] "Generic (PLEG): container finished" podID="9b7f5192-259d-44ff-9e42-5ab977c95519" containerID="f9d8e5687bb8b8d8eea28ea3abe15bf40dbdf4c746016f8ddecd8da2c5b538dd" exitCode=0 Feb 19 21:31:18 crc kubenswrapper[4795]: I0219 21:31:18.603821 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"9b7f5192-259d-44ff-9e42-5ab977c95519","Type":"ContainerDied","Data":"f9d8e5687bb8b8d8eea28ea3abe15bf40dbdf4c746016f8ddecd8da2c5b538dd"} Feb 19 21:31:18 crc kubenswrapper[4795]: I0219 21:31:18.649599 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:31:18 crc kubenswrapper[4795]: I0219 21:31:18.920537 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-pb7s7"] Feb 19 21:31:18 crc kubenswrapper[4795]: I0219 21:31:18.956086 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:31:18 crc kubenswrapper[4795]: I0219 21:31:18.956154 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:31:19 crc kubenswrapper[4795]: I0219 21:31:19.000784 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:31:19 crc kubenswrapper[4795]: I0219 21:31:19.428554 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5j7b9"] Feb 19 21:31:19 crc kubenswrapper[4795]: I0219 21:31:19.614749 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fq62x" podUID="8e47d73b-026b-4bc2-b1f0-c69efadc0ce7" containerName="registry-server" containerID="cri-o://13c02d833399591c84abe5da0c16dc2fb6486b3bcc08499be35a9d2502d4f09f" gracePeriod=2 Feb 19 21:31:19 crc kubenswrapper[4795]: I0219 21:31:19.665351 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:31:19 crc kubenswrapper[4795]: I0219 21:31:19.758624 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:31:19 crc kubenswrapper[4795]: I0219 21:31:19.758673 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:31:19 crc kubenswrapper[4795]: I0219 21:31:19.823970 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:31:19 crc kubenswrapper[4795]: I0219 21:31:19.933064 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 21:31:19 crc kubenswrapper[4795]: I0219 21:31:19.945873 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9b7f5192-259d-44ff-9e42-5ab977c95519-kubelet-dir\") pod \"9b7f5192-259d-44ff-9e42-5ab977c95519\" (UID: \"9b7f5192-259d-44ff-9e42-5ab977c95519\") " Feb 19 21:31:19 crc kubenswrapper[4795]: I0219 21:31:19.945924 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b7f5192-259d-44ff-9e42-5ab977c95519-kube-api-access\") pod \"9b7f5192-259d-44ff-9e42-5ab977c95519\" (UID: \"9b7f5192-259d-44ff-9e42-5ab977c95519\") " Feb 19 21:31:19 crc kubenswrapper[4795]: I0219 21:31:19.949280 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b7f5192-259d-44ff-9e42-5ab977c95519-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9b7f5192-259d-44ff-9e42-5ab977c95519" (UID: "9b7f5192-259d-44ff-9e42-5ab977c95519"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:31:19 crc kubenswrapper[4795]: I0219 21:31:19.951337 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b7f5192-259d-44ff-9e42-5ab977c95519-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9b7f5192-259d-44ff-9e42-5ab977c95519" (UID: "9b7f5192-259d-44ff-9e42-5ab977c95519"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:31:20 crc kubenswrapper[4795]: I0219 21:31:20.046887 4795 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9b7f5192-259d-44ff-9e42-5ab977c95519-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:20 crc kubenswrapper[4795]: I0219 21:31:20.046916 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b7f5192-259d-44ff-9e42-5ab977c95519-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:20 crc kubenswrapper[4795]: I0219 21:31:20.623276 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 21:31:20 crc kubenswrapper[4795]: I0219 21:31:20.623276 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"9b7f5192-259d-44ff-9e42-5ab977c95519","Type":"ContainerDied","Data":"080d0dae8e8a181a5f82b3abfaa74c44eff0ab9f2582c06c29aee612f9a99152"} Feb 19 21:31:20 crc kubenswrapper[4795]: I0219 21:31:20.623735 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="080d0dae8e8a181a5f82b3abfaa74c44eff0ab9f2582c06c29aee612f9a99152" Feb 19 21:31:20 crc kubenswrapper[4795]: I0219 21:31:20.625236 4795 generic.go:334] "Generic (PLEG): container finished" podID="8e47d73b-026b-4bc2-b1f0-c69efadc0ce7" containerID="13c02d833399591c84abe5da0c16dc2fb6486b3bcc08499be35a9d2502d4f09f" exitCode=0 Feb 19 21:31:20 crc kubenswrapper[4795]: I0219 21:31:20.625393 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fq62x" event={"ID":"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7","Type":"ContainerDied","Data":"13c02d833399591c84abe5da0c16dc2fb6486b3bcc08499be35a9d2502d4f09f"} Feb 19 21:31:20 crc kubenswrapper[4795]: I0219 21:31:20.626230 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5j7b9" podUID="1fa17669-dc5e-46a8-a76d-befdbc69aeed" containerName="registry-server" containerID="cri-o://f41b709a11c74f3734b7f736873f5d029cecf42b67def1cf94218b316be19474" gracePeriod=2 Feb 19 21:31:20 crc kubenswrapper[4795]: I0219 21:31:20.661733 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.122906 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.260240 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jr9vx\" (UniqueName: \"kubernetes.io/projected/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-kube-api-access-jr9vx\") pod \"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7\" (UID: \"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7\") " Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.260347 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-utilities\") pod \"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7\" (UID: \"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7\") " Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.260373 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-catalog-content\") pod \"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7\" (UID: \"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7\") " Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.261096 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-utilities" (OuterVolumeSpecName: "utilities") pod "8e47d73b-026b-4bc2-b1f0-c69efadc0ce7" (UID: "8e47d73b-026b-4bc2-b1f0-c69efadc0ce7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.265963 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-kube-api-access-jr9vx" (OuterVolumeSpecName: "kube-api-access-jr9vx") pod "8e47d73b-026b-4bc2-b1f0-c69efadc0ce7" (UID: "8e47d73b-026b-4bc2-b1f0-c69efadc0ce7"). InnerVolumeSpecName "kube-api-access-jr9vx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.308651 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e47d73b-026b-4bc2-b1f0-c69efadc0ce7" (UID: "8e47d73b-026b-4bc2-b1f0-c69efadc0ce7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.361240 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jr9vx\" (UniqueName: \"kubernetes.io/projected/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-kube-api-access-jr9vx\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.361275 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.361288 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.633321 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fq62x" event={"ID":"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7","Type":"ContainerDied","Data":"56f8a93ddff4618883796150de8b693b1c3a76f9b5f00a99b738a48400fed9ee"} Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.633725 4795 scope.go:117] "RemoveContainer" containerID="13c02d833399591c84abe5da0c16dc2fb6486b3bcc08499be35a9d2502d4f09f" Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.633372 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.635718 4795 generic.go:334] "Generic (PLEG): container finished" podID="1fa17669-dc5e-46a8-a76d-befdbc69aeed" containerID="f41b709a11c74f3734b7f736873f5d029cecf42b67def1cf94218b316be19474" exitCode=0 Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.635979 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5j7b9" event={"ID":"1fa17669-dc5e-46a8-a76d-befdbc69aeed","Type":"ContainerDied","Data":"f41b709a11c74f3734b7f736873f5d029cecf42b67def1cf94218b316be19474"} Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.678868 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.687714 4795 scope.go:117] "RemoveContainer" containerID="b730d866cb5930ce6b3bd30068b74b30f568d9da17e9b40a916b2db10b9b7380" Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.690439 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fq62x"] Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.692904 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fq62x"] Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.706958 4795 scope.go:117] "RemoveContainer" containerID="7884e76708116ab40efc68a93802f7cd42d9acd2ddce9816192d25b3558b0941" Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.825380 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v698q"] Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.825778 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-v698q" podUID="d858d3ea-6432-49a9-9b32-2e36b61c6e57" containerName="registry-server" containerID="cri-o://a7485d12d59f4055683cb6ae750e079b0729d5deb16ed6d10099b8d940c03a00" gracePeriod=2 Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.866386 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fa17669-dc5e-46a8-a76d-befdbc69aeed-catalog-content\") pod \"1fa17669-dc5e-46a8-a76d-befdbc69aeed\" (UID: \"1fa17669-dc5e-46a8-a76d-befdbc69aeed\") " Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.866432 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9dxw\" (UniqueName: \"kubernetes.io/projected/1fa17669-dc5e-46a8-a76d-befdbc69aeed-kube-api-access-l9dxw\") pod \"1fa17669-dc5e-46a8-a76d-befdbc69aeed\" (UID: \"1fa17669-dc5e-46a8-a76d-befdbc69aeed\") " Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.866506 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fa17669-dc5e-46a8-a76d-befdbc69aeed-utilities\") pod \"1fa17669-dc5e-46a8-a76d-befdbc69aeed\" (UID: \"1fa17669-dc5e-46a8-a76d-befdbc69aeed\") " Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.867344 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fa17669-dc5e-46a8-a76d-befdbc69aeed-utilities" (OuterVolumeSpecName: "utilities") pod "1fa17669-dc5e-46a8-a76d-befdbc69aeed" (UID: "1fa17669-dc5e-46a8-a76d-befdbc69aeed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.870443 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fa17669-dc5e-46a8-a76d-befdbc69aeed-kube-api-access-l9dxw" (OuterVolumeSpecName: "kube-api-access-l9dxw") pod "1fa17669-dc5e-46a8-a76d-befdbc69aeed" (UID: "1fa17669-dc5e-46a8-a76d-befdbc69aeed"). InnerVolumeSpecName "kube-api-access-l9dxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.967906 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9dxw\" (UniqueName: \"kubernetes.io/projected/1fa17669-dc5e-46a8-a76d-befdbc69aeed-kube-api-access-l9dxw\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.967938 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fa17669-dc5e-46a8-a76d-befdbc69aeed-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.166910 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fa17669-dc5e-46a8-a76d-befdbc69aeed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1fa17669-dc5e-46a8-a76d-befdbc69aeed" (UID: "1fa17669-dc5e-46a8-a76d-befdbc69aeed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.169426 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fa17669-dc5e-46a8-a76d-befdbc69aeed-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.643815 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5j7b9" event={"ID":"1fa17669-dc5e-46a8-a76d-befdbc69aeed","Type":"ContainerDied","Data":"825dc9912efe54135ca23a728bf83c290bc47d9c0fb6c14bb961a4a89a196a6f"} Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.643871 4795 scope.go:117] "RemoveContainer" containerID="f41b709a11c74f3734b7f736873f5d029cecf42b67def1cf94218b316be19474" Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.643983 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.653287 4795 generic.go:334] "Generic (PLEG): container finished" podID="d858d3ea-6432-49a9-9b32-2e36b61c6e57" containerID="a7485d12d59f4055683cb6ae750e079b0729d5deb16ed6d10099b8d940c03a00" exitCode=0 Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.653327 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v698q" event={"ID":"d858d3ea-6432-49a9-9b32-2e36b61c6e57","Type":"ContainerDied","Data":"a7485d12d59f4055683cb6ae750e079b0729d5deb16ed6d10099b8d940c03a00"} Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.668872 4795 scope.go:117] "RemoveContainer" containerID="90b9312e0f5408fc251999d131140fd68f045334ecdf8f5a7e0995574996fb05" Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.680550 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5j7b9"] Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.682874 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5j7b9"] Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.710398 4795 scope.go:117] "RemoveContainer" containerID="ed0bfebb52fd6133757aa14996934704806932b85d024110638b142b237490a2" Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.910588 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.982348 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d858d3ea-6432-49a9-9b32-2e36b61c6e57-catalog-content\") pod \"d858d3ea-6432-49a9-9b32-2e36b61c6e57\" (UID: \"d858d3ea-6432-49a9-9b32-2e36b61c6e57\") " Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.982422 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnffn\" (UniqueName: \"kubernetes.io/projected/d858d3ea-6432-49a9-9b32-2e36b61c6e57-kube-api-access-hnffn\") pod \"d858d3ea-6432-49a9-9b32-2e36b61c6e57\" (UID: \"d858d3ea-6432-49a9-9b32-2e36b61c6e57\") " Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.982455 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d858d3ea-6432-49a9-9b32-2e36b61c6e57-utilities\") pod \"d858d3ea-6432-49a9-9b32-2e36b61c6e57\" (UID: \"d858d3ea-6432-49a9-9b32-2e36b61c6e57\") " Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.983388 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d858d3ea-6432-49a9-9b32-2e36b61c6e57-utilities" (OuterVolumeSpecName: "utilities") pod "d858d3ea-6432-49a9-9b32-2e36b61c6e57" (UID: "d858d3ea-6432-49a9-9b32-2e36b61c6e57"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.986896 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d858d3ea-6432-49a9-9b32-2e36b61c6e57-kube-api-access-hnffn" (OuterVolumeSpecName: "kube-api-access-hnffn") pod "d858d3ea-6432-49a9-9b32-2e36b61c6e57" (UID: "d858d3ea-6432-49a9-9b32-2e36b61c6e57"). InnerVolumeSpecName "kube-api-access-hnffn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:31:23 crc kubenswrapper[4795]: I0219 21:31:23.004507 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d858d3ea-6432-49a9-9b32-2e36b61c6e57-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d858d3ea-6432-49a9-9b32-2e36b61c6e57" (UID: "d858d3ea-6432-49a9-9b32-2e36b61c6e57"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:31:23 crc kubenswrapper[4795]: I0219 21:31:23.083307 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnffn\" (UniqueName: \"kubernetes.io/projected/d858d3ea-6432-49a9-9b32-2e36b61c6e57-kube-api-access-hnffn\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:23 crc kubenswrapper[4795]: I0219 21:31:23.083676 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d858d3ea-6432-49a9-9b32-2e36b61c6e57-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:23 crc kubenswrapper[4795]: I0219 21:31:23.083688 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d858d3ea-6432-49a9-9b32-2e36b61c6e57-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:23 crc kubenswrapper[4795]: I0219 21:31:23.517830 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fa17669-dc5e-46a8-a76d-befdbc69aeed" path="/var/lib/kubelet/pods/1fa17669-dc5e-46a8-a76d-befdbc69aeed/volumes" Feb 19 21:31:23 crc kubenswrapper[4795]: I0219 21:31:23.518421 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e47d73b-026b-4bc2-b1f0-c69efadc0ce7" path="/var/lib/kubelet/pods/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7/volumes" Feb 19 21:31:23 crc kubenswrapper[4795]: I0219 21:31:23.673200 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v698q" event={"ID":"d858d3ea-6432-49a9-9b32-2e36b61c6e57","Type":"ContainerDied","Data":"28460660f6d692ba6e8f6b91806cd443070cee612e5427f6f9dabd127b7a144e"} Feb 19 21:31:23 crc kubenswrapper[4795]: I0219 21:31:23.673238 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:31:23 crc kubenswrapper[4795]: I0219 21:31:23.673260 4795 scope.go:117] "RemoveContainer" containerID="a7485d12d59f4055683cb6ae750e079b0729d5deb16ed6d10099b8d940c03a00" Feb 19 21:31:23 crc kubenswrapper[4795]: I0219 21:31:23.689056 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v698q"] Feb 19 21:31:23 crc kubenswrapper[4795]: I0219 21:31:23.689490 4795 scope.go:117] "RemoveContainer" containerID="76fd4bbcb0278e373cff16d83c7ceda3fc1a11cdd6076ee88c76e62b714b1bad" Feb 19 21:31:23 crc kubenswrapper[4795]: I0219 21:31:23.692391 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-v698q"] Feb 19 21:31:23 crc kubenswrapper[4795]: I0219 21:31:23.720484 4795 scope.go:117] "RemoveContainer" containerID="3f2cfda31513c4d8d601e01af9b97c3ba687e6c071cb1cb6f9eb5d2af4229073" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.267541 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 19 21:31:24 crc kubenswrapper[4795]: E0219 21:31:24.267728 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa17669-dc5e-46a8-a76d-befdbc69aeed" containerName="extract-content" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.267739 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa17669-dc5e-46a8-a76d-befdbc69aeed" containerName="extract-content" Feb 19 21:31:24 crc kubenswrapper[4795]: E0219 21:31:24.267748 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d858d3ea-6432-49a9-9b32-2e36b61c6e57" containerName="registry-server" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.267755 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d858d3ea-6432-49a9-9b32-2e36b61c6e57" containerName="registry-server" Feb 19 21:31:24 crc kubenswrapper[4795]: E0219 21:31:24.267766 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa17669-dc5e-46a8-a76d-befdbc69aeed" containerName="registry-server" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.267772 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa17669-dc5e-46a8-a76d-befdbc69aeed" containerName="registry-server" Feb 19 21:31:24 crc kubenswrapper[4795]: E0219 21:31:24.267781 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa17669-dc5e-46a8-a76d-befdbc69aeed" containerName="extract-utilities" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.267787 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa17669-dc5e-46a8-a76d-befdbc69aeed" containerName="extract-utilities" Feb 19 21:31:24 crc kubenswrapper[4795]: E0219 21:31:24.267794 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d858d3ea-6432-49a9-9b32-2e36b61c6e57" containerName="extract-utilities" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.267800 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d858d3ea-6432-49a9-9b32-2e36b61c6e57" containerName="extract-utilities" Feb 19 21:31:24 crc kubenswrapper[4795]: E0219 21:31:24.267812 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b7f5192-259d-44ff-9e42-5ab977c95519" containerName="pruner" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.267819 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7f5192-259d-44ff-9e42-5ab977c95519" containerName="pruner" Feb 19 21:31:24 crc kubenswrapper[4795]: E0219 21:31:24.267831 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e47d73b-026b-4bc2-b1f0-c69efadc0ce7" containerName="extract-utilities" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.267838 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e47d73b-026b-4bc2-b1f0-c69efadc0ce7" containerName="extract-utilities" Feb 19 21:31:24 crc kubenswrapper[4795]: E0219 21:31:24.267848 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d858d3ea-6432-49a9-9b32-2e36b61c6e57" containerName="extract-content" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.267855 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d858d3ea-6432-49a9-9b32-2e36b61c6e57" containerName="extract-content" Feb 19 21:31:24 crc kubenswrapper[4795]: E0219 21:31:24.267867 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e47d73b-026b-4bc2-b1f0-c69efadc0ce7" containerName="registry-server" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.267874 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e47d73b-026b-4bc2-b1f0-c69efadc0ce7" containerName="registry-server" Feb 19 21:31:24 crc kubenswrapper[4795]: E0219 21:31:24.267884 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e47d73b-026b-4bc2-b1f0-c69efadc0ce7" containerName="extract-content" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.267891 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e47d73b-026b-4bc2-b1f0-c69efadc0ce7" containerName="extract-content" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.268009 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b7f5192-259d-44ff-9e42-5ab977c95519" containerName="pruner" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.268021 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa17669-dc5e-46a8-a76d-befdbc69aeed" containerName="registry-server" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.268035 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e47d73b-026b-4bc2-b1f0-c69efadc0ce7" containerName="registry-server" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.268045 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d858d3ea-6432-49a9-9b32-2e36b61c6e57" containerName="registry-server" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.268426 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.270091 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.270346 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.278596 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.397463 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-kube-api-access\") pod \"installer-9-crc\" (UID: \"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.397514 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.397549 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-var-lock\") pod \"installer-9-crc\" (UID: \"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.498609 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-kube-api-access\") pod \"installer-9-crc\" (UID: \"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.498659 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.498701 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-var-lock\") pod \"installer-9-crc\" (UID: \"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.498801 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.498817 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-var-lock\") pod \"installer-9-crc\" (UID: \"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.516322 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-kube-api-access\") pod \"installer-9-crc\" (UID: \"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.598554 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:31:25 crc kubenswrapper[4795]: I0219 21:31:25.021657 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 19 21:31:25 crc kubenswrapper[4795]: W0219 21:31:25.031350 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6b9bcd07_a45a_4f32_9d56_9ebb0931b1a6.slice/crio-6561cdc38385fc6e6e9ed635e289b887be8b67167e1892ff8512a6be8137ce3f WatchSource:0}: Error finding container 6561cdc38385fc6e6e9ed635e289b887be8b67167e1892ff8512a6be8137ce3f: Status 404 returned error can't find the container with id 6561cdc38385fc6e6e9ed635e289b887be8b67167e1892ff8512a6be8137ce3f Feb 19 21:31:25 crc kubenswrapper[4795]: I0219 21:31:25.518315 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d858d3ea-6432-49a9-9b32-2e36b61c6e57" path="/var/lib/kubelet/pods/d858d3ea-6432-49a9-9b32-2e36b61c6e57/volumes" Feb 19 21:31:25 crc kubenswrapper[4795]: I0219 21:31:25.686651 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpkx6" event={"ID":"ac83daf6-848e-4977-8bb9-a7b4db89618f","Type":"ContainerStarted","Data":"43f429377ebc8b5c3ab07e6e96409113b8aa9376fff7b173c7135fc5c7df0572"} Feb 19 21:31:25 crc kubenswrapper[4795]: I0219 21:31:25.688400 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6","Type":"ContainerStarted","Data":"d1854bd6a757d0896d79bed3a255459850433d4bfa0a2f7e67645e00599c61cf"} Feb 19 21:31:25 crc kubenswrapper[4795]: I0219 21:31:25.688461 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6","Type":"ContainerStarted","Data":"6561cdc38385fc6e6e9ed635e289b887be8b67167e1892ff8512a6be8137ce3f"} Feb 19 21:31:25 crc kubenswrapper[4795]: I0219 21:31:25.736770 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.736753427 podStartE2EDuration="1.736753427s" podCreationTimestamp="2026-02-19 21:31:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:31:25.733575359 +0000 UTC m=+196.926093243" watchObservedRunningTime="2026-02-19 21:31:25.736753427 +0000 UTC m=+196.929271291" Feb 19 21:31:26 crc kubenswrapper[4795]: I0219 21:31:26.694801 4795 generic.go:334] "Generic (PLEG): container finished" podID="ac83daf6-848e-4977-8bb9-a7b4db89618f" containerID="43f429377ebc8b5c3ab07e6e96409113b8aa9376fff7b173c7135fc5c7df0572" exitCode=0 Feb 19 21:31:26 crc kubenswrapper[4795]: I0219 21:31:26.694863 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpkx6" event={"ID":"ac83daf6-848e-4977-8bb9-a7b4db89618f","Type":"ContainerDied","Data":"43f429377ebc8b5c3ab07e6e96409113b8aa9376fff7b173c7135fc5c7df0572"} Feb 19 21:31:28 crc kubenswrapper[4795]: I0219 21:31:28.428071 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:31:28 crc kubenswrapper[4795]: I0219 21:31:28.428504 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:31:30 crc kubenswrapper[4795]: I0219 21:31:30.726710 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpkx6" event={"ID":"ac83daf6-848e-4977-8bb9-a7b4db89618f","Type":"ContainerStarted","Data":"f64c952287ce3317126edd7ba098c5fd097db5032a96dca6b373fe4f29a5389f"} Feb 19 21:31:40 crc kubenswrapper[4795]: I0219 21:31:40.237877 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:31:40 crc kubenswrapper[4795]: I0219 21:31:40.238730 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:31:40 crc kubenswrapper[4795]: I0219 21:31:40.313525 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:31:40 crc kubenswrapper[4795]: I0219 21:31:40.338804 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zpkx6" podStartSLOduration=12.933403007999999 podStartE2EDuration="1m1.338789275s" podCreationTimestamp="2026-02-19 21:30:39 +0000 UTC" firstStartedPulling="2026-02-19 21:30:41.224110738 +0000 UTC m=+152.416628602" lastFinishedPulling="2026-02-19 21:31:29.629497005 +0000 UTC m=+200.822014869" observedRunningTime="2026-02-19 21:31:30.746277788 +0000 UTC m=+201.938795672" watchObservedRunningTime="2026-02-19 21:31:40.338789275 +0000 UTC m=+211.531307129" Feb 19 21:31:40 crc kubenswrapper[4795]: I0219 21:31:40.847975 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:31:40 crc kubenswrapper[4795]: I0219 21:31:40.912464 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zpkx6"] Feb 19 21:31:42 crc kubenswrapper[4795]: I0219 21:31:42.785966 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zpkx6" podUID="ac83daf6-848e-4977-8bb9-a7b4db89618f" containerName="registry-server" containerID="cri-o://f64c952287ce3317126edd7ba098c5fd097db5032a96dca6b373fe4f29a5389f" gracePeriod=2 Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.162563 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.243569 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac83daf6-848e-4977-8bb9-a7b4db89618f-utilities\") pod \"ac83daf6-848e-4977-8bb9-a7b4db89618f\" (UID: \"ac83daf6-848e-4977-8bb9-a7b4db89618f\") " Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.243799 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jst7w\" (UniqueName: \"kubernetes.io/projected/ac83daf6-848e-4977-8bb9-a7b4db89618f-kube-api-access-jst7w\") pod \"ac83daf6-848e-4977-8bb9-a7b4db89618f\" (UID: \"ac83daf6-848e-4977-8bb9-a7b4db89618f\") " Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.243851 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac83daf6-848e-4977-8bb9-a7b4db89618f-catalog-content\") pod \"ac83daf6-848e-4977-8bb9-a7b4db89618f\" (UID: \"ac83daf6-848e-4977-8bb9-a7b4db89618f\") " Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.244486 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac83daf6-848e-4977-8bb9-a7b4db89618f-utilities" (OuterVolumeSpecName: "utilities") pod "ac83daf6-848e-4977-8bb9-a7b4db89618f" (UID: "ac83daf6-848e-4977-8bb9-a7b4db89618f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.252566 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac83daf6-848e-4977-8bb9-a7b4db89618f-kube-api-access-jst7w" (OuterVolumeSpecName: "kube-api-access-jst7w") pod "ac83daf6-848e-4977-8bb9-a7b4db89618f" (UID: "ac83daf6-848e-4977-8bb9-a7b4db89618f"). InnerVolumeSpecName "kube-api-access-jst7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.346249 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac83daf6-848e-4977-8bb9-a7b4db89618f-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.346321 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jst7w\" (UniqueName: \"kubernetes.io/projected/ac83daf6-848e-4977-8bb9-a7b4db89618f-kube-api-access-jst7w\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.405989 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac83daf6-848e-4977-8bb9-a7b4db89618f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac83daf6-848e-4977-8bb9-a7b4db89618f" (UID: "ac83daf6-848e-4977-8bb9-a7b4db89618f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.448091 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac83daf6-848e-4977-8bb9-a7b4db89618f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.794542 4795 generic.go:334] "Generic (PLEG): container finished" podID="ac83daf6-848e-4977-8bb9-a7b4db89618f" containerID="f64c952287ce3317126edd7ba098c5fd097db5032a96dca6b373fe4f29a5389f" exitCode=0 Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.794593 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpkx6" event={"ID":"ac83daf6-848e-4977-8bb9-a7b4db89618f","Type":"ContainerDied","Data":"f64c952287ce3317126edd7ba098c5fd097db5032a96dca6b373fe4f29a5389f"} Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.794628 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpkx6" event={"ID":"ac83daf6-848e-4977-8bb9-a7b4db89618f","Type":"ContainerDied","Data":"5f603a895c6c8cc9d233f8386c8619bd99f45021b342fc0088eeb4edeed70ce6"} Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.794655 4795 scope.go:117] "RemoveContainer" containerID="f64c952287ce3317126edd7ba098c5fd097db5032a96dca6b373fe4f29a5389f" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.794799 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.826258 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zpkx6"] Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.829023 4795 scope.go:117] "RemoveContainer" containerID="43f429377ebc8b5c3ab07e6e96409113b8aa9376fff7b173c7135fc5c7df0572" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.842693 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zpkx6"] Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.856221 4795 scope.go:117] "RemoveContainer" containerID="b9d88a868f45e4d7fdf10f8b6b1fdb6c04b5f7137907622ee25cca321d0d04fd" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.881365 4795 scope.go:117] "RemoveContainer" containerID="f64c952287ce3317126edd7ba098c5fd097db5032a96dca6b373fe4f29a5389f" Feb 19 21:31:43 crc kubenswrapper[4795]: E0219 21:31:43.882057 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f64c952287ce3317126edd7ba098c5fd097db5032a96dca6b373fe4f29a5389f\": container with ID starting with f64c952287ce3317126edd7ba098c5fd097db5032a96dca6b373fe4f29a5389f not found: ID does not exist" containerID="f64c952287ce3317126edd7ba098c5fd097db5032a96dca6b373fe4f29a5389f" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.882109 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f64c952287ce3317126edd7ba098c5fd097db5032a96dca6b373fe4f29a5389f"} err="failed to get container status \"f64c952287ce3317126edd7ba098c5fd097db5032a96dca6b373fe4f29a5389f\": rpc error: code = NotFound desc = could not find container \"f64c952287ce3317126edd7ba098c5fd097db5032a96dca6b373fe4f29a5389f\": container with ID starting with f64c952287ce3317126edd7ba098c5fd097db5032a96dca6b373fe4f29a5389f not found: ID does not exist" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.882200 4795 scope.go:117] "RemoveContainer" containerID="43f429377ebc8b5c3ab07e6e96409113b8aa9376fff7b173c7135fc5c7df0572" Feb 19 21:31:43 crc kubenswrapper[4795]: E0219 21:31:43.882656 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43f429377ebc8b5c3ab07e6e96409113b8aa9376fff7b173c7135fc5c7df0572\": container with ID starting with 43f429377ebc8b5c3ab07e6e96409113b8aa9376fff7b173c7135fc5c7df0572 not found: ID does not exist" containerID="43f429377ebc8b5c3ab07e6e96409113b8aa9376fff7b173c7135fc5c7df0572" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.882704 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43f429377ebc8b5c3ab07e6e96409113b8aa9376fff7b173c7135fc5c7df0572"} err="failed to get container status \"43f429377ebc8b5c3ab07e6e96409113b8aa9376fff7b173c7135fc5c7df0572\": rpc error: code = NotFound desc = could not find container \"43f429377ebc8b5c3ab07e6e96409113b8aa9376fff7b173c7135fc5c7df0572\": container with ID starting with 43f429377ebc8b5c3ab07e6e96409113b8aa9376fff7b173c7135fc5c7df0572 not found: ID does not exist" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.882731 4795 scope.go:117] "RemoveContainer" containerID="b9d88a868f45e4d7fdf10f8b6b1fdb6c04b5f7137907622ee25cca321d0d04fd" Feb 19 21:31:43 crc kubenswrapper[4795]: E0219 21:31:43.883051 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9d88a868f45e4d7fdf10f8b6b1fdb6c04b5f7137907622ee25cca321d0d04fd\": container with ID starting with b9d88a868f45e4d7fdf10f8b6b1fdb6c04b5f7137907622ee25cca321d0d04fd not found: ID does not exist" containerID="b9d88a868f45e4d7fdf10f8b6b1fdb6c04b5f7137907622ee25cca321d0d04fd" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.883080 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9d88a868f45e4d7fdf10f8b6b1fdb6c04b5f7137907622ee25cca321d0d04fd"} err="failed to get container status \"b9d88a868f45e4d7fdf10f8b6b1fdb6c04b5f7137907622ee25cca321d0d04fd\": rpc error: code = NotFound desc = could not find container \"b9d88a868f45e4d7fdf10f8b6b1fdb6c04b5f7137907622ee25cca321d0d04fd\": container with ID starting with b9d88a868f45e4d7fdf10f8b6b1fdb6c04b5f7137907622ee25cca321d0d04fd not found: ID does not exist" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.950726 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" podUID="9de314c5-1440-476b-b98b-7804f5d95145" containerName="oauth-openshift" containerID="cri-o://4877e8c5ce5d915ffce78d7fcaa41becc17c2fa95a99e16fe8f7fd820d5bc200" gracePeriod=15 Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.358643 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.462997 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-service-ca\") pod \"9de314c5-1440-476b-b98b-7804f5d95145\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.463043 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxfbg\" (UniqueName: \"kubernetes.io/projected/9de314c5-1440-476b-b98b-7804f5d95145-kube-api-access-cxfbg\") pod \"9de314c5-1440-476b-b98b-7804f5d95145\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.463112 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-login\") pod \"9de314c5-1440-476b-b98b-7804f5d95145\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.463135 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9de314c5-1440-476b-b98b-7804f5d95145-audit-dir\") pod \"9de314c5-1440-476b-b98b-7804f5d95145\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.463157 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-trusted-ca-bundle\") pod \"9de314c5-1440-476b-b98b-7804f5d95145\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.463209 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-idp-0-file-data\") pod \"9de314c5-1440-476b-b98b-7804f5d95145\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.463226 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-cliconfig\") pod \"9de314c5-1440-476b-b98b-7804f5d95145\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.463242 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-audit-policies\") pod \"9de314c5-1440-476b-b98b-7804f5d95145\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.463262 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-router-certs\") pod \"9de314c5-1440-476b-b98b-7804f5d95145\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.463284 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-ocp-branding-template\") pod \"9de314c5-1440-476b-b98b-7804f5d95145\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.463407 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-error\") pod \"9de314c5-1440-476b-b98b-7804f5d95145\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.463508 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-session\") pod \"9de314c5-1440-476b-b98b-7804f5d95145\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.463550 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-provider-selection\") pod \"9de314c5-1440-476b-b98b-7804f5d95145\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.463846 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "9de314c5-1440-476b-b98b-7804f5d95145" (UID: "9de314c5-1440-476b-b98b-7804f5d95145"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.463980 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9de314c5-1440-476b-b98b-7804f5d95145-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "9de314c5-1440-476b-b98b-7804f5d95145" (UID: "9de314c5-1440-476b-b98b-7804f5d95145"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.464149 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-serving-cert\") pod \"9de314c5-1440-476b-b98b-7804f5d95145\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.464299 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "9de314c5-1440-476b-b98b-7804f5d95145" (UID: "9de314c5-1440-476b-b98b-7804f5d95145"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.464314 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "9de314c5-1440-476b-b98b-7804f5d95145" (UID: "9de314c5-1440-476b-b98b-7804f5d95145"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.464579 4795 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9de314c5-1440-476b-b98b-7804f5d95145-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.464597 4795 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.464611 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.464623 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.465575 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "9de314c5-1440-476b-b98b-7804f5d95145" (UID: "9de314c5-1440-476b-b98b-7804f5d95145"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.468456 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "9de314c5-1440-476b-b98b-7804f5d95145" (UID: "9de314c5-1440-476b-b98b-7804f5d95145"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.468911 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "9de314c5-1440-476b-b98b-7804f5d95145" (UID: "9de314c5-1440-476b-b98b-7804f5d95145"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.469429 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "9de314c5-1440-476b-b98b-7804f5d95145" (UID: "9de314c5-1440-476b-b98b-7804f5d95145"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.470051 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "9de314c5-1440-476b-b98b-7804f5d95145" (UID: "9de314c5-1440-476b-b98b-7804f5d95145"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.470063 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "9de314c5-1440-476b-b98b-7804f5d95145" (UID: "9de314c5-1440-476b-b98b-7804f5d95145"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.470313 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9de314c5-1440-476b-b98b-7804f5d95145-kube-api-access-cxfbg" (OuterVolumeSpecName: "kube-api-access-cxfbg") pod "9de314c5-1440-476b-b98b-7804f5d95145" (UID: "9de314c5-1440-476b-b98b-7804f5d95145"). InnerVolumeSpecName "kube-api-access-cxfbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.470327 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "9de314c5-1440-476b-b98b-7804f5d95145" (UID: "9de314c5-1440-476b-b98b-7804f5d95145"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.475507 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "9de314c5-1440-476b-b98b-7804f5d95145" (UID: "9de314c5-1440-476b-b98b-7804f5d95145"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.480432 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "9de314c5-1440-476b-b98b-7804f5d95145" (UID: "9de314c5-1440-476b-b98b-7804f5d95145"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.565820 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxfbg\" (UniqueName: \"kubernetes.io/projected/9de314c5-1440-476b-b98b-7804f5d95145-kube-api-access-cxfbg\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.565885 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.565898 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.565914 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.565925 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.565935 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.565945 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.565954 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.565964 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.565976 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.805338 4795 generic.go:334] "Generic (PLEG): container finished" podID="9de314c5-1440-476b-b98b-7804f5d95145" containerID="4877e8c5ce5d915ffce78d7fcaa41becc17c2fa95a99e16fe8f7fd820d5bc200" exitCode=0 Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.805438 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" event={"ID":"9de314c5-1440-476b-b98b-7804f5d95145","Type":"ContainerDied","Data":"4877e8c5ce5d915ffce78d7fcaa41becc17c2fa95a99e16fe8f7fd820d5bc200"} Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.805878 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" event={"ID":"9de314c5-1440-476b-b98b-7804f5d95145","Type":"ContainerDied","Data":"704e722c476db821d8e6f00d8c80db7e6888aef51f3367193fd7b4f2cac02bc3"} Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.805498 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.805907 4795 scope.go:117] "RemoveContainer" containerID="4877e8c5ce5d915ffce78d7fcaa41becc17c2fa95a99e16fe8f7fd820d5bc200" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.836720 4795 scope.go:117] "RemoveContainer" containerID="4877e8c5ce5d915ffce78d7fcaa41becc17c2fa95a99e16fe8f7fd820d5bc200" Feb 19 21:31:44 crc kubenswrapper[4795]: E0219 21:31:44.837148 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4877e8c5ce5d915ffce78d7fcaa41becc17c2fa95a99e16fe8f7fd820d5bc200\": container with ID starting with 4877e8c5ce5d915ffce78d7fcaa41becc17c2fa95a99e16fe8f7fd820d5bc200 not found: ID does not exist" containerID="4877e8c5ce5d915ffce78d7fcaa41becc17c2fa95a99e16fe8f7fd820d5bc200" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.837202 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4877e8c5ce5d915ffce78d7fcaa41becc17c2fa95a99e16fe8f7fd820d5bc200"} err="failed to get container status \"4877e8c5ce5d915ffce78d7fcaa41becc17c2fa95a99e16fe8f7fd820d5bc200\": rpc error: code = NotFound desc = could not find container \"4877e8c5ce5d915ffce78d7fcaa41becc17c2fa95a99e16fe8f7fd820d5bc200\": container with ID starting with 4877e8c5ce5d915ffce78d7fcaa41becc17c2fa95a99e16fe8f7fd820d5bc200 not found: ID does not exist" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.846301 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-pb7s7"] Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.849340 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-pb7s7"] Feb 19 21:31:45 crc kubenswrapper[4795]: I0219 21:31:45.524205 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9de314c5-1440-476b-b98b-7804f5d95145" path="/var/lib/kubelet/pods/9de314c5-1440-476b-b98b-7804f5d95145/volumes" Feb 19 21:31:45 crc kubenswrapper[4795]: I0219 21:31:45.525232 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac83daf6-848e-4977-8bb9-a7b4db89618f" path="/var/lib/kubelet/pods/ac83daf6-848e-4977-8bb9-a7b4db89618f/volumes" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.528827 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5"] Feb 19 21:31:53 crc kubenswrapper[4795]: E0219 21:31:53.530638 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9de314c5-1440-476b-b98b-7804f5d95145" containerName="oauth-openshift" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.530709 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9de314c5-1440-476b-b98b-7804f5d95145" containerName="oauth-openshift" Feb 19 21:31:53 crc kubenswrapper[4795]: E0219 21:31:53.530769 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac83daf6-848e-4977-8bb9-a7b4db89618f" containerName="registry-server" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.530831 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac83daf6-848e-4977-8bb9-a7b4db89618f" containerName="registry-server" Feb 19 21:31:53 crc kubenswrapper[4795]: E0219 21:31:53.530897 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac83daf6-848e-4977-8bb9-a7b4db89618f" containerName="extract-utilities" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.530948 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac83daf6-848e-4977-8bb9-a7b4db89618f" containerName="extract-utilities" Feb 19 21:31:53 crc kubenswrapper[4795]: E0219 21:31:53.531008 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac83daf6-848e-4977-8bb9-a7b4db89618f" containerName="extract-content" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.531063 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac83daf6-848e-4977-8bb9-a7b4db89618f" containerName="extract-content" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.531230 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9de314c5-1440-476b-b98b-7804f5d95145" containerName="oauth-openshift" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.531295 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac83daf6-848e-4977-8bb9-a7b4db89618f" containerName="registry-server" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.531733 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.534012 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.534151 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.534324 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.535516 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.535706 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.535823 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.535907 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.536813 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.537814 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.537943 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.538081 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.538237 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.546547 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.548716 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.554152 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5"] Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.558016 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.689767 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-service-ca\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.689808 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4f54db4e-f039-4a8c-84c6-82b502e3c925-audit-dir\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.689852 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.689891 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.689910 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-session\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.689942 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-user-template-login\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.689965 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-cliconfig\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.689983 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.690002 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4f54db4e-f039-4a8c-84c6-82b502e3c925-audit-policies\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.690019 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-serving-cert\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.690203 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-user-template-error\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.690271 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-router-certs\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.690358 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd82l\" (UniqueName: \"kubernetes.io/projected/4f54db4e-f039-4a8c-84c6-82b502e3c925-kube-api-access-fd82l\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.690445 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.791583 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.791827 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-session\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.791935 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-user-template-login\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.792027 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-cliconfig\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.792120 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.792332 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4f54db4e-f039-4a8c-84c6-82b502e3c925-audit-policies\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.792686 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-serving-cert\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.792802 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-user-template-error\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.793248 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-router-certs\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.793350 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd82l\" (UniqueName: \"kubernetes.io/projected/4f54db4e-f039-4a8c-84c6-82b502e3c925-kube-api-access-fd82l\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.793447 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.793594 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-service-ca\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.793670 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.793775 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4f54db4e-f039-4a8c-84c6-82b502e3c925-audit-dir\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.793902 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4f54db4e-f039-4a8c-84c6-82b502e3c925-audit-dir\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.792909 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4f54db4e-f039-4a8c-84c6-82b502e3c925-audit-policies\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.792703 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-cliconfig\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.794583 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.794769 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-service-ca\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.797510 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.797510 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-serving-cert\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.805612 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-user-template-error\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.805800 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-router-certs\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.806641 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.810447 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-user-template-login\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.810826 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-session\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.810859 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.818337 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd82l\" (UniqueName: \"kubernetes.io/projected/4f54db4e-f039-4a8c-84c6-82b502e3c925-kube-api-access-fd82l\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.856385 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:54 crc kubenswrapper[4795]: I0219 21:31:54.214388 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5"] Feb 19 21:31:54 crc kubenswrapper[4795]: I0219 21:31:54.879058 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" event={"ID":"4f54db4e-f039-4a8c-84c6-82b502e3c925","Type":"ContainerStarted","Data":"4c202565e552b9755548ccb2c23f10760de1e618afd946344515df9bc31e4ad5"} Feb 19 21:31:54 crc kubenswrapper[4795]: I0219 21:31:54.880857 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" event={"ID":"4f54db4e-f039-4a8c-84c6-82b502e3c925","Type":"ContainerStarted","Data":"a23a01be45afae3d3855a5117f260f203acfb7581d38469c4afc270a96d7c809"} Feb 19 21:31:54 crc kubenswrapper[4795]: I0219 21:31:54.880969 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:54 crc kubenswrapper[4795]: I0219 21:31:54.885239 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:54 crc kubenswrapper[4795]: I0219 21:31:54.901019 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" podStartSLOduration=36.900999391 podStartE2EDuration="36.900999391s" podCreationTimestamp="2026-02-19 21:31:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:31:54.897103654 +0000 UTC m=+226.089621528" watchObservedRunningTime="2026-02-19 21:31:54.900999391 +0000 UTC m=+226.093517255" Feb 19 21:31:58 crc kubenswrapper[4795]: I0219 21:31:58.428087 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:31:58 crc kubenswrapper[4795]: I0219 21:31:58.428495 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:31:58 crc kubenswrapper[4795]: I0219 21:31:58.428562 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:31:58 crc kubenswrapper[4795]: I0219 21:31:58.429415 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 21:31:58 crc kubenswrapper[4795]: I0219 21:31:58.429532 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e" gracePeriod=600 Feb 19 21:31:58 crc kubenswrapper[4795]: I0219 21:31:58.904020 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e" exitCode=0 Feb 19 21:31:58 crc kubenswrapper[4795]: I0219 21:31:58.904090 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e"} Feb 19 21:31:58 crc kubenswrapper[4795]: I0219 21:31:58.904448 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"b70534b9f8ebcb8ef865c023146a47e5407cdcbaee4d6cb7a41e8ac0daedef4a"} Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.946451 4795 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.947281 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce" gracePeriod=15 Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.947355 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b" gracePeriod=15 Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.947383 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c" gracePeriod=15 Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.947441 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0" gracePeriod=15 Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.947379 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30" gracePeriod=15 Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.950776 4795 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 21:32:02 crc kubenswrapper[4795]: E0219 21:32:02.951546 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.951587 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 19 21:32:02 crc kubenswrapper[4795]: E0219 21:32:02.951612 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.951626 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 21:32:02 crc kubenswrapper[4795]: E0219 21:32:02.951650 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.951663 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 19 21:32:02 crc kubenswrapper[4795]: E0219 21:32:02.951678 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.951690 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 19 21:32:02 crc kubenswrapper[4795]: E0219 21:32:02.951705 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.951717 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 19 21:32:02 crc kubenswrapper[4795]: E0219 21:32:02.951743 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.951755 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.951962 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.951991 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.952015 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.952040 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.952063 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.952083 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 19 21:32:02 crc kubenswrapper[4795]: E0219 21:32:02.952303 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.952319 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.958357 4795 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.959905 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.964725 4795 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Feb 19 21:32:03 crc kubenswrapper[4795]: E0219 21:32:03.000103 4795 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.69:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.102465 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.102517 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.102542 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.102577 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.102684 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.102723 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.102755 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.102818 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.203653 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.203728 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.203750 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.203765 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.203790 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.203817 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.203833 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.203777 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.203858 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.203846 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.203909 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.203964 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.203959 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.204031 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.204127 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.204162 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.301632 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: E0219 21:32:03.325591 4795 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.69:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895c3438c5182ea openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-19 21:32:03.32453553 +0000 UTC m=+234.517053404,LastTimestamp:2026-02-19 21:32:03.32453553 +0000 UTC m=+234.517053404,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.931221 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"1104528b9e716ce5addd230534335514e5ee2436ce9ed811347dd6d853cd85d6"} Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.931643 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"2bc8077dbe1f6e57ad6e4ac46bb06b6c63e727e2b8059b59cfdf30ce4e8238ff"} Feb 19 21:32:03 crc kubenswrapper[4795]: E0219 21:32:03.932528 4795 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.69:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.933386 4795 generic.go:334] "Generic (PLEG): container finished" podID="6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6" containerID="d1854bd6a757d0896d79bed3a255459850433d4bfa0a2f7e67645e00599c61cf" exitCode=0 Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.933479 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6","Type":"ContainerDied","Data":"d1854bd6a757d0896d79bed3a255459850433d4bfa0a2f7e67645e00599c61cf"} Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.934566 4795 status_manager.go:851] "Failed to get status for pod" podUID="6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.935572 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.936690 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.937372 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30" exitCode=0 Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.937403 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0" exitCode=0 Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.937416 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b" exitCode=0 Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.937429 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c" exitCode=2 Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.937486 4795 scope.go:117] "RemoveContainer" containerID="d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650" Feb 19 21:32:04 crc kubenswrapper[4795]: I0219 21:32:04.950528 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.295382 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.296722 4795 status_manager.go:851] "Failed to get status for pod" podUID="6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.301095 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.301884 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.302293 4795 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.302745 4795 status_manager.go:851] "Failed to get status for pod" podUID="6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.431239 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.431308 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-kube-api-access\") pod \"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6\" (UID: \"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6\") " Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.431329 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-var-lock\") pod \"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6\" (UID: \"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6\") " Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.431374 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.431393 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-kubelet-dir\") pod \"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6\" (UID: \"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6\") " Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.431443 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6" (UID: "6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.431509 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-var-lock" (OuterVolumeSpecName: "var-lock") pod "6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6" (UID: "6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.431543 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.431521 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.431575 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.431743 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.431998 4795 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-var-lock\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.432024 4795 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.432040 4795 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.432054 4795 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.432066 4795 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.437861 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6" (UID: "6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.519316 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.532756 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.961220 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.961223 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6","Type":"ContainerDied","Data":"6561cdc38385fc6e6e9ed635e289b887be8b67167e1892ff8512a6be8137ce3f"} Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.961264 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6561cdc38385fc6e6e9ed635e289b887be8b67167e1892ff8512a6be8137ce3f" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.965330 4795 status_manager.go:851] "Failed to get status for pod" podUID="6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.966283 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.967076 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce" exitCode=0 Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.967129 4795 scope.go:117] "RemoveContainer" containerID="818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.967260 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.968869 4795 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.969320 4795 status_manager.go:851] "Failed to get status for pod" podUID="6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.970233 4795 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.970764 4795 status_manager.go:851] "Failed to get status for pod" podUID="6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.993918 4795 scope.go:117] "RemoveContainer" containerID="353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0" Feb 19 21:32:06 crc kubenswrapper[4795]: I0219 21:32:06.019362 4795 scope.go:117] "RemoveContainer" containerID="21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b" Feb 19 21:32:06 crc kubenswrapper[4795]: I0219 21:32:06.037641 4795 scope.go:117] "RemoveContainer" containerID="7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c" Feb 19 21:32:06 crc kubenswrapper[4795]: I0219 21:32:06.066059 4795 scope.go:117] "RemoveContainer" containerID="ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce" Feb 19 21:32:06 crc kubenswrapper[4795]: I0219 21:32:06.092772 4795 scope.go:117] "RemoveContainer" containerID="6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a" Feb 19 21:32:06 crc kubenswrapper[4795]: I0219 21:32:06.130672 4795 scope.go:117] "RemoveContainer" containerID="818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30" Feb 19 21:32:06 crc kubenswrapper[4795]: E0219 21:32:06.131416 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\": container with ID starting with 818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30 not found: ID does not exist" containerID="818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30" Feb 19 21:32:06 crc kubenswrapper[4795]: I0219 21:32:06.131468 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30"} err="failed to get container status \"818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\": rpc error: code = NotFound desc = could not find container \"818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\": container with ID starting with 818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30 not found: ID does not exist" Feb 19 21:32:06 crc kubenswrapper[4795]: I0219 21:32:06.131506 4795 scope.go:117] "RemoveContainer" containerID="353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0" Feb 19 21:32:06 crc kubenswrapper[4795]: E0219 21:32:06.132394 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\": container with ID starting with 353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0 not found: ID does not exist" containerID="353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0" Feb 19 21:32:06 crc kubenswrapper[4795]: I0219 21:32:06.132429 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0"} err="failed to get container status \"353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\": rpc error: code = NotFound desc = could not find container \"353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\": container with ID starting with 353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0 not found: ID does not exist" Feb 19 21:32:06 crc kubenswrapper[4795]: I0219 21:32:06.132449 4795 scope.go:117] "RemoveContainer" containerID="21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b" Feb 19 21:32:06 crc kubenswrapper[4795]: E0219 21:32:06.133730 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\": container with ID starting with 21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b not found: ID does not exist" containerID="21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b" Feb 19 21:32:06 crc kubenswrapper[4795]: I0219 21:32:06.133763 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b"} err="failed to get container status \"21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\": rpc error: code = NotFound desc = could not find container \"21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\": container with ID starting with 21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b not found: ID does not exist" Feb 19 21:32:06 crc kubenswrapper[4795]: I0219 21:32:06.133782 4795 scope.go:117] "RemoveContainer" containerID="7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c" Feb 19 21:32:06 crc kubenswrapper[4795]: E0219 21:32:06.134568 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\": container with ID starting with 7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c not found: ID does not exist" containerID="7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c" Feb 19 21:32:06 crc kubenswrapper[4795]: I0219 21:32:06.134595 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c"} err="failed to get container status \"7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\": rpc error: code = NotFound desc = could not find container \"7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\": container with ID starting with 7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c not found: ID does not exist" Feb 19 21:32:06 crc kubenswrapper[4795]: I0219 21:32:06.134613 4795 scope.go:117] "RemoveContainer" containerID="ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce" Feb 19 21:32:06 crc kubenswrapper[4795]: E0219 21:32:06.134998 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\": container with ID starting with ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce not found: ID does not exist" containerID="ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce" Feb 19 21:32:06 crc kubenswrapper[4795]: I0219 21:32:06.135031 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce"} err="failed to get container status \"ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\": rpc error: code = NotFound desc = could not find container \"ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\": container with ID starting with ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce not found: ID does not exist" Feb 19 21:32:06 crc kubenswrapper[4795]: I0219 21:32:06.135056 4795 scope.go:117] "RemoveContainer" containerID="6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a" Feb 19 21:32:06 crc kubenswrapper[4795]: E0219 21:32:06.135811 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\": container with ID starting with 6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a not found: ID does not exist" containerID="6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a" Feb 19 21:32:06 crc kubenswrapper[4795]: I0219 21:32:06.135886 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a"} err="failed to get container status \"6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\": rpc error: code = NotFound desc = could not find container \"6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\": container with ID starting with 6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a not found: ID does not exist" Feb 19 21:32:06 crc kubenswrapper[4795]: E0219 21:32:06.990716 4795 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.69:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895c3438c5182ea openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-19 21:32:03.32453553 +0000 UTC m=+234.517053404,LastTimestamp:2026-02-19 21:32:03.32453553 +0000 UTC m=+234.517053404,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 19 21:32:08 crc kubenswrapper[4795]: E0219 21:32:08.180316 4795 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:08 crc kubenswrapper[4795]: E0219 21:32:08.181408 4795 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:08 crc kubenswrapper[4795]: E0219 21:32:08.182018 4795 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:08 crc kubenswrapper[4795]: E0219 21:32:08.182667 4795 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:08 crc kubenswrapper[4795]: E0219 21:32:08.183391 4795 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:08 crc kubenswrapper[4795]: I0219 21:32:08.183460 4795 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 19 21:32:08 crc kubenswrapper[4795]: E0219 21:32:08.184029 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="200ms" Feb 19 21:32:08 crc kubenswrapper[4795]: E0219 21:32:08.385283 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="400ms" Feb 19 21:32:08 crc kubenswrapper[4795]: E0219 21:32:08.785932 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="800ms" Feb 19 21:32:09 crc kubenswrapper[4795]: I0219 21:32:09.517266 4795 status_manager.go:851] "Failed to get status for pod" podUID="6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:09 crc kubenswrapper[4795]: E0219 21:32:09.586707 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="1.6s" Feb 19 21:32:11 crc kubenswrapper[4795]: E0219 21:32:11.187622 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="3.2s" Feb 19 21:32:13 crc kubenswrapper[4795]: E0219 21:32:13.555962 4795 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.69:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" volumeName="registry-storage" Feb 19 21:32:14 crc kubenswrapper[4795]: E0219 21:32:14.388627 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="6.4s" Feb 19 21:32:14 crc kubenswrapper[4795]: I0219 21:32:14.511726 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:14 crc kubenswrapper[4795]: I0219 21:32:14.512778 4795 status_manager.go:851] "Failed to get status for pod" podUID="6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:14 crc kubenswrapper[4795]: I0219 21:32:14.528632 4795 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1c3c2292-d8c1-42cd-9a68-25e9dee8a334" Feb 19 21:32:14 crc kubenswrapper[4795]: I0219 21:32:14.528709 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1c3c2292-d8c1-42cd-9a68-25e9dee8a334" Feb 19 21:32:14 crc kubenswrapper[4795]: E0219 21:32:14.529405 4795 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:14 crc kubenswrapper[4795]: I0219 21:32:14.530394 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:14 crc kubenswrapper[4795]: W0219 21:32:14.551112 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-41655226d47bbaa32fdba77096cc0201e9e2f9d220772b2542c5846761ab512c WatchSource:0}: Error finding container 41655226d47bbaa32fdba77096cc0201e9e2f9d220772b2542c5846761ab512c: Status 404 returned error can't find the container with id 41655226d47bbaa32fdba77096cc0201e9e2f9d220772b2542c5846761ab512c Feb 19 21:32:15 crc kubenswrapper[4795]: I0219 21:32:15.030300 4795 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="d0854a09a82591d70c3728b2e7a60c1677d7f76e2fa7d0b45cc959f5a6b6fb19" exitCode=0 Feb 19 21:32:15 crc kubenswrapper[4795]: I0219 21:32:15.030429 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"d0854a09a82591d70c3728b2e7a60c1677d7f76e2fa7d0b45cc959f5a6b6fb19"} Feb 19 21:32:15 crc kubenswrapper[4795]: I0219 21:32:15.030612 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"41655226d47bbaa32fdba77096cc0201e9e2f9d220772b2542c5846761ab512c"} Feb 19 21:32:15 crc kubenswrapper[4795]: I0219 21:32:15.030895 4795 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1c3c2292-d8c1-42cd-9a68-25e9dee8a334" Feb 19 21:32:15 crc kubenswrapper[4795]: I0219 21:32:15.030913 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1c3c2292-d8c1-42cd-9a68-25e9dee8a334" Feb 19 21:32:15 crc kubenswrapper[4795]: E0219 21:32:15.031364 4795 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:15 crc kubenswrapper[4795]: I0219 21:32:15.031577 4795 status_manager.go:851] "Failed to get status for pod" podUID="6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:16 crc kubenswrapper[4795]: I0219 21:32:16.038707 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"448b13b1be0bd809e4ea29aef6a03c34c56c001d6ad87835a10a869a4dbfe46f"} Feb 19 21:32:16 crc kubenswrapper[4795]: I0219 21:32:16.039328 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"92f8ac516b57fb2f0845465ac2ad4422fde93efd295aedf8ebf83ebd04aea4a0"} Feb 19 21:32:16 crc kubenswrapper[4795]: I0219 21:32:16.039343 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"024c62e342b22d30a7eed51fb61ae33a7b8776fc7cdcb095e7c4f47b2fe5254a"} Feb 19 21:32:16 crc kubenswrapper[4795]: I0219 21:32:16.039355 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9c333b5bb4256786b46783a605904ae495f24f99183f7aab2ac51b63d05cbf7f"} Feb 19 21:32:17 crc kubenswrapper[4795]: I0219 21:32:17.050892 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ac2cb93e122d3e40ddbf9f0c954eedc03efaea57abaaf21a1796d82c5ce36949"} Feb 19 21:32:17 crc kubenswrapper[4795]: I0219 21:32:17.051072 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:17 crc kubenswrapper[4795]: I0219 21:32:17.051283 4795 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1c3c2292-d8c1-42cd-9a68-25e9dee8a334" Feb 19 21:32:17 crc kubenswrapper[4795]: I0219 21:32:17.051313 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1c3c2292-d8c1-42cd-9a68-25e9dee8a334" Feb 19 21:32:18 crc kubenswrapper[4795]: I0219 21:32:18.110242 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 19 21:32:18 crc kubenswrapper[4795]: I0219 21:32:18.111196 4795 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd" exitCode=1 Feb 19 21:32:18 crc kubenswrapper[4795]: I0219 21:32:18.111373 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd"} Feb 19 21:32:18 crc kubenswrapper[4795]: I0219 21:32:18.112117 4795 scope.go:117] "RemoveContainer" containerID="8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd" Feb 19 21:32:18 crc kubenswrapper[4795]: I0219 21:32:18.578554 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:32:19 crc kubenswrapper[4795]: I0219 21:32:19.125238 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 19 21:32:19 crc kubenswrapper[4795]: I0219 21:32:19.125793 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"dc84cb82080f440e6c764fb2db5e202b4dce225d9e24ad46de4f51d7f0493019"} Feb 19 21:32:19 crc kubenswrapper[4795]: I0219 21:32:19.531862 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:19 crc kubenswrapper[4795]: I0219 21:32:19.531945 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:19 crc kubenswrapper[4795]: I0219 21:32:19.540732 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:21 crc kubenswrapper[4795]: I0219 21:32:21.246981 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:32:21 crc kubenswrapper[4795]: I0219 21:32:21.252986 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:32:21 crc kubenswrapper[4795]: I0219 21:32:21.848843 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:32:22 crc kubenswrapper[4795]: I0219 21:32:22.059100 4795 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:22 crc kubenswrapper[4795]: I0219 21:32:22.099993 4795 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="794ade11-a364-41bd-85e7-564b6098af69" Feb 19 21:32:22 crc kubenswrapper[4795]: I0219 21:32:22.142461 4795 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1c3c2292-d8c1-42cd-9a68-25e9dee8a334" Feb 19 21:32:22 crc kubenswrapper[4795]: I0219 21:32:22.142718 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1c3c2292-d8c1-42cd-9a68-25e9dee8a334" Feb 19 21:32:22 crc kubenswrapper[4795]: I0219 21:32:22.145311 4795 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="794ade11-a364-41bd-85e7-564b6098af69" Feb 19 21:32:22 crc kubenswrapper[4795]: I0219 21:32:22.145751 4795 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://9c333b5bb4256786b46783a605904ae495f24f99183f7aab2ac51b63d05cbf7f" Feb 19 21:32:22 crc kubenswrapper[4795]: I0219 21:32:22.145783 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:23 crc kubenswrapper[4795]: I0219 21:32:23.146020 4795 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1c3c2292-d8c1-42cd-9a68-25e9dee8a334" Feb 19 21:32:23 crc kubenswrapper[4795]: I0219 21:32:23.146397 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1c3c2292-d8c1-42cd-9a68-25e9dee8a334" Feb 19 21:32:23 crc kubenswrapper[4795]: I0219 21:32:23.148411 4795 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="794ade11-a364-41bd-85e7-564b6098af69" Feb 19 21:32:29 crc kubenswrapper[4795]: I0219 21:32:29.388553 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 19 21:32:31 crc kubenswrapper[4795]: I0219 21:32:31.854835 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:32:32 crc kubenswrapper[4795]: I0219 21:32:32.233550 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 19 21:32:32 crc kubenswrapper[4795]: I0219 21:32:32.588474 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 19 21:32:33 crc kubenswrapper[4795]: I0219 21:32:33.322739 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 19 21:32:33 crc kubenswrapper[4795]: I0219 21:32:33.769344 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 21:32:34 crc kubenswrapper[4795]: I0219 21:32:34.150225 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 19 21:32:34 crc kubenswrapper[4795]: I0219 21:32:34.189785 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 19 21:32:34 crc kubenswrapper[4795]: I0219 21:32:34.268320 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 19 21:32:34 crc kubenswrapper[4795]: I0219 21:32:34.358195 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 19 21:32:34 crc kubenswrapper[4795]: I0219 21:32:34.466968 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 19 21:32:34 crc kubenswrapper[4795]: I0219 21:32:34.565676 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 19 21:32:34 crc kubenswrapper[4795]: I0219 21:32:34.640739 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 19 21:32:34 crc kubenswrapper[4795]: I0219 21:32:34.981344 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 19 21:32:35 crc kubenswrapper[4795]: I0219 21:32:35.058346 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 19 21:32:35 crc kubenswrapper[4795]: I0219 21:32:35.069259 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 19 21:32:35 crc kubenswrapper[4795]: I0219 21:32:35.317786 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 19 21:32:35 crc kubenswrapper[4795]: I0219 21:32:35.667765 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 19 21:32:35 crc kubenswrapper[4795]: I0219 21:32:35.688852 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 19 21:32:35 crc kubenswrapper[4795]: I0219 21:32:35.713948 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 19 21:32:35 crc kubenswrapper[4795]: I0219 21:32:35.905670 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 19 21:32:36 crc kubenswrapper[4795]: I0219 21:32:36.068946 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 19 21:32:36 crc kubenswrapper[4795]: I0219 21:32:36.122677 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 19 21:32:36 crc kubenswrapper[4795]: I0219 21:32:36.145932 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 19 21:32:36 crc kubenswrapper[4795]: I0219 21:32:36.283506 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 19 21:32:36 crc kubenswrapper[4795]: I0219 21:32:36.354905 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 19 21:32:36 crc kubenswrapper[4795]: I0219 21:32:36.386759 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 19 21:32:36 crc kubenswrapper[4795]: I0219 21:32:36.539140 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 19 21:32:36 crc kubenswrapper[4795]: I0219 21:32:36.738135 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 19 21:32:36 crc kubenswrapper[4795]: I0219 21:32:36.758492 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 19 21:32:36 crc kubenswrapper[4795]: I0219 21:32:36.872646 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 19 21:32:36 crc kubenswrapper[4795]: I0219 21:32:36.894122 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 19 21:32:36 crc kubenswrapper[4795]: I0219 21:32:36.942916 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 19 21:32:36 crc kubenswrapper[4795]: I0219 21:32:36.956239 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 19 21:32:37 crc kubenswrapper[4795]: I0219 21:32:37.060279 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 19 21:32:37 crc kubenswrapper[4795]: I0219 21:32:37.356471 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 19 21:32:37 crc kubenswrapper[4795]: I0219 21:32:37.473672 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 19 21:32:37 crc kubenswrapper[4795]: I0219 21:32:37.500313 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 19 21:32:37 crc kubenswrapper[4795]: I0219 21:32:37.541730 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 21:32:37 crc kubenswrapper[4795]: I0219 21:32:37.567160 4795 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 19 21:32:37 crc kubenswrapper[4795]: I0219 21:32:37.568859 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 19 21:32:37 crc kubenswrapper[4795]: I0219 21:32:37.661814 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 19 21:32:37 crc kubenswrapper[4795]: I0219 21:32:37.958120 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 19 21:32:37 crc kubenswrapper[4795]: I0219 21:32:37.989030 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.014564 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.015383 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.043927 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.082975 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.116459 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.354432 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.366999 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.381419 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.460104 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.515441 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.575159 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.576373 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.633556 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.676836 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.726613 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.737071 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.788373 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.790685 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.817348 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 19 21:32:39 crc kubenswrapper[4795]: I0219 21:32:39.014818 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 19 21:32:39 crc kubenswrapper[4795]: I0219 21:32:39.088277 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 21:32:39 crc kubenswrapper[4795]: I0219 21:32:39.104042 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 19 21:32:39 crc kubenswrapper[4795]: I0219 21:32:39.243729 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 19 21:32:39 crc kubenswrapper[4795]: I0219 21:32:39.436503 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 19 21:32:39 crc kubenswrapper[4795]: I0219 21:32:39.511759 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 19 21:32:39 crc kubenswrapper[4795]: I0219 21:32:39.696378 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 19 21:32:39 crc kubenswrapper[4795]: I0219 21:32:39.711562 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 19 21:32:39 crc kubenswrapper[4795]: I0219 21:32:39.719001 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 19 21:32:39 crc kubenswrapper[4795]: I0219 21:32:39.793947 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 19 21:32:39 crc kubenswrapper[4795]: I0219 21:32:39.805451 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 19 21:32:39 crc kubenswrapper[4795]: I0219 21:32:39.813296 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 19 21:32:39 crc kubenswrapper[4795]: I0219 21:32:39.890121 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 19 21:32:39 crc kubenswrapper[4795]: I0219 21:32:39.893284 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 19 21:32:39 crc kubenswrapper[4795]: I0219 21:32:39.903401 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.024226 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.116507 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.139761 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.161085 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.183411 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.209825 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.263127 4795 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.281161 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.288807 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.325268 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.347421 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.455728 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.543890 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.557750 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.588914 4795 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.734719 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.892283 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.136890 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.150953 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.157369 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.171790 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.205148 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.241391 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.376640 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.383620 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.417831 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.455029 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.487882 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.523565 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.529299 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.648424 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.713851 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.748810 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.756141 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.802489 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.935533 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.972774 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.024466 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.148163 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.148275 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.178158 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.287812 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.354046 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.465235 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.581691 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.584793 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.585523 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.656478 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.698987 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.715550 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.716708 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.764155 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.816603 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.881140 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.925781 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.946387 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.984689 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.995363 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.108391 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.111969 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.136472 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.141949 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.246736 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.364794 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.546539 4795 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.556629 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.556685 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.561534 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.574039 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.576494 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=21.576478973 podStartE2EDuration="21.576478973s" podCreationTimestamp="2026-02-19 21:32:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:32:43.575630949 +0000 UTC m=+274.768148823" watchObservedRunningTime="2026-02-19 21:32:43.576478973 +0000 UTC m=+274.768996837" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.627640 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.643394 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.689053 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.692450 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.761198 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.842318 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.850503 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.933155 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.116060 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.254790 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.270776 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.341241 4795 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.341487 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://1104528b9e716ce5addd230534335514e5ee2436ce9ed811347dd6d853cd85d6" gracePeriod=5 Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.347703 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.410283 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.413265 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.476041 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.488866 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.555729 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.565920 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.622311 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.685815 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.702318 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.729481 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.732265 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.889863 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.920139 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.939451 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.063723 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.080827 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.157759 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.176056 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.203609 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.312637 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.347672 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.367741 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.439169 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.452742 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.524692 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.561694 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.593907 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.631360 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.689109 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.724079 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.755727 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.756020 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.763431 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.832770 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.880294 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.892149 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.904462 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 19 21:32:46 crc kubenswrapper[4795]: I0219 21:32:46.165204 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 19 21:32:46 crc kubenswrapper[4795]: I0219 21:32:46.306905 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 19 21:32:46 crc kubenswrapper[4795]: I0219 21:32:46.366006 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 19 21:32:46 crc kubenswrapper[4795]: I0219 21:32:46.459334 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 19 21:32:46 crc kubenswrapper[4795]: I0219 21:32:46.508913 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 19 21:32:46 crc kubenswrapper[4795]: I0219 21:32:46.519861 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 19 21:32:46 crc kubenswrapper[4795]: I0219 21:32:46.538465 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 19 21:32:46 crc kubenswrapper[4795]: I0219 21:32:46.545652 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 19 21:32:46 crc kubenswrapper[4795]: I0219 21:32:46.633735 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 19 21:32:46 crc kubenswrapper[4795]: I0219 21:32:46.721679 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 19 21:32:46 crc kubenswrapper[4795]: I0219 21:32:46.743918 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 19 21:32:46 crc kubenswrapper[4795]: I0219 21:32:46.765099 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 19 21:32:46 crc kubenswrapper[4795]: I0219 21:32:46.865998 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 19 21:32:46 crc kubenswrapper[4795]: I0219 21:32:46.872198 4795 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 19 21:32:46 crc kubenswrapper[4795]: I0219 21:32:46.947035 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.097197 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.109395 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.131446 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.169869 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.192763 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.197627 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.198756 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.276792 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.372375 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.623629 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.669799 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.681936 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.719230 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.731844 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.749330 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.786461 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.792333 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.858966 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.962823 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 19 21:32:48 crc kubenswrapper[4795]: I0219 21:32:48.002727 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 19 21:32:48 crc kubenswrapper[4795]: I0219 21:32:48.075573 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 19 21:32:48 crc kubenswrapper[4795]: I0219 21:32:48.117475 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 19 21:32:48 crc kubenswrapper[4795]: I0219 21:32:48.152919 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 19 21:32:48 crc kubenswrapper[4795]: I0219 21:32:48.488243 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 19 21:32:48 crc kubenswrapper[4795]: I0219 21:32:48.555994 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 19 21:32:48 crc kubenswrapper[4795]: I0219 21:32:48.755583 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 19 21:32:48 crc kubenswrapper[4795]: I0219 21:32:48.851149 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 21:32:48 crc kubenswrapper[4795]: I0219 21:32:48.922309 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 19 21:32:48 crc kubenswrapper[4795]: I0219 21:32:48.937807 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 21:32:49 crc kubenswrapper[4795]: I0219 21:32:49.023357 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 19 21:32:49 crc kubenswrapper[4795]: I0219 21:32:49.159123 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 19 21:32:49 crc kubenswrapper[4795]: I0219 21:32:49.207150 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 19 21:32:49 crc kubenswrapper[4795]: I0219 21:32:49.230748 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 19 21:32:49 crc kubenswrapper[4795]: I0219 21:32:49.293298 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 19 21:32:49 crc kubenswrapper[4795]: I0219 21:32:49.429679 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 19 21:32:49 crc kubenswrapper[4795]: I0219 21:32:49.437146 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 19 21:32:49 crc kubenswrapper[4795]: I0219 21:32:49.472943 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 19 21:32:49 crc kubenswrapper[4795]: I0219 21:32:49.728728 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 19 21:32:49 crc kubenswrapper[4795]: I0219 21:32:49.902434 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 19 21:32:49 crc kubenswrapper[4795]: I0219 21:32:49.902498 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.030194 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.030344 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.030437 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.030492 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.030527 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.030579 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.030664 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.030682 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.030692 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.030902 4795 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.030931 4795 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.030950 4795 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.038605 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.104930 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.132494 4795 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.132625 4795 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.304246 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.304295 4795 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="1104528b9e716ce5addd230534335514e5ee2436ce9ed811347dd6d853cd85d6" exitCode=137 Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.304342 4795 scope.go:117] "RemoveContainer" containerID="1104528b9e716ce5addd230534335514e5ee2436ce9ed811347dd6d853cd85d6" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.304423 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.337066 4795 scope.go:117] "RemoveContainer" containerID="1104528b9e716ce5addd230534335514e5ee2436ce9ed811347dd6d853cd85d6" Feb 19 21:32:50 crc kubenswrapper[4795]: E0219 21:32:50.337557 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1104528b9e716ce5addd230534335514e5ee2436ce9ed811347dd6d853cd85d6\": container with ID starting with 1104528b9e716ce5addd230534335514e5ee2436ce9ed811347dd6d853cd85d6 not found: ID does not exist" containerID="1104528b9e716ce5addd230534335514e5ee2436ce9ed811347dd6d853cd85d6" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.337589 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1104528b9e716ce5addd230534335514e5ee2436ce9ed811347dd6d853cd85d6"} err="failed to get container status \"1104528b9e716ce5addd230534335514e5ee2436ce9ed811347dd6d853cd85d6\": rpc error: code = NotFound desc = could not find container \"1104528b9e716ce5addd230534335514e5ee2436ce9ed811347dd6d853cd85d6\": container with ID starting with 1104528b9e716ce5addd230534335514e5ee2436ce9ed811347dd6d853cd85d6 not found: ID does not exist" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.858308 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 19 21:32:51 crc kubenswrapper[4795]: I0219 21:32:51.414889 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 19 21:32:51 crc kubenswrapper[4795]: I0219 21:32:51.524470 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 19 21:32:51 crc kubenswrapper[4795]: I0219 21:32:51.539951 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 19 21:32:51 crc kubenswrapper[4795]: I0219 21:32:51.923652 4795 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 19 21:32:52 crc kubenswrapper[4795]: I0219 21:32:52.105049 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 19 21:32:52 crc kubenswrapper[4795]: I0219 21:32:52.823399 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.030381 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zzmtm"] Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.031337 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zzmtm" podUID="e8c7f503-32c4-4ca2-8435-9918cae8d931" containerName="registry-server" containerID="cri-o://e2b3b578e0a130165bd461f501cf99b487cf4f990927f2541dd89aab2055e28d" gracePeriod=30 Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.052637 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c9sh5"] Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.053572 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-c9sh5" podUID="7ae7ca82-f2b1-4fec-9f66-732017519586" containerName="registry-server" containerID="cri-o://e2d47055ec62ef6c48d9090ad6f5104fd8d79584ae42ab1689cf9d640447aeec" gracePeriod=30 Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.080951 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dfw9n"] Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.081221 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" podUID="7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca" containerName="marketplace-operator" containerID="cri-o://79fa299dc315e3fe30e63332104d23b13faff115fe64d3c739841e3e2664edb0" gracePeriod=30 Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.090596 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bmzl7"] Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.091070 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bmzl7" podUID="12e5472a-2c4b-4b71-91fb-06c3d5fcca54" containerName="registry-server" containerID="cri-o://124253c24d0ba04a222871263e3eb4ccb45c5fd2f1555778cc72b79051abee81" gracePeriod=30 Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.096379 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5tclw"] Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.096719 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5tclw" podUID="dd50e9de-1cec-4f7c-8308-e0e5eb7961b9" containerName="registry-server" containerID="cri-o://7c6c77a2d2c99d511b04824d382fd22558c9e465bd1270597992cd286f73dd2f" gracePeriod=30 Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.102263 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n9qlf"] Feb 19 21:33:03 crc kubenswrapper[4795]: E0219 21:33:03.102578 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.102597 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 19 21:33:03 crc kubenswrapper[4795]: E0219 21:33:03.102609 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6" containerName="installer" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.102618 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6" containerName="installer" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.102757 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.102792 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6" containerName="installer" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.103348 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-n9qlf" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.115508 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n9qlf"] Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.191489 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c91304a6-fa59-4df4-aa17-d7d2f73d9103-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n9qlf\" (UID: \"c91304a6-fa59-4df4-aa17-d7d2f73d9103\") " pod="openshift-marketplace/marketplace-operator-79b997595-n9qlf" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.191586 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t9lt\" (UniqueName: \"kubernetes.io/projected/c91304a6-fa59-4df4-aa17-d7d2f73d9103-kube-api-access-7t9lt\") pod \"marketplace-operator-79b997595-n9qlf\" (UID: \"c91304a6-fa59-4df4-aa17-d7d2f73d9103\") " pod="openshift-marketplace/marketplace-operator-79b997595-n9qlf" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.191630 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c91304a6-fa59-4df4-aa17-d7d2f73d9103-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n9qlf\" (UID: \"c91304a6-fa59-4df4-aa17-d7d2f73d9103\") " pod="openshift-marketplace/marketplace-operator-79b997595-n9qlf" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.292682 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t9lt\" (UniqueName: \"kubernetes.io/projected/c91304a6-fa59-4df4-aa17-d7d2f73d9103-kube-api-access-7t9lt\") pod \"marketplace-operator-79b997595-n9qlf\" (UID: \"c91304a6-fa59-4df4-aa17-d7d2f73d9103\") " pod="openshift-marketplace/marketplace-operator-79b997595-n9qlf" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.293029 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c91304a6-fa59-4df4-aa17-d7d2f73d9103-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n9qlf\" (UID: \"c91304a6-fa59-4df4-aa17-d7d2f73d9103\") " pod="openshift-marketplace/marketplace-operator-79b997595-n9qlf" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.293069 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c91304a6-fa59-4df4-aa17-d7d2f73d9103-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n9qlf\" (UID: \"c91304a6-fa59-4df4-aa17-d7d2f73d9103\") " pod="openshift-marketplace/marketplace-operator-79b997595-n9qlf" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.295291 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c91304a6-fa59-4df4-aa17-d7d2f73d9103-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n9qlf\" (UID: \"c91304a6-fa59-4df4-aa17-d7d2f73d9103\") " pod="openshift-marketplace/marketplace-operator-79b997595-n9qlf" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.298724 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c91304a6-fa59-4df4-aa17-d7d2f73d9103-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n9qlf\" (UID: \"c91304a6-fa59-4df4-aa17-d7d2f73d9103\") " pod="openshift-marketplace/marketplace-operator-79b997595-n9qlf" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.314140 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t9lt\" (UniqueName: \"kubernetes.io/projected/c91304a6-fa59-4df4-aa17-d7d2f73d9103-kube-api-access-7t9lt\") pod \"marketplace-operator-79b997595-n9qlf\" (UID: \"c91304a6-fa59-4df4-aa17-d7d2f73d9103\") " pod="openshift-marketplace/marketplace-operator-79b997595-n9qlf" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.396718 4795 generic.go:334] "Generic (PLEG): container finished" podID="dd50e9de-1cec-4f7c-8308-e0e5eb7961b9" containerID="7c6c77a2d2c99d511b04824d382fd22558c9e465bd1270597992cd286f73dd2f" exitCode=0 Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.396756 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tclw" event={"ID":"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9","Type":"ContainerDied","Data":"7c6c77a2d2c99d511b04824d382fd22558c9e465bd1270597992cd286f73dd2f"} Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.398921 4795 generic.go:334] "Generic (PLEG): container finished" podID="e8c7f503-32c4-4ca2-8435-9918cae8d931" containerID="e2b3b578e0a130165bd461f501cf99b487cf4f990927f2541dd89aab2055e28d" exitCode=0 Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.398975 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzmtm" event={"ID":"e8c7f503-32c4-4ca2-8435-9918cae8d931","Type":"ContainerDied","Data":"e2b3b578e0a130165bd461f501cf99b487cf4f990927f2541dd89aab2055e28d"} Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.399000 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzmtm" event={"ID":"e8c7f503-32c4-4ca2-8435-9918cae8d931","Type":"ContainerDied","Data":"1cc35dfba309639dbd91f218472ea2c5630e482b3631fbe826d36494a013aa5f"} Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.399014 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cc35dfba309639dbd91f218472ea2c5630e482b3631fbe826d36494a013aa5f" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.401108 4795 generic.go:334] "Generic (PLEG): container finished" podID="12e5472a-2c4b-4b71-91fb-06c3d5fcca54" containerID="124253c24d0ba04a222871263e3eb4ccb45c5fd2f1555778cc72b79051abee81" exitCode=0 Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.401183 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bmzl7" event={"ID":"12e5472a-2c4b-4b71-91fb-06c3d5fcca54","Type":"ContainerDied","Data":"124253c24d0ba04a222871263e3eb4ccb45c5fd2f1555778cc72b79051abee81"} Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.405367 4795 generic.go:334] "Generic (PLEG): container finished" podID="7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca" containerID="79fa299dc315e3fe30e63332104d23b13faff115fe64d3c739841e3e2664edb0" exitCode=0 Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.405449 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" event={"ID":"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca","Type":"ContainerDied","Data":"79fa299dc315e3fe30e63332104d23b13faff115fe64d3c739841e3e2664edb0"} Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.411411 4795 generic.go:334] "Generic (PLEG): container finished" podID="7ae7ca82-f2b1-4fec-9f66-732017519586" containerID="e2d47055ec62ef6c48d9090ad6f5104fd8d79584ae42ab1689cf9d640447aeec" exitCode=0 Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.411450 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c9sh5" event={"ID":"7ae7ca82-f2b1-4fec-9f66-732017519586","Type":"ContainerDied","Data":"e2d47055ec62ef6c48d9090ad6f5104fd8d79584ae42ab1689cf9d640447aeec"} Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.411500 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c9sh5" event={"ID":"7ae7ca82-f2b1-4fec-9f66-732017519586","Type":"ContainerDied","Data":"28fe4330f368c52acd76e8871982506eeb01297bf98866cc2f51b2139ec4aa1c"} Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.411513 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28fe4330f368c52acd76e8871982506eeb01297bf98866cc2f51b2139ec4aa1c" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.491610 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-n9qlf" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.493972 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.498136 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.502119 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.508863 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.522366 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.596376 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lns2m\" (UniqueName: \"kubernetes.io/projected/7ae7ca82-f2b1-4fec-9f66-732017519586-kube-api-access-lns2m\") pod \"7ae7ca82-f2b1-4fec-9f66-732017519586\" (UID: \"7ae7ca82-f2b1-4fec-9f66-732017519586\") " Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.596474 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8bcw\" (UniqueName: \"kubernetes.io/projected/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-kube-api-access-t8bcw\") pod \"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca\" (UID: \"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca\") " Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.596532 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ae7ca82-f2b1-4fec-9f66-732017519586-utilities\") pod \"7ae7ca82-f2b1-4fec-9f66-732017519586\" (UID: \"7ae7ca82-f2b1-4fec-9f66-732017519586\") " Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.596626 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ae7ca82-f2b1-4fec-9f66-732017519586-catalog-content\") pod \"7ae7ca82-f2b1-4fec-9f66-732017519586\" (UID: \"7ae7ca82-f2b1-4fec-9f66-732017519586\") " Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.596656 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-utilities\") pod \"12e5472a-2c4b-4b71-91fb-06c3d5fcca54\" (UID: \"12e5472a-2c4b-4b71-91fb-06c3d5fcca54\") " Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.596798 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-marketplace-trusted-ca\") pod \"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca\" (UID: \"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca\") " Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.596861 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-utilities\") pod \"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9\" (UID: \"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9\") " Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.596894 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krx4q\" (UniqueName: \"kubernetes.io/projected/e8c7f503-32c4-4ca2-8435-9918cae8d931-kube-api-access-krx4q\") pod \"e8c7f503-32c4-4ca2-8435-9918cae8d931\" (UID: \"e8c7f503-32c4-4ca2-8435-9918cae8d931\") " Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.596951 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5vg2\" (UniqueName: \"kubernetes.io/projected/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-kube-api-access-m5vg2\") pod \"12e5472a-2c4b-4b71-91fb-06c3d5fcca54\" (UID: \"12e5472a-2c4b-4b71-91fb-06c3d5fcca54\") " Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.596968 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8c7f503-32c4-4ca2-8435-9918cae8d931-utilities\") pod \"e8c7f503-32c4-4ca2-8435-9918cae8d931\" (UID: \"e8c7f503-32c4-4ca2-8435-9918cae8d931\") " Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.597037 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-catalog-content\") pod \"12e5472a-2c4b-4b71-91fb-06c3d5fcca54\" (UID: \"12e5472a-2c4b-4b71-91fb-06c3d5fcca54\") " Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.597120 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ddlx\" (UniqueName: \"kubernetes.io/projected/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-kube-api-access-5ddlx\") pod \"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9\" (UID: \"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9\") " Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.597154 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8c7f503-32c4-4ca2-8435-9918cae8d931-catalog-content\") pod \"e8c7f503-32c4-4ca2-8435-9918cae8d931\" (UID: \"e8c7f503-32c4-4ca2-8435-9918cae8d931\") " Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.597305 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-catalog-content\") pod \"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9\" (UID: \"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9\") " Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.597378 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-marketplace-operator-metrics\") pod \"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca\" (UID: \"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca\") " Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.599273 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca" (UID: "7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.599882 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8c7f503-32c4-4ca2-8435-9918cae8d931-utilities" (OuterVolumeSpecName: "utilities") pod "e8c7f503-32c4-4ca2-8435-9918cae8d931" (UID: "e8c7f503-32c4-4ca2-8435-9918cae8d931"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.600052 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ae7ca82-f2b1-4fec-9f66-732017519586-utilities" (OuterVolumeSpecName: "utilities") pod "7ae7ca82-f2b1-4fec-9f66-732017519586" (UID: "7ae7ca82-f2b1-4fec-9f66-732017519586"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.601566 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-utilities" (OuterVolumeSpecName: "utilities") pod "dd50e9de-1cec-4f7c-8308-e0e5eb7961b9" (UID: "dd50e9de-1cec-4f7c-8308-e0e5eb7961b9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.603396 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ae7ca82-f2b1-4fec-9f66-732017519586-kube-api-access-lns2m" (OuterVolumeSpecName: "kube-api-access-lns2m") pod "7ae7ca82-f2b1-4fec-9f66-732017519586" (UID: "7ae7ca82-f2b1-4fec-9f66-732017519586"). InnerVolumeSpecName "kube-api-access-lns2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.603959 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-kube-api-access-m5vg2" (OuterVolumeSpecName: "kube-api-access-m5vg2") pod "12e5472a-2c4b-4b71-91fb-06c3d5fcca54" (UID: "12e5472a-2c4b-4b71-91fb-06c3d5fcca54"). InnerVolumeSpecName "kube-api-access-m5vg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.604548 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8c7f503-32c4-4ca2-8435-9918cae8d931-kube-api-access-krx4q" (OuterVolumeSpecName: "kube-api-access-krx4q") pod "e8c7f503-32c4-4ca2-8435-9918cae8d931" (UID: "e8c7f503-32c4-4ca2-8435-9918cae8d931"). InnerVolumeSpecName "kube-api-access-krx4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.604744 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-kube-api-access-5ddlx" (OuterVolumeSpecName: "kube-api-access-5ddlx") pod "dd50e9de-1cec-4f7c-8308-e0e5eb7961b9" (UID: "dd50e9de-1cec-4f7c-8308-e0e5eb7961b9"). InnerVolumeSpecName "kube-api-access-5ddlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.605480 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca" (UID: "7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.606780 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-utilities" (OuterVolumeSpecName: "utilities") pod "12e5472a-2c4b-4b71-91fb-06c3d5fcca54" (UID: "12e5472a-2c4b-4b71-91fb-06c3d5fcca54"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.626810 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-kube-api-access-t8bcw" (OuterVolumeSpecName: "kube-api-access-t8bcw") pod "7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca" (UID: "7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca"). InnerVolumeSpecName "kube-api-access-t8bcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.659676 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ae7ca82-f2b1-4fec-9f66-732017519586-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ae7ca82-f2b1-4fec-9f66-732017519586" (UID: "7ae7ca82-f2b1-4fec-9f66-732017519586"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.659802 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "12e5472a-2c4b-4b71-91fb-06c3d5fcca54" (UID: "12e5472a-2c4b-4b71-91fb-06c3d5fcca54"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.673001 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8c7f503-32c4-4ca2-8435-9918cae8d931-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e8c7f503-32c4-4ca2-8435-9918cae8d931" (UID: "e8c7f503-32c4-4ca2-8435-9918cae8d931"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.699951 4795 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.699984 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lns2m\" (UniqueName: \"kubernetes.io/projected/7ae7ca82-f2b1-4fec-9f66-732017519586-kube-api-access-lns2m\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.699993 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8bcw\" (UniqueName: \"kubernetes.io/projected/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-kube-api-access-t8bcw\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.700002 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ae7ca82-f2b1-4fec-9f66-732017519586-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.700012 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.700019 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ae7ca82-f2b1-4fec-9f66-732017519586-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.700027 4795 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.700034 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.700042 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8c7f503-32c4-4ca2-8435-9918cae8d931-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.700050 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krx4q\" (UniqueName: \"kubernetes.io/projected/e8c7f503-32c4-4ca2-8435-9918cae8d931-kube-api-access-krx4q\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.700058 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5vg2\" (UniqueName: \"kubernetes.io/projected/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-kube-api-access-m5vg2\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.700066 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.700074 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ddlx\" (UniqueName: \"kubernetes.io/projected/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-kube-api-access-5ddlx\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.700082 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8c7f503-32c4-4ca2-8435-9918cae8d931-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.783226 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd50e9de-1cec-4f7c-8308-e0e5eb7961b9" (UID: "dd50e9de-1cec-4f7c-8308-e0e5eb7961b9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.801567 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.906454 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n9qlf"] Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.418281 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.418281 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bmzl7" event={"ID":"12e5472a-2c4b-4b71-91fb-06c3d5fcca54","Type":"ContainerDied","Data":"e5d0662258d341fed3d8a6fa8a495b8db4873d0c1396a7edd323344c9bdab748"} Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.418788 4795 scope.go:117] "RemoveContainer" containerID="124253c24d0ba04a222871263e3eb4ccb45c5fd2f1555778cc72b79051abee81" Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.420291 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.420294 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" event={"ID":"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca","Type":"ContainerDied","Data":"365d5c2e07de412e6c9e8f0e65078f4ceb7110e13c2cb20266daf040eaf8acbd"} Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.423615 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tclw" event={"ID":"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9","Type":"ContainerDied","Data":"d567b35b55d0a2cbb795271be9aef7ece38aac167cf48328a27f71f0b916ce76"} Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.423660 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.426181 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.426197 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n9qlf" event={"ID":"c91304a6-fa59-4df4-aa17-d7d2f73d9103","Type":"ContainerStarted","Data":"64e2127208ee2ac235c32326fdf8a8cd7616f06d53e17bfecc0502fae0f15b99"} Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.426239 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n9qlf" event={"ID":"c91304a6-fa59-4df4-aa17-d7d2f73d9103","Type":"ContainerStarted","Data":"63915007f0fc176b803854f4170846dae0c676210a4c7f26666d321c1ca4538c"} Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.426479 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.443502 4795 scope.go:117] "RemoveContainer" containerID="cb4588aca4785a571a304fb53b83b17a5d4fe720455aa15c94f9117f8cbe5514" Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.446613 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-n9qlf" podStartSLOduration=1.446592772 podStartE2EDuration="1.446592772s" podCreationTimestamp="2026-02-19 21:33:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:33:04.445256785 +0000 UTC m=+295.637774649" watchObservedRunningTime="2026-02-19 21:33:04.446592772 +0000 UTC m=+295.639110636" Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.467845 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bmzl7"] Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.471341 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bmzl7"] Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.480090 4795 scope.go:117] "RemoveContainer" containerID="7d3a5224f2284bc0e0b27ce31d0a2ef7f65f8de8a465113fd61e0813421d7fde" Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.496417 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c9sh5"] Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.504321 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-c9sh5"] Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.511672 4795 scope.go:117] "RemoveContainer" containerID="79fa299dc315e3fe30e63332104d23b13faff115fe64d3c739841e3e2664edb0" Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.521037 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5tclw"] Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.524133 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5tclw"] Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.530218 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zzmtm"] Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.534756 4795 scope.go:117] "RemoveContainer" containerID="7c6c77a2d2c99d511b04824d382fd22558c9e465bd1270597992cd286f73dd2f" Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.536073 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zzmtm"] Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.543679 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dfw9n"] Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.551722 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dfw9n"] Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.564359 4795 scope.go:117] "RemoveContainer" containerID="1fec75a7b046544f542f990a4f1a07786af212b43341a6cab76f9622d84d42c4" Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.584997 4795 scope.go:117] "RemoveContainer" containerID="e32b30648a4cba474dcfdca283954a8496e4f09ea18f9bc9f3563fdbbe7453f6" Feb 19 21:33:05 crc kubenswrapper[4795]: I0219 21:33:05.435903 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-n9qlf" Feb 19 21:33:05 crc kubenswrapper[4795]: I0219 21:33:05.439281 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-n9qlf" Feb 19 21:33:05 crc kubenswrapper[4795]: I0219 21:33:05.518338 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12e5472a-2c4b-4b71-91fb-06c3d5fcca54" path="/var/lib/kubelet/pods/12e5472a-2c4b-4b71-91fb-06c3d5fcca54/volumes" Feb 19 21:33:05 crc kubenswrapper[4795]: I0219 21:33:05.518938 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ae7ca82-f2b1-4fec-9f66-732017519586" path="/var/lib/kubelet/pods/7ae7ca82-f2b1-4fec-9f66-732017519586/volumes" Feb 19 21:33:05 crc kubenswrapper[4795]: I0219 21:33:05.519519 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca" path="/var/lib/kubelet/pods/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca/volumes" Feb 19 21:33:05 crc kubenswrapper[4795]: I0219 21:33:05.520345 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd50e9de-1cec-4f7c-8308-e0e5eb7961b9" path="/var/lib/kubelet/pods/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9/volumes" Feb 19 21:33:05 crc kubenswrapper[4795]: I0219 21:33:05.520868 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8c7f503-32c4-4ca2-8435-9918cae8d931" path="/var/lib/kubelet/pods/e8c7f503-32c4-4ca2-8435-9918cae8d931/volumes" Feb 19 21:33:09 crc kubenswrapper[4795]: I0219 21:33:09.299196 4795 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.290215 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7ph8l"] Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.290686 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" podUID="86ffb50f-47f6-47b2-9141-1de9999a13e0" containerName="controller-manager" containerID="cri-o://71e5ceb34779a2773b15e3bbb07890cf9d8cbc1dd13d422577cbf3017f33a99e" gracePeriod=30 Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.420611 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw"] Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.421060 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" podUID="102f7fb5-3031-4853-b112-2aa910aa63a7" containerName="route-controller-manager" containerID="cri-o://09bfebe5361b480de9eacdb44947d498846179bb75edaddc250a409a01766d3b" gracePeriod=30 Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.493634 4795 generic.go:334] "Generic (PLEG): container finished" podID="86ffb50f-47f6-47b2-9141-1de9999a13e0" containerID="71e5ceb34779a2773b15e3bbb07890cf9d8cbc1dd13d422577cbf3017f33a99e" exitCode=0 Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.493675 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" event={"ID":"86ffb50f-47f6-47b2-9141-1de9999a13e0","Type":"ContainerDied","Data":"71e5ceb34779a2773b15e3bbb07890cf9d8cbc1dd13d422577cbf3017f33a99e"} Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.677514 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.747953 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rvcj\" (UniqueName: \"kubernetes.io/projected/86ffb50f-47f6-47b2-9141-1de9999a13e0-kube-api-access-4rvcj\") pod \"86ffb50f-47f6-47b2-9141-1de9999a13e0\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.747993 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-proxy-ca-bundles\") pod \"86ffb50f-47f6-47b2-9141-1de9999a13e0\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.748055 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-client-ca\") pod \"86ffb50f-47f6-47b2-9141-1de9999a13e0\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.748116 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-config\") pod \"86ffb50f-47f6-47b2-9141-1de9999a13e0\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.748146 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86ffb50f-47f6-47b2-9141-1de9999a13e0-serving-cert\") pod \"86ffb50f-47f6-47b2-9141-1de9999a13e0\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.749013 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-config" (OuterVolumeSpecName: "config") pod "86ffb50f-47f6-47b2-9141-1de9999a13e0" (UID: "86ffb50f-47f6-47b2-9141-1de9999a13e0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.749083 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-client-ca" (OuterVolumeSpecName: "client-ca") pod "86ffb50f-47f6-47b2-9141-1de9999a13e0" (UID: "86ffb50f-47f6-47b2-9141-1de9999a13e0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.749333 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "86ffb50f-47f6-47b2-9141-1de9999a13e0" (UID: "86ffb50f-47f6-47b2-9141-1de9999a13e0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.753682 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86ffb50f-47f6-47b2-9141-1de9999a13e0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "86ffb50f-47f6-47b2-9141-1de9999a13e0" (UID: "86ffb50f-47f6-47b2-9141-1de9999a13e0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.754344 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86ffb50f-47f6-47b2-9141-1de9999a13e0-kube-api-access-4rvcj" (OuterVolumeSpecName: "kube-api-access-4rvcj") pod "86ffb50f-47f6-47b2-9141-1de9999a13e0" (UID: "86ffb50f-47f6-47b2-9141-1de9999a13e0"). InnerVolumeSpecName "kube-api-access-4rvcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.772065 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.848737 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/102f7fb5-3031-4853-b112-2aa910aa63a7-config\") pod \"102f7fb5-3031-4853-b112-2aa910aa63a7\" (UID: \"102f7fb5-3031-4853-b112-2aa910aa63a7\") " Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.848805 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/102f7fb5-3031-4853-b112-2aa910aa63a7-serving-cert\") pod \"102f7fb5-3031-4853-b112-2aa910aa63a7\" (UID: \"102f7fb5-3031-4853-b112-2aa910aa63a7\") " Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.848849 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/102f7fb5-3031-4853-b112-2aa910aa63a7-client-ca\") pod \"102f7fb5-3031-4853-b112-2aa910aa63a7\" (UID: \"102f7fb5-3031-4853-b112-2aa910aa63a7\") " Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.848924 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-546h9\" (UniqueName: \"kubernetes.io/projected/102f7fb5-3031-4853-b112-2aa910aa63a7-kube-api-access-546h9\") pod \"102f7fb5-3031-4853-b112-2aa910aa63a7\" (UID: \"102f7fb5-3031-4853-b112-2aa910aa63a7\") " Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.849629 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/102f7fb5-3031-4853-b112-2aa910aa63a7-client-ca" (OuterVolumeSpecName: "client-ca") pod "102f7fb5-3031-4853-b112-2aa910aa63a7" (UID: "102f7fb5-3031-4853-b112-2aa910aa63a7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.849650 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/102f7fb5-3031-4853-b112-2aa910aa63a7-config" (OuterVolumeSpecName: "config") pod "102f7fb5-3031-4853-b112-2aa910aa63a7" (UID: "102f7fb5-3031-4853-b112-2aa910aa63a7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.849918 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.849943 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86ffb50f-47f6-47b2-9141-1de9999a13e0-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.849957 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rvcj\" (UniqueName: \"kubernetes.io/projected/86ffb50f-47f6-47b2-9141-1de9999a13e0-kube-api-access-4rvcj\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.849971 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.849982 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/102f7fb5-3031-4853-b112-2aa910aa63a7-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.849993 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.850004 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/102f7fb5-3031-4853-b112-2aa910aa63a7-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.852112 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/102f7fb5-3031-4853-b112-2aa910aa63a7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "102f7fb5-3031-4853-b112-2aa910aa63a7" (UID: "102f7fb5-3031-4853-b112-2aa910aa63a7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.852413 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/102f7fb5-3031-4853-b112-2aa910aa63a7-kube-api-access-546h9" (OuterVolumeSpecName: "kube-api-access-546h9") pod "102f7fb5-3031-4853-b112-2aa910aa63a7" (UID: "102f7fb5-3031-4853-b112-2aa910aa63a7"). InnerVolumeSpecName "kube-api-access-546h9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.951489 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/102f7fb5-3031-4853-b112-2aa910aa63a7-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.951527 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-546h9\" (UniqueName: \"kubernetes.io/projected/102f7fb5-3031-4853-b112-2aa910aa63a7-kube-api-access-546h9\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.499296 4795 generic.go:334] "Generic (PLEG): container finished" podID="102f7fb5-3031-4853-b112-2aa910aa63a7" containerID="09bfebe5361b480de9eacdb44947d498846179bb75edaddc250a409a01766d3b" exitCode=0 Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.499339 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.499382 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" event={"ID":"102f7fb5-3031-4853-b112-2aa910aa63a7","Type":"ContainerDied","Data":"09bfebe5361b480de9eacdb44947d498846179bb75edaddc250a409a01766d3b"} Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.499412 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" event={"ID":"102f7fb5-3031-4853-b112-2aa910aa63a7","Type":"ContainerDied","Data":"f4d807af544e927e81a81905631510e6f7454a6d612cc5078b9fdd6b9b356c32"} Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.499429 4795 scope.go:117] "RemoveContainer" containerID="09bfebe5361b480de9eacdb44947d498846179bb75edaddc250a409a01766d3b" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.500745 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" event={"ID":"86ffb50f-47f6-47b2-9141-1de9999a13e0","Type":"ContainerDied","Data":"f50c3553621e34238711ac41e2e592ef162e4af963002aedb152ce56da5992e5"} Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.500819 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.513819 4795 scope.go:117] "RemoveContainer" containerID="09bfebe5361b480de9eacdb44947d498846179bb75edaddc250a409a01766d3b" Feb 19 21:33:16 crc kubenswrapper[4795]: E0219 21:33:16.514435 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09bfebe5361b480de9eacdb44947d498846179bb75edaddc250a409a01766d3b\": container with ID starting with 09bfebe5361b480de9eacdb44947d498846179bb75edaddc250a409a01766d3b not found: ID does not exist" containerID="09bfebe5361b480de9eacdb44947d498846179bb75edaddc250a409a01766d3b" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.514515 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09bfebe5361b480de9eacdb44947d498846179bb75edaddc250a409a01766d3b"} err="failed to get container status \"09bfebe5361b480de9eacdb44947d498846179bb75edaddc250a409a01766d3b\": rpc error: code = NotFound desc = could not find container \"09bfebe5361b480de9eacdb44947d498846179bb75edaddc250a409a01766d3b\": container with ID starting with 09bfebe5361b480de9eacdb44947d498846179bb75edaddc250a409a01766d3b not found: ID does not exist" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.514553 4795 scope.go:117] "RemoveContainer" containerID="71e5ceb34779a2773b15e3bbb07890cf9d8cbc1dd13d422577cbf3017f33a99e" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.534104 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7ph8l"] Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.540251 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7ph8l"] Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.543790 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw"] Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.550734 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw"] Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.590676 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-658fd5994d-4zhjv"] Feb 19 21:33:16 crc kubenswrapper[4795]: E0219 21:33:16.591107 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8c7f503-32c4-4ca2-8435-9918cae8d931" containerName="registry-server" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591127 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8c7f503-32c4-4ca2-8435-9918cae8d931" containerName="registry-server" Feb 19 21:33:16 crc kubenswrapper[4795]: E0219 21:33:16.591138 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8c7f503-32c4-4ca2-8435-9918cae8d931" containerName="extract-content" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591143 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8c7f503-32c4-4ca2-8435-9918cae8d931" containerName="extract-content" Feb 19 21:33:16 crc kubenswrapper[4795]: E0219 21:33:16.591155 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae7ca82-f2b1-4fec-9f66-732017519586" containerName="extract-utilities" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591164 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae7ca82-f2b1-4fec-9f66-732017519586" containerName="extract-utilities" Feb 19 21:33:16 crc kubenswrapper[4795]: E0219 21:33:16.591185 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae7ca82-f2b1-4fec-9f66-732017519586" containerName="extract-content" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591191 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae7ca82-f2b1-4fec-9f66-732017519586" containerName="extract-content" Feb 19 21:33:16 crc kubenswrapper[4795]: E0219 21:33:16.591202 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca" containerName="marketplace-operator" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591207 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca" containerName="marketplace-operator" Feb 19 21:33:16 crc kubenswrapper[4795]: E0219 21:33:16.591215 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd50e9de-1cec-4f7c-8308-e0e5eb7961b9" containerName="registry-server" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591221 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd50e9de-1cec-4f7c-8308-e0e5eb7961b9" containerName="registry-server" Feb 19 21:33:16 crc kubenswrapper[4795]: E0219 21:33:16.591230 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae7ca82-f2b1-4fec-9f66-732017519586" containerName="registry-server" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591236 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae7ca82-f2b1-4fec-9f66-732017519586" containerName="registry-server" Feb 19 21:33:16 crc kubenswrapper[4795]: E0219 21:33:16.591245 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="102f7fb5-3031-4853-b112-2aa910aa63a7" containerName="route-controller-manager" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591250 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="102f7fb5-3031-4853-b112-2aa910aa63a7" containerName="route-controller-manager" Feb 19 21:33:16 crc kubenswrapper[4795]: E0219 21:33:16.591282 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ffb50f-47f6-47b2-9141-1de9999a13e0" containerName="controller-manager" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591289 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ffb50f-47f6-47b2-9141-1de9999a13e0" containerName="controller-manager" Feb 19 21:33:16 crc kubenswrapper[4795]: E0219 21:33:16.591297 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8c7f503-32c4-4ca2-8435-9918cae8d931" containerName="extract-utilities" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591303 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8c7f503-32c4-4ca2-8435-9918cae8d931" containerName="extract-utilities" Feb 19 21:33:16 crc kubenswrapper[4795]: E0219 21:33:16.591309 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd50e9de-1cec-4f7c-8308-e0e5eb7961b9" containerName="extract-content" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591316 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd50e9de-1cec-4f7c-8308-e0e5eb7961b9" containerName="extract-content" Feb 19 21:33:16 crc kubenswrapper[4795]: E0219 21:33:16.591323 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12e5472a-2c4b-4b71-91fb-06c3d5fcca54" containerName="extract-content" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591329 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="12e5472a-2c4b-4b71-91fb-06c3d5fcca54" containerName="extract-content" Feb 19 21:33:16 crc kubenswrapper[4795]: E0219 21:33:16.591337 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12e5472a-2c4b-4b71-91fb-06c3d5fcca54" containerName="registry-server" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591343 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="12e5472a-2c4b-4b71-91fb-06c3d5fcca54" containerName="registry-server" Feb 19 21:33:16 crc kubenswrapper[4795]: E0219 21:33:16.591351 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd50e9de-1cec-4f7c-8308-e0e5eb7961b9" containerName="extract-utilities" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591357 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd50e9de-1cec-4f7c-8308-e0e5eb7961b9" containerName="extract-utilities" Feb 19 21:33:16 crc kubenswrapper[4795]: E0219 21:33:16.591364 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12e5472a-2c4b-4b71-91fb-06c3d5fcca54" containerName="extract-utilities" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591370 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="12e5472a-2c4b-4b71-91fb-06c3d5fcca54" containerName="extract-utilities" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591442 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae7ca82-f2b1-4fec-9f66-732017519586" containerName="registry-server" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591452 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="12e5472a-2c4b-4b71-91fb-06c3d5fcca54" containerName="registry-server" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591460 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca" containerName="marketplace-operator" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591469 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd50e9de-1cec-4f7c-8308-e0e5eb7961b9" containerName="registry-server" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591478 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="86ffb50f-47f6-47b2-9141-1de9999a13e0" containerName="controller-manager" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591484 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8c7f503-32c4-4ca2-8435-9918cae8d931" containerName="registry-server" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591492 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="102f7fb5-3031-4853-b112-2aa910aa63a7" containerName="route-controller-manager" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591826 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.595425 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.595567 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.595954 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.596243 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.596433 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.598404 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj"] Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.599209 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.599454 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.601773 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.602217 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.602230 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.602230 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.602244 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.602245 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.606325 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-658fd5994d-4zhjv"] Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.609978 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.614670 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj"] Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.660861 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-serving-cert\") pod \"route-controller-manager-75664bd6d9-j5lrj\" (UID: \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.660906 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-client-ca\") pod \"controller-manager-658fd5994d-4zhjv\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.660937 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-client-ca\") pod \"route-controller-manager-75664bd6d9-j5lrj\" (UID: \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.660987 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ngvm\" (UniqueName: \"kubernetes.io/projected/09f05756-63fb-4c1b-b763-065b4a66ceff-kube-api-access-7ngvm\") pod \"controller-manager-658fd5994d-4zhjv\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.661011 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09f05756-63fb-4c1b-b763-065b4a66ceff-serving-cert\") pod \"controller-manager-658fd5994d-4zhjv\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.661040 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-proxy-ca-bundles\") pod \"controller-manager-658fd5994d-4zhjv\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.661065 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-config\") pod \"controller-manager-658fd5994d-4zhjv\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.661104 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-config\") pod \"route-controller-manager-75664bd6d9-j5lrj\" (UID: \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.661134 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk96h\" (UniqueName: \"kubernetes.io/projected/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-kube-api-access-lk96h\") pod \"route-controller-manager-75664bd6d9-j5lrj\" (UID: \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.761731 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ngvm\" (UniqueName: \"kubernetes.io/projected/09f05756-63fb-4c1b-b763-065b4a66ceff-kube-api-access-7ngvm\") pod \"controller-manager-658fd5994d-4zhjv\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.762492 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09f05756-63fb-4c1b-b763-065b4a66ceff-serving-cert\") pod \"controller-manager-658fd5994d-4zhjv\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.762526 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-proxy-ca-bundles\") pod \"controller-manager-658fd5994d-4zhjv\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.762548 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-config\") pod \"controller-manager-658fd5994d-4zhjv\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.762576 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-config\") pod \"route-controller-manager-75664bd6d9-j5lrj\" (UID: \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.762597 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk96h\" (UniqueName: \"kubernetes.io/projected/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-kube-api-access-lk96h\") pod \"route-controller-manager-75664bd6d9-j5lrj\" (UID: \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.762619 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-serving-cert\") pod \"route-controller-manager-75664bd6d9-j5lrj\" (UID: \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.762633 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-client-ca\") pod \"controller-manager-658fd5994d-4zhjv\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.762656 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-client-ca\") pod \"route-controller-manager-75664bd6d9-j5lrj\" (UID: \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.763509 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-client-ca\") pod \"route-controller-manager-75664bd6d9-j5lrj\" (UID: \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.765231 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-config\") pod \"route-controller-manager-75664bd6d9-j5lrj\" (UID: \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.768324 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-serving-cert\") pod \"route-controller-manager-75664bd6d9-j5lrj\" (UID: \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.768769 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-client-ca\") pod \"controller-manager-658fd5994d-4zhjv\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.769099 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09f05756-63fb-4c1b-b763-065b4a66ceff-serving-cert\") pod \"controller-manager-658fd5994d-4zhjv\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.769404 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-config\") pod \"controller-manager-658fd5994d-4zhjv\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.769591 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-proxy-ca-bundles\") pod \"controller-manager-658fd5994d-4zhjv\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.785058 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ngvm\" (UniqueName: \"kubernetes.io/projected/09f05756-63fb-4c1b-b763-065b4a66ceff-kube-api-access-7ngvm\") pod \"controller-manager-658fd5994d-4zhjv\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.786763 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk96h\" (UniqueName: \"kubernetes.io/projected/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-kube-api-access-lk96h\") pod \"route-controller-manager-75664bd6d9-j5lrj\" (UID: \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.916419 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.935800 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:17 crc kubenswrapper[4795]: I0219 21:33:17.380828 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-658fd5994d-4zhjv"] Feb 19 21:33:17 crc kubenswrapper[4795]: I0219 21:33:17.391124 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj"] Feb 19 21:33:17 crc kubenswrapper[4795]: W0219 21:33:17.394945 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bb64b6f_6a96_4fe9_9fda_67a962a579bb.slice/crio-9c0f3d2b587ecd0963a52b1acde19cffd9f8d935d94f3195fd7e7d8ce920cee0 WatchSource:0}: Error finding container 9c0f3d2b587ecd0963a52b1acde19cffd9f8d935d94f3195fd7e7d8ce920cee0: Status 404 returned error can't find the container with id 9c0f3d2b587ecd0963a52b1acde19cffd9f8d935d94f3195fd7e7d8ce920cee0 Feb 19 21:33:17 crc kubenswrapper[4795]: I0219 21:33:17.508181 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" event={"ID":"09f05756-63fb-4c1b-b763-065b4a66ceff","Type":"ContainerStarted","Data":"82e2340862c779da3d50db9a4a4c3d7bdeb32be3ace73f6532ae9bd0d9d449b3"} Feb 19 21:33:17 crc kubenswrapper[4795]: I0219 21:33:17.509448 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" event={"ID":"9bb64b6f-6a96-4fe9-9fda-67a962a579bb","Type":"ContainerStarted","Data":"9c0f3d2b587ecd0963a52b1acde19cffd9f8d935d94f3195fd7e7d8ce920cee0"} Feb 19 21:33:17 crc kubenswrapper[4795]: I0219 21:33:17.524735 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="102f7fb5-3031-4853-b112-2aa910aa63a7" path="/var/lib/kubelet/pods/102f7fb5-3031-4853-b112-2aa910aa63a7/volumes" Feb 19 21:33:17 crc kubenswrapper[4795]: I0219 21:33:17.525588 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86ffb50f-47f6-47b2-9141-1de9999a13e0" path="/var/lib/kubelet/pods/86ffb50f-47f6-47b2-9141-1de9999a13e0/volumes" Feb 19 21:33:18 crc kubenswrapper[4795]: I0219 21:33:18.523274 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" event={"ID":"9bb64b6f-6a96-4fe9-9fda-67a962a579bb","Type":"ContainerStarted","Data":"7448f9d97513bea9320be3400b42d577c96f3d79a9463cc8c66045316a8904ba"} Feb 19 21:33:18 crc kubenswrapper[4795]: I0219 21:33:18.523740 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:18 crc kubenswrapper[4795]: I0219 21:33:18.524418 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" event={"ID":"09f05756-63fb-4c1b-b763-065b4a66ceff","Type":"ContainerStarted","Data":"60f5807a8d146a89fab101c9137b8d07d3570d1b0c1032e786eccb821a4d18a5"} Feb 19 21:33:18 crc kubenswrapper[4795]: I0219 21:33:18.524668 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:18 crc kubenswrapper[4795]: I0219 21:33:18.534754 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:18 crc kubenswrapper[4795]: I0219 21:33:18.561884 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" podStartSLOduration=3.561870136 podStartE2EDuration="3.561870136s" podCreationTimestamp="2026-02-19 21:33:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:33:18.558776701 +0000 UTC m=+309.751294565" watchObservedRunningTime="2026-02-19 21:33:18.561870136 +0000 UTC m=+309.754388000" Feb 19 21:33:18 crc kubenswrapper[4795]: I0219 21:33:18.756459 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:18 crc kubenswrapper[4795]: I0219 21:33:18.776032 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" podStartSLOduration=3.776016157 podStartE2EDuration="3.776016157s" podCreationTimestamp="2026-02-19 21:33:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:33:18.578073292 +0000 UTC m=+309.770591156" watchObservedRunningTime="2026-02-19 21:33:18.776016157 +0000 UTC m=+309.968534021" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.104856 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-fgxv6"] Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.107376 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.124404 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-fgxv6"] Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.213177 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c072476d-28b8-4ac2-9faa-2a8f54071b38-installation-pull-secrets\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.213229 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcdds\" (UniqueName: \"kubernetes.io/projected/c072476d-28b8-4ac2-9faa-2a8f54071b38-kube-api-access-vcdds\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.213256 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.213287 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c072476d-28b8-4ac2-9faa-2a8f54071b38-registry-certificates\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.213501 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c072476d-28b8-4ac2-9faa-2a8f54071b38-trusted-ca\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.213562 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c072476d-28b8-4ac2-9faa-2a8f54071b38-bound-sa-token\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.213619 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c072476d-28b8-4ac2-9faa-2a8f54071b38-registry-tls\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.213643 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c072476d-28b8-4ac2-9faa-2a8f54071b38-ca-trust-extracted\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.237374 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.314929 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c072476d-28b8-4ac2-9faa-2a8f54071b38-trusted-ca\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.314992 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c072476d-28b8-4ac2-9faa-2a8f54071b38-bound-sa-token\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.315015 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c072476d-28b8-4ac2-9faa-2a8f54071b38-registry-tls\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.315032 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c072476d-28b8-4ac2-9faa-2a8f54071b38-ca-trust-extracted\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.315059 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c072476d-28b8-4ac2-9faa-2a8f54071b38-installation-pull-secrets\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.315080 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcdds\" (UniqueName: \"kubernetes.io/projected/c072476d-28b8-4ac2-9faa-2a8f54071b38-kube-api-access-vcdds\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.315106 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c072476d-28b8-4ac2-9faa-2a8f54071b38-registry-certificates\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.316369 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c072476d-28b8-4ac2-9faa-2a8f54071b38-registry-certificates\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.316735 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c072476d-28b8-4ac2-9faa-2a8f54071b38-ca-trust-extracted\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.317074 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c072476d-28b8-4ac2-9faa-2a8f54071b38-trusted-ca\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.322057 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c072476d-28b8-4ac2-9faa-2a8f54071b38-installation-pull-secrets\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.324691 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c072476d-28b8-4ac2-9faa-2a8f54071b38-registry-tls\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.343046 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c072476d-28b8-4ac2-9faa-2a8f54071b38-bound-sa-token\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.343496 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcdds\" (UniqueName: \"kubernetes.io/projected/c072476d-28b8-4ac2-9faa-2a8f54071b38-kube-api-access-vcdds\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.442942 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.826548 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-fgxv6"] Feb 19 21:33:31 crc kubenswrapper[4795]: W0219 21:33:31.832619 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc072476d_28b8_4ac2_9faa_2a8f54071b38.slice/crio-7e91a0576a8105ae06a3f44088f1585b6b67e0ec5d2db89d19e3476e1179878d WatchSource:0}: Error finding container 7e91a0576a8105ae06a3f44088f1585b6b67e0ec5d2db89d19e3476e1179878d: Status 404 returned error can't find the container with id 7e91a0576a8105ae06a3f44088f1585b6b67e0ec5d2db89d19e3476e1179878d Feb 19 21:33:32 crc kubenswrapper[4795]: I0219 21:33:32.589851 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" event={"ID":"c072476d-28b8-4ac2-9faa-2a8f54071b38","Type":"ContainerStarted","Data":"962c78e81bf040099743ca07798d02b09e7ff69a18e291185ff7292a84b3a4bd"} Feb 19 21:33:32 crc kubenswrapper[4795]: I0219 21:33:32.589937 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:32 crc kubenswrapper[4795]: I0219 21:33:32.589947 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" event={"ID":"c072476d-28b8-4ac2-9faa-2a8f54071b38","Type":"ContainerStarted","Data":"7e91a0576a8105ae06a3f44088f1585b6b67e0ec5d2db89d19e3476e1179878d"} Feb 19 21:33:32 crc kubenswrapper[4795]: I0219 21:33:32.611391 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" podStartSLOduration=1.6113737399999999 podStartE2EDuration="1.61137374s" podCreationTimestamp="2026-02-19 21:33:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:33:32.608694386 +0000 UTC m=+323.801212250" watchObservedRunningTime="2026-02-19 21:33:32.61137374 +0000 UTC m=+323.803891604" Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.287718 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj"] Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.288582 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" podUID="9bb64b6f-6a96-4fe9-9fda-67a962a579bb" containerName="route-controller-manager" containerID="cri-o://7448f9d97513bea9320be3400b42d577c96f3d79a9463cc8c66045316a8904ba" gracePeriod=30 Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.606363 4795 generic.go:334] "Generic (PLEG): container finished" podID="9bb64b6f-6a96-4fe9-9fda-67a962a579bb" containerID="7448f9d97513bea9320be3400b42d577c96f3d79a9463cc8c66045316a8904ba" exitCode=0 Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.606403 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" event={"ID":"9bb64b6f-6a96-4fe9-9fda-67a962a579bb","Type":"ContainerDied","Data":"7448f9d97513bea9320be3400b42d577c96f3d79a9463cc8c66045316a8904ba"} Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.741219 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.889315 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-serving-cert\") pod \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\" (UID: \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\") " Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.889382 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-client-ca\") pod \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\" (UID: \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\") " Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.889468 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lk96h\" (UniqueName: \"kubernetes.io/projected/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-kube-api-access-lk96h\") pod \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\" (UID: \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\") " Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.889518 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-config\") pod \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\" (UID: \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\") " Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.890118 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-client-ca" (OuterVolumeSpecName: "client-ca") pod "9bb64b6f-6a96-4fe9-9fda-67a962a579bb" (UID: "9bb64b6f-6a96-4fe9-9fda-67a962a579bb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.890145 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-config" (OuterVolumeSpecName: "config") pod "9bb64b6f-6a96-4fe9-9fda-67a962a579bb" (UID: "9bb64b6f-6a96-4fe9-9fda-67a962a579bb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.894448 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9bb64b6f-6a96-4fe9-9fda-67a962a579bb" (UID: "9bb64b6f-6a96-4fe9-9fda-67a962a579bb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.894605 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-kube-api-access-lk96h" (OuterVolumeSpecName: "kube-api-access-lk96h") pod "9bb64b6f-6a96-4fe9-9fda-67a962a579bb" (UID: "9bb64b6f-6a96-4fe9-9fda-67a962a579bb"). InnerVolumeSpecName "kube-api-access-lk96h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.990992 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.991023 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lk96h\" (UniqueName: \"kubernetes.io/projected/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-kube-api-access-lk96h\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.991035 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.991042 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.613307 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" event={"ID":"9bb64b6f-6a96-4fe9-9fda-67a962a579bb","Type":"ContainerDied","Data":"9c0f3d2b587ecd0963a52b1acde19cffd9f8d935d94f3195fd7e7d8ce920cee0"} Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.613369 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.613374 4795 scope.go:117] "RemoveContainer" containerID="7448f9d97513bea9320be3400b42d577c96f3d79a9463cc8c66045316a8904ba" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.639944 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2"] Feb 19 21:33:36 crc kubenswrapper[4795]: E0219 21:33:36.640280 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bb64b6f-6a96-4fe9-9fda-67a962a579bb" containerName="route-controller-manager" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.640310 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bb64b6f-6a96-4fe9-9fda-67a962a579bb" containerName="route-controller-manager" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.640463 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bb64b6f-6a96-4fe9-9fda-67a962a579bb" containerName="route-controller-manager" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.640993 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.643493 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.643614 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.645359 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.645793 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.645922 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.648466 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.650546 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2"] Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.657507 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj"] Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.662413 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj"] Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.801499 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n9kz\" (UniqueName: \"kubernetes.io/projected/8e80efae-f1ac-40f9-ad38-61dc2821499e-kube-api-access-6n9kz\") pod \"route-controller-manager-84d5f88f56-vjpj2\" (UID: \"8e80efae-f1ac-40f9-ad38-61dc2821499e\") " pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.801548 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e80efae-f1ac-40f9-ad38-61dc2821499e-client-ca\") pod \"route-controller-manager-84d5f88f56-vjpj2\" (UID: \"8e80efae-f1ac-40f9-ad38-61dc2821499e\") " pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.801619 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e80efae-f1ac-40f9-ad38-61dc2821499e-config\") pod \"route-controller-manager-84d5f88f56-vjpj2\" (UID: \"8e80efae-f1ac-40f9-ad38-61dc2821499e\") " pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.801643 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e80efae-f1ac-40f9-ad38-61dc2821499e-serving-cert\") pod \"route-controller-manager-84d5f88f56-vjpj2\" (UID: \"8e80efae-f1ac-40f9-ad38-61dc2821499e\") " pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.902716 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e80efae-f1ac-40f9-ad38-61dc2821499e-client-ca\") pod \"route-controller-manager-84d5f88f56-vjpj2\" (UID: \"8e80efae-f1ac-40f9-ad38-61dc2821499e\") " pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.902835 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e80efae-f1ac-40f9-ad38-61dc2821499e-config\") pod \"route-controller-manager-84d5f88f56-vjpj2\" (UID: \"8e80efae-f1ac-40f9-ad38-61dc2821499e\") " pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.902868 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e80efae-f1ac-40f9-ad38-61dc2821499e-serving-cert\") pod \"route-controller-manager-84d5f88f56-vjpj2\" (UID: \"8e80efae-f1ac-40f9-ad38-61dc2821499e\") " pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.902907 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n9kz\" (UniqueName: \"kubernetes.io/projected/8e80efae-f1ac-40f9-ad38-61dc2821499e-kube-api-access-6n9kz\") pod \"route-controller-manager-84d5f88f56-vjpj2\" (UID: \"8e80efae-f1ac-40f9-ad38-61dc2821499e\") " pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.904139 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e80efae-f1ac-40f9-ad38-61dc2821499e-client-ca\") pod \"route-controller-manager-84d5f88f56-vjpj2\" (UID: \"8e80efae-f1ac-40f9-ad38-61dc2821499e\") " pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.904335 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e80efae-f1ac-40f9-ad38-61dc2821499e-config\") pod \"route-controller-manager-84d5f88f56-vjpj2\" (UID: \"8e80efae-f1ac-40f9-ad38-61dc2821499e\") " pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.907395 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e80efae-f1ac-40f9-ad38-61dc2821499e-serving-cert\") pod \"route-controller-manager-84d5f88f56-vjpj2\" (UID: \"8e80efae-f1ac-40f9-ad38-61dc2821499e\") " pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.917816 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n9kz\" (UniqueName: \"kubernetes.io/projected/8e80efae-f1ac-40f9-ad38-61dc2821499e-kube-api-access-6n9kz\") pod \"route-controller-manager-84d5f88f56-vjpj2\" (UID: \"8e80efae-f1ac-40f9-ad38-61dc2821499e\") " pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.956577 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" Feb 19 21:33:37 crc kubenswrapper[4795]: I0219 21:33:37.388655 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2"] Feb 19 21:33:37 crc kubenswrapper[4795]: I0219 21:33:37.520395 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bb64b6f-6a96-4fe9-9fda-67a962a579bb" path="/var/lib/kubelet/pods/9bb64b6f-6a96-4fe9-9fda-67a962a579bb/volumes" Feb 19 21:33:37 crc kubenswrapper[4795]: I0219 21:33:37.619365 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" event={"ID":"8e80efae-f1ac-40f9-ad38-61dc2821499e","Type":"ContainerStarted","Data":"3db54fe9b4bcdd2dfcde25b1c782209dd46ca7eb8c297c01d7a470c0b950e023"} Feb 19 21:33:37 crc kubenswrapper[4795]: I0219 21:33:37.619486 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" event={"ID":"8e80efae-f1ac-40f9-ad38-61dc2821499e","Type":"ContainerStarted","Data":"ff2a96362a19fc9b02cca871a18d60ace8d1f71c1784478eca8f7671f28a2556"} Feb 19 21:33:37 crc kubenswrapper[4795]: I0219 21:33:37.620812 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" Feb 19 21:33:37 crc kubenswrapper[4795]: I0219 21:33:37.622349 4795 patch_prober.go:28] interesting pod/route-controller-manager-84d5f88f56-vjpj2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": dial tcp 10.217.0.62:8443: connect: connection refused" start-of-body= Feb 19 21:33:37 crc kubenswrapper[4795]: I0219 21:33:37.622393 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" podUID="8e80efae-f1ac-40f9-ad38-61dc2821499e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": dial tcp 10.217.0.62:8443: connect: connection refused" Feb 19 21:33:37 crc kubenswrapper[4795]: I0219 21:33:37.638650 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" podStartSLOduration=2.638635696 podStartE2EDuration="2.638635696s" podCreationTimestamp="2026-02-19 21:33:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:33:37.63542012 +0000 UTC m=+328.827937984" watchObservedRunningTime="2026-02-19 21:33:37.638635696 +0000 UTC m=+328.831153560" Feb 19 21:33:38 crc kubenswrapper[4795]: I0219 21:33:38.630075 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" Feb 19 21:33:51 crc kubenswrapper[4795]: I0219 21:33:51.452537 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:51 crc kubenswrapper[4795]: I0219 21:33:51.511012 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4h49m"] Feb 19 21:33:58 crc kubenswrapper[4795]: I0219 21:33:58.427910 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:33:58 crc kubenswrapper[4795]: I0219 21:33:58.428553 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.147566 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mdtnr"] Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.149479 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mdtnr" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.151664 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.168067 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mdtnr"] Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.276431 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw6pd\" (UniqueName: \"kubernetes.io/projected/f457fe15-4099-4d77-8140-3297bee0a182-kube-api-access-vw6pd\") pod \"certified-operators-mdtnr\" (UID: \"f457fe15-4099-4d77-8140-3297bee0a182\") " pod="openshift-marketplace/certified-operators-mdtnr" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.276847 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f457fe15-4099-4d77-8140-3297bee0a182-catalog-content\") pod \"certified-operators-mdtnr\" (UID: \"f457fe15-4099-4d77-8140-3297bee0a182\") " pod="openshift-marketplace/certified-operators-mdtnr" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.277140 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f457fe15-4099-4d77-8140-3297bee0a182-utilities\") pod \"certified-operators-mdtnr\" (UID: \"f457fe15-4099-4d77-8140-3297bee0a182\") " pod="openshift-marketplace/certified-operators-mdtnr" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.343332 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p266t"] Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.344275 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p266t" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.346636 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.359237 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p266t"] Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.395554 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f457fe15-4099-4d77-8140-3297bee0a182-catalog-content\") pod \"certified-operators-mdtnr\" (UID: \"f457fe15-4099-4d77-8140-3297bee0a182\") " pod="openshift-marketplace/certified-operators-mdtnr" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.395665 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f457fe15-4099-4d77-8140-3297bee0a182-utilities\") pod \"certified-operators-mdtnr\" (UID: \"f457fe15-4099-4d77-8140-3297bee0a182\") " pod="openshift-marketplace/certified-operators-mdtnr" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.395725 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw6pd\" (UniqueName: \"kubernetes.io/projected/f457fe15-4099-4d77-8140-3297bee0a182-kube-api-access-vw6pd\") pod \"certified-operators-mdtnr\" (UID: \"f457fe15-4099-4d77-8140-3297bee0a182\") " pod="openshift-marketplace/certified-operators-mdtnr" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.396125 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f457fe15-4099-4d77-8140-3297bee0a182-utilities\") pod \"certified-operators-mdtnr\" (UID: \"f457fe15-4099-4d77-8140-3297bee0a182\") " pod="openshift-marketplace/certified-operators-mdtnr" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.396371 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f457fe15-4099-4d77-8140-3297bee0a182-catalog-content\") pod \"certified-operators-mdtnr\" (UID: \"f457fe15-4099-4d77-8140-3297bee0a182\") " pod="openshift-marketplace/certified-operators-mdtnr" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.425131 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw6pd\" (UniqueName: \"kubernetes.io/projected/f457fe15-4099-4d77-8140-3297bee0a182-kube-api-access-vw6pd\") pod \"certified-operators-mdtnr\" (UID: \"f457fe15-4099-4d77-8140-3297bee0a182\") " pod="openshift-marketplace/certified-operators-mdtnr" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.469989 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mdtnr" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.496788 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqmwl\" (UniqueName: \"kubernetes.io/projected/432a371d-d143-4da7-9332-682f52b39381-kube-api-access-mqmwl\") pod \"community-operators-p266t\" (UID: \"432a371d-d143-4da7-9332-682f52b39381\") " pod="openshift-marketplace/community-operators-p266t" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.496943 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/432a371d-d143-4da7-9332-682f52b39381-utilities\") pod \"community-operators-p266t\" (UID: \"432a371d-d143-4da7-9332-682f52b39381\") " pod="openshift-marketplace/community-operators-p266t" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.497001 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/432a371d-d143-4da7-9332-682f52b39381-catalog-content\") pod \"community-operators-p266t\" (UID: \"432a371d-d143-4da7-9332-682f52b39381\") " pod="openshift-marketplace/community-operators-p266t" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.598200 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqmwl\" (UniqueName: \"kubernetes.io/projected/432a371d-d143-4da7-9332-682f52b39381-kube-api-access-mqmwl\") pod \"community-operators-p266t\" (UID: \"432a371d-d143-4da7-9332-682f52b39381\") " pod="openshift-marketplace/community-operators-p266t" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.598639 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/432a371d-d143-4da7-9332-682f52b39381-utilities\") pod \"community-operators-p266t\" (UID: \"432a371d-d143-4da7-9332-682f52b39381\") " pod="openshift-marketplace/community-operators-p266t" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.598672 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/432a371d-d143-4da7-9332-682f52b39381-catalog-content\") pod \"community-operators-p266t\" (UID: \"432a371d-d143-4da7-9332-682f52b39381\") " pod="openshift-marketplace/community-operators-p266t" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.599341 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/432a371d-d143-4da7-9332-682f52b39381-catalog-content\") pod \"community-operators-p266t\" (UID: \"432a371d-d143-4da7-9332-682f52b39381\") " pod="openshift-marketplace/community-operators-p266t" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.599350 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/432a371d-d143-4da7-9332-682f52b39381-utilities\") pod \"community-operators-p266t\" (UID: \"432a371d-d143-4da7-9332-682f52b39381\") " pod="openshift-marketplace/community-operators-p266t" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.619920 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqmwl\" (UniqueName: \"kubernetes.io/projected/432a371d-d143-4da7-9332-682f52b39381-kube-api-access-mqmwl\") pod \"community-operators-p266t\" (UID: \"432a371d-d143-4da7-9332-682f52b39381\") " pod="openshift-marketplace/community-operators-p266t" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.704521 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p266t" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.876118 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mdtnr"] Feb 19 21:34:14 crc kubenswrapper[4795]: I0219 21:34:14.060755 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p266t"] Feb 19 21:34:14 crc kubenswrapper[4795]: W0219 21:34:14.096558 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod432a371d_d143_4da7_9332_682f52b39381.slice/crio-5cb7b1d8a97bf4b9862704ff25f694d6972eb35858aac2d0da3339e985d59a7e WatchSource:0}: Error finding container 5cb7b1d8a97bf4b9862704ff25f694d6972eb35858aac2d0da3339e985d59a7e: Status 404 returned error can't find the container with id 5cb7b1d8a97bf4b9862704ff25f694d6972eb35858aac2d0da3339e985d59a7e Feb 19 21:34:14 crc kubenswrapper[4795]: I0219 21:34:14.818719 4795 generic.go:334] "Generic (PLEG): container finished" podID="f457fe15-4099-4d77-8140-3297bee0a182" containerID="9ef1064bce8bd4a67adcc876213ec91f6b1c19e931d783289fdcbf08c748684f" exitCode=0 Feb 19 21:34:14 crc kubenswrapper[4795]: I0219 21:34:14.818821 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdtnr" event={"ID":"f457fe15-4099-4d77-8140-3297bee0a182","Type":"ContainerDied","Data":"9ef1064bce8bd4a67adcc876213ec91f6b1c19e931d783289fdcbf08c748684f"} Feb 19 21:34:14 crc kubenswrapper[4795]: I0219 21:34:14.818908 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdtnr" event={"ID":"f457fe15-4099-4d77-8140-3297bee0a182","Type":"ContainerStarted","Data":"4856b161db877886e34629abe64d1bcb0728155b7b8faf257539528a50573274"} Feb 19 21:34:14 crc kubenswrapper[4795]: I0219 21:34:14.821901 4795 generic.go:334] "Generic (PLEG): container finished" podID="432a371d-d143-4da7-9332-682f52b39381" containerID="daee02a63e745db155d7de8aac8f4ca682b9ef8ec8d76969ae8d79cf5644783d" exitCode=0 Feb 19 21:34:14 crc kubenswrapper[4795]: I0219 21:34:14.821934 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p266t" event={"ID":"432a371d-d143-4da7-9332-682f52b39381","Type":"ContainerDied","Data":"daee02a63e745db155d7de8aac8f4ca682b9ef8ec8d76969ae8d79cf5644783d"} Feb 19 21:34:14 crc kubenswrapper[4795]: I0219 21:34:14.821967 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p266t" event={"ID":"432a371d-d143-4da7-9332-682f52b39381","Type":"ContainerStarted","Data":"5cb7b1d8a97bf4b9862704ff25f694d6972eb35858aac2d0da3339e985d59a7e"} Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.291424 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-658fd5994d-4zhjv"] Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.291858 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" podUID="09f05756-63fb-4c1b-b763-065b4a66ceff" containerName="controller-manager" containerID="cri-o://60f5807a8d146a89fab101c9137b8d07d3570d1b0c1032e786eccb821a4d18a5" gracePeriod=30 Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.341869 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-r8t7x"] Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.343051 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r8t7x" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.344812 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.356274 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r8t7x"] Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.430470 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9xpk\" (UniqueName: \"kubernetes.io/projected/4941d783-94cd-4a5c-a124-5c8751cc8494-kube-api-access-k9xpk\") pod \"redhat-marketplace-r8t7x\" (UID: \"4941d783-94cd-4a5c-a124-5c8751cc8494\") " pod="openshift-marketplace/redhat-marketplace-r8t7x" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.430605 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4941d783-94cd-4a5c-a124-5c8751cc8494-catalog-content\") pod \"redhat-marketplace-r8t7x\" (UID: \"4941d783-94cd-4a5c-a124-5c8751cc8494\") " pod="openshift-marketplace/redhat-marketplace-r8t7x" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.430798 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4941d783-94cd-4a5c-a124-5c8751cc8494-utilities\") pod \"redhat-marketplace-r8t7x\" (UID: \"4941d783-94cd-4a5c-a124-5c8751cc8494\") " pod="openshift-marketplace/redhat-marketplace-r8t7x" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.531878 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4941d783-94cd-4a5c-a124-5c8751cc8494-utilities\") pod \"redhat-marketplace-r8t7x\" (UID: \"4941d783-94cd-4a5c-a124-5c8751cc8494\") " pod="openshift-marketplace/redhat-marketplace-r8t7x" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.532320 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9xpk\" (UniqueName: \"kubernetes.io/projected/4941d783-94cd-4a5c-a124-5c8751cc8494-kube-api-access-k9xpk\") pod \"redhat-marketplace-r8t7x\" (UID: \"4941d783-94cd-4a5c-a124-5c8751cc8494\") " pod="openshift-marketplace/redhat-marketplace-r8t7x" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.532358 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4941d783-94cd-4a5c-a124-5c8751cc8494-catalog-content\") pod \"redhat-marketplace-r8t7x\" (UID: \"4941d783-94cd-4a5c-a124-5c8751cc8494\") " pod="openshift-marketplace/redhat-marketplace-r8t7x" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.532382 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4941d783-94cd-4a5c-a124-5c8751cc8494-utilities\") pod \"redhat-marketplace-r8t7x\" (UID: \"4941d783-94cd-4a5c-a124-5c8751cc8494\") " pod="openshift-marketplace/redhat-marketplace-r8t7x" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.532787 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4941d783-94cd-4a5c-a124-5c8751cc8494-catalog-content\") pod \"redhat-marketplace-r8t7x\" (UID: \"4941d783-94cd-4a5c-a124-5c8751cc8494\") " pod="openshift-marketplace/redhat-marketplace-r8t7x" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.557299 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9xpk\" (UniqueName: \"kubernetes.io/projected/4941d783-94cd-4a5c-a124-5c8751cc8494-kube-api-access-k9xpk\") pod \"redhat-marketplace-r8t7x\" (UID: \"4941d783-94cd-4a5c-a124-5c8751cc8494\") " pod="openshift-marketplace/redhat-marketplace-r8t7x" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.653129 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.734847 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-proxy-ca-bundles\") pod \"09f05756-63fb-4c1b-b763-065b4a66ceff\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.734919 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-config\") pod \"09f05756-63fb-4c1b-b763-065b4a66ceff\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.734965 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09f05756-63fb-4c1b-b763-065b4a66ceff-serving-cert\") pod \"09f05756-63fb-4c1b-b763-065b4a66ceff\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.734986 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-client-ca\") pod \"09f05756-63fb-4c1b-b763-065b4a66ceff\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.735022 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ngvm\" (UniqueName: \"kubernetes.io/projected/09f05756-63fb-4c1b-b763-065b4a66ceff-kube-api-access-7ngvm\") pod \"09f05756-63fb-4c1b-b763-065b4a66ceff\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.735809 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-client-ca" (OuterVolumeSpecName: "client-ca") pod "09f05756-63fb-4c1b-b763-065b4a66ceff" (UID: "09f05756-63fb-4c1b-b763-065b4a66ceff"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.735928 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-config" (OuterVolumeSpecName: "config") pod "09f05756-63fb-4c1b-b763-065b4a66ceff" (UID: "09f05756-63fb-4c1b-b763-065b4a66ceff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.735857 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "09f05756-63fb-4c1b-b763-065b4a66ceff" (UID: "09f05756-63fb-4c1b-b763-065b4a66ceff"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.739455 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r8t7x" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.739922 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09f05756-63fb-4c1b-b763-065b4a66ceff-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09f05756-63fb-4c1b-b763-065b4a66ceff" (UID: "09f05756-63fb-4c1b-b763-065b4a66ceff"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.739921 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09f05756-63fb-4c1b-b763-065b4a66ceff-kube-api-access-7ngvm" (OuterVolumeSpecName: "kube-api-access-7ngvm") pod "09f05756-63fb-4c1b-b763-065b4a66ceff" (UID: "09f05756-63fb-4c1b-b763-065b4a66ceff"). InnerVolumeSpecName "kube-api-access-7ngvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.832731 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p266t" event={"ID":"432a371d-d143-4da7-9332-682f52b39381","Type":"ContainerStarted","Data":"b9423a320aab48ad3d9b982af503bf9d93dbf652a3d81eb4d4c08e2b544a60d3"} Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.835807 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09f05756-63fb-4c1b-b763-065b4a66ceff-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.835831 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.835841 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ngvm\" (UniqueName: \"kubernetes.io/projected/09f05756-63fb-4c1b-b763-065b4a66ceff-kube-api-access-7ngvm\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.835849 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.835858 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.836723 4795 generic.go:334] "Generic (PLEG): container finished" podID="09f05756-63fb-4c1b-b763-065b4a66ceff" containerID="60f5807a8d146a89fab101c9137b8d07d3570d1b0c1032e786eccb821a4d18a5" exitCode=0 Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.836801 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" event={"ID":"09f05756-63fb-4c1b-b763-065b4a66ceff","Type":"ContainerDied","Data":"60f5807a8d146a89fab101c9137b8d07d3570d1b0c1032e786eccb821a4d18a5"} Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.836832 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" event={"ID":"09f05756-63fb-4c1b-b763-065b4a66ceff","Type":"ContainerDied","Data":"82e2340862c779da3d50db9a4a4c3d7bdeb32be3ace73f6532ae9bd0d9d449b3"} Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.836853 4795 scope.go:117] "RemoveContainer" containerID="60f5807a8d146a89fab101c9137b8d07d3570d1b0c1032e786eccb821a4d18a5" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.836928 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.843116 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdtnr" event={"ID":"f457fe15-4099-4d77-8140-3297bee0a182","Type":"ContainerStarted","Data":"b2320d7447d9ae8b69ae359baad4c9c3633df43693ee76d985a2bbd216cfd589"} Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.873815 4795 scope.go:117] "RemoveContainer" containerID="60f5807a8d146a89fab101c9137b8d07d3570d1b0c1032e786eccb821a4d18a5" Feb 19 21:34:15 crc kubenswrapper[4795]: E0219 21:34:15.882870 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60f5807a8d146a89fab101c9137b8d07d3570d1b0c1032e786eccb821a4d18a5\": container with ID starting with 60f5807a8d146a89fab101c9137b8d07d3570d1b0c1032e786eccb821a4d18a5 not found: ID does not exist" containerID="60f5807a8d146a89fab101c9137b8d07d3570d1b0c1032e786eccb821a4d18a5" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.882956 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60f5807a8d146a89fab101c9137b8d07d3570d1b0c1032e786eccb821a4d18a5"} err="failed to get container status \"60f5807a8d146a89fab101c9137b8d07d3570d1b0c1032e786eccb821a4d18a5\": rpc error: code = NotFound desc = could not find container \"60f5807a8d146a89fab101c9137b8d07d3570d1b0c1032e786eccb821a4d18a5\": container with ID starting with 60f5807a8d146a89fab101c9137b8d07d3570d1b0c1032e786eccb821a4d18a5 not found: ID does not exist" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.885603 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-658fd5994d-4zhjv"] Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.888395 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-658fd5994d-4zhjv"] Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.908429 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r8t7x"] Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.946491 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4v92x"] Feb 19 21:34:15 crc kubenswrapper[4795]: E0219 21:34:15.946801 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09f05756-63fb-4c1b-b763-065b4a66ceff" containerName="controller-manager" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.946824 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="09f05756-63fb-4c1b-b763-065b4a66ceff" containerName="controller-manager" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.947117 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="09f05756-63fb-4c1b-b763-065b4a66ceff" containerName="controller-manager" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.954990 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4v92x" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.961256 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.961481 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4v92x"] Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.039643 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfbhm\" (UniqueName: \"kubernetes.io/projected/4002b94b-8679-454c-a721-fa900f6cde3b-kube-api-access-vfbhm\") pod \"redhat-operators-4v92x\" (UID: \"4002b94b-8679-454c-a721-fa900f6cde3b\") " pod="openshift-marketplace/redhat-operators-4v92x" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.039685 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4002b94b-8679-454c-a721-fa900f6cde3b-catalog-content\") pod \"redhat-operators-4v92x\" (UID: \"4002b94b-8679-454c-a721-fa900f6cde3b\") " pod="openshift-marketplace/redhat-operators-4v92x" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.039717 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4002b94b-8679-454c-a721-fa900f6cde3b-utilities\") pod \"redhat-operators-4v92x\" (UID: \"4002b94b-8679-454c-a721-fa900f6cde3b\") " pod="openshift-marketplace/redhat-operators-4v92x" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.140875 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfbhm\" (UniqueName: \"kubernetes.io/projected/4002b94b-8679-454c-a721-fa900f6cde3b-kube-api-access-vfbhm\") pod \"redhat-operators-4v92x\" (UID: \"4002b94b-8679-454c-a721-fa900f6cde3b\") " pod="openshift-marketplace/redhat-operators-4v92x" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.140918 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4002b94b-8679-454c-a721-fa900f6cde3b-catalog-content\") pod \"redhat-operators-4v92x\" (UID: \"4002b94b-8679-454c-a721-fa900f6cde3b\") " pod="openshift-marketplace/redhat-operators-4v92x" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.140954 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4002b94b-8679-454c-a721-fa900f6cde3b-utilities\") pod \"redhat-operators-4v92x\" (UID: \"4002b94b-8679-454c-a721-fa900f6cde3b\") " pod="openshift-marketplace/redhat-operators-4v92x" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.141389 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4002b94b-8679-454c-a721-fa900f6cde3b-utilities\") pod \"redhat-operators-4v92x\" (UID: \"4002b94b-8679-454c-a721-fa900f6cde3b\") " pod="openshift-marketplace/redhat-operators-4v92x" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.141460 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4002b94b-8679-454c-a721-fa900f6cde3b-catalog-content\") pod \"redhat-operators-4v92x\" (UID: \"4002b94b-8679-454c-a721-fa900f6cde3b\") " pod="openshift-marketplace/redhat-operators-4v92x" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.157436 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfbhm\" (UniqueName: \"kubernetes.io/projected/4002b94b-8679-454c-a721-fa900f6cde3b-kube-api-access-vfbhm\") pod \"redhat-operators-4v92x\" (UID: \"4002b94b-8679-454c-a721-fa900f6cde3b\") " pod="openshift-marketplace/redhat-operators-4v92x" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.307031 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4v92x" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.519936 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4v92x"] Feb 19 21:34:16 crc kubenswrapper[4795]: W0219 21:34:16.530358 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4002b94b_8679_454c_a721_fa900f6cde3b.slice/crio-50fb6def2952fa87e8b69160dd8658191952e0924a74c50c7f54839084847fe3 WatchSource:0}: Error finding container 50fb6def2952fa87e8b69160dd8658191952e0924a74c50c7f54839084847fe3: Status 404 returned error can't find the container with id 50fb6def2952fa87e8b69160dd8658191952e0924a74c50c7f54839084847fe3 Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.559373 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" podUID="80407681-6091-46cc-836f-757ec4d16604" containerName="registry" containerID="cri-o://65bbbf8046489156a597ecad5046b6cb27a3c794bdbc97d6bd9820be41cf0ce9" gracePeriod=30 Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.663516 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-75dccc5c74-kxcfb"] Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.664096 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.667111 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.667256 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.667392 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.667517 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.667575 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.667701 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.673055 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.680359 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75dccc5c74-kxcfb"] Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.757310 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1eb72f6-8164-4492-886e-8a24ec7b56c3-client-ca\") pod \"controller-manager-75dccc5c74-kxcfb\" (UID: \"f1eb72f6-8164-4492-886e-8a24ec7b56c3\") " pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.757654 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1eb72f6-8164-4492-886e-8a24ec7b56c3-proxy-ca-bundles\") pod \"controller-manager-75dccc5c74-kxcfb\" (UID: \"f1eb72f6-8164-4492-886e-8a24ec7b56c3\") " pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.757710 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmhsl\" (UniqueName: \"kubernetes.io/projected/f1eb72f6-8164-4492-886e-8a24ec7b56c3-kube-api-access-tmhsl\") pod \"controller-manager-75dccc5c74-kxcfb\" (UID: \"f1eb72f6-8164-4492-886e-8a24ec7b56c3\") " pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.757768 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1eb72f6-8164-4492-886e-8a24ec7b56c3-config\") pod \"controller-manager-75dccc5c74-kxcfb\" (UID: \"f1eb72f6-8164-4492-886e-8a24ec7b56c3\") " pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.757791 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1eb72f6-8164-4492-886e-8a24ec7b56c3-serving-cert\") pod \"controller-manager-75dccc5c74-kxcfb\" (UID: \"f1eb72f6-8164-4492-886e-8a24ec7b56c3\") " pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.853003 4795 generic.go:334] "Generic (PLEG): container finished" podID="432a371d-d143-4da7-9332-682f52b39381" containerID="b9423a320aab48ad3d9b982af503bf9d93dbf652a3d81eb4d4c08e2b544a60d3" exitCode=0 Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.853069 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p266t" event={"ID":"432a371d-d143-4da7-9332-682f52b39381","Type":"ContainerDied","Data":"b9423a320aab48ad3d9b982af503bf9d93dbf652a3d81eb4d4c08e2b544a60d3"} Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.855721 4795 generic.go:334] "Generic (PLEG): container finished" podID="4002b94b-8679-454c-a721-fa900f6cde3b" containerID="ad79ddcd190a1eef44de3c7cc5a165efaf090d3940c4dfdd69cab1403f6866c2" exitCode=0 Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.856145 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4v92x" event={"ID":"4002b94b-8679-454c-a721-fa900f6cde3b","Type":"ContainerDied","Data":"ad79ddcd190a1eef44de3c7cc5a165efaf090d3940c4dfdd69cab1403f6866c2"} Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.856199 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4v92x" event={"ID":"4002b94b-8679-454c-a721-fa900f6cde3b","Type":"ContainerStarted","Data":"50fb6def2952fa87e8b69160dd8658191952e0924a74c50c7f54839084847fe3"} Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.858659 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmhsl\" (UniqueName: \"kubernetes.io/projected/f1eb72f6-8164-4492-886e-8a24ec7b56c3-kube-api-access-tmhsl\") pod \"controller-manager-75dccc5c74-kxcfb\" (UID: \"f1eb72f6-8164-4492-886e-8a24ec7b56c3\") " pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.858732 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1eb72f6-8164-4492-886e-8a24ec7b56c3-config\") pod \"controller-manager-75dccc5c74-kxcfb\" (UID: \"f1eb72f6-8164-4492-886e-8a24ec7b56c3\") " pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.858762 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1eb72f6-8164-4492-886e-8a24ec7b56c3-serving-cert\") pod \"controller-manager-75dccc5c74-kxcfb\" (UID: \"f1eb72f6-8164-4492-886e-8a24ec7b56c3\") " pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.858798 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1eb72f6-8164-4492-886e-8a24ec7b56c3-client-ca\") pod \"controller-manager-75dccc5c74-kxcfb\" (UID: \"f1eb72f6-8164-4492-886e-8a24ec7b56c3\") " pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.858835 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1eb72f6-8164-4492-886e-8a24ec7b56c3-proxy-ca-bundles\") pod \"controller-manager-75dccc5c74-kxcfb\" (UID: \"f1eb72f6-8164-4492-886e-8a24ec7b56c3\") " pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.860469 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1eb72f6-8164-4492-886e-8a24ec7b56c3-client-ca\") pod \"controller-manager-75dccc5c74-kxcfb\" (UID: \"f1eb72f6-8164-4492-886e-8a24ec7b56c3\") " pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.860728 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1eb72f6-8164-4492-886e-8a24ec7b56c3-proxy-ca-bundles\") pod \"controller-manager-75dccc5c74-kxcfb\" (UID: \"f1eb72f6-8164-4492-886e-8a24ec7b56c3\") " pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.861817 4795 generic.go:334] "Generic (PLEG): container finished" podID="4941d783-94cd-4a5c-a124-5c8751cc8494" containerID="2a13a0790dc3bb7cf68dab93c75f7b9ee68b6d88b5baf91479232ddf7a7e83a0" exitCode=0 Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.861881 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r8t7x" event={"ID":"4941d783-94cd-4a5c-a124-5c8751cc8494","Type":"ContainerDied","Data":"2a13a0790dc3bb7cf68dab93c75f7b9ee68b6d88b5baf91479232ddf7a7e83a0"} Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.861905 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r8t7x" event={"ID":"4941d783-94cd-4a5c-a124-5c8751cc8494","Type":"ContainerStarted","Data":"00bcdf3193d28659cbe747a71cf7bf86337c90e5f777d20f680713f4490a05c7"} Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.863109 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1eb72f6-8164-4492-886e-8a24ec7b56c3-config\") pod \"controller-manager-75dccc5c74-kxcfb\" (UID: \"f1eb72f6-8164-4492-886e-8a24ec7b56c3\") " pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.874369 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1eb72f6-8164-4492-886e-8a24ec7b56c3-serving-cert\") pod \"controller-manager-75dccc5c74-kxcfb\" (UID: \"f1eb72f6-8164-4492-886e-8a24ec7b56c3\") " pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.892390 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmhsl\" (UniqueName: \"kubernetes.io/projected/f1eb72f6-8164-4492-886e-8a24ec7b56c3-kube-api-access-tmhsl\") pod \"controller-manager-75dccc5c74-kxcfb\" (UID: \"f1eb72f6-8164-4492-886e-8a24ec7b56c3\") " pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.894525 4795 generic.go:334] "Generic (PLEG): container finished" podID="80407681-6091-46cc-836f-757ec4d16604" containerID="65bbbf8046489156a597ecad5046b6cb27a3c794bdbc97d6bd9820be41cf0ce9" exitCode=0 Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.894585 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" event={"ID":"80407681-6091-46cc-836f-757ec4d16604","Type":"ContainerDied","Data":"65bbbf8046489156a597ecad5046b6cb27a3c794bdbc97d6bd9820be41cf0ce9"} Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.907895 4795 generic.go:334] "Generic (PLEG): container finished" podID="f457fe15-4099-4d77-8140-3297bee0a182" containerID="b2320d7447d9ae8b69ae359baad4c9c3633df43693ee76d985a2bbd216cfd589" exitCode=0 Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.907938 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdtnr" event={"ID":"f457fe15-4099-4d77-8140-3297bee0a182","Type":"ContainerDied","Data":"b2320d7447d9ae8b69ae359baad4c9c3633df43693ee76d985a2bbd216cfd589"} Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.907965 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdtnr" event={"ID":"f457fe15-4099-4d77-8140-3297bee0a182","Type":"ContainerStarted","Data":"f938be8ab635ea8a8d765d8b19d4b4e613931d75d16f156e55c63f345b513f51"} Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.947606 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mdtnr" podStartSLOduration=2.553936162 podStartE2EDuration="3.947591267s" podCreationTimestamp="2026-02-19 21:34:13 +0000 UTC" firstStartedPulling="2026-02-19 21:34:14.82135056 +0000 UTC m=+366.013868424" lastFinishedPulling="2026-02-19 21:34:16.215005665 +0000 UTC m=+367.407523529" observedRunningTime="2026-02-19 21:34:16.944706559 +0000 UTC m=+368.137224423" watchObservedRunningTime="2026-02-19 21:34:16.947591267 +0000 UTC m=+368.140109131" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.949143 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.959484 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4kqs\" (UniqueName: \"kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-kube-api-access-b4kqs\") pod \"80407681-6091-46cc-836f-757ec4d16604\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.959542 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/80407681-6091-46cc-836f-757ec4d16604-installation-pull-secrets\") pod \"80407681-6091-46cc-836f-757ec4d16604\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.959577 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/80407681-6091-46cc-836f-757ec4d16604-ca-trust-extracted\") pod \"80407681-6091-46cc-836f-757ec4d16604\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.959638 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80407681-6091-46cc-836f-757ec4d16604-trusted-ca\") pod \"80407681-6091-46cc-836f-757ec4d16604\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.959661 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-registry-tls\") pod \"80407681-6091-46cc-836f-757ec4d16604\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.959685 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/80407681-6091-46cc-836f-757ec4d16604-registry-certificates\") pod \"80407681-6091-46cc-836f-757ec4d16604\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.959813 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"80407681-6091-46cc-836f-757ec4d16604\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.959871 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-bound-sa-token\") pod \"80407681-6091-46cc-836f-757ec4d16604\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.960949 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80407681-6091-46cc-836f-757ec4d16604-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "80407681-6091-46cc-836f-757ec4d16604" (UID: "80407681-6091-46cc-836f-757ec4d16604"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.961409 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80407681-6091-46cc-836f-757ec4d16604-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "80407681-6091-46cc-836f-757ec4d16604" (UID: "80407681-6091-46cc-836f-757ec4d16604"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.963307 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-kube-api-access-b4kqs" (OuterVolumeSpecName: "kube-api-access-b4kqs") pod "80407681-6091-46cc-836f-757ec4d16604" (UID: "80407681-6091-46cc-836f-757ec4d16604"). InnerVolumeSpecName "kube-api-access-b4kqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.963629 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "80407681-6091-46cc-836f-757ec4d16604" (UID: "80407681-6091-46cc-836f-757ec4d16604"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.964363 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80407681-6091-46cc-836f-757ec4d16604-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "80407681-6091-46cc-836f-757ec4d16604" (UID: "80407681-6091-46cc-836f-757ec4d16604"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.964666 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "80407681-6091-46cc-836f-757ec4d16604" (UID: "80407681-6091-46cc-836f-757ec4d16604"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.977978 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "80407681-6091-46cc-836f-757ec4d16604" (UID: "80407681-6091-46cc-836f-757ec4d16604"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.978431 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80407681-6091-46cc-836f-757ec4d16604-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "80407681-6091-46cc-836f-757ec4d16604" (UID: "80407681-6091-46cc-836f-757ec4d16604"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.023810 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.061637 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4kqs\" (UniqueName: \"kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-kube-api-access-b4kqs\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.061686 4795 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/80407681-6091-46cc-836f-757ec4d16604-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.061698 4795 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/80407681-6091-46cc-836f-757ec4d16604-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.061707 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80407681-6091-46cc-836f-757ec4d16604-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.061716 4795 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/80407681-6091-46cc-836f-757ec4d16604-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.061724 4795 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.061732 4795 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.217810 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75dccc5c74-kxcfb"] Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.520828 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09f05756-63fb-4c1b-b763-065b4a66ceff" path="/var/lib/kubelet/pods/09f05756-63fb-4c1b-b763-065b4a66ceff/volumes" Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.918228 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p266t" event={"ID":"432a371d-d143-4da7-9332-682f52b39381","Type":"ContainerStarted","Data":"b211347bda79d6334829206bacffd181e7103ae2551df554c80058c66927561a"} Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.919674 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4v92x" event={"ID":"4002b94b-8679-454c-a721-fa900f6cde3b","Type":"ContainerStarted","Data":"c68f15a983051f4f99ec5ef6d4ff27d4f6d9bb5925061af129100d919e54372c"} Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.921861 4795 generic.go:334] "Generic (PLEG): container finished" podID="4941d783-94cd-4a5c-a124-5c8751cc8494" containerID="527326d97e5ae95c589541ded8c2267bf0cc084a5d6d6e6c0e689826ef089c37" exitCode=0 Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.921913 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r8t7x" event={"ID":"4941d783-94cd-4a5c-a124-5c8751cc8494","Type":"ContainerDied","Data":"527326d97e5ae95c589541ded8c2267bf0cc084a5d6d6e6c0e689826ef089c37"} Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.927407 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" event={"ID":"f1eb72f6-8164-4492-886e-8a24ec7b56c3","Type":"ContainerStarted","Data":"29d76f92b63ad5874a47fa0c6423d915bc59af679709ea0cc500dc141e825163"} Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.927440 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" event={"ID":"f1eb72f6-8164-4492-886e-8a24ec7b56c3","Type":"ContainerStarted","Data":"a3658b2c4a0a12c86cabc8ed73e4998562ac8c4f155e33a413875640448d2b73"} Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.927453 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.932661 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" event={"ID":"80407681-6091-46cc-836f-757ec4d16604","Type":"ContainerDied","Data":"f75c3a05080941863a756e5365daccf1f7896cc60502e7fce756c537524233eb"} Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.932713 4795 scope.go:117] "RemoveContainer" containerID="65bbbf8046489156a597ecad5046b6cb27a3c794bdbc97d6bd9820be41cf0ce9" Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.932825 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.933129 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.940109 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p266t" podStartSLOduration=2.497806347 podStartE2EDuration="4.940091867s" podCreationTimestamp="2026-02-19 21:34:13 +0000 UTC" firstStartedPulling="2026-02-19 21:34:14.82432737 +0000 UTC m=+366.016845234" lastFinishedPulling="2026-02-19 21:34:17.26661289 +0000 UTC m=+368.459130754" observedRunningTime="2026-02-19 21:34:17.936515321 +0000 UTC m=+369.129033185" watchObservedRunningTime="2026-02-19 21:34:17.940091867 +0000 UTC m=+369.132609731" Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.995891 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" podStartSLOduration=2.995878373 podStartE2EDuration="2.995878373s" podCreationTimestamp="2026-02-19 21:34:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:34:17.995117153 +0000 UTC m=+369.187635017" watchObservedRunningTime="2026-02-19 21:34:17.995878373 +0000 UTC m=+369.188396237" Feb 19 21:34:18 crc kubenswrapper[4795]: I0219 21:34:18.044876 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4h49m"] Feb 19 21:34:18 crc kubenswrapper[4795]: I0219 21:34:18.051417 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4h49m"] Feb 19 21:34:18 crc kubenswrapper[4795]: I0219 21:34:18.938479 4795 generic.go:334] "Generic (PLEG): container finished" podID="4002b94b-8679-454c-a721-fa900f6cde3b" containerID="c68f15a983051f4f99ec5ef6d4ff27d4f6d9bb5925061af129100d919e54372c" exitCode=0 Feb 19 21:34:18 crc kubenswrapper[4795]: I0219 21:34:18.938555 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4v92x" event={"ID":"4002b94b-8679-454c-a721-fa900f6cde3b","Type":"ContainerDied","Data":"c68f15a983051f4f99ec5ef6d4ff27d4f6d9bb5925061af129100d919e54372c"} Feb 19 21:34:18 crc kubenswrapper[4795]: I0219 21:34:18.940354 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r8t7x" event={"ID":"4941d783-94cd-4a5c-a124-5c8751cc8494","Type":"ContainerStarted","Data":"cd0f881418c39c61d538a924534edeccfb4228f8b26bb2f170aeb97e73127009"} Feb 19 21:34:18 crc kubenswrapper[4795]: I0219 21:34:18.975754 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-r8t7x" podStartSLOduration=2.536868626 podStartE2EDuration="3.975733785s" podCreationTimestamp="2026-02-19 21:34:15 +0000 UTC" firstStartedPulling="2026-02-19 21:34:16.864248612 +0000 UTC m=+368.056766476" lastFinishedPulling="2026-02-19 21:34:18.303113771 +0000 UTC m=+369.495631635" observedRunningTime="2026-02-19 21:34:18.974554623 +0000 UTC m=+370.167072497" watchObservedRunningTime="2026-02-19 21:34:18.975733785 +0000 UTC m=+370.168251649" Feb 19 21:34:19 crc kubenswrapper[4795]: I0219 21:34:19.523132 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80407681-6091-46cc-836f-757ec4d16604" path="/var/lib/kubelet/pods/80407681-6091-46cc-836f-757ec4d16604/volumes" Feb 19 21:34:19 crc kubenswrapper[4795]: I0219 21:34:19.948681 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4v92x" event={"ID":"4002b94b-8679-454c-a721-fa900f6cde3b","Type":"ContainerStarted","Data":"cd522c11f1f693280a76f8e450998fe73c235352edcb3197b989bee12ba9092b"} Feb 19 21:34:19 crc kubenswrapper[4795]: I0219 21:34:19.965617 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4v92x" podStartSLOduration=2.499528385 podStartE2EDuration="4.965592153s" podCreationTimestamp="2026-02-19 21:34:15 +0000 UTC" firstStartedPulling="2026-02-19 21:34:16.856746621 +0000 UTC m=+368.049264485" lastFinishedPulling="2026-02-19 21:34:19.322810349 +0000 UTC m=+370.515328253" observedRunningTime="2026-02-19 21:34:19.96398849 +0000 UTC m=+371.156506354" watchObservedRunningTime="2026-02-19 21:34:19.965592153 +0000 UTC m=+371.158110057" Feb 19 21:34:23 crc kubenswrapper[4795]: I0219 21:34:23.470580 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mdtnr" Feb 19 21:34:23 crc kubenswrapper[4795]: I0219 21:34:23.471105 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mdtnr" Feb 19 21:34:23 crc kubenswrapper[4795]: I0219 21:34:23.509397 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mdtnr" Feb 19 21:34:23 crc kubenswrapper[4795]: I0219 21:34:23.707299 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p266t" Feb 19 21:34:23 crc kubenswrapper[4795]: I0219 21:34:23.707374 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p266t" Feb 19 21:34:23 crc kubenswrapper[4795]: I0219 21:34:23.743261 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p266t" Feb 19 21:34:24 crc kubenswrapper[4795]: I0219 21:34:24.007984 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mdtnr" Feb 19 21:34:24 crc kubenswrapper[4795]: I0219 21:34:24.009944 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p266t" Feb 19 21:34:25 crc kubenswrapper[4795]: I0219 21:34:25.740216 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-r8t7x" Feb 19 21:34:25 crc kubenswrapper[4795]: I0219 21:34:25.740547 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-r8t7x" Feb 19 21:34:25 crc kubenswrapper[4795]: I0219 21:34:25.801535 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-r8t7x" Feb 19 21:34:26 crc kubenswrapper[4795]: I0219 21:34:26.013159 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-r8t7x" Feb 19 21:34:26 crc kubenswrapper[4795]: I0219 21:34:26.308321 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4v92x" Feb 19 21:34:26 crc kubenswrapper[4795]: I0219 21:34:26.308420 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4v92x" Feb 19 21:34:26 crc kubenswrapper[4795]: I0219 21:34:26.345544 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4v92x" Feb 19 21:34:27 crc kubenswrapper[4795]: I0219 21:34:27.047614 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4v92x" Feb 19 21:34:28 crc kubenswrapper[4795]: I0219 21:34:28.427608 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:34:28 crc kubenswrapper[4795]: I0219 21:34:28.427687 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:34:58 crc kubenswrapper[4795]: I0219 21:34:58.427568 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:34:58 crc kubenswrapper[4795]: I0219 21:34:58.428241 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:34:58 crc kubenswrapper[4795]: I0219 21:34:58.428294 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:34:58 crc kubenswrapper[4795]: I0219 21:34:58.429704 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b70534b9f8ebcb8ef865c023146a47e5407cdcbaee4d6cb7a41e8ac0daedef4a"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 21:34:58 crc kubenswrapper[4795]: I0219 21:34:58.429781 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://b70534b9f8ebcb8ef865c023146a47e5407cdcbaee4d6cb7a41e8ac0daedef4a" gracePeriod=600 Feb 19 21:34:59 crc kubenswrapper[4795]: I0219 21:34:59.177057 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="b70534b9f8ebcb8ef865c023146a47e5407cdcbaee4d6cb7a41e8ac0daedef4a" exitCode=0 Feb 19 21:34:59 crc kubenswrapper[4795]: I0219 21:34:59.177128 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"b70534b9f8ebcb8ef865c023146a47e5407cdcbaee4d6cb7a41e8ac0daedef4a"} Feb 19 21:34:59 crc kubenswrapper[4795]: I0219 21:34:59.177689 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"7b9a6596619327bd9c2f9e2c7e01d53fcd036e297997e2604d1242871dfda04a"} Feb 19 21:34:59 crc kubenswrapper[4795]: I0219 21:34:59.177714 4795 scope.go:117] "RemoveContainer" containerID="6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e" Feb 19 21:36:58 crc kubenswrapper[4795]: I0219 21:36:58.427990 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:36:58 crc kubenswrapper[4795]: I0219 21:36:58.429618 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:37:09 crc kubenswrapper[4795]: I0219 21:37:09.759274 4795 scope.go:117] "RemoveContainer" containerID="4275386b7725c8ec761b57381bafb0a0755373bf551591680c5fa169d19954f2" Feb 19 21:37:09 crc kubenswrapper[4795]: I0219 21:37:09.781032 4795 scope.go:117] "RemoveContainer" containerID="2bc16809b25a9de1b1eb527dd20e661703d80bd6e75ac128fd97a218812a8320" Feb 19 21:37:09 crc kubenswrapper[4795]: I0219 21:37:09.805694 4795 scope.go:117] "RemoveContainer" containerID="843a490c0d5b242211efc7cdd05db863b65b298c17a2a1bc1bcd8caa9584d3d3" Feb 19 21:37:09 crc kubenswrapper[4795]: I0219 21:37:09.828830 4795 scope.go:117] "RemoveContainer" containerID="099f13240cf3aea087d163d3e1052bf3c9c7278f2cd6182ccc06266ec67326ae" Feb 19 21:37:28 crc kubenswrapper[4795]: I0219 21:37:28.428346 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:37:28 crc kubenswrapper[4795]: I0219 21:37:28.429118 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:37:58 crc kubenswrapper[4795]: I0219 21:37:58.427588 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:37:58 crc kubenswrapper[4795]: I0219 21:37:58.428303 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:37:58 crc kubenswrapper[4795]: I0219 21:37:58.428375 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:37:58 crc kubenswrapper[4795]: I0219 21:37:58.429596 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7b9a6596619327bd9c2f9e2c7e01d53fcd036e297997e2604d1242871dfda04a"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 21:37:58 crc kubenswrapper[4795]: I0219 21:37:58.429698 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://7b9a6596619327bd9c2f9e2c7e01d53fcd036e297997e2604d1242871dfda04a" gracePeriod=600 Feb 19 21:37:59 crc kubenswrapper[4795]: I0219 21:37:59.345865 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="7b9a6596619327bd9c2f9e2c7e01d53fcd036e297997e2604d1242871dfda04a" exitCode=0 Feb 19 21:37:59 crc kubenswrapper[4795]: I0219 21:37:59.345953 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"7b9a6596619327bd9c2f9e2c7e01d53fcd036e297997e2604d1242871dfda04a"} Feb 19 21:37:59 crc kubenswrapper[4795]: I0219 21:37:59.346478 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"01e410588eeb6332f2520524efa20c5c33620bd277e99c49543b745d9bb370ca"} Feb 19 21:37:59 crc kubenswrapper[4795]: I0219 21:37:59.346523 4795 scope.go:117] "RemoveContainer" containerID="b70534b9f8ebcb8ef865c023146a47e5407cdcbaee4d6cb7a41e8ac0daedef4a" Feb 19 21:38:09 crc kubenswrapper[4795]: I0219 21:38:09.860664 4795 scope.go:117] "RemoveContainer" containerID="e2d47055ec62ef6c48d9090ad6f5104fd8d79584ae42ab1689cf9d640447aeec" Feb 19 21:38:09 crc kubenswrapper[4795]: I0219 21:38:09.879940 4795 scope.go:117] "RemoveContainer" containerID="e2b3b578e0a130165bd461f501cf99b487cf4f990927f2541dd89aab2055e28d" Feb 19 21:39:15 crc kubenswrapper[4795]: I0219 21:39:15.839725 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4qphl"] Feb 19 21:39:15 crc kubenswrapper[4795]: I0219 21:39:15.840741 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovn-controller" containerID="cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb" gracePeriod=30 Feb 19 21:39:15 crc kubenswrapper[4795]: I0219 21:39:15.840785 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="nbdb" containerID="cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab" gracePeriod=30 Feb 19 21:39:15 crc kubenswrapper[4795]: I0219 21:39:15.840866 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="northd" containerID="cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29" gracePeriod=30 Feb 19 21:39:15 crc kubenswrapper[4795]: I0219 21:39:15.840916 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a" gracePeriod=30 Feb 19 21:39:15 crc kubenswrapper[4795]: I0219 21:39:15.840935 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="sbdb" containerID="cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00" gracePeriod=30 Feb 19 21:39:15 crc kubenswrapper[4795]: I0219 21:39:15.840961 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="kube-rbac-proxy-node" containerID="cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475" gracePeriod=30 Feb 19 21:39:15 crc kubenswrapper[4795]: I0219 21:39:15.841006 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovn-acl-logging" containerID="cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9" gracePeriod=30 Feb 19 21:39:15 crc kubenswrapper[4795]: I0219 21:39:15.876558 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovnkube-controller" containerID="cri-o://3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81" gracePeriod=30 Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.121255 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovnkube-controller/3.log" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.123440 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovn-acl-logging/0.log" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.123921 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovn-controller/0.log" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.124536 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173046 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pf8fb"] Feb 19 21:39:16 crc kubenswrapper[4795]: E0219 21:39:16.173309 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovn-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173327 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovn-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: E0219 21:39:16.173344 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovnkube-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173352 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovnkube-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: E0219 21:39:16.173365 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="nbdb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173373 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="nbdb" Feb 19 21:39:16 crc kubenswrapper[4795]: E0219 21:39:16.173386 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="kubecfg-setup" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173393 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="kubecfg-setup" Feb 19 21:39:16 crc kubenswrapper[4795]: E0219 21:39:16.173401 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovnkube-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173408 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovnkube-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: E0219 21:39:16.173417 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="kube-rbac-proxy-node" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173424 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="kube-rbac-proxy-node" Feb 19 21:39:16 crc kubenswrapper[4795]: E0219 21:39:16.173433 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovnkube-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173441 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovnkube-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: E0219 21:39:16.173450 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovnkube-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173458 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovnkube-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: E0219 21:39:16.173469 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80407681-6091-46cc-836f-757ec4d16604" containerName="registry" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173475 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="80407681-6091-46cc-836f-757ec4d16604" containerName="registry" Feb 19 21:39:16 crc kubenswrapper[4795]: E0219 21:39:16.173485 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovnkube-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173491 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovnkube-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: E0219 21:39:16.173501 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="sbdb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173508 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="sbdb" Feb 19 21:39:16 crc kubenswrapper[4795]: E0219 21:39:16.173518 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="kube-rbac-proxy-ovn-metrics" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173527 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="kube-rbac-proxy-ovn-metrics" Feb 19 21:39:16 crc kubenswrapper[4795]: E0219 21:39:16.173537 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovn-acl-logging" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173545 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovn-acl-logging" Feb 19 21:39:16 crc kubenswrapper[4795]: E0219 21:39:16.173555 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="northd" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173562 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="northd" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173664 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovnkube-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173675 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="80407681-6091-46cc-836f-757ec4d16604" containerName="registry" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173685 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovnkube-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173692 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovnkube-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173703 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="northd" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173713 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="kube-rbac-proxy-ovn-metrics" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173724 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="kube-rbac-proxy-node" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173732 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="nbdb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173741 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovn-acl-logging" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173752 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="sbdb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173762 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovn-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173950 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovnkube-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173960 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovnkube-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.175735 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.239295 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-run-ovn-kubernetes\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.239670 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nrjb\" (UniqueName: \"kubernetes.io/projected/adf5bd36-b46b-4a06-8291-cae9f3988330-kube-api-access-6nrjb\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.239691 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-slash\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.239708 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-systemd\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.239728 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-cni-bin\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.239747 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-var-lib-openvswitch\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.239404 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.239774 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-slash" (OuterVolumeSpecName: "host-slash") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.239811 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.239822 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.239825 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.239769 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-run-netns\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.239945 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-env-overrides\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.239972 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-ovn\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240000 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-ovnkube-config\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240033 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-var-lib-cni-networks-ovn-kubernetes\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240059 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-log-socket\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240062 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240089 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-systemd-units\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240127 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-ovnkube-script-lib\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240148 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-etc-openvswitch\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240118 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240147 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240186 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-cni-netd\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240205 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240210 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/adf5bd36-b46b-4a06-8291-cae9f3988330-ovn-node-metrics-cert\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240226 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240232 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-openvswitch\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240257 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-node-log\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240280 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-kubelet\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240280 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-log-socket" (OuterVolumeSpecName: "log-socket") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240336 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240365 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-node-log" (OuterVolumeSpecName: "node-log") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240432 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-run-netns\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240470 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-var-lib-openvswitch\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240458 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240531 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-cni-bin\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240564 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-kubelet\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240593 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/24d72341-090a-4e01-bc3f-9e04becb3500-ovn-node-metrics-cert\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240617 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-node-log\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240659 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/24d72341-090a-4e01-bc3f-9e04becb3500-ovnkube-script-lib\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240786 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240778 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-cni-netd\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240866 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-slash\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240904 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/24d72341-090a-4e01-bc3f-9e04becb3500-env-overrides\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240908 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240930 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240957 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv45v\" (UniqueName: \"kubernetes.io/projected/24d72341-090a-4e01-bc3f-9e04becb3500-kube-api-access-zv45v\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240980 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-run-ovn-kubernetes\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240998 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-log-socket\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241014 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/24d72341-090a-4e01-bc3f-9e04becb3500-ovnkube-config\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241096 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-run-openvswitch\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241148 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-systemd-units\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241206 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-run-systemd\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241303 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-run-ovn\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241372 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241493 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-etc-openvswitch\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241647 4795 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241678 4795 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241697 4795 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241714 4795 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241731 4795 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-node-log\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241748 4795 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241766 4795 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241786 4795 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-slash\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241803 4795 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241820 4795 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241836 4795 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241852 4795 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241869 4795 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241887 4795 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241904 4795 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241920 4795 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-log-socket\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241937 4795 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.245704 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adf5bd36-b46b-4a06-8291-cae9f3988330-kube-api-access-6nrjb" (OuterVolumeSpecName: "kube-api-access-6nrjb") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "kube-api-access-6nrjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.246284 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adf5bd36-b46b-4a06-8291-cae9f3988330-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.255675 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343560 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-run-ovn\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343641 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343684 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-run-ovn\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343742 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-etc-openvswitch\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343685 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-etc-openvswitch\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343756 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343792 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-run-netns\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343821 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-var-lib-openvswitch\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343840 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-cni-bin\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343861 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-kubelet\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343876 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/24d72341-090a-4e01-bc3f-9e04becb3500-ovn-node-metrics-cert\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343891 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-node-log\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343919 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/24d72341-090a-4e01-bc3f-9e04becb3500-ovnkube-script-lib\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343937 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-cni-netd\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343949 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-run-netns\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343960 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-slash\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343998 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-node-log\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344034 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/24d72341-090a-4e01-bc3f-9e04becb3500-env-overrides\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344080 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv45v\" (UniqueName: \"kubernetes.io/projected/24d72341-090a-4e01-bc3f-9e04becb3500-kube-api-access-zv45v\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344120 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-run-ovn-kubernetes\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344152 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/24d72341-090a-4e01-bc3f-9e04becb3500-ovnkube-config\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344242 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-log-socket\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344283 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-run-openvswitch\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344313 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-systemd-units\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344339 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-run-systemd\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344396 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nrjb\" (UniqueName: \"kubernetes.io/projected/adf5bd36-b46b-4a06-8291-cae9f3988330-kube-api-access-6nrjb\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344416 4795 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344438 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/adf5bd36-b46b-4a06-8291-cae9f3988330-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344484 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-run-systemd\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344527 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-cni-bin\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344571 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-kubelet\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344722 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/24d72341-090a-4e01-bc3f-9e04becb3500-ovnkube-script-lib\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343917 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-var-lib-openvswitch\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344770 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-cni-netd\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344793 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-slash\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.345194 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/24d72341-090a-4e01-bc3f-9e04becb3500-ovnkube-config\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.345347 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-run-openvswitch\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.345413 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-log-socket\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.345449 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-systemd-units\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.345472 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/24d72341-090a-4e01-bc3f-9e04becb3500-env-overrides\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.345502 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-run-ovn-kubernetes\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.348555 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/24d72341-090a-4e01-bc3f-9e04becb3500-ovn-node-metrics-cert\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.362275 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv45v\" (UniqueName: \"kubernetes.io/projected/24d72341-090a-4e01-bc3f-9e04becb3500-kube-api-access-zv45v\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.490832 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.792779 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5p6d9_e967392b-9bd8-4111-b1b9-96d503a19668/kube-multus/2.log" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.793549 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5p6d9_e967392b-9bd8-4111-b1b9-96d503a19668/kube-multus/1.log" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.793638 4795 generic.go:334] "Generic (PLEG): container finished" podID="e967392b-9bd8-4111-b1b9-96d503a19668" containerID="2822f57eb80a87e2600f1bdd3a3189e4bded6fbc0e489210ee08ea133cbf2aa9" exitCode=2 Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.793725 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5p6d9" event={"ID":"e967392b-9bd8-4111-b1b9-96d503a19668","Type":"ContainerDied","Data":"2822f57eb80a87e2600f1bdd3a3189e4bded6fbc0e489210ee08ea133cbf2aa9"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.793806 4795 scope.go:117] "RemoveContainer" containerID="02e3717583e910ca7356ee58e22d63f40252488fb8540be7a30ab0fc918b908b" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.794328 4795 scope.go:117] "RemoveContainer" containerID="2822f57eb80a87e2600f1bdd3a3189e4bded6fbc0e489210ee08ea133cbf2aa9" Feb 19 21:39:16 crc kubenswrapper[4795]: E0219 21:39:16.794592 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-5p6d9_openshift-multus(e967392b-9bd8-4111-b1b9-96d503a19668)\"" pod="openshift-multus/multus-5p6d9" podUID="e967392b-9bd8-4111-b1b9-96d503a19668" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.799511 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovnkube-controller/3.log" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.801564 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovn-acl-logging/0.log" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.801972 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovn-controller/0.log" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.802520 4795 generic.go:334] "Generic (PLEG): container finished" podID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerID="3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81" exitCode=0 Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.802602 4795 generic.go:334] "Generic (PLEG): container finished" podID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerID="d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00" exitCode=0 Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.802658 4795 generic.go:334] "Generic (PLEG): container finished" podID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerID="b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab" exitCode=0 Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.802706 4795 generic.go:334] "Generic (PLEG): container finished" podID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerID="c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29" exitCode=0 Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.802774 4795 generic.go:334] "Generic (PLEG): container finished" podID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerID="05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a" exitCode=0 Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.802827 4795 generic.go:334] "Generic (PLEG): container finished" podID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerID="c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475" exitCode=0 Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.802882 4795 generic.go:334] "Generic (PLEG): container finished" podID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerID="10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9" exitCode=143 Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.802930 4795 generic.go:334] "Generic (PLEG): container finished" podID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerID="83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb" exitCode=143 Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.802826 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerDied","Data":"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803120 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerDied","Data":"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803186 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerDied","Data":"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803207 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerDied","Data":"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803237 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerDied","Data":"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803260 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerDied","Data":"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803277 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803295 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803304 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803314 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803324 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803334 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803344 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803354 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803364 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803373 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803388 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerDied","Data":"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803404 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803416 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803425 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803435 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803445 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803453 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803462 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803471 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803480 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803488 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803503 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerDied","Data":"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803520 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803532 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803541 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803548 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803555 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803565 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803571 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803578 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803585 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803591 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803601 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerDied","Data":"740cf96667070bca195c02d1eaa75a11754b5d3aaf7c73fedd2e4e883f6a4193"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803614 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803623 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803631 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803638 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803645 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803652 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803659 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803666 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803673 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803680 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.802745 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.805277 4795 generic.go:334] "Generic (PLEG): container finished" podID="24d72341-090a-4e01-bc3f-9e04becb3500" containerID="193aac3fcec4da276a7aec22b1427ad4f66933bf38a55fc8aeeb45f9b5a11fd4" exitCode=0 Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.805369 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" event={"ID":"24d72341-090a-4e01-bc3f-9e04becb3500","Type":"ContainerDied","Data":"193aac3fcec4da276a7aec22b1427ad4f66933bf38a55fc8aeeb45f9b5a11fd4"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.805442 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" event={"ID":"24d72341-090a-4e01-bc3f-9e04becb3500","Type":"ContainerStarted","Data":"6c92282b9b9b80b829143d77ba70c574b5e19c9e3fe2b59c01addfa468227c9d"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.835181 4795 scope.go:117] "RemoveContainer" containerID="3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.875177 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4qphl"] Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.879263 4795 scope.go:117] "RemoveContainer" containerID="b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.881674 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4qphl"] Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.917810 4795 scope.go:117] "RemoveContainer" containerID="d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.930131 4795 scope.go:117] "RemoveContainer" containerID="b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.943233 4795 scope.go:117] "RemoveContainer" containerID="c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.958404 4795 scope.go:117] "RemoveContainer" containerID="05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.988400 4795 scope.go:117] "RemoveContainer" containerID="c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.012245 4795 scope.go:117] "RemoveContainer" containerID="10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.038804 4795 scope.go:117] "RemoveContainer" containerID="83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.054381 4795 scope.go:117] "RemoveContainer" containerID="cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.080714 4795 scope.go:117] "RemoveContainer" containerID="3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81" Feb 19 21:39:17 crc kubenswrapper[4795]: E0219 21:39:17.081104 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81\": container with ID starting with 3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81 not found: ID does not exist" containerID="3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.081136 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81"} err="failed to get container status \"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81\": rpc error: code = NotFound desc = could not find container \"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81\": container with ID starting with 3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.081159 4795 scope.go:117] "RemoveContainer" containerID="b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d" Feb 19 21:39:17 crc kubenswrapper[4795]: E0219 21:39:17.081682 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d\": container with ID starting with b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d not found: ID does not exist" containerID="b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.081702 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d"} err="failed to get container status \"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d\": rpc error: code = NotFound desc = could not find container \"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d\": container with ID starting with b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.081714 4795 scope.go:117] "RemoveContainer" containerID="d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00" Feb 19 21:39:17 crc kubenswrapper[4795]: E0219 21:39:17.081997 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\": container with ID starting with d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00 not found: ID does not exist" containerID="d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.082054 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00"} err="failed to get container status \"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\": rpc error: code = NotFound desc = could not find container \"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\": container with ID starting with d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.082069 4795 scope.go:117] "RemoveContainer" containerID="b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab" Feb 19 21:39:17 crc kubenswrapper[4795]: E0219 21:39:17.082410 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\": container with ID starting with b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab not found: ID does not exist" containerID="b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.082445 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab"} err="failed to get container status \"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\": rpc error: code = NotFound desc = could not find container \"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\": container with ID starting with b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.082463 4795 scope.go:117] "RemoveContainer" containerID="c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29" Feb 19 21:39:17 crc kubenswrapper[4795]: E0219 21:39:17.082760 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\": container with ID starting with c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29 not found: ID does not exist" containerID="c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.082781 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29"} err="failed to get container status \"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\": rpc error: code = NotFound desc = could not find container \"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\": container with ID starting with c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.082793 4795 scope.go:117] "RemoveContainer" containerID="05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a" Feb 19 21:39:17 crc kubenswrapper[4795]: E0219 21:39:17.083179 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\": container with ID starting with 05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a not found: ID does not exist" containerID="05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.083208 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a"} err="failed to get container status \"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\": rpc error: code = NotFound desc = could not find container \"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\": container with ID starting with 05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.083228 4795 scope.go:117] "RemoveContainer" containerID="c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475" Feb 19 21:39:17 crc kubenswrapper[4795]: E0219 21:39:17.083472 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\": container with ID starting with c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475 not found: ID does not exist" containerID="c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.083494 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475"} err="failed to get container status \"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\": rpc error: code = NotFound desc = could not find container \"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\": container with ID starting with c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.083507 4795 scope.go:117] "RemoveContainer" containerID="10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9" Feb 19 21:39:17 crc kubenswrapper[4795]: E0219 21:39:17.083813 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\": container with ID starting with 10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9 not found: ID does not exist" containerID="10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.083833 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9"} err="failed to get container status \"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\": rpc error: code = NotFound desc = could not find container \"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\": container with ID starting with 10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.083845 4795 scope.go:117] "RemoveContainer" containerID="83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb" Feb 19 21:39:17 crc kubenswrapper[4795]: E0219 21:39:17.084052 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\": container with ID starting with 83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb not found: ID does not exist" containerID="83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.084071 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb"} err="failed to get container status \"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\": rpc error: code = NotFound desc = could not find container \"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\": container with ID starting with 83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.084083 4795 scope.go:117] "RemoveContainer" containerID="cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c" Feb 19 21:39:17 crc kubenswrapper[4795]: E0219 21:39:17.084351 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\": container with ID starting with cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c not found: ID does not exist" containerID="cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.084379 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c"} err="failed to get container status \"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\": rpc error: code = NotFound desc = could not find container \"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\": container with ID starting with cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.084397 4795 scope.go:117] "RemoveContainer" containerID="3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.084826 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81"} err="failed to get container status \"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81\": rpc error: code = NotFound desc = could not find container \"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81\": container with ID starting with 3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.084843 4795 scope.go:117] "RemoveContainer" containerID="b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.085048 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d"} err="failed to get container status \"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d\": rpc error: code = NotFound desc = could not find container \"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d\": container with ID starting with b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.085064 4795 scope.go:117] "RemoveContainer" containerID="d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.085746 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00"} err="failed to get container status \"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\": rpc error: code = NotFound desc = could not find container \"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\": container with ID starting with d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.085770 4795 scope.go:117] "RemoveContainer" containerID="b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.086059 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab"} err="failed to get container status \"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\": rpc error: code = NotFound desc = could not find container \"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\": container with ID starting with b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.086086 4795 scope.go:117] "RemoveContainer" containerID="c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.086476 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29"} err="failed to get container status \"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\": rpc error: code = NotFound desc = could not find container \"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\": container with ID starting with c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.086519 4795 scope.go:117] "RemoveContainer" containerID="05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.086768 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a"} err="failed to get container status \"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\": rpc error: code = NotFound desc = could not find container \"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\": container with ID starting with 05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.086811 4795 scope.go:117] "RemoveContainer" containerID="c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.087349 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475"} err="failed to get container status \"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\": rpc error: code = NotFound desc = could not find container \"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\": container with ID starting with c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.087372 4795 scope.go:117] "RemoveContainer" containerID="10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.087710 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9"} err="failed to get container status \"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\": rpc error: code = NotFound desc = could not find container \"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\": container with ID starting with 10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.087736 4795 scope.go:117] "RemoveContainer" containerID="83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.087964 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb"} err="failed to get container status \"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\": rpc error: code = NotFound desc = could not find container \"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\": container with ID starting with 83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.088010 4795 scope.go:117] "RemoveContainer" containerID="cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.088332 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c"} err="failed to get container status \"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\": rpc error: code = NotFound desc = could not find container \"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\": container with ID starting with cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.088374 4795 scope.go:117] "RemoveContainer" containerID="3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.088746 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81"} err="failed to get container status \"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81\": rpc error: code = NotFound desc = could not find container \"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81\": container with ID starting with 3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.088794 4795 scope.go:117] "RemoveContainer" containerID="b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.089072 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d"} err="failed to get container status \"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d\": rpc error: code = NotFound desc = could not find container \"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d\": container with ID starting with b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.089093 4795 scope.go:117] "RemoveContainer" containerID="d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.089310 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00"} err="failed to get container status \"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\": rpc error: code = NotFound desc = could not find container \"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\": container with ID starting with d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.089353 4795 scope.go:117] "RemoveContainer" containerID="b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.089597 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab"} err="failed to get container status \"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\": rpc error: code = NotFound desc = could not find container \"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\": container with ID starting with b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.089624 4795 scope.go:117] "RemoveContainer" containerID="c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.089842 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29"} err="failed to get container status \"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\": rpc error: code = NotFound desc = could not find container \"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\": container with ID starting with c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.089868 4795 scope.go:117] "RemoveContainer" containerID="05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.090070 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a"} err="failed to get container status \"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\": rpc error: code = NotFound desc = could not find container \"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\": container with ID starting with 05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.090097 4795 scope.go:117] "RemoveContainer" containerID="c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.090349 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475"} err="failed to get container status \"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\": rpc error: code = NotFound desc = could not find container \"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\": container with ID starting with c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.090368 4795 scope.go:117] "RemoveContainer" containerID="10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.090569 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9"} err="failed to get container status \"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\": rpc error: code = NotFound desc = could not find container \"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\": container with ID starting with 10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.090586 4795 scope.go:117] "RemoveContainer" containerID="83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.090774 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb"} err="failed to get container status \"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\": rpc error: code = NotFound desc = could not find container \"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\": container with ID starting with 83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.090791 4795 scope.go:117] "RemoveContainer" containerID="cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.090994 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c"} err="failed to get container status \"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\": rpc error: code = NotFound desc = could not find container \"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\": container with ID starting with cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.091012 4795 scope.go:117] "RemoveContainer" containerID="3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.091499 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81"} err="failed to get container status \"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81\": rpc error: code = NotFound desc = could not find container \"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81\": container with ID starting with 3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.091584 4795 scope.go:117] "RemoveContainer" containerID="b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.091978 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d"} err="failed to get container status \"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d\": rpc error: code = NotFound desc = could not find container \"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d\": container with ID starting with b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.092002 4795 scope.go:117] "RemoveContainer" containerID="d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.092307 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00"} err="failed to get container status \"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\": rpc error: code = NotFound desc = could not find container \"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\": container with ID starting with d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.092332 4795 scope.go:117] "RemoveContainer" containerID="b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.092609 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab"} err="failed to get container status \"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\": rpc error: code = NotFound desc = could not find container \"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\": container with ID starting with b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.092628 4795 scope.go:117] "RemoveContainer" containerID="c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.092980 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29"} err="failed to get container status \"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\": rpc error: code = NotFound desc = could not find container \"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\": container with ID starting with c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.093002 4795 scope.go:117] "RemoveContainer" containerID="05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.093251 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a"} err="failed to get container status \"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\": rpc error: code = NotFound desc = could not find container \"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\": container with ID starting with 05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.093275 4795 scope.go:117] "RemoveContainer" containerID="c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.093472 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475"} err="failed to get container status \"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\": rpc error: code = NotFound desc = could not find container \"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\": container with ID starting with c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.093490 4795 scope.go:117] "RemoveContainer" containerID="10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.093686 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9"} err="failed to get container status \"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\": rpc error: code = NotFound desc = could not find container \"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\": container with ID starting with 10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.093705 4795 scope.go:117] "RemoveContainer" containerID="83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.093900 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb"} err="failed to get container status \"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\": rpc error: code = NotFound desc = could not find container \"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\": container with ID starting with 83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.093923 4795 scope.go:117] "RemoveContainer" containerID="cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.094201 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c"} err="failed to get container status \"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\": rpc error: code = NotFound desc = could not find container \"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\": container with ID starting with cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.518455 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" path="/var/lib/kubelet/pods/adf5bd36-b46b-4a06-8291-cae9f3988330/volumes" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.814321 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5p6d9_e967392b-9bd8-4111-b1b9-96d503a19668/kube-multus/2.log" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.821487 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" event={"ID":"24d72341-090a-4e01-bc3f-9e04becb3500","Type":"ContainerStarted","Data":"0c08565dc9b29a05ad8bf00031d18acac856a1b316d912f8504345e535fd4b40"} Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.821537 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" event={"ID":"24d72341-090a-4e01-bc3f-9e04becb3500","Type":"ContainerStarted","Data":"647b3f01d647f3ad581c625aa58960eb56e018d582fa5085d823401c1a126110"} Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.821560 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" event={"ID":"24d72341-090a-4e01-bc3f-9e04becb3500","Type":"ContainerStarted","Data":"6fc57746e9dcda6b57e64bceab831932e5c5f45fc6254378e692ece47caec394"} Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.821578 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" event={"ID":"24d72341-090a-4e01-bc3f-9e04becb3500","Type":"ContainerStarted","Data":"6a46a7718bb4b86f2153d5d76e61c96d00420c2e80be4c85737decd2345ccf16"} Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.821594 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" event={"ID":"24d72341-090a-4e01-bc3f-9e04becb3500","Type":"ContainerStarted","Data":"4ebfdca56cc19c6ad360bb7600c9b108012f9686b333722fa40779e9317f58d3"} Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.821611 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" event={"ID":"24d72341-090a-4e01-bc3f-9e04becb3500","Type":"ContainerStarted","Data":"12622a6e9a1cb4219ad74d3abc47d2af2d83ea744ded01c72cf1ba7a281221be"} Feb 19 21:39:19 crc kubenswrapper[4795]: I0219 21:39:19.849102 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" event={"ID":"24d72341-090a-4e01-bc3f-9e04becb3500","Type":"ContainerStarted","Data":"ffb96d524a15ba82de35e22ce6eae23556b8100e8a34fea8ecf5781711675253"} Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.246024 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-6tsln"] Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.247202 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.250277 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.250383 4795 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-4rmf9" Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.252996 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.253636 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.304592 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/dc847694-39ea-4c3c-bb58-0f920e59ac62-node-mnt\") pod \"crc-storage-crc-6tsln\" (UID: \"dc847694-39ea-4c3c-bb58-0f920e59ac62\") " pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.304737 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67zfj\" (UniqueName: \"kubernetes.io/projected/dc847694-39ea-4c3c-bb58-0f920e59ac62-kube-api-access-67zfj\") pod \"crc-storage-crc-6tsln\" (UID: \"dc847694-39ea-4c3c-bb58-0f920e59ac62\") " pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.304785 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/dc847694-39ea-4c3c-bb58-0f920e59ac62-crc-storage\") pod \"crc-storage-crc-6tsln\" (UID: \"dc847694-39ea-4c3c-bb58-0f920e59ac62\") " pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.406485 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67zfj\" (UniqueName: \"kubernetes.io/projected/dc847694-39ea-4c3c-bb58-0f920e59ac62-kube-api-access-67zfj\") pod \"crc-storage-crc-6tsln\" (UID: \"dc847694-39ea-4c3c-bb58-0f920e59ac62\") " pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.406559 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/dc847694-39ea-4c3c-bb58-0f920e59ac62-crc-storage\") pod \"crc-storage-crc-6tsln\" (UID: \"dc847694-39ea-4c3c-bb58-0f920e59ac62\") " pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.406640 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/dc847694-39ea-4c3c-bb58-0f920e59ac62-node-mnt\") pod \"crc-storage-crc-6tsln\" (UID: \"dc847694-39ea-4c3c-bb58-0f920e59ac62\") " pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.407081 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/dc847694-39ea-4c3c-bb58-0f920e59ac62-node-mnt\") pod \"crc-storage-crc-6tsln\" (UID: \"dc847694-39ea-4c3c-bb58-0f920e59ac62\") " pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.408153 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/dc847694-39ea-4c3c-bb58-0f920e59ac62-crc-storage\") pod \"crc-storage-crc-6tsln\" (UID: \"dc847694-39ea-4c3c-bb58-0f920e59ac62\") " pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.440720 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67zfj\" (UniqueName: \"kubernetes.io/projected/dc847694-39ea-4c3c-bb58-0f920e59ac62-kube-api-access-67zfj\") pod \"crc-storage-crc-6tsln\" (UID: \"dc847694-39ea-4c3c-bb58-0f920e59ac62\") " pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.572444 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:21 crc kubenswrapper[4795]: E0219 21:39:21.614805 4795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-6tsln_crc-storage_dc847694-39ea-4c3c-bb58-0f920e59ac62_0(2d9a848d2fad25e3057d21c616fc09057534181c5aaede0764f37d7275d818b4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 21:39:21 crc kubenswrapper[4795]: E0219 21:39:21.615136 4795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-6tsln_crc-storage_dc847694-39ea-4c3c-bb58-0f920e59ac62_0(2d9a848d2fad25e3057d21c616fc09057534181c5aaede0764f37d7275d818b4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:21 crc kubenswrapper[4795]: E0219 21:39:21.615200 4795 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-6tsln_crc-storage_dc847694-39ea-4c3c-bb58-0f920e59ac62_0(2d9a848d2fad25e3057d21c616fc09057534181c5aaede0764f37d7275d818b4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:21 crc kubenswrapper[4795]: E0219 21:39:21.615272 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-6tsln_crc-storage(dc847694-39ea-4c3c-bb58-0f920e59ac62)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-6tsln_crc-storage(dc847694-39ea-4c3c-bb58-0f920e59ac62)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-6tsln_crc-storage_dc847694-39ea-4c3c-bb58-0f920e59ac62_0(2d9a848d2fad25e3057d21c616fc09057534181c5aaede0764f37d7275d818b4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-6tsln" podUID="dc847694-39ea-4c3c-bb58-0f920e59ac62" Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.870026 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" event={"ID":"24d72341-090a-4e01-bc3f-9e04becb3500","Type":"ContainerStarted","Data":"b21df4c128974f22a00aac324263ae0ce26cad96082a2cfc1a7978c4b8725776"} Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.910735 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" podStartSLOduration=5.9107118629999995 podStartE2EDuration="5.910711863s" podCreationTimestamp="2026-02-19 21:39:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:39:21.906381725 +0000 UTC m=+673.098899589" watchObservedRunningTime="2026-02-19 21:39:21.910711863 +0000 UTC m=+673.103229767" Feb 19 21:39:22 crc kubenswrapper[4795]: I0219 21:39:22.536723 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-6tsln"] Feb 19 21:39:22 crc kubenswrapper[4795]: I0219 21:39:22.536855 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:22 crc kubenswrapper[4795]: I0219 21:39:22.537259 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:22 crc kubenswrapper[4795]: E0219 21:39:22.564207 4795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-6tsln_crc-storage_dc847694-39ea-4c3c-bb58-0f920e59ac62_0(165272ba5a3d1fdfec0a57a6458884ba2762924d409002cf611a6478b4e752a3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 21:39:22 crc kubenswrapper[4795]: E0219 21:39:22.564528 4795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-6tsln_crc-storage_dc847694-39ea-4c3c-bb58-0f920e59ac62_0(165272ba5a3d1fdfec0a57a6458884ba2762924d409002cf611a6478b4e752a3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:22 crc kubenswrapper[4795]: E0219 21:39:22.564549 4795 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-6tsln_crc-storage_dc847694-39ea-4c3c-bb58-0f920e59ac62_0(165272ba5a3d1fdfec0a57a6458884ba2762924d409002cf611a6478b4e752a3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:22 crc kubenswrapper[4795]: E0219 21:39:22.564594 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-6tsln_crc-storage(dc847694-39ea-4c3c-bb58-0f920e59ac62)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-6tsln_crc-storage(dc847694-39ea-4c3c-bb58-0f920e59ac62)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-6tsln_crc-storage_dc847694-39ea-4c3c-bb58-0f920e59ac62_0(165272ba5a3d1fdfec0a57a6458884ba2762924d409002cf611a6478b4e752a3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-6tsln" podUID="dc847694-39ea-4c3c-bb58-0f920e59ac62" Feb 19 21:39:22 crc kubenswrapper[4795]: I0219 21:39:22.874637 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:22 crc kubenswrapper[4795]: I0219 21:39:22.874688 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:22 crc kubenswrapper[4795]: I0219 21:39:22.874702 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:22 crc kubenswrapper[4795]: I0219 21:39:22.898000 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:22 crc kubenswrapper[4795]: I0219 21:39:22.899794 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:29 crc kubenswrapper[4795]: I0219 21:39:29.514479 4795 scope.go:117] "RemoveContainer" containerID="2822f57eb80a87e2600f1bdd3a3189e4bded6fbc0e489210ee08ea133cbf2aa9" Feb 19 21:39:29 crc kubenswrapper[4795]: E0219 21:39:29.515039 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-5p6d9_openshift-multus(e967392b-9bd8-4111-b1b9-96d503a19668)\"" pod="openshift-multus/multus-5p6d9" podUID="e967392b-9bd8-4111-b1b9-96d503a19668" Feb 19 21:39:35 crc kubenswrapper[4795]: I0219 21:39:35.511424 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:35 crc kubenswrapper[4795]: I0219 21:39:35.512464 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:35 crc kubenswrapper[4795]: E0219 21:39:35.564839 4795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-6tsln_crc-storage_dc847694-39ea-4c3c-bb58-0f920e59ac62_0(2cac28f05bb02e95f07312d123438159c25eddb8af19222fb8034a1595da2469): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 21:39:35 crc kubenswrapper[4795]: E0219 21:39:35.564948 4795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-6tsln_crc-storage_dc847694-39ea-4c3c-bb58-0f920e59ac62_0(2cac28f05bb02e95f07312d123438159c25eddb8af19222fb8034a1595da2469): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:35 crc kubenswrapper[4795]: E0219 21:39:35.564997 4795 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-6tsln_crc-storage_dc847694-39ea-4c3c-bb58-0f920e59ac62_0(2cac28f05bb02e95f07312d123438159c25eddb8af19222fb8034a1595da2469): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:35 crc kubenswrapper[4795]: E0219 21:39:35.565091 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-6tsln_crc-storage(dc847694-39ea-4c3c-bb58-0f920e59ac62)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-6tsln_crc-storage(dc847694-39ea-4c3c-bb58-0f920e59ac62)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-6tsln_crc-storage_dc847694-39ea-4c3c-bb58-0f920e59ac62_0(2cac28f05bb02e95f07312d123438159c25eddb8af19222fb8034a1595da2469): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-6tsln" podUID="dc847694-39ea-4c3c-bb58-0f920e59ac62" Feb 19 21:39:44 crc kubenswrapper[4795]: I0219 21:39:44.511671 4795 scope.go:117] "RemoveContainer" containerID="2822f57eb80a87e2600f1bdd3a3189e4bded6fbc0e489210ee08ea133cbf2aa9" Feb 19 21:39:45 crc kubenswrapper[4795]: I0219 21:39:45.010779 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5p6d9_e967392b-9bd8-4111-b1b9-96d503a19668/kube-multus/2.log" Feb 19 21:39:45 crc kubenswrapper[4795]: I0219 21:39:45.011151 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5p6d9" event={"ID":"e967392b-9bd8-4111-b1b9-96d503a19668","Type":"ContainerStarted","Data":"4a1caddc7b9e55db86fe435872d1d75dbec41873652387af8ca72cbba985cceb"} Feb 19 21:39:46 crc kubenswrapper[4795]: I0219 21:39:46.510970 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:46 crc kubenswrapper[4795]: I0219 21:39:46.511825 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:46 crc kubenswrapper[4795]: I0219 21:39:46.580050 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:46 crc kubenswrapper[4795]: I0219 21:39:46.806520 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-6tsln"] Feb 19 21:39:46 crc kubenswrapper[4795]: W0219 21:39:46.818821 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc847694_39ea_4c3c_bb58_0f920e59ac62.slice/crio-dc4caf179b98ffbf04243aa4ec489831d188da92f55af147cff3804c3f377731 WatchSource:0}: Error finding container dc4caf179b98ffbf04243aa4ec489831d188da92f55af147cff3804c3f377731: Status 404 returned error can't find the container with id dc4caf179b98ffbf04243aa4ec489831d188da92f55af147cff3804c3f377731 Feb 19 21:39:46 crc kubenswrapper[4795]: I0219 21:39:46.821617 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 21:39:47 crc kubenswrapper[4795]: I0219 21:39:47.021996 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-6tsln" event={"ID":"dc847694-39ea-4c3c-bb58-0f920e59ac62","Type":"ContainerStarted","Data":"dc4caf179b98ffbf04243aa4ec489831d188da92f55af147cff3804c3f377731"} Feb 19 21:39:49 crc kubenswrapper[4795]: I0219 21:39:49.034224 4795 generic.go:334] "Generic (PLEG): container finished" podID="dc847694-39ea-4c3c-bb58-0f920e59ac62" containerID="1e723054d3a11aa58d4ff2996b03eae32a1cee8cb004d6442cd324f9c14218e2" exitCode=0 Feb 19 21:39:49 crc kubenswrapper[4795]: I0219 21:39:49.034319 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-6tsln" event={"ID":"dc847694-39ea-4c3c-bb58-0f920e59ac62","Type":"ContainerDied","Data":"1e723054d3a11aa58d4ff2996b03eae32a1cee8cb004d6442cd324f9c14218e2"} Feb 19 21:39:50 crc kubenswrapper[4795]: I0219 21:39:50.970425 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:51 crc kubenswrapper[4795]: I0219 21:39:51.028698 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/dc847694-39ea-4c3c-bb58-0f920e59ac62-node-mnt\") pod \"dc847694-39ea-4c3c-bb58-0f920e59ac62\" (UID: \"dc847694-39ea-4c3c-bb58-0f920e59ac62\") " Feb 19 21:39:51 crc kubenswrapper[4795]: I0219 21:39:51.028770 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67zfj\" (UniqueName: \"kubernetes.io/projected/dc847694-39ea-4c3c-bb58-0f920e59ac62-kube-api-access-67zfj\") pod \"dc847694-39ea-4c3c-bb58-0f920e59ac62\" (UID: \"dc847694-39ea-4c3c-bb58-0f920e59ac62\") " Feb 19 21:39:51 crc kubenswrapper[4795]: I0219 21:39:51.028898 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/dc847694-39ea-4c3c-bb58-0f920e59ac62-crc-storage\") pod \"dc847694-39ea-4c3c-bb58-0f920e59ac62\" (UID: \"dc847694-39ea-4c3c-bb58-0f920e59ac62\") " Feb 19 21:39:51 crc kubenswrapper[4795]: I0219 21:39:51.028924 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc847694-39ea-4c3c-bb58-0f920e59ac62-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "dc847694-39ea-4c3c-bb58-0f920e59ac62" (UID: "dc847694-39ea-4c3c-bb58-0f920e59ac62"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:39:51 crc kubenswrapper[4795]: I0219 21:39:51.029282 4795 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/dc847694-39ea-4c3c-bb58-0f920e59ac62-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:51 crc kubenswrapper[4795]: I0219 21:39:51.033899 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc847694-39ea-4c3c-bb58-0f920e59ac62-kube-api-access-67zfj" (OuterVolumeSpecName: "kube-api-access-67zfj") pod "dc847694-39ea-4c3c-bb58-0f920e59ac62" (UID: "dc847694-39ea-4c3c-bb58-0f920e59ac62"). InnerVolumeSpecName "kube-api-access-67zfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:39:51 crc kubenswrapper[4795]: I0219 21:39:51.042256 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc847694-39ea-4c3c-bb58-0f920e59ac62-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "dc847694-39ea-4c3c-bb58-0f920e59ac62" (UID: "dc847694-39ea-4c3c-bb58-0f920e59ac62"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:39:51 crc kubenswrapper[4795]: I0219 21:39:51.046616 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-6tsln" event={"ID":"dc847694-39ea-4c3c-bb58-0f920e59ac62","Type":"ContainerDied","Data":"dc4caf179b98ffbf04243aa4ec489831d188da92f55af147cff3804c3f377731"} Feb 19 21:39:51 crc kubenswrapper[4795]: I0219 21:39:51.046650 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc4caf179b98ffbf04243aa4ec489831d188da92f55af147cff3804c3f377731" Feb 19 21:39:51 crc kubenswrapper[4795]: I0219 21:39:51.046699 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:51 crc kubenswrapper[4795]: I0219 21:39:51.129683 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67zfj\" (UniqueName: \"kubernetes.io/projected/dc847694-39ea-4c3c-bb58-0f920e59ac62-kube-api-access-67zfj\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:51 crc kubenswrapper[4795]: I0219 21:39:51.129712 4795 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/dc847694-39ea-4c3c-bb58-0f920e59ac62-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:58 crc kubenswrapper[4795]: I0219 21:39:58.427785 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:39:58 crc kubenswrapper[4795]: I0219 21:39:58.428358 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:39:58 crc kubenswrapper[4795]: I0219 21:39:58.861729 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72"] Feb 19 21:39:58 crc kubenswrapper[4795]: E0219 21:39:58.861954 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc847694-39ea-4c3c-bb58-0f920e59ac62" containerName="storage" Feb 19 21:39:58 crc kubenswrapper[4795]: I0219 21:39:58.861972 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc847694-39ea-4c3c-bb58-0f920e59ac62" containerName="storage" Feb 19 21:39:58 crc kubenswrapper[4795]: I0219 21:39:58.862090 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc847694-39ea-4c3c-bb58-0f920e59ac62" containerName="storage" Feb 19 21:39:58 crc kubenswrapper[4795]: I0219 21:39:58.862754 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" Feb 19 21:39:58 crc kubenswrapper[4795]: I0219 21:39:58.864831 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 19 21:39:58 crc kubenswrapper[4795]: I0219 21:39:58.869826 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72"] Feb 19 21:39:58 crc kubenswrapper[4795]: I0219 21:39:58.923481 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72\" (UID: \"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" Feb 19 21:39:58 crc kubenswrapper[4795]: I0219 21:39:58.923528 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9mhs\" (UniqueName: \"kubernetes.io/projected/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-kube-api-access-v9mhs\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72\" (UID: \"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" Feb 19 21:39:58 crc kubenswrapper[4795]: I0219 21:39:58.923551 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72\" (UID: \"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" Feb 19 21:39:59 crc kubenswrapper[4795]: I0219 21:39:59.024073 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72\" (UID: \"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" Feb 19 21:39:59 crc kubenswrapper[4795]: I0219 21:39:59.024129 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9mhs\" (UniqueName: \"kubernetes.io/projected/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-kube-api-access-v9mhs\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72\" (UID: \"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" Feb 19 21:39:59 crc kubenswrapper[4795]: I0219 21:39:59.024160 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72\" (UID: \"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" Feb 19 21:39:59 crc kubenswrapper[4795]: I0219 21:39:59.024671 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72\" (UID: \"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" Feb 19 21:39:59 crc kubenswrapper[4795]: I0219 21:39:59.024696 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72\" (UID: \"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" Feb 19 21:39:59 crc kubenswrapper[4795]: I0219 21:39:59.041496 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9mhs\" (UniqueName: \"kubernetes.io/projected/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-kube-api-access-v9mhs\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72\" (UID: \"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" Feb 19 21:39:59 crc kubenswrapper[4795]: I0219 21:39:59.178818 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" Feb 19 21:39:59 crc kubenswrapper[4795]: I0219 21:39:59.335270 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72"] Feb 19 21:40:00 crc kubenswrapper[4795]: I0219 21:40:00.095722 4795 generic.go:334] "Generic (PLEG): container finished" podID="bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a" containerID="8f8f73f759d7873ea56e90568186bc9854f950930e7d81d6f7da9012a8ba5393" exitCode=0 Feb 19 21:40:00 crc kubenswrapper[4795]: I0219 21:40:00.095770 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" event={"ID":"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a","Type":"ContainerDied","Data":"8f8f73f759d7873ea56e90568186bc9854f950930e7d81d6f7da9012a8ba5393"} Feb 19 21:40:00 crc kubenswrapper[4795]: I0219 21:40:00.095828 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" event={"ID":"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a","Type":"ContainerStarted","Data":"164513c4517cddc9aa0b1b5c63c226485f295baefd630938b695cfd17f59c08a"} Feb 19 21:40:02 crc kubenswrapper[4795]: I0219 21:40:02.105950 4795 generic.go:334] "Generic (PLEG): container finished" podID="bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a" containerID="cd7a2b9ac2ff5a35837c99c829afcd08ccf0c370a64c862942b5af3fe6d51acd" exitCode=0 Feb 19 21:40:02 crc kubenswrapper[4795]: I0219 21:40:02.106047 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" event={"ID":"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a","Type":"ContainerDied","Data":"cd7a2b9ac2ff5a35837c99c829afcd08ccf0c370a64c862942b5af3fe6d51acd"} Feb 19 21:40:03 crc kubenswrapper[4795]: I0219 21:40:03.118730 4795 generic.go:334] "Generic (PLEG): container finished" podID="bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a" containerID="593a16368bd401d1c61c0f86210f64969596b90ed4236c8a754ef8fe915a191a" exitCode=0 Feb 19 21:40:03 crc kubenswrapper[4795]: I0219 21:40:03.118813 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" event={"ID":"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a","Type":"ContainerDied","Data":"593a16368bd401d1c61c0f86210f64969596b90ed4236c8a754ef8fe915a191a"} Feb 19 21:40:04 crc kubenswrapper[4795]: I0219 21:40:04.318780 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" Feb 19 21:40:04 crc kubenswrapper[4795]: I0219 21:40:04.388738 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9mhs\" (UniqueName: \"kubernetes.io/projected/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-kube-api-access-v9mhs\") pod \"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a\" (UID: \"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a\") " Feb 19 21:40:04 crc kubenswrapper[4795]: I0219 21:40:04.388855 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-util\") pod \"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a\" (UID: \"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a\") " Feb 19 21:40:04 crc kubenswrapper[4795]: I0219 21:40:04.388923 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-bundle\") pod \"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a\" (UID: \"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a\") " Feb 19 21:40:04 crc kubenswrapper[4795]: I0219 21:40:04.389676 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-bundle" (OuterVolumeSpecName: "bundle") pod "bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a" (UID: "bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:40:04 crc kubenswrapper[4795]: I0219 21:40:04.394589 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-kube-api-access-v9mhs" (OuterVolumeSpecName: "kube-api-access-v9mhs") pod "bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a" (UID: "bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a"). InnerVolumeSpecName "kube-api-access-v9mhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:40:04 crc kubenswrapper[4795]: I0219 21:40:04.408185 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-util" (OuterVolumeSpecName: "util") pod "bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a" (UID: "bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:40:04 crc kubenswrapper[4795]: I0219 21:40:04.489874 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9mhs\" (UniqueName: \"kubernetes.io/projected/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-kube-api-access-v9mhs\") on node \"crc\" DevicePath \"\"" Feb 19 21:40:04 crc kubenswrapper[4795]: I0219 21:40:04.489904 4795 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-util\") on node \"crc\" DevicePath \"\"" Feb 19 21:40:04 crc kubenswrapper[4795]: I0219 21:40:04.489917 4795 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:40:05 crc kubenswrapper[4795]: I0219 21:40:05.131280 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" event={"ID":"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a","Type":"ContainerDied","Data":"164513c4517cddc9aa0b1b5c63c226485f295baefd630938b695cfd17f59c08a"} Feb 19 21:40:05 crc kubenswrapper[4795]: I0219 21:40:05.131328 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" Feb 19 21:40:05 crc kubenswrapper[4795]: I0219 21:40:05.131322 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="164513c4517cddc9aa0b1b5c63c226485f295baefd630938b695cfd17f59c08a" Feb 19 21:40:07 crc kubenswrapper[4795]: I0219 21:40:07.586708 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-rzkrx"] Feb 19 21:40:07 crc kubenswrapper[4795]: E0219 21:40:07.587178 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a" containerName="extract" Feb 19 21:40:07 crc kubenswrapper[4795]: I0219 21:40:07.587190 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a" containerName="extract" Feb 19 21:40:07 crc kubenswrapper[4795]: E0219 21:40:07.587202 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a" containerName="util" Feb 19 21:40:07 crc kubenswrapper[4795]: I0219 21:40:07.587208 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a" containerName="util" Feb 19 21:40:07 crc kubenswrapper[4795]: E0219 21:40:07.587217 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a" containerName="pull" Feb 19 21:40:07 crc kubenswrapper[4795]: I0219 21:40:07.587223 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a" containerName="pull" Feb 19 21:40:07 crc kubenswrapper[4795]: I0219 21:40:07.587318 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a" containerName="extract" Feb 19 21:40:07 crc kubenswrapper[4795]: I0219 21:40:07.587709 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-rzkrx" Feb 19 21:40:07 crc kubenswrapper[4795]: I0219 21:40:07.591493 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 19 21:40:07 crc kubenswrapper[4795]: I0219 21:40:07.591787 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 19 21:40:07 crc kubenswrapper[4795]: I0219 21:40:07.592017 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-47nj5" Feb 19 21:40:07 crc kubenswrapper[4795]: I0219 21:40:07.601872 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-rzkrx"] Feb 19 21:40:07 crc kubenswrapper[4795]: I0219 21:40:07.632755 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqgzl\" (UniqueName: \"kubernetes.io/projected/6b614198-6804-46a3-bb1e-d8495c0d53d6-kube-api-access-mqgzl\") pod \"nmstate-operator-694c9596b7-rzkrx\" (UID: \"6b614198-6804-46a3-bb1e-d8495c0d53d6\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-rzkrx" Feb 19 21:40:07 crc kubenswrapper[4795]: I0219 21:40:07.733927 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqgzl\" (UniqueName: \"kubernetes.io/projected/6b614198-6804-46a3-bb1e-d8495c0d53d6-kube-api-access-mqgzl\") pod \"nmstate-operator-694c9596b7-rzkrx\" (UID: \"6b614198-6804-46a3-bb1e-d8495c0d53d6\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-rzkrx" Feb 19 21:40:07 crc kubenswrapper[4795]: I0219 21:40:07.751126 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqgzl\" (UniqueName: \"kubernetes.io/projected/6b614198-6804-46a3-bb1e-d8495c0d53d6-kube-api-access-mqgzl\") pod \"nmstate-operator-694c9596b7-rzkrx\" (UID: \"6b614198-6804-46a3-bb1e-d8495c0d53d6\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-rzkrx" Feb 19 21:40:07 crc kubenswrapper[4795]: I0219 21:40:07.901836 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-rzkrx" Feb 19 21:40:08 crc kubenswrapper[4795]: I0219 21:40:08.302316 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-rzkrx"] Feb 19 21:40:08 crc kubenswrapper[4795]: W0219 21:40:08.307155 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b614198_6804_46a3_bb1e_d8495c0d53d6.slice/crio-54395a4aa981739ddd50eb4d29527f4dc4b645d1e35236d8afd23037dbc3e977 WatchSource:0}: Error finding container 54395a4aa981739ddd50eb4d29527f4dc4b645d1e35236d8afd23037dbc3e977: Status 404 returned error can't find the container with id 54395a4aa981739ddd50eb4d29527f4dc4b645d1e35236d8afd23037dbc3e977 Feb 19 21:40:09 crc kubenswrapper[4795]: I0219 21:40:09.150023 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-rzkrx" event={"ID":"6b614198-6804-46a3-bb1e-d8495c0d53d6","Type":"ContainerStarted","Data":"54395a4aa981739ddd50eb4d29527f4dc4b645d1e35236d8afd23037dbc3e977"} Feb 19 21:40:11 crc kubenswrapper[4795]: I0219 21:40:11.162365 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-rzkrx" event={"ID":"6b614198-6804-46a3-bb1e-d8495c0d53d6","Type":"ContainerStarted","Data":"f7595c627af0267910b3cc1c0692dd57c951900bf3d553d3838e63683c58ab68"} Feb 19 21:40:11 crc kubenswrapper[4795]: I0219 21:40:11.179635 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-rzkrx" podStartSLOduration=2.330833938 podStartE2EDuration="4.179621043s" podCreationTimestamp="2026-02-19 21:40:07 +0000 UTC" firstStartedPulling="2026-02-19 21:40:08.309504595 +0000 UTC m=+719.502022459" lastFinishedPulling="2026-02-19 21:40:10.1582917 +0000 UTC m=+721.350809564" observedRunningTime="2026-02-19 21:40:11.17807398 +0000 UTC m=+722.370591854" watchObservedRunningTime="2026-02-19 21:40:11.179621043 +0000 UTC m=+722.372138907" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.091949 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-kgnfd"] Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.093063 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-kgnfd" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.103525 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-dpglh" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.109946 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-kgnfd"] Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.144968 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-nk7zb"] Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.145805 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nk7zb" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.147305 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.149508 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-nk7zb"] Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.155047 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-zqk47"] Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.155694 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-zqk47" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.184132 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/06d09723-c7bd-422c-b447-70dee244cc05-ovs-socket\") pod \"nmstate-handler-zqk47\" (UID: \"06d09723-c7bd-422c-b447-70dee244cc05\") " pod="openshift-nmstate/nmstate-handler-zqk47" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.184171 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tfl8\" (UniqueName: \"kubernetes.io/projected/1dfc7b5c-9302-4774-a6c8-e76ff4d60385-kube-api-access-4tfl8\") pod \"nmstate-metrics-58c85c668d-kgnfd\" (UID: \"1dfc7b5c-9302-4774-a6c8-e76ff4d60385\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-kgnfd" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.184229 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/06d09723-c7bd-422c-b447-70dee244cc05-nmstate-lock\") pod \"nmstate-handler-zqk47\" (UID: \"06d09723-c7bd-422c-b447-70dee244cc05\") " pod="openshift-nmstate/nmstate-handler-zqk47" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.184248 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6c89273b-007f-44e6-88da-f48de3a5f03b-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-nk7zb\" (UID: \"6c89273b-007f-44e6-88da-f48de3a5f03b\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nk7zb" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.184287 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxs5c\" (UniqueName: \"kubernetes.io/projected/6c89273b-007f-44e6-88da-f48de3a5f03b-kube-api-access-lxs5c\") pod \"nmstate-webhook-866bcb46dc-nk7zb\" (UID: \"6c89273b-007f-44e6-88da-f48de3a5f03b\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nk7zb" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.184383 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/06d09723-c7bd-422c-b447-70dee244cc05-dbus-socket\") pod \"nmstate-handler-zqk47\" (UID: \"06d09723-c7bd-422c-b447-70dee244cc05\") " pod="openshift-nmstate/nmstate-handler-zqk47" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.184487 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsgtd\" (UniqueName: \"kubernetes.io/projected/06d09723-c7bd-422c-b447-70dee244cc05-kube-api-access-hsgtd\") pod \"nmstate-handler-zqk47\" (UID: \"06d09723-c7bd-422c-b447-70dee244cc05\") " pod="openshift-nmstate/nmstate-handler-zqk47" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.253920 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td"] Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.254698 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.257178 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.262389 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.264159 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-6rsrp" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.285177 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxs5c\" (UniqueName: \"kubernetes.io/projected/6c89273b-007f-44e6-88da-f48de3a5f03b-kube-api-access-lxs5c\") pod \"nmstate-webhook-866bcb46dc-nk7zb\" (UID: \"6c89273b-007f-44e6-88da-f48de3a5f03b\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nk7zb" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.285246 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/06d09723-c7bd-422c-b447-70dee244cc05-dbus-socket\") pod \"nmstate-handler-zqk47\" (UID: \"06d09723-c7bd-422c-b447-70dee244cc05\") " pod="openshift-nmstate/nmstate-handler-zqk47" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.285264 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsgtd\" (UniqueName: \"kubernetes.io/projected/06d09723-c7bd-422c-b447-70dee244cc05-kube-api-access-hsgtd\") pod \"nmstate-handler-zqk47\" (UID: \"06d09723-c7bd-422c-b447-70dee244cc05\") " pod="openshift-nmstate/nmstate-handler-zqk47" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.285301 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-gp2td\" (UID: \"f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.285327 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwgjv\" (UniqueName: \"kubernetes.io/projected/f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8-kube-api-access-vwgjv\") pod \"nmstate-console-plugin-5c78fc5d65-gp2td\" (UID: \"f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.285345 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-gp2td\" (UID: \"f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.285369 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/06d09723-c7bd-422c-b447-70dee244cc05-ovs-socket\") pod \"nmstate-handler-zqk47\" (UID: \"06d09723-c7bd-422c-b447-70dee244cc05\") " pod="openshift-nmstate/nmstate-handler-zqk47" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.285389 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tfl8\" (UniqueName: \"kubernetes.io/projected/1dfc7b5c-9302-4774-a6c8-e76ff4d60385-kube-api-access-4tfl8\") pod \"nmstate-metrics-58c85c668d-kgnfd\" (UID: \"1dfc7b5c-9302-4774-a6c8-e76ff4d60385\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-kgnfd" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.285431 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/06d09723-c7bd-422c-b447-70dee244cc05-nmstate-lock\") pod \"nmstate-handler-zqk47\" (UID: \"06d09723-c7bd-422c-b447-70dee244cc05\") " pod="openshift-nmstate/nmstate-handler-zqk47" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.285449 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6c89273b-007f-44e6-88da-f48de3a5f03b-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-nk7zb\" (UID: \"6c89273b-007f-44e6-88da-f48de3a5f03b\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nk7zb" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.285676 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/06d09723-c7bd-422c-b447-70dee244cc05-ovs-socket\") pod \"nmstate-handler-zqk47\" (UID: \"06d09723-c7bd-422c-b447-70dee244cc05\") " pod="openshift-nmstate/nmstate-handler-zqk47" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.285763 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/06d09723-c7bd-422c-b447-70dee244cc05-dbus-socket\") pod \"nmstate-handler-zqk47\" (UID: \"06d09723-c7bd-422c-b447-70dee244cc05\") " pod="openshift-nmstate/nmstate-handler-zqk47" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.285842 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/06d09723-c7bd-422c-b447-70dee244cc05-nmstate-lock\") pod \"nmstate-handler-zqk47\" (UID: \"06d09723-c7bd-422c-b447-70dee244cc05\") " pod="openshift-nmstate/nmstate-handler-zqk47" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.293601 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6c89273b-007f-44e6-88da-f48de3a5f03b-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-nk7zb\" (UID: \"6c89273b-007f-44e6-88da-f48de3a5f03b\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nk7zb" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.300911 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td"] Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.304721 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsgtd\" (UniqueName: \"kubernetes.io/projected/06d09723-c7bd-422c-b447-70dee244cc05-kube-api-access-hsgtd\") pod \"nmstate-handler-zqk47\" (UID: \"06d09723-c7bd-422c-b447-70dee244cc05\") " pod="openshift-nmstate/nmstate-handler-zqk47" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.305413 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxs5c\" (UniqueName: \"kubernetes.io/projected/6c89273b-007f-44e6-88da-f48de3a5f03b-kube-api-access-lxs5c\") pod \"nmstate-webhook-866bcb46dc-nk7zb\" (UID: \"6c89273b-007f-44e6-88da-f48de3a5f03b\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nk7zb" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.315997 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tfl8\" (UniqueName: \"kubernetes.io/projected/1dfc7b5c-9302-4774-a6c8-e76ff4d60385-kube-api-access-4tfl8\") pod \"nmstate-metrics-58c85c668d-kgnfd\" (UID: \"1dfc7b5c-9302-4774-a6c8-e76ff4d60385\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-kgnfd" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.386147 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwgjv\" (UniqueName: \"kubernetes.io/projected/f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8-kube-api-access-vwgjv\") pod \"nmstate-console-plugin-5c78fc5d65-gp2td\" (UID: \"f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.386205 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-gp2td\" (UID: \"f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.386272 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-gp2td\" (UID: \"f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.387035 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-gp2td\" (UID: \"f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td" Feb 19 21:40:12 crc kubenswrapper[4795]: E0219 21:40:12.387109 4795 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Feb 19 21:40:12 crc kubenswrapper[4795]: E0219 21:40:12.387153 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8-plugin-serving-cert podName:f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8 nodeName:}" failed. No retries permitted until 2026-02-19 21:40:12.887138833 +0000 UTC m=+724.079656697 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8-plugin-serving-cert") pod "nmstate-console-plugin-5c78fc5d65-gp2td" (UID: "f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8") : secret "plugin-serving-cert" not found Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.402343 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwgjv\" (UniqueName: \"kubernetes.io/projected/f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8-kube-api-access-vwgjv\") pod \"nmstate-console-plugin-5c78fc5d65-gp2td\" (UID: \"f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.446638 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-kgnfd" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.453730 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6699bdbc6b-7fbd7"] Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.454341 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.483382 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nk7zb" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.484128 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6699bdbc6b-7fbd7"] Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.496514 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-zqk47" Feb 19 21:40:12 crc kubenswrapper[4795]: W0219 21:40:12.558115 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06d09723_c7bd_422c_b447_70dee244cc05.slice/crio-72c964e7fb5502abf39fea516705b245fe7b7c2087a640f5fdde882b2b1bb29b WatchSource:0}: Error finding container 72c964e7fb5502abf39fea516705b245fe7b7c2087a640f5fdde882b2b1bb29b: Status 404 returned error can't find the container with id 72c964e7fb5502abf39fea516705b245fe7b7c2087a640f5fdde882b2b1bb29b Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.589663 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55f88401-be34-4486-8e8c-4c3be51ab251-service-ca\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.589700 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55f88401-be34-4486-8e8c-4c3be51ab251-console-serving-cert\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.589737 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55f88401-be34-4486-8e8c-4c3be51ab251-trusted-ca-bundle\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.589762 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55f88401-be34-4486-8e8c-4c3be51ab251-console-config\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.589784 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5l2q\" (UniqueName: \"kubernetes.io/projected/55f88401-be34-4486-8e8c-4c3be51ab251-kube-api-access-n5l2q\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.589807 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55f88401-be34-4486-8e8c-4c3be51ab251-console-oauth-config\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.589822 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55f88401-be34-4486-8e8c-4c3be51ab251-oauth-serving-cert\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.665235 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-kgnfd"] Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.690900 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55f88401-be34-4486-8e8c-4c3be51ab251-service-ca\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.691270 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55f88401-be34-4486-8e8c-4c3be51ab251-console-serving-cert\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.691305 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55f88401-be34-4486-8e8c-4c3be51ab251-trusted-ca-bundle\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.691330 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55f88401-be34-4486-8e8c-4c3be51ab251-console-config\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.691352 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5l2q\" (UniqueName: \"kubernetes.io/projected/55f88401-be34-4486-8e8c-4c3be51ab251-kube-api-access-n5l2q\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.691378 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55f88401-be34-4486-8e8c-4c3be51ab251-console-oauth-config\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.691405 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55f88401-be34-4486-8e8c-4c3be51ab251-oauth-serving-cert\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.691744 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55f88401-be34-4486-8e8c-4c3be51ab251-service-ca\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.692109 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55f88401-be34-4486-8e8c-4c3be51ab251-oauth-serving-cert\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.692620 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55f88401-be34-4486-8e8c-4c3be51ab251-console-config\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.693115 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55f88401-be34-4486-8e8c-4c3be51ab251-trusted-ca-bundle\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.696423 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55f88401-be34-4486-8e8c-4c3be51ab251-console-oauth-config\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.700743 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55f88401-be34-4486-8e8c-4c3be51ab251-console-serving-cert\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.705939 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5l2q\" (UniqueName: \"kubernetes.io/projected/55f88401-be34-4486-8e8c-4c3be51ab251-kube-api-access-n5l2q\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.802942 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.893331 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-gp2td\" (UID: \"f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.897929 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-gp2td\" (UID: \"f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.927929 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-nk7zb"] Feb 19 21:40:12 crc kubenswrapper[4795]: W0219 21:40:12.944979 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c89273b_007f_44e6_88da_f48de3a5f03b.slice/crio-5613366bc7262aa9b87e14088524bc21ac6efc15f0a0bba4fefbf1e99942e393 WatchSource:0}: Error finding container 5613366bc7262aa9b87e14088524bc21ac6efc15f0a0bba4fefbf1e99942e393: Status 404 returned error can't find the container with id 5613366bc7262aa9b87e14088524bc21ac6efc15f0a0bba4fefbf1e99942e393 Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.984992 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6699bdbc6b-7fbd7"] Feb 19 21:40:12 crc kubenswrapper[4795]: W0219 21:40:12.990416 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55f88401_be34_4486_8e8c_4c3be51ab251.slice/crio-060a5e633139d5351f0cbb0878548a867828a9586a7491786a3d1147e8abd5b4 WatchSource:0}: Error finding container 060a5e633139d5351f0cbb0878548a867828a9586a7491786a3d1147e8abd5b4: Status 404 returned error can't find the container with id 060a5e633139d5351f0cbb0878548a867828a9586a7491786a3d1147e8abd5b4 Feb 19 21:40:13 crc kubenswrapper[4795]: I0219 21:40:13.166660 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td" Feb 19 21:40:13 crc kubenswrapper[4795]: I0219 21:40:13.172705 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nk7zb" event={"ID":"6c89273b-007f-44e6-88da-f48de3a5f03b","Type":"ContainerStarted","Data":"5613366bc7262aa9b87e14088524bc21ac6efc15f0a0bba4fefbf1e99942e393"} Feb 19 21:40:13 crc kubenswrapper[4795]: I0219 21:40:13.174468 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-zqk47" event={"ID":"06d09723-c7bd-422c-b447-70dee244cc05","Type":"ContainerStarted","Data":"72c964e7fb5502abf39fea516705b245fe7b7c2087a640f5fdde882b2b1bb29b"} Feb 19 21:40:13 crc kubenswrapper[4795]: I0219 21:40:13.177213 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6699bdbc6b-7fbd7" event={"ID":"55f88401-be34-4486-8e8c-4c3be51ab251","Type":"ContainerStarted","Data":"0782a73adbea456f2d44b5878f555fc1836a9ec6d638da94a76199740bf9627d"} Feb 19 21:40:13 crc kubenswrapper[4795]: I0219 21:40:13.177252 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6699bdbc6b-7fbd7" event={"ID":"55f88401-be34-4486-8e8c-4c3be51ab251","Type":"ContainerStarted","Data":"060a5e633139d5351f0cbb0878548a867828a9586a7491786a3d1147e8abd5b4"} Feb 19 21:40:13 crc kubenswrapper[4795]: I0219 21:40:13.180131 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-kgnfd" event={"ID":"1dfc7b5c-9302-4774-a6c8-e76ff4d60385","Type":"ContainerStarted","Data":"d669bb293befeeb26b605590e515a05af3aaf925a4aa01dfd16bce44738deef4"} Feb 19 21:40:13 crc kubenswrapper[4795]: I0219 21:40:13.193239 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6699bdbc6b-7fbd7" podStartSLOduration=1.193219206 podStartE2EDuration="1.193219206s" podCreationTimestamp="2026-02-19 21:40:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:40:13.192558537 +0000 UTC m=+724.385076401" watchObservedRunningTime="2026-02-19 21:40:13.193219206 +0000 UTC m=+724.385737060" Feb 19 21:40:13 crc kubenswrapper[4795]: I0219 21:40:13.364644 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td"] Feb 19 21:40:14 crc kubenswrapper[4795]: I0219 21:40:14.195035 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td" event={"ID":"f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8","Type":"ContainerStarted","Data":"fe32af036e454241c567cc8e0f61e0866559d9a7d3a4502e65dcb281e729761d"} Feb 19 21:40:15 crc kubenswrapper[4795]: I0219 21:40:15.206909 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-kgnfd" event={"ID":"1dfc7b5c-9302-4774-a6c8-e76ff4d60385","Type":"ContainerStarted","Data":"61115fc7fcf512e5a26159f193db4580d5a6a7507403b0bc4169c12cc951ca11"} Feb 19 21:40:15 crc kubenswrapper[4795]: I0219 21:40:15.208971 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nk7zb" event={"ID":"6c89273b-007f-44e6-88da-f48de3a5f03b","Type":"ContainerStarted","Data":"42355c608a117e5b0c1f3f0b421dd26ea3d75e379af3aeb7703f92cd5be3c557"} Feb 19 21:40:15 crc kubenswrapper[4795]: I0219 21:40:15.209123 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nk7zb" Feb 19 21:40:15 crc kubenswrapper[4795]: I0219 21:40:15.210698 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-zqk47" event={"ID":"06d09723-c7bd-422c-b447-70dee244cc05","Type":"ContainerStarted","Data":"d3f3ab5956da1b5c9ba5705b7eb80eda83b416a4d638ed678f93df9fc6895974"} Feb 19 21:40:15 crc kubenswrapper[4795]: I0219 21:40:15.210846 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-zqk47" Feb 19 21:40:15 crc kubenswrapper[4795]: I0219 21:40:15.225943 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nk7zb" podStartSLOduration=1.2471275259999999 podStartE2EDuration="3.225925735s" podCreationTimestamp="2026-02-19 21:40:12 +0000 UTC" firstStartedPulling="2026-02-19 21:40:12.948430229 +0000 UTC m=+724.140948093" lastFinishedPulling="2026-02-19 21:40:14.927228428 +0000 UTC m=+726.119746302" observedRunningTime="2026-02-19 21:40:15.22465545 +0000 UTC m=+726.417173314" watchObservedRunningTime="2026-02-19 21:40:15.225925735 +0000 UTC m=+726.418443599" Feb 19 21:40:15 crc kubenswrapper[4795]: I0219 21:40:15.250253 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-zqk47" podStartSLOduration=0.92566856 podStartE2EDuration="3.250230097s" podCreationTimestamp="2026-02-19 21:40:12 +0000 UTC" firstStartedPulling="2026-02-19 21:40:12.559764275 +0000 UTC m=+723.752282139" lastFinishedPulling="2026-02-19 21:40:14.884325812 +0000 UTC m=+726.076843676" observedRunningTime="2026-02-19 21:40:15.243333596 +0000 UTC m=+726.435851470" watchObservedRunningTime="2026-02-19 21:40:15.250230097 +0000 UTC m=+726.442747961" Feb 19 21:40:16 crc kubenswrapper[4795]: I0219 21:40:16.221663 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td" event={"ID":"f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8","Type":"ContainerStarted","Data":"b11adb14302c0d4d392ad94899855264089be2fef7f39e171d5e437b5cbf641d"} Feb 19 21:40:16 crc kubenswrapper[4795]: I0219 21:40:16.244407 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td" podStartSLOduration=1.845439686 podStartE2EDuration="4.244382259s" podCreationTimestamp="2026-02-19 21:40:12 +0000 UTC" firstStartedPulling="2026-02-19 21:40:13.370729533 +0000 UTC m=+724.563247397" lastFinishedPulling="2026-02-19 21:40:15.769672096 +0000 UTC m=+726.962189970" observedRunningTime="2026-02-19 21:40:16.235543724 +0000 UTC m=+727.428061608" watchObservedRunningTime="2026-02-19 21:40:16.244382259 +0000 UTC m=+727.436900123" Feb 19 21:40:17 crc kubenswrapper[4795]: I0219 21:40:17.226899 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-kgnfd" event={"ID":"1dfc7b5c-9302-4774-a6c8-e76ff4d60385","Type":"ContainerStarted","Data":"21cf6b304361dbb86a91906186b1f42903ce3f0af0e47747fd83a53d2746c9b0"} Feb 19 21:40:17 crc kubenswrapper[4795]: I0219 21:40:17.244548 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-kgnfd" podStartSLOduration=1.193796351 podStartE2EDuration="5.244533455s" podCreationTimestamp="2026-02-19 21:40:12 +0000 UTC" firstStartedPulling="2026-02-19 21:40:12.675914956 +0000 UTC m=+723.868432820" lastFinishedPulling="2026-02-19 21:40:16.72665207 +0000 UTC m=+727.919169924" observedRunningTime="2026-02-19 21:40:17.240612437 +0000 UTC m=+728.433130321" watchObservedRunningTime="2026-02-19 21:40:17.244533455 +0000 UTC m=+728.437051319" Feb 19 21:40:22 crc kubenswrapper[4795]: I0219 21:40:22.529637 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-zqk47" Feb 19 21:40:22 crc kubenswrapper[4795]: I0219 21:40:22.803649 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:22 crc kubenswrapper[4795]: I0219 21:40:22.804240 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:22 crc kubenswrapper[4795]: I0219 21:40:22.811255 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:23 crc kubenswrapper[4795]: I0219 21:40:23.273588 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:23 crc kubenswrapper[4795]: I0219 21:40:23.336893 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-rvkhj"] Feb 19 21:40:28 crc kubenswrapper[4795]: I0219 21:40:28.427277 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:40:28 crc kubenswrapper[4795]: I0219 21:40:28.427651 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:40:32 crc kubenswrapper[4795]: I0219 21:40:32.490308 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nk7zb" Feb 19 21:40:42 crc kubenswrapper[4795]: I0219 21:40:42.249212 4795 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 19 21:40:46 crc kubenswrapper[4795]: I0219 21:40:46.342143 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj"] Feb 19 21:40:46 crc kubenswrapper[4795]: I0219 21:40:46.343862 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" Feb 19 21:40:46 crc kubenswrapper[4795]: I0219 21:40:46.346927 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 19 21:40:46 crc kubenswrapper[4795]: I0219 21:40:46.358015 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj"] Feb 19 21:40:46 crc kubenswrapper[4795]: I0219 21:40:46.445807 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f281e72-3a5e-4abb-bbcb-7555808866be-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj\" (UID: \"8f281e72-3a5e-4abb-bbcb-7555808866be\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" Feb 19 21:40:46 crc kubenswrapper[4795]: I0219 21:40:46.445888 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f281e72-3a5e-4abb-bbcb-7555808866be-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj\" (UID: \"8f281e72-3a5e-4abb-bbcb-7555808866be\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" Feb 19 21:40:46 crc kubenswrapper[4795]: I0219 21:40:46.446033 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7zj2\" (UniqueName: \"kubernetes.io/projected/8f281e72-3a5e-4abb-bbcb-7555808866be-kube-api-access-s7zj2\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj\" (UID: \"8f281e72-3a5e-4abb-bbcb-7555808866be\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" Feb 19 21:40:46 crc kubenswrapper[4795]: I0219 21:40:46.547475 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f281e72-3a5e-4abb-bbcb-7555808866be-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj\" (UID: \"8f281e72-3a5e-4abb-bbcb-7555808866be\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" Feb 19 21:40:46 crc kubenswrapper[4795]: I0219 21:40:46.547530 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f281e72-3a5e-4abb-bbcb-7555808866be-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj\" (UID: \"8f281e72-3a5e-4abb-bbcb-7555808866be\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" Feb 19 21:40:46 crc kubenswrapper[4795]: I0219 21:40:46.547576 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7zj2\" (UniqueName: \"kubernetes.io/projected/8f281e72-3a5e-4abb-bbcb-7555808866be-kube-api-access-s7zj2\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj\" (UID: \"8f281e72-3a5e-4abb-bbcb-7555808866be\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" Feb 19 21:40:46 crc kubenswrapper[4795]: I0219 21:40:46.547944 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f281e72-3a5e-4abb-bbcb-7555808866be-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj\" (UID: \"8f281e72-3a5e-4abb-bbcb-7555808866be\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" Feb 19 21:40:46 crc kubenswrapper[4795]: I0219 21:40:46.548142 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f281e72-3a5e-4abb-bbcb-7555808866be-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj\" (UID: \"8f281e72-3a5e-4abb-bbcb-7555808866be\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" Feb 19 21:40:46 crc kubenswrapper[4795]: I0219 21:40:46.566475 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7zj2\" (UniqueName: \"kubernetes.io/projected/8f281e72-3a5e-4abb-bbcb-7555808866be-kube-api-access-s7zj2\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj\" (UID: \"8f281e72-3a5e-4abb-bbcb-7555808866be\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" Feb 19 21:40:46 crc kubenswrapper[4795]: I0219 21:40:46.661214 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" Feb 19 21:40:46 crc kubenswrapper[4795]: I0219 21:40:46.831653 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj"] Feb 19 21:40:47 crc kubenswrapper[4795]: I0219 21:40:47.405901 4795 generic.go:334] "Generic (PLEG): container finished" podID="8f281e72-3a5e-4abb-bbcb-7555808866be" containerID="8e8e52ed11c438f4bee44742883e14e18265ed4eb79cf0379b3f52cfd53cb52b" exitCode=0 Feb 19 21:40:47 crc kubenswrapper[4795]: I0219 21:40:47.405993 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" event={"ID":"8f281e72-3a5e-4abb-bbcb-7555808866be","Type":"ContainerDied","Data":"8e8e52ed11c438f4bee44742883e14e18265ed4eb79cf0379b3f52cfd53cb52b"} Feb 19 21:40:47 crc kubenswrapper[4795]: I0219 21:40:47.406374 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" event={"ID":"8f281e72-3a5e-4abb-bbcb-7555808866be","Type":"ContainerStarted","Data":"d5f862a402ff333110036da6599ba414ec2036ba5942154c2f4bc5a556c0dcb8"} Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.398238 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-rvkhj" podUID="ec60d287-0f21-467c-8030-84b8726af567" containerName="console" containerID="cri-o://4035d8859f231da873e32f045fcb444bd306ddab0862e7e765fb5b6fd021c463" gracePeriod=15 Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.704444 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zmvnt"] Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.706065 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.711318 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zmvnt"] Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.773686 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-rvkhj_ec60d287-0f21-467c-8030-84b8726af567/console/0.log" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.773768 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.800794 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4118207c-5a68-4979-b9b6-eb22b17052b5-utilities\") pod \"redhat-operators-zmvnt\" (UID: \"4118207c-5a68-4979-b9b6-eb22b17052b5\") " pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.800856 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4118207c-5a68-4979-b9b6-eb22b17052b5-catalog-content\") pod \"redhat-operators-zmvnt\" (UID: \"4118207c-5a68-4979-b9b6-eb22b17052b5\") " pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.800888 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsv5f\" (UniqueName: \"kubernetes.io/projected/4118207c-5a68-4979-b9b6-eb22b17052b5-kube-api-access-rsv5f\") pod \"redhat-operators-zmvnt\" (UID: \"4118207c-5a68-4979-b9b6-eb22b17052b5\") " pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.902013 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-console-config\") pod \"ec60d287-0f21-467c-8030-84b8726af567\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.902352 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-trusted-ca-bundle\") pod \"ec60d287-0f21-467c-8030-84b8726af567\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.902451 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-service-ca\") pod \"ec60d287-0f21-467c-8030-84b8726af567\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.902483 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-oauth-serving-cert\") pod \"ec60d287-0f21-467c-8030-84b8726af567\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.902508 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ec60d287-0f21-467c-8030-84b8726af567-console-oauth-config\") pod \"ec60d287-0f21-467c-8030-84b8726af567\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.902542 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8p5td\" (UniqueName: \"kubernetes.io/projected/ec60d287-0f21-467c-8030-84b8726af567-kube-api-access-8p5td\") pod \"ec60d287-0f21-467c-8030-84b8726af567\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.902582 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec60d287-0f21-467c-8030-84b8726af567-console-serving-cert\") pod \"ec60d287-0f21-467c-8030-84b8726af567\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.902752 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4118207c-5a68-4979-b9b6-eb22b17052b5-utilities\") pod \"redhat-operators-zmvnt\" (UID: \"4118207c-5a68-4979-b9b6-eb22b17052b5\") " pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.902765 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-console-config" (OuterVolumeSpecName: "console-config") pod "ec60d287-0f21-467c-8030-84b8726af567" (UID: "ec60d287-0f21-467c-8030-84b8726af567"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.902881 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4118207c-5a68-4979-b9b6-eb22b17052b5-catalog-content\") pod \"redhat-operators-zmvnt\" (UID: \"4118207c-5a68-4979-b9b6-eb22b17052b5\") " pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.902972 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsv5f\" (UniqueName: \"kubernetes.io/projected/4118207c-5a68-4979-b9b6-eb22b17052b5-kube-api-access-rsv5f\") pod \"redhat-operators-zmvnt\" (UID: \"4118207c-5a68-4979-b9b6-eb22b17052b5\") " pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.903180 4795 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-console-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.903293 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ec60d287-0f21-467c-8030-84b8726af567" (UID: "ec60d287-0f21-467c-8030-84b8726af567"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.903318 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4118207c-5a68-4979-b9b6-eb22b17052b5-utilities\") pod \"redhat-operators-zmvnt\" (UID: \"4118207c-5a68-4979-b9b6-eb22b17052b5\") " pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.903310 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ec60d287-0f21-467c-8030-84b8726af567" (UID: "ec60d287-0f21-467c-8030-84b8726af567"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.903331 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-service-ca" (OuterVolumeSpecName: "service-ca") pod "ec60d287-0f21-467c-8030-84b8726af567" (UID: "ec60d287-0f21-467c-8030-84b8726af567"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.903426 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4118207c-5a68-4979-b9b6-eb22b17052b5-catalog-content\") pod \"redhat-operators-zmvnt\" (UID: \"4118207c-5a68-4979-b9b6-eb22b17052b5\") " pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.908045 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec60d287-0f21-467c-8030-84b8726af567-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ec60d287-0f21-467c-8030-84b8726af567" (UID: "ec60d287-0f21-467c-8030-84b8726af567"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.908129 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec60d287-0f21-467c-8030-84b8726af567-kube-api-access-8p5td" (OuterVolumeSpecName: "kube-api-access-8p5td") pod "ec60d287-0f21-467c-8030-84b8726af567" (UID: "ec60d287-0f21-467c-8030-84b8726af567"). InnerVolumeSpecName "kube-api-access-8p5td". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.910250 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec60d287-0f21-467c-8030-84b8726af567-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ec60d287-0f21-467c-8030-84b8726af567" (UID: "ec60d287-0f21-467c-8030-84b8726af567"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.918453 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsv5f\" (UniqueName: \"kubernetes.io/projected/4118207c-5a68-4979-b9b6-eb22b17052b5-kube-api-access-rsv5f\") pod \"redhat-operators-zmvnt\" (UID: \"4118207c-5a68-4979-b9b6-eb22b17052b5\") " pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.004523 4795 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.004561 4795 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.004575 4795 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ec60d287-0f21-467c-8030-84b8726af567-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.004585 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8p5td\" (UniqueName: \"kubernetes.io/projected/ec60d287-0f21-467c-8030-84b8726af567-kube-api-access-8p5td\") on node \"crc\" DevicePath \"\"" Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.004594 4795 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec60d287-0f21-467c-8030-84b8726af567-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.004601 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.070774 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.256878 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zmvnt"] Feb 19 21:40:49 crc kubenswrapper[4795]: W0219 21:40:49.264353 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4118207c_5a68_4979_b9b6_eb22b17052b5.slice/crio-c3720e6494d7da9d6757edc6343c06ba78fe938b5670ae300d8ec58fba2ab684 WatchSource:0}: Error finding container c3720e6494d7da9d6757edc6343c06ba78fe938b5670ae300d8ec58fba2ab684: Status 404 returned error can't find the container with id c3720e6494d7da9d6757edc6343c06ba78fe938b5670ae300d8ec58fba2ab684 Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.417509 4795 generic.go:334] "Generic (PLEG): container finished" podID="8f281e72-3a5e-4abb-bbcb-7555808866be" containerID="282254e9f9d949e94e17360f3140e13e0f05261314c0d710cebe2dcd8af745c8" exitCode=0 Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.417678 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" event={"ID":"8f281e72-3a5e-4abb-bbcb-7555808866be","Type":"ContainerDied","Data":"282254e9f9d949e94e17360f3140e13e0f05261314c0d710cebe2dcd8af745c8"} Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.419229 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-rvkhj_ec60d287-0f21-467c-8030-84b8726af567/console/0.log" Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.419267 4795 generic.go:334] "Generic (PLEG): container finished" podID="ec60d287-0f21-467c-8030-84b8726af567" containerID="4035d8859f231da873e32f045fcb444bd306ddab0862e7e765fb5b6fd021c463" exitCode=2 Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.419334 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.419359 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rvkhj" event={"ID":"ec60d287-0f21-467c-8030-84b8726af567","Type":"ContainerDied","Data":"4035d8859f231da873e32f045fcb444bd306ddab0862e7e765fb5b6fd021c463"} Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.419398 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rvkhj" event={"ID":"ec60d287-0f21-467c-8030-84b8726af567","Type":"ContainerDied","Data":"01605015262ec0d283a1299b16fa7df4e9785d87441123f54856fb5a6f2abf61"} Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.419421 4795 scope.go:117] "RemoveContainer" containerID="4035d8859f231da873e32f045fcb444bd306ddab0862e7e765fb5b6fd021c463" Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.422216 4795 generic.go:334] "Generic (PLEG): container finished" podID="4118207c-5a68-4979-b9b6-eb22b17052b5" containerID="c429bcd1ab09eff2de0ad552c87f970faf827c32a10744ab563813583f821bcd" exitCode=0 Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.422258 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmvnt" event={"ID":"4118207c-5a68-4979-b9b6-eb22b17052b5","Type":"ContainerDied","Data":"c429bcd1ab09eff2de0ad552c87f970faf827c32a10744ab563813583f821bcd"} Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.422272 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmvnt" event={"ID":"4118207c-5a68-4979-b9b6-eb22b17052b5","Type":"ContainerStarted","Data":"c3720e6494d7da9d6757edc6343c06ba78fe938b5670ae300d8ec58fba2ab684"} Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.469701 4795 scope.go:117] "RemoveContainer" containerID="4035d8859f231da873e32f045fcb444bd306ddab0862e7e765fb5b6fd021c463" Feb 19 21:40:49 crc kubenswrapper[4795]: E0219 21:40:49.470054 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4035d8859f231da873e32f045fcb444bd306ddab0862e7e765fb5b6fd021c463\": container with ID starting with 4035d8859f231da873e32f045fcb444bd306ddab0862e7e765fb5b6fd021c463 not found: ID does not exist" containerID="4035d8859f231da873e32f045fcb444bd306ddab0862e7e765fb5b6fd021c463" Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.470083 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4035d8859f231da873e32f045fcb444bd306ddab0862e7e765fb5b6fd021c463"} err="failed to get container status \"4035d8859f231da873e32f045fcb444bd306ddab0862e7e765fb5b6fd021c463\": rpc error: code = NotFound desc = could not find container \"4035d8859f231da873e32f045fcb444bd306ddab0862e7e765fb5b6fd021c463\": container with ID starting with 4035d8859f231da873e32f045fcb444bd306ddab0862e7e765fb5b6fd021c463 not found: ID does not exist" Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.522976 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-rvkhj"] Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.525630 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-rvkhj"] Feb 19 21:40:50 crc kubenswrapper[4795]: I0219 21:40:50.430134 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmvnt" event={"ID":"4118207c-5a68-4979-b9b6-eb22b17052b5","Type":"ContainerStarted","Data":"0ad256ff7ca3904d283dba61f3dcaf8929555e0b40d744d5939cd6ca169a84e6"} Feb 19 21:40:50 crc kubenswrapper[4795]: I0219 21:40:50.437998 4795 generic.go:334] "Generic (PLEG): container finished" podID="8f281e72-3a5e-4abb-bbcb-7555808866be" containerID="fc35abed06fc900449f1e23dd9b72c2de42312527d8bedd50f603c4383888de5" exitCode=0 Feb 19 21:40:50 crc kubenswrapper[4795]: I0219 21:40:50.438037 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" event={"ID":"8f281e72-3a5e-4abb-bbcb-7555808866be","Type":"ContainerDied","Data":"fc35abed06fc900449f1e23dd9b72c2de42312527d8bedd50f603c4383888de5"} Feb 19 21:40:51 crc kubenswrapper[4795]: I0219 21:40:51.449585 4795 generic.go:334] "Generic (PLEG): container finished" podID="4118207c-5a68-4979-b9b6-eb22b17052b5" containerID="0ad256ff7ca3904d283dba61f3dcaf8929555e0b40d744d5939cd6ca169a84e6" exitCode=0 Feb 19 21:40:51 crc kubenswrapper[4795]: I0219 21:40:51.449678 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmvnt" event={"ID":"4118207c-5a68-4979-b9b6-eb22b17052b5","Type":"ContainerDied","Data":"0ad256ff7ca3904d283dba61f3dcaf8929555e0b40d744d5939cd6ca169a84e6"} Feb 19 21:40:51 crc kubenswrapper[4795]: I0219 21:40:51.524976 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec60d287-0f21-467c-8030-84b8726af567" path="/var/lib/kubelet/pods/ec60d287-0f21-467c-8030-84b8726af567/volumes" Feb 19 21:40:51 crc kubenswrapper[4795]: I0219 21:40:51.693917 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" Feb 19 21:40:51 crc kubenswrapper[4795]: I0219 21:40:51.841144 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f281e72-3a5e-4abb-bbcb-7555808866be-bundle\") pod \"8f281e72-3a5e-4abb-bbcb-7555808866be\" (UID: \"8f281e72-3a5e-4abb-bbcb-7555808866be\") " Feb 19 21:40:51 crc kubenswrapper[4795]: I0219 21:40:51.841194 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f281e72-3a5e-4abb-bbcb-7555808866be-util\") pod \"8f281e72-3a5e-4abb-bbcb-7555808866be\" (UID: \"8f281e72-3a5e-4abb-bbcb-7555808866be\") " Feb 19 21:40:51 crc kubenswrapper[4795]: I0219 21:40:51.841264 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7zj2\" (UniqueName: \"kubernetes.io/projected/8f281e72-3a5e-4abb-bbcb-7555808866be-kube-api-access-s7zj2\") pod \"8f281e72-3a5e-4abb-bbcb-7555808866be\" (UID: \"8f281e72-3a5e-4abb-bbcb-7555808866be\") " Feb 19 21:40:51 crc kubenswrapper[4795]: I0219 21:40:51.843316 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f281e72-3a5e-4abb-bbcb-7555808866be-bundle" (OuterVolumeSpecName: "bundle") pod "8f281e72-3a5e-4abb-bbcb-7555808866be" (UID: "8f281e72-3a5e-4abb-bbcb-7555808866be"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:40:51 crc kubenswrapper[4795]: I0219 21:40:51.847596 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f281e72-3a5e-4abb-bbcb-7555808866be-kube-api-access-s7zj2" (OuterVolumeSpecName: "kube-api-access-s7zj2") pod "8f281e72-3a5e-4abb-bbcb-7555808866be" (UID: "8f281e72-3a5e-4abb-bbcb-7555808866be"). InnerVolumeSpecName "kube-api-access-s7zj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:40:51 crc kubenswrapper[4795]: I0219 21:40:51.864047 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f281e72-3a5e-4abb-bbcb-7555808866be-util" (OuterVolumeSpecName: "util") pod "8f281e72-3a5e-4abb-bbcb-7555808866be" (UID: "8f281e72-3a5e-4abb-bbcb-7555808866be"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:40:51 crc kubenswrapper[4795]: I0219 21:40:51.942325 4795 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f281e72-3a5e-4abb-bbcb-7555808866be-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:40:51 crc kubenswrapper[4795]: I0219 21:40:51.942819 4795 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f281e72-3a5e-4abb-bbcb-7555808866be-util\") on node \"crc\" DevicePath \"\"" Feb 19 21:40:51 crc kubenswrapper[4795]: I0219 21:40:51.942876 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7zj2\" (UniqueName: \"kubernetes.io/projected/8f281e72-3a5e-4abb-bbcb-7555808866be-kube-api-access-s7zj2\") on node \"crc\" DevicePath \"\"" Feb 19 21:40:52 crc kubenswrapper[4795]: I0219 21:40:52.458793 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmvnt" event={"ID":"4118207c-5a68-4979-b9b6-eb22b17052b5","Type":"ContainerStarted","Data":"cc7dddef9a63f7dd6f05656e3a45747c50259dd5340d82e13ccdd95b6ceaa8c9"} Feb 19 21:40:52 crc kubenswrapper[4795]: I0219 21:40:52.460778 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" event={"ID":"8f281e72-3a5e-4abb-bbcb-7555808866be","Type":"ContainerDied","Data":"d5f862a402ff333110036da6599ba414ec2036ba5942154c2f4bc5a556c0dcb8"} Feb 19 21:40:52 crc kubenswrapper[4795]: I0219 21:40:52.460830 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5f862a402ff333110036da6599ba414ec2036ba5942154c2f4bc5a556c0dcb8" Feb 19 21:40:52 crc kubenswrapper[4795]: I0219 21:40:52.460933 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" Feb 19 21:40:52 crc kubenswrapper[4795]: I0219 21:40:52.526186 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zmvnt" podStartSLOduration=2.130334313 podStartE2EDuration="4.526150821s" podCreationTimestamp="2026-02-19 21:40:48 +0000 UTC" firstStartedPulling="2026-02-19 21:40:49.429647264 +0000 UTC m=+760.622165128" lastFinishedPulling="2026-02-19 21:40:51.825463762 +0000 UTC m=+763.017981636" observedRunningTime="2026-02-19 21:40:52.519311172 +0000 UTC m=+763.711829086" watchObservedRunningTime="2026-02-19 21:40:52.526150821 +0000 UTC m=+763.718668685" Feb 19 21:40:58 crc kubenswrapper[4795]: I0219 21:40:58.427190 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:40:58 crc kubenswrapper[4795]: I0219 21:40:58.427619 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:40:58 crc kubenswrapper[4795]: I0219 21:40:58.427661 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:40:58 crc kubenswrapper[4795]: I0219 21:40:58.428210 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"01e410588eeb6332f2520524efa20c5c33620bd277e99c49543b745d9bb370ca"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 21:40:58 crc kubenswrapper[4795]: I0219 21:40:58.428272 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://01e410588eeb6332f2520524efa20c5c33620bd277e99c49543b745d9bb370ca" gracePeriod=600 Feb 19 21:40:59 crc kubenswrapper[4795]: I0219 21:40:59.071404 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:40:59 crc kubenswrapper[4795]: I0219 21:40:59.071833 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:40:59 crc kubenswrapper[4795]: I0219 21:40:59.111845 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:40:59 crc kubenswrapper[4795]: I0219 21:40:59.498651 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="01e410588eeb6332f2520524efa20c5c33620bd277e99c49543b745d9bb370ca" exitCode=0 Feb 19 21:40:59 crc kubenswrapper[4795]: I0219 21:40:59.498710 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"01e410588eeb6332f2520524efa20c5c33620bd277e99c49543b745d9bb370ca"} Feb 19 21:40:59 crc kubenswrapper[4795]: I0219 21:40:59.498760 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"baca31a8ff8f8b420ab1c2fee031dade1a9efccb0543c74090535aa06f41da2f"} Feb 19 21:40:59 crc kubenswrapper[4795]: I0219 21:40:59.498782 4795 scope.go:117] "RemoveContainer" containerID="7b9a6596619327bd9c2f9e2c7e01d53fcd036e297997e2604d1242871dfda04a" Feb 19 21:40:59 crc kubenswrapper[4795]: I0219 21:40:59.543095 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.499292 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zmvnt"] Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.510229 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zmvnt" podUID="4118207c-5a68-4979-b9b6-eb22b17052b5" containerName="registry-server" containerID="cri-o://cc7dddef9a63f7dd6f05656e3a45747c50259dd5340d82e13ccdd95b6ceaa8c9" gracePeriod=2 Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.613340 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj"] Feb 19 21:41:01 crc kubenswrapper[4795]: E0219 21:41:01.613876 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f281e72-3a5e-4abb-bbcb-7555808866be" containerName="pull" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.613888 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f281e72-3a5e-4abb-bbcb-7555808866be" containerName="pull" Feb 19 21:41:01 crc kubenswrapper[4795]: E0219 21:41:01.613904 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f281e72-3a5e-4abb-bbcb-7555808866be" containerName="extract" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.613910 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f281e72-3a5e-4abb-bbcb-7555808866be" containerName="extract" Feb 19 21:41:01 crc kubenswrapper[4795]: E0219 21:41:01.613923 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f281e72-3a5e-4abb-bbcb-7555808866be" containerName="util" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.613929 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f281e72-3a5e-4abb-bbcb-7555808866be" containerName="util" Feb 19 21:41:01 crc kubenswrapper[4795]: E0219 21:41:01.613939 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec60d287-0f21-467c-8030-84b8726af567" containerName="console" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.613945 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec60d287-0f21-467c-8030-84b8726af567" containerName="console" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.614038 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f281e72-3a5e-4abb-bbcb-7555808866be" containerName="extract" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.614053 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec60d287-0f21-467c-8030-84b8726af567" containerName="console" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.614414 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.616208 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.616211 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.616841 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.632264 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.632583 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-d4wz6" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.636367 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj"] Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.768976 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2eb889b2-1f23-4497-a779-5312fcd470b1-webhook-cert\") pod \"metallb-operator-controller-manager-549cf7d797-tscrj\" (UID: \"2eb889b2-1f23-4497-a779-5312fcd470b1\") " pod="metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.769020 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2eb889b2-1f23-4497-a779-5312fcd470b1-apiservice-cert\") pod \"metallb-operator-controller-manager-549cf7d797-tscrj\" (UID: \"2eb889b2-1f23-4497-a779-5312fcd470b1\") " pod="metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.769051 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcbx4\" (UniqueName: \"kubernetes.io/projected/2eb889b2-1f23-4497-a779-5312fcd470b1-kube-api-access-zcbx4\") pod \"metallb-operator-controller-manager-549cf7d797-tscrj\" (UID: \"2eb889b2-1f23-4497-a779-5312fcd470b1\") " pod="metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.869788 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcbx4\" (UniqueName: \"kubernetes.io/projected/2eb889b2-1f23-4497-a779-5312fcd470b1-kube-api-access-zcbx4\") pod \"metallb-operator-controller-manager-549cf7d797-tscrj\" (UID: \"2eb889b2-1f23-4497-a779-5312fcd470b1\") " pod="metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.869906 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2eb889b2-1f23-4497-a779-5312fcd470b1-webhook-cert\") pod \"metallb-operator-controller-manager-549cf7d797-tscrj\" (UID: \"2eb889b2-1f23-4497-a779-5312fcd470b1\") " pod="metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.869923 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2eb889b2-1f23-4497-a779-5312fcd470b1-apiservice-cert\") pod \"metallb-operator-controller-manager-549cf7d797-tscrj\" (UID: \"2eb889b2-1f23-4497-a779-5312fcd470b1\") " pod="metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.875393 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2eb889b2-1f23-4497-a779-5312fcd470b1-webhook-cert\") pod \"metallb-operator-controller-manager-549cf7d797-tscrj\" (UID: \"2eb889b2-1f23-4497-a779-5312fcd470b1\") " pod="metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.888933 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcbx4\" (UniqueName: \"kubernetes.io/projected/2eb889b2-1f23-4497-a779-5312fcd470b1-kube-api-access-zcbx4\") pod \"metallb-operator-controller-manager-549cf7d797-tscrj\" (UID: \"2eb889b2-1f23-4497-a779-5312fcd470b1\") " pod="metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.893920 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2eb889b2-1f23-4497-a779-5312fcd470b1-apiservice-cert\") pod \"metallb-operator-controller-manager-549cf7d797-tscrj\" (UID: \"2eb889b2-1f23-4497-a779-5312fcd470b1\") " pod="metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.965476 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.966021 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.980423 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x"] Feb 19 21:41:01 crc kubenswrapper[4795]: E0219 21:41:01.980915 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4118207c-5a68-4979-b9b6-eb22b17052b5" containerName="extract-content" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.980930 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4118207c-5a68-4979-b9b6-eb22b17052b5" containerName="extract-content" Feb 19 21:41:01 crc kubenswrapper[4795]: E0219 21:41:01.980944 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4118207c-5a68-4979-b9b6-eb22b17052b5" containerName="extract-utilities" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.980951 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4118207c-5a68-4979-b9b6-eb22b17052b5" containerName="extract-utilities" Feb 19 21:41:01 crc kubenswrapper[4795]: E0219 21:41:01.980959 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4118207c-5a68-4979-b9b6-eb22b17052b5" containerName="registry-server" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.980965 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4118207c-5a68-4979-b9b6-eb22b17052b5" containerName="registry-server" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.981052 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4118207c-5a68-4979-b9b6-eb22b17052b5" containerName="registry-server" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.981391 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.983140 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.983294 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.983698 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-q6xth" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.993652 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x"] Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.128774 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4118207c-5a68-4979-b9b6-eb22b17052b5-utilities\") pod \"4118207c-5a68-4979-b9b6-eb22b17052b5\" (UID: \"4118207c-5a68-4979-b9b6-eb22b17052b5\") " Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.129321 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4118207c-5a68-4979-b9b6-eb22b17052b5-catalog-content\") pod \"4118207c-5a68-4979-b9b6-eb22b17052b5\" (UID: \"4118207c-5a68-4979-b9b6-eb22b17052b5\") " Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.129450 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsv5f\" (UniqueName: \"kubernetes.io/projected/4118207c-5a68-4979-b9b6-eb22b17052b5-kube-api-access-rsv5f\") pod \"4118207c-5a68-4979-b9b6-eb22b17052b5\" (UID: \"4118207c-5a68-4979-b9b6-eb22b17052b5\") " Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.129628 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/94a7e477-a2bd-4c46-8eb0-084260fade4a-webhook-cert\") pod \"metallb-operator-webhook-server-7d8d766c-z8q6x\" (UID: \"94a7e477-a2bd-4c46-8eb0-084260fade4a\") " pod="metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.129705 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/94a7e477-a2bd-4c46-8eb0-084260fade4a-apiservice-cert\") pod \"metallb-operator-webhook-server-7d8d766c-z8q6x\" (UID: \"94a7e477-a2bd-4c46-8eb0-084260fade4a\") " pod="metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.129869 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g457x\" (UniqueName: \"kubernetes.io/projected/94a7e477-a2bd-4c46-8eb0-084260fade4a-kube-api-access-g457x\") pod \"metallb-operator-webhook-server-7d8d766c-z8q6x\" (UID: \"94a7e477-a2bd-4c46-8eb0-084260fade4a\") " pod="metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.129938 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4118207c-5a68-4979-b9b6-eb22b17052b5-utilities" (OuterVolumeSpecName: "utilities") pod "4118207c-5a68-4979-b9b6-eb22b17052b5" (UID: "4118207c-5a68-4979-b9b6-eb22b17052b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.136425 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4118207c-5a68-4979-b9b6-eb22b17052b5-kube-api-access-rsv5f" (OuterVolumeSpecName: "kube-api-access-rsv5f") pod "4118207c-5a68-4979-b9b6-eb22b17052b5" (UID: "4118207c-5a68-4979-b9b6-eb22b17052b5"). InnerVolumeSpecName "kube-api-access-rsv5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.230285 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g457x\" (UniqueName: \"kubernetes.io/projected/94a7e477-a2bd-4c46-8eb0-084260fade4a-kube-api-access-g457x\") pod \"metallb-operator-webhook-server-7d8d766c-z8q6x\" (UID: \"94a7e477-a2bd-4c46-8eb0-084260fade4a\") " pod="metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.230328 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/94a7e477-a2bd-4c46-8eb0-084260fade4a-webhook-cert\") pod \"metallb-operator-webhook-server-7d8d766c-z8q6x\" (UID: \"94a7e477-a2bd-4c46-8eb0-084260fade4a\") " pod="metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.230359 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/94a7e477-a2bd-4c46-8eb0-084260fade4a-apiservice-cert\") pod \"metallb-operator-webhook-server-7d8d766c-z8q6x\" (UID: \"94a7e477-a2bd-4c46-8eb0-084260fade4a\") " pod="metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.230414 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsv5f\" (UniqueName: \"kubernetes.io/projected/4118207c-5a68-4979-b9b6-eb22b17052b5-kube-api-access-rsv5f\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.230424 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4118207c-5a68-4979-b9b6-eb22b17052b5-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.234967 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/94a7e477-a2bd-4c46-8eb0-084260fade4a-apiservice-cert\") pod \"metallb-operator-webhook-server-7d8d766c-z8q6x\" (UID: \"94a7e477-a2bd-4c46-8eb0-084260fade4a\") " pod="metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.234982 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/94a7e477-a2bd-4c46-8eb0-084260fade4a-webhook-cert\") pod \"metallb-operator-webhook-server-7d8d766c-z8q6x\" (UID: \"94a7e477-a2bd-4c46-8eb0-084260fade4a\") " pod="metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.250726 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g457x\" (UniqueName: \"kubernetes.io/projected/94a7e477-a2bd-4c46-8eb0-084260fade4a-kube-api-access-g457x\") pod \"metallb-operator-webhook-server-7d8d766c-z8q6x\" (UID: \"94a7e477-a2bd-4c46-8eb0-084260fade4a\") " pod="metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.292986 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4118207c-5a68-4979-b9b6-eb22b17052b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4118207c-5a68-4979-b9b6-eb22b17052b5" (UID: "4118207c-5a68-4979-b9b6-eb22b17052b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.319646 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.332106 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4118207c-5a68-4979-b9b6-eb22b17052b5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.481390 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj"] Feb 19 21:41:02 crc kubenswrapper[4795]: W0219 21:41:02.488010 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2eb889b2_1f23_4497_a779_5312fcd470b1.slice/crio-8fc354a2031a50acbd85479a8cf74b120b4e788591fcc72267bcca2224a37c12 WatchSource:0}: Error finding container 8fc354a2031a50acbd85479a8cf74b120b4e788591fcc72267bcca2224a37c12: Status 404 returned error can't find the container with id 8fc354a2031a50acbd85479a8cf74b120b4e788591fcc72267bcca2224a37c12 Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.523725 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj" event={"ID":"2eb889b2-1f23-4497-a779-5312fcd470b1","Type":"ContainerStarted","Data":"8fc354a2031a50acbd85479a8cf74b120b4e788591fcc72267bcca2224a37c12"} Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.533354 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x"] Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.535041 4795 generic.go:334] "Generic (PLEG): container finished" podID="4118207c-5a68-4979-b9b6-eb22b17052b5" containerID="cc7dddef9a63f7dd6f05656e3a45747c50259dd5340d82e13ccdd95b6ceaa8c9" exitCode=0 Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.535071 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmvnt" event={"ID":"4118207c-5a68-4979-b9b6-eb22b17052b5","Type":"ContainerDied","Data":"cc7dddef9a63f7dd6f05656e3a45747c50259dd5340d82e13ccdd95b6ceaa8c9"} Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.535094 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmvnt" event={"ID":"4118207c-5a68-4979-b9b6-eb22b17052b5","Type":"ContainerDied","Data":"c3720e6494d7da9d6757edc6343c06ba78fe938b5670ae300d8ec58fba2ab684"} Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.535116 4795 scope.go:117] "RemoveContainer" containerID="cc7dddef9a63f7dd6f05656e3a45747c50259dd5340d82e13ccdd95b6ceaa8c9" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.535253 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.555988 4795 scope.go:117] "RemoveContainer" containerID="0ad256ff7ca3904d283dba61f3dcaf8929555e0b40d744d5939cd6ca169a84e6" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.562876 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zmvnt"] Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.576874 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zmvnt"] Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.588951 4795 scope.go:117] "RemoveContainer" containerID="c429bcd1ab09eff2de0ad552c87f970faf827c32a10744ab563813583f821bcd" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.609374 4795 scope.go:117] "RemoveContainer" containerID="cc7dddef9a63f7dd6f05656e3a45747c50259dd5340d82e13ccdd95b6ceaa8c9" Feb 19 21:41:02 crc kubenswrapper[4795]: E0219 21:41:02.611366 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc7dddef9a63f7dd6f05656e3a45747c50259dd5340d82e13ccdd95b6ceaa8c9\": container with ID starting with cc7dddef9a63f7dd6f05656e3a45747c50259dd5340d82e13ccdd95b6ceaa8c9 not found: ID does not exist" containerID="cc7dddef9a63f7dd6f05656e3a45747c50259dd5340d82e13ccdd95b6ceaa8c9" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.611406 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc7dddef9a63f7dd6f05656e3a45747c50259dd5340d82e13ccdd95b6ceaa8c9"} err="failed to get container status \"cc7dddef9a63f7dd6f05656e3a45747c50259dd5340d82e13ccdd95b6ceaa8c9\": rpc error: code = NotFound desc = could not find container \"cc7dddef9a63f7dd6f05656e3a45747c50259dd5340d82e13ccdd95b6ceaa8c9\": container with ID starting with cc7dddef9a63f7dd6f05656e3a45747c50259dd5340d82e13ccdd95b6ceaa8c9 not found: ID does not exist" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.611435 4795 scope.go:117] "RemoveContainer" containerID="0ad256ff7ca3904d283dba61f3dcaf8929555e0b40d744d5939cd6ca169a84e6" Feb 19 21:41:02 crc kubenswrapper[4795]: E0219 21:41:02.621287 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ad256ff7ca3904d283dba61f3dcaf8929555e0b40d744d5939cd6ca169a84e6\": container with ID starting with 0ad256ff7ca3904d283dba61f3dcaf8929555e0b40d744d5939cd6ca169a84e6 not found: ID does not exist" containerID="0ad256ff7ca3904d283dba61f3dcaf8929555e0b40d744d5939cd6ca169a84e6" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.621333 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ad256ff7ca3904d283dba61f3dcaf8929555e0b40d744d5939cd6ca169a84e6"} err="failed to get container status \"0ad256ff7ca3904d283dba61f3dcaf8929555e0b40d744d5939cd6ca169a84e6\": rpc error: code = NotFound desc = could not find container \"0ad256ff7ca3904d283dba61f3dcaf8929555e0b40d744d5939cd6ca169a84e6\": container with ID starting with 0ad256ff7ca3904d283dba61f3dcaf8929555e0b40d744d5939cd6ca169a84e6 not found: ID does not exist" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.621359 4795 scope.go:117] "RemoveContainer" containerID="c429bcd1ab09eff2de0ad552c87f970faf827c32a10744ab563813583f821bcd" Feb 19 21:41:02 crc kubenswrapper[4795]: E0219 21:41:02.625260 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c429bcd1ab09eff2de0ad552c87f970faf827c32a10744ab563813583f821bcd\": container with ID starting with c429bcd1ab09eff2de0ad552c87f970faf827c32a10744ab563813583f821bcd not found: ID does not exist" containerID="c429bcd1ab09eff2de0ad552c87f970faf827c32a10744ab563813583f821bcd" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.625304 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c429bcd1ab09eff2de0ad552c87f970faf827c32a10744ab563813583f821bcd"} err="failed to get container status \"c429bcd1ab09eff2de0ad552c87f970faf827c32a10744ab563813583f821bcd\": rpc error: code = NotFound desc = could not find container \"c429bcd1ab09eff2de0ad552c87f970faf827c32a10744ab563813583f821bcd\": container with ID starting with c429bcd1ab09eff2de0ad552c87f970faf827c32a10744ab563813583f821bcd not found: ID does not exist" Feb 19 21:41:03 crc kubenswrapper[4795]: I0219 21:41:03.520940 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4118207c-5a68-4979-b9b6-eb22b17052b5" path="/var/lib/kubelet/pods/4118207c-5a68-4979-b9b6-eb22b17052b5/volumes" Feb 19 21:41:03 crc kubenswrapper[4795]: I0219 21:41:03.545508 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x" event={"ID":"94a7e477-a2bd-4c46-8eb0-084260fade4a","Type":"ContainerStarted","Data":"7ef25b170c2459d47bd81b8c4b15a2f3dcf9f6b31f799843a81af7a10f7025a1"} Feb 19 21:41:07 crc kubenswrapper[4795]: I0219 21:41:07.565338 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj" event={"ID":"2eb889b2-1f23-4497-a779-5312fcd470b1","Type":"ContainerStarted","Data":"fd20c7d88200f4018d0640ab4e5f5ed4790e0d8e988c9bba85b4d3915a758139"} Feb 19 21:41:07 crc kubenswrapper[4795]: I0219 21:41:07.565864 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj" Feb 19 21:41:07 crc kubenswrapper[4795]: I0219 21:41:07.566932 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x" event={"ID":"94a7e477-a2bd-4c46-8eb0-084260fade4a","Type":"ContainerStarted","Data":"70a76006e68dfb5df6fddc68cf0e339cb3af018b44bfda881f8ff3833dbac99a"} Feb 19 21:41:07 crc kubenswrapper[4795]: I0219 21:41:07.567152 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x" Feb 19 21:41:07 crc kubenswrapper[4795]: I0219 21:41:07.588602 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj" podStartSLOduration=2.482430395 podStartE2EDuration="6.588584061s" podCreationTimestamp="2026-02-19 21:41:01 +0000 UTC" firstStartedPulling="2026-02-19 21:41:02.491262396 +0000 UTC m=+773.683780260" lastFinishedPulling="2026-02-19 21:41:06.597416062 +0000 UTC m=+777.789933926" observedRunningTime="2026-02-19 21:41:07.585872152 +0000 UTC m=+778.778390056" watchObservedRunningTime="2026-02-19 21:41:07.588584061 +0000 UTC m=+778.781101945" Feb 19 21:41:22 crc kubenswrapper[4795]: I0219 21:41:22.323995 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x" Feb 19 21:41:22 crc kubenswrapper[4795]: I0219 21:41:22.353253 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x" podStartSLOduration=17.290933457 podStartE2EDuration="21.35322858s" podCreationTimestamp="2026-02-19 21:41:01 +0000 UTC" firstStartedPulling="2026-02-19 21:41:02.547577263 +0000 UTC m=+773.740095127" lastFinishedPulling="2026-02-19 21:41:06.609872386 +0000 UTC m=+777.802390250" observedRunningTime="2026-02-19 21:41:07.608249049 +0000 UTC m=+778.800766913" watchObservedRunningTime="2026-02-19 21:41:22.35322858 +0000 UTC m=+793.545746474" Feb 19 21:41:41 crc kubenswrapper[4795]: I0219 21:41:41.969533 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.755197 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-5qtpm"] Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.755866 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5qtpm" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.759382 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-n92rw" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.759536 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.769312 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-b7csh"] Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.772862 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.775456 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.775501 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.817381 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-5qtpm"] Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.840886 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-kmbww"] Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.841970 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-kmbww" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.844475 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.844669 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.844842 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.845234 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-vfjqv" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.873914 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-xrsfh"] Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.874770 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-xrsfh" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.876575 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.889412 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-xrsfh"] Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.952646 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-reloader\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.952688 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32ed0d55-a2df-4643-9283-e5bc8d1c993e-metrics-certs\") pod \"speaker-kmbww\" (UID: \"32ed0d55-a2df-4643-9283-e5bc8d1c993e\") " pod="metallb-system/speaker-kmbww" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.952981 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e32c1521-9c29-4d70-b4bb-54af4127daaf-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-5qtpm\" (UID: \"e32c1521-9c29-4d70-b4bb-54af4127daaf\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5qtpm" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.953042 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-frr-conf\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.953092 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-metrics-certs\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.953123 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/32ed0d55-a2df-4643-9283-e5bc8d1c993e-memberlist\") pod \"speaker-kmbww\" (UID: \"32ed0d55-a2df-4643-9283-e5bc8d1c993e\") " pod="metallb-system/speaker-kmbww" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.953182 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-frr-startup\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.953209 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-frr-sockets\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.953232 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-metrics\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.953254 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj6jp\" (UniqueName: \"kubernetes.io/projected/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-kube-api-access-pj6jp\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.953283 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq89b\" (UniqueName: \"kubernetes.io/projected/32ed0d55-a2df-4643-9283-e5bc8d1c993e-kube-api-access-bq89b\") pod \"speaker-kmbww\" (UID: \"32ed0d55-a2df-4643-9283-e5bc8d1c993e\") " pod="metallb-system/speaker-kmbww" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.953309 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chzkw\" (UniqueName: \"kubernetes.io/projected/e32c1521-9c29-4d70-b4bb-54af4127daaf-kube-api-access-chzkw\") pod \"frr-k8s-webhook-server-78b44bf5bb-5qtpm\" (UID: \"e32c1521-9c29-4d70-b4bb-54af4127daaf\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5qtpm" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.953338 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/32ed0d55-a2df-4643-9283-e5bc8d1c993e-metallb-excludel2\") pod \"speaker-kmbww\" (UID: \"32ed0d55-a2df-4643-9283-e5bc8d1c993e\") " pod="metallb-system/speaker-kmbww" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054017 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-metrics\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054061 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj6jp\" (UniqueName: \"kubernetes.io/projected/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-kube-api-access-pj6jp\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054096 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq89b\" (UniqueName: \"kubernetes.io/projected/32ed0d55-a2df-4643-9283-e5bc8d1c993e-kube-api-access-bq89b\") pod \"speaker-kmbww\" (UID: \"32ed0d55-a2df-4643-9283-e5bc8d1c993e\") " pod="metallb-system/speaker-kmbww" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054115 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chzkw\" (UniqueName: \"kubernetes.io/projected/e32c1521-9c29-4d70-b4bb-54af4127daaf-kube-api-access-chzkw\") pod \"frr-k8s-webhook-server-78b44bf5bb-5qtpm\" (UID: \"e32c1521-9c29-4d70-b4bb-54af4127daaf\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5qtpm" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054146 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/32ed0d55-a2df-4643-9283-e5bc8d1c993e-metallb-excludel2\") pod \"speaker-kmbww\" (UID: \"32ed0d55-a2df-4643-9283-e5bc8d1c993e\") " pod="metallb-system/speaker-kmbww" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054204 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-reloader\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054226 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4-metrics-certs\") pod \"controller-69bbfbf88f-xrsfh\" (UID: \"1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4\") " pod="metallb-system/controller-69bbfbf88f-xrsfh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054246 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32ed0d55-a2df-4643-9283-e5bc8d1c993e-metrics-certs\") pod \"speaker-kmbww\" (UID: \"32ed0d55-a2df-4643-9283-e5bc8d1c993e\") " pod="metallb-system/speaker-kmbww" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054264 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4-cert\") pod \"controller-69bbfbf88f-xrsfh\" (UID: \"1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4\") " pod="metallb-system/controller-69bbfbf88f-xrsfh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054282 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e32c1521-9c29-4d70-b4bb-54af4127daaf-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-5qtpm\" (UID: \"e32c1521-9c29-4d70-b4bb-54af4127daaf\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5qtpm" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054544 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-frr-conf\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054566 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-metrics-certs\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054622 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/32ed0d55-a2df-4643-9283-e5bc8d1c993e-memberlist\") pod \"speaker-kmbww\" (UID: \"32ed0d55-a2df-4643-9283-e5bc8d1c993e\") " pod="metallb-system/speaker-kmbww" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054653 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2l5g\" (UniqueName: \"kubernetes.io/projected/1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4-kube-api-access-h2l5g\") pod \"controller-69bbfbf88f-xrsfh\" (UID: \"1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4\") " pod="metallb-system/controller-69bbfbf88f-xrsfh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054668 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-metrics\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054698 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-frr-startup\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054751 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-frr-sockets\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054798 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-reloader\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054925 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/32ed0d55-a2df-4643-9283-e5bc8d1c993e-metallb-excludel2\") pod \"speaker-kmbww\" (UID: \"32ed0d55-a2df-4643-9283-e5bc8d1c993e\") " pod="metallb-system/speaker-kmbww" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.055094 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-frr-sockets\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.055177 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-frr-conf\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:43 crc kubenswrapper[4795]: E0219 21:41:43.055190 4795 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 19 21:41:43 crc kubenswrapper[4795]: E0219 21:41:43.055266 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32ed0d55-a2df-4643-9283-e5bc8d1c993e-memberlist podName:32ed0d55-a2df-4643-9283-e5bc8d1c993e nodeName:}" failed. No retries permitted until 2026-02-19 21:41:43.55524524 +0000 UTC m=+814.747763104 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/32ed0d55-a2df-4643-9283-e5bc8d1c993e-memberlist") pod "speaker-kmbww" (UID: "32ed0d55-a2df-4643-9283-e5bc8d1c993e") : secret "metallb-memberlist" not found Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.055413 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-frr-startup\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.063949 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e32c1521-9c29-4d70-b4bb-54af4127daaf-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-5qtpm\" (UID: \"e32c1521-9c29-4d70-b4bb-54af4127daaf\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5qtpm" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.064404 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32ed0d55-a2df-4643-9283-e5bc8d1c993e-metrics-certs\") pod \"speaker-kmbww\" (UID: \"32ed0d55-a2df-4643-9283-e5bc8d1c993e\") " pod="metallb-system/speaker-kmbww" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.065616 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-metrics-certs\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.073228 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj6jp\" (UniqueName: \"kubernetes.io/projected/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-kube-api-access-pj6jp\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.077896 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chzkw\" (UniqueName: \"kubernetes.io/projected/e32c1521-9c29-4d70-b4bb-54af4127daaf-kube-api-access-chzkw\") pod \"frr-k8s-webhook-server-78b44bf5bb-5qtpm\" (UID: \"e32c1521-9c29-4d70-b4bb-54af4127daaf\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5qtpm" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.089052 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5qtpm" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.089727 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq89b\" (UniqueName: \"kubernetes.io/projected/32ed0d55-a2df-4643-9283-e5bc8d1c993e-kube-api-access-bq89b\") pod \"speaker-kmbww\" (UID: \"32ed0d55-a2df-4643-9283-e5bc8d1c993e\") " pod="metallb-system/speaker-kmbww" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.098089 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.155496 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4-metrics-certs\") pod \"controller-69bbfbf88f-xrsfh\" (UID: \"1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4\") " pod="metallb-system/controller-69bbfbf88f-xrsfh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.155548 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4-cert\") pod \"controller-69bbfbf88f-xrsfh\" (UID: \"1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4\") " pod="metallb-system/controller-69bbfbf88f-xrsfh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.155613 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2l5g\" (UniqueName: \"kubernetes.io/projected/1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4-kube-api-access-h2l5g\") pod \"controller-69bbfbf88f-xrsfh\" (UID: \"1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4\") " pod="metallb-system/controller-69bbfbf88f-xrsfh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.159128 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4-metrics-certs\") pod \"controller-69bbfbf88f-xrsfh\" (UID: \"1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4\") " pod="metallb-system/controller-69bbfbf88f-xrsfh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.159493 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.172758 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4-cert\") pod \"controller-69bbfbf88f-xrsfh\" (UID: \"1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4\") " pod="metallb-system/controller-69bbfbf88f-xrsfh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.181136 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2l5g\" (UniqueName: \"kubernetes.io/projected/1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4-kube-api-access-h2l5g\") pod \"controller-69bbfbf88f-xrsfh\" (UID: \"1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4\") " pod="metallb-system/controller-69bbfbf88f-xrsfh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.190886 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-xrsfh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.336439 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-5qtpm"] Feb 19 21:41:43 crc kubenswrapper[4795]: W0219 21:41:43.341460 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode32c1521_9c29_4d70_b4bb_54af4127daaf.slice/crio-9047f712377ca3901ebb8e873e69efa04063d1504d4029f5ef3cb0edde012d3a WatchSource:0}: Error finding container 9047f712377ca3901ebb8e873e69efa04063d1504d4029f5ef3cb0edde012d3a: Status 404 returned error can't find the container with id 9047f712377ca3901ebb8e873e69efa04063d1504d4029f5ef3cb0edde012d3a Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.393593 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-xrsfh"] Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.562604 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/32ed0d55-a2df-4643-9283-e5bc8d1c993e-memberlist\") pod \"speaker-kmbww\" (UID: \"32ed0d55-a2df-4643-9283-e5bc8d1c993e\") " pod="metallb-system/speaker-kmbww" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.567591 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/32ed0d55-a2df-4643-9283-e5bc8d1c993e-memberlist\") pod \"speaker-kmbww\" (UID: \"32ed0d55-a2df-4643-9283-e5bc8d1c993e\") " pod="metallb-system/speaker-kmbww" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.763718 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-kmbww" Feb 19 21:41:43 crc kubenswrapper[4795]: W0219 21:41:43.781535 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32ed0d55_a2df_4643_9283_e5bc8d1c993e.slice/crio-671bd2731e10ef1d917a4416efcf739bbb775ba07351f4f4e615f454b1d45adb WatchSource:0}: Error finding container 671bd2731e10ef1d917a4416efcf739bbb775ba07351f4f4e615f454b1d45adb: Status 404 returned error can't find the container with id 671bd2731e10ef1d917a4416efcf739bbb775ba07351f4f4e615f454b1d45adb Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.793985 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b7csh" event={"ID":"3c79ff86-c25b-45b2-9f84-d33c6264cc0a","Type":"ContainerStarted","Data":"7d3f22e5a3be1edd5e4cce1e6f3e38eda25c44cfa45aa75be523de02578b39be"} Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.795547 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-xrsfh" event={"ID":"1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4","Type":"ContainerStarted","Data":"4f41af9995dddda2fcfffe35d4232bad62a2c5f7794bfdbeeccc1aacf7e5ada9"} Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.795571 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-xrsfh" event={"ID":"1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4","Type":"ContainerStarted","Data":"e0583de77f00d9cff1a8dcebb7c4662615467eeed68c003aba755f57c3480df7"} Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.795581 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-xrsfh" event={"ID":"1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4","Type":"ContainerStarted","Data":"1d8085a3baa07142c8904cbb51aa56d861810608e02e76f8ef21d6469ddf0f48"} Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.795949 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-xrsfh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.796747 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5qtpm" event={"ID":"e32c1521-9c29-4d70-b4bb-54af4127daaf","Type":"ContainerStarted","Data":"9047f712377ca3901ebb8e873e69efa04063d1504d4029f5ef3cb0edde012d3a"} Feb 19 21:41:44 crc kubenswrapper[4795]: I0219 21:41:44.820093 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-kmbww" event={"ID":"32ed0d55-a2df-4643-9283-e5bc8d1c993e","Type":"ContainerStarted","Data":"0670aa1ec4bddfa996238282d946eb3787421f8b28dd329074767c8497d54509"} Feb 19 21:41:44 crc kubenswrapper[4795]: I0219 21:41:44.820143 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-kmbww" event={"ID":"32ed0d55-a2df-4643-9283-e5bc8d1c993e","Type":"ContainerStarted","Data":"791ad1189990959af46b227040aa14ad78fed65e30d9ade491c0f1c1235f68ca"} Feb 19 21:41:44 crc kubenswrapper[4795]: I0219 21:41:44.820158 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-kmbww" event={"ID":"32ed0d55-a2df-4643-9283-e5bc8d1c993e","Type":"ContainerStarted","Data":"671bd2731e10ef1d917a4416efcf739bbb775ba07351f4f4e615f454b1d45adb"} Feb 19 21:41:44 crc kubenswrapper[4795]: I0219 21:41:44.820416 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-kmbww" Feb 19 21:41:44 crc kubenswrapper[4795]: I0219 21:41:44.836414 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-xrsfh" podStartSLOduration=2.836394956 podStartE2EDuration="2.836394956s" podCreationTimestamp="2026-02-19 21:41:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:41:43.820429138 +0000 UTC m=+815.012947072" watchObservedRunningTime="2026-02-19 21:41:44.836394956 +0000 UTC m=+816.028912820" Feb 19 21:41:44 crc kubenswrapper[4795]: I0219 21:41:44.836854 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-kmbww" podStartSLOduration=2.836847339 podStartE2EDuration="2.836847339s" podCreationTimestamp="2026-02-19 21:41:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:41:44.834013259 +0000 UTC m=+816.026531133" watchObservedRunningTime="2026-02-19 21:41:44.836847339 +0000 UTC m=+816.029365193" Feb 19 21:41:49 crc kubenswrapper[4795]: I0219 21:41:49.858487 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b7csh" event={"ID":"3c79ff86-c25b-45b2-9f84-d33c6264cc0a","Type":"ContainerStarted","Data":"d73d829479256056c62fe2ac838de7fd175399aa9f5a3f8e7b6ce485e7ac517c"} Feb 19 21:41:50 crc kubenswrapper[4795]: I0219 21:41:50.866876 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5qtpm" event={"ID":"e32c1521-9c29-4d70-b4bb-54af4127daaf","Type":"ContainerStarted","Data":"d04c81a2b1179a537195634390212769771129f49fec48d33bca709345343559"} Feb 19 21:41:50 crc kubenswrapper[4795]: I0219 21:41:50.866975 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5qtpm" Feb 19 21:41:50 crc kubenswrapper[4795]: I0219 21:41:50.869049 4795 generic.go:334] "Generic (PLEG): container finished" podID="3c79ff86-c25b-45b2-9f84-d33c6264cc0a" containerID="d73d829479256056c62fe2ac838de7fd175399aa9f5a3f8e7b6ce485e7ac517c" exitCode=0 Feb 19 21:41:50 crc kubenswrapper[4795]: I0219 21:41:50.869105 4795 generic.go:334] "Generic (PLEG): container finished" podID="3c79ff86-c25b-45b2-9f84-d33c6264cc0a" containerID="72490e311d8a8f57f4ff25e7a8c8b4561c1442d47f363f83bb4f177d5c9b8a35" exitCode=0 Feb 19 21:41:50 crc kubenswrapper[4795]: I0219 21:41:50.869143 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b7csh" event={"ID":"3c79ff86-c25b-45b2-9f84-d33c6264cc0a","Type":"ContainerDied","Data":"d73d829479256056c62fe2ac838de7fd175399aa9f5a3f8e7b6ce485e7ac517c"} Feb 19 21:41:50 crc kubenswrapper[4795]: I0219 21:41:50.869213 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b7csh" event={"ID":"3c79ff86-c25b-45b2-9f84-d33c6264cc0a","Type":"ContainerDied","Data":"72490e311d8a8f57f4ff25e7a8c8b4561c1442d47f363f83bb4f177d5c9b8a35"} Feb 19 21:41:50 crc kubenswrapper[4795]: I0219 21:41:50.937752 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5qtpm" podStartSLOduration=2.590851862 podStartE2EDuration="8.937729746s" podCreationTimestamp="2026-02-19 21:41:42 +0000 UTC" firstStartedPulling="2026-02-19 21:41:43.3435216 +0000 UTC m=+814.536039464" lastFinishedPulling="2026-02-19 21:41:49.690399474 +0000 UTC m=+820.882917348" observedRunningTime="2026-02-19 21:41:50.898394104 +0000 UTC m=+822.090911978" watchObservedRunningTime="2026-02-19 21:41:50.937729746 +0000 UTC m=+822.130247640" Feb 19 21:41:51 crc kubenswrapper[4795]: I0219 21:41:51.879995 4795 generic.go:334] "Generic (PLEG): container finished" podID="3c79ff86-c25b-45b2-9f84-d33c6264cc0a" containerID="3b7cad17e35684cde452421d7ca4e7d24377b7d18f62c3a556b036e8654918a2" exitCode=0 Feb 19 21:41:51 crc kubenswrapper[4795]: I0219 21:41:51.880094 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b7csh" event={"ID":"3c79ff86-c25b-45b2-9f84-d33c6264cc0a","Type":"ContainerDied","Data":"3b7cad17e35684cde452421d7ca4e7d24377b7d18f62c3a556b036e8654918a2"} Feb 19 21:41:52 crc kubenswrapper[4795]: I0219 21:41:52.892386 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b7csh" event={"ID":"3c79ff86-c25b-45b2-9f84-d33c6264cc0a","Type":"ContainerStarted","Data":"ef29568524bb229975514ef0788a7abc9a9bde52e2f211668185fbab6225b011"} Feb 19 21:41:52 crc kubenswrapper[4795]: I0219 21:41:52.892689 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b7csh" event={"ID":"3c79ff86-c25b-45b2-9f84-d33c6264cc0a","Type":"ContainerStarted","Data":"63c8caad8691f8ca95be952545c23d51ef9eb1af270f0d0f7f162802ed5cd007"} Feb 19 21:41:52 crc kubenswrapper[4795]: I0219 21:41:52.892700 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b7csh" event={"ID":"3c79ff86-c25b-45b2-9f84-d33c6264cc0a","Type":"ContainerStarted","Data":"deeafe2b27fe4d72012e098dbfc492988ac9174c03f3cf05903e8c59f99be818"} Feb 19 21:41:52 crc kubenswrapper[4795]: I0219 21:41:52.892713 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b7csh" event={"ID":"3c79ff86-c25b-45b2-9f84-d33c6264cc0a","Type":"ContainerStarted","Data":"83b40f6c5992832cf7358444e78ce7f4c611860f9d7a246847e859075b139159"} Feb 19 21:41:52 crc kubenswrapper[4795]: I0219 21:41:52.892721 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b7csh" event={"ID":"3c79ff86-c25b-45b2-9f84-d33c6264cc0a","Type":"ContainerStarted","Data":"e7f43fbada166b61bac7c815dce8910ab401ca78b6f4dc559075e40b9f92f0f5"} Feb 19 21:41:53 crc kubenswrapper[4795]: I0219 21:41:53.195229 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-xrsfh" Feb 19 21:41:53 crc kubenswrapper[4795]: I0219 21:41:53.767310 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-kmbww" Feb 19 21:41:53 crc kubenswrapper[4795]: I0219 21:41:53.902473 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b7csh" event={"ID":"3c79ff86-c25b-45b2-9f84-d33c6264cc0a","Type":"ContainerStarted","Data":"543e34a58bf0cba90d368232b34f75d0ecc48f7e93a76053516cff14262fb13d"} Feb 19 21:41:53 crc kubenswrapper[4795]: I0219 21:41:53.902659 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:53 crc kubenswrapper[4795]: I0219 21:41:53.925982 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-b7csh" podStartSLOduration=5.521059606 podStartE2EDuration="11.925964847s" podCreationTimestamp="2026-02-19 21:41:42 +0000 UTC" firstStartedPulling="2026-02-19 21:41:43.268985011 +0000 UTC m=+814.461502875" lastFinishedPulling="2026-02-19 21:41:49.673890252 +0000 UTC m=+820.866408116" observedRunningTime="2026-02-19 21:41:53.923029535 +0000 UTC m=+825.115547419" watchObservedRunningTime="2026-02-19 21:41:53.925964847 +0000 UTC m=+825.118482721" Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.148022 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6"] Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.149685 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.152297 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.167374 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6"] Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.199869 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj24f\" (UniqueName: \"kubernetes.io/projected/5aba51a3-e783-497f-b58e-dcd4e631b0e9-kube-api-access-pj24f\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6\" (UID: \"5aba51a3-e783-497f-b58e-dcd4e631b0e9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.199986 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5aba51a3-e783-497f-b58e-dcd4e631b0e9-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6\" (UID: \"5aba51a3-e783-497f-b58e-dcd4e631b0e9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.200019 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5aba51a3-e783-497f-b58e-dcd4e631b0e9-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6\" (UID: \"5aba51a3-e783-497f-b58e-dcd4e631b0e9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.301548 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj24f\" (UniqueName: \"kubernetes.io/projected/5aba51a3-e783-497f-b58e-dcd4e631b0e9-kube-api-access-pj24f\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6\" (UID: \"5aba51a3-e783-497f-b58e-dcd4e631b0e9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.301641 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5aba51a3-e783-497f-b58e-dcd4e631b0e9-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6\" (UID: \"5aba51a3-e783-497f-b58e-dcd4e631b0e9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.301697 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5aba51a3-e783-497f-b58e-dcd4e631b0e9-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6\" (UID: \"5aba51a3-e783-497f-b58e-dcd4e631b0e9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.302342 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5aba51a3-e783-497f-b58e-dcd4e631b0e9-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6\" (UID: \"5aba51a3-e783-497f-b58e-dcd4e631b0e9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.302943 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5aba51a3-e783-497f-b58e-dcd4e631b0e9-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6\" (UID: \"5aba51a3-e783-497f-b58e-dcd4e631b0e9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.329055 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj24f\" (UniqueName: \"kubernetes.io/projected/5aba51a3-e783-497f-b58e-dcd4e631b0e9-kube-api-access-pj24f\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6\" (UID: \"5aba51a3-e783-497f-b58e-dcd4e631b0e9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.470140 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.671492 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6"] Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.914802 4795 generic.go:334] "Generic (PLEG): container finished" podID="5aba51a3-e783-497f-b58e-dcd4e631b0e9" containerID="28f884510467f3d0ccbca8dbc1657f80878d0035b8902e2a73fc0ce4be4db940" exitCode=0 Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.914841 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" event={"ID":"5aba51a3-e783-497f-b58e-dcd4e631b0e9","Type":"ContainerDied","Data":"28f884510467f3d0ccbca8dbc1657f80878d0035b8902e2a73fc0ce4be4db940"} Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.914870 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" event={"ID":"5aba51a3-e783-497f-b58e-dcd4e631b0e9","Type":"ContainerStarted","Data":"1fc9a3d11f64d5cef1ac53a01fa7a06ddaf1046b3d7b78ab66ef1091c7d79560"} Feb 19 21:41:58 crc kubenswrapper[4795]: I0219 21:41:58.099091 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:58 crc kubenswrapper[4795]: I0219 21:41:58.149640 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:59 crc kubenswrapper[4795]: I0219 21:41:59.938976 4795 generic.go:334] "Generic (PLEG): container finished" podID="5aba51a3-e783-497f-b58e-dcd4e631b0e9" containerID="648f73eb6a7a5b928f64dce45c3c576ce679739a989f57b6d95eaedcd3692b3e" exitCode=0 Feb 19 21:41:59 crc kubenswrapper[4795]: I0219 21:41:59.939035 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" event={"ID":"5aba51a3-e783-497f-b58e-dcd4e631b0e9","Type":"ContainerDied","Data":"648f73eb6a7a5b928f64dce45c3c576ce679739a989f57b6d95eaedcd3692b3e"} Feb 19 21:42:00 crc kubenswrapper[4795]: I0219 21:42:00.951304 4795 generic.go:334] "Generic (PLEG): container finished" podID="5aba51a3-e783-497f-b58e-dcd4e631b0e9" containerID="e3730bdcb19372ac8c0b9369d1d0af34244b2d30d4da4dfcb34d21e8405a0a67" exitCode=0 Feb 19 21:42:00 crc kubenswrapper[4795]: I0219 21:42:00.951581 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" event={"ID":"5aba51a3-e783-497f-b58e-dcd4e631b0e9","Type":"ContainerDied","Data":"e3730bdcb19372ac8c0b9369d1d0af34244b2d30d4da4dfcb34d21e8405a0a67"} Feb 19 21:42:02 crc kubenswrapper[4795]: I0219 21:42:02.210635 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" Feb 19 21:42:02 crc kubenswrapper[4795]: I0219 21:42:02.392860 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5aba51a3-e783-497f-b58e-dcd4e631b0e9-util\") pod \"5aba51a3-e783-497f-b58e-dcd4e631b0e9\" (UID: \"5aba51a3-e783-497f-b58e-dcd4e631b0e9\") " Feb 19 21:42:02 crc kubenswrapper[4795]: I0219 21:42:02.392988 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj24f\" (UniqueName: \"kubernetes.io/projected/5aba51a3-e783-497f-b58e-dcd4e631b0e9-kube-api-access-pj24f\") pod \"5aba51a3-e783-497f-b58e-dcd4e631b0e9\" (UID: \"5aba51a3-e783-497f-b58e-dcd4e631b0e9\") " Feb 19 21:42:02 crc kubenswrapper[4795]: I0219 21:42:02.393087 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5aba51a3-e783-497f-b58e-dcd4e631b0e9-bundle\") pod \"5aba51a3-e783-497f-b58e-dcd4e631b0e9\" (UID: \"5aba51a3-e783-497f-b58e-dcd4e631b0e9\") " Feb 19 21:42:02 crc kubenswrapper[4795]: I0219 21:42:02.394395 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5aba51a3-e783-497f-b58e-dcd4e631b0e9-bundle" (OuterVolumeSpecName: "bundle") pod "5aba51a3-e783-497f-b58e-dcd4e631b0e9" (UID: "5aba51a3-e783-497f-b58e-dcd4e631b0e9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:42:02 crc kubenswrapper[4795]: I0219 21:42:02.403464 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aba51a3-e783-497f-b58e-dcd4e631b0e9-kube-api-access-pj24f" (OuterVolumeSpecName: "kube-api-access-pj24f") pod "5aba51a3-e783-497f-b58e-dcd4e631b0e9" (UID: "5aba51a3-e783-497f-b58e-dcd4e631b0e9"). InnerVolumeSpecName "kube-api-access-pj24f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:42:02 crc kubenswrapper[4795]: I0219 21:42:02.406030 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5aba51a3-e783-497f-b58e-dcd4e631b0e9-util" (OuterVolumeSpecName: "util") pod "5aba51a3-e783-497f-b58e-dcd4e631b0e9" (UID: "5aba51a3-e783-497f-b58e-dcd4e631b0e9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:42:02 crc kubenswrapper[4795]: I0219 21:42:02.494742 4795 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5aba51a3-e783-497f-b58e-dcd4e631b0e9-util\") on node \"crc\" DevicePath \"\"" Feb 19 21:42:02 crc kubenswrapper[4795]: I0219 21:42:02.494799 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj24f\" (UniqueName: \"kubernetes.io/projected/5aba51a3-e783-497f-b58e-dcd4e631b0e9-kube-api-access-pj24f\") on node \"crc\" DevicePath \"\"" Feb 19 21:42:02 crc kubenswrapper[4795]: I0219 21:42:02.494826 4795 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5aba51a3-e783-497f-b58e-dcd4e631b0e9-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:42:02 crc kubenswrapper[4795]: I0219 21:42:02.965146 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" event={"ID":"5aba51a3-e783-497f-b58e-dcd4e631b0e9","Type":"ContainerDied","Data":"1fc9a3d11f64d5cef1ac53a01fa7a06ddaf1046b3d7b78ab66ef1091c7d79560"} Feb 19 21:42:02 crc kubenswrapper[4795]: I0219 21:42:02.965214 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fc9a3d11f64d5cef1ac53a01fa7a06ddaf1046b3d7b78ab66ef1091c7d79560" Feb 19 21:42:02 crc kubenswrapper[4795]: I0219 21:42:02.965274 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" Feb 19 21:42:03 crc kubenswrapper[4795]: I0219 21:42:03.094849 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5qtpm" Feb 19 21:42:03 crc kubenswrapper[4795]: I0219 21:42:03.101347 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-b7csh" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.113102 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cmmch"] Feb 19 21:42:07 crc kubenswrapper[4795]: E0219 21:42:07.114214 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aba51a3-e783-497f-b58e-dcd4e631b0e9" containerName="pull" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.114240 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aba51a3-e783-497f-b58e-dcd4e631b0e9" containerName="pull" Feb 19 21:42:07 crc kubenswrapper[4795]: E0219 21:42:07.114269 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aba51a3-e783-497f-b58e-dcd4e631b0e9" containerName="util" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.114282 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aba51a3-e783-497f-b58e-dcd4e631b0e9" containerName="util" Feb 19 21:42:07 crc kubenswrapper[4795]: E0219 21:42:07.114305 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aba51a3-e783-497f-b58e-dcd4e631b0e9" containerName="extract" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.114320 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aba51a3-e783-497f-b58e-dcd4e631b0e9" containerName="extract" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.114526 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5aba51a3-e783-497f-b58e-dcd4e631b0e9" containerName="extract" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.116990 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cmmch" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.121720 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.121871 4795 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-wgj97" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.122951 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.126215 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cmmch"] Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.260999 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8a132514-cc3c-49a6-9a36-812490cf7ada-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-cmmch\" (UID: \"8a132514-cc3c-49a6-9a36-812490cf7ada\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cmmch" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.261076 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsm4x\" (UniqueName: \"kubernetes.io/projected/8a132514-cc3c-49a6-9a36-812490cf7ada-kube-api-access-gsm4x\") pod \"cert-manager-operator-controller-manager-66c8bdd694-cmmch\" (UID: \"8a132514-cc3c-49a6-9a36-812490cf7ada\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cmmch" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.362504 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8a132514-cc3c-49a6-9a36-812490cf7ada-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-cmmch\" (UID: \"8a132514-cc3c-49a6-9a36-812490cf7ada\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cmmch" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.362621 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsm4x\" (UniqueName: \"kubernetes.io/projected/8a132514-cc3c-49a6-9a36-812490cf7ada-kube-api-access-gsm4x\") pod \"cert-manager-operator-controller-manager-66c8bdd694-cmmch\" (UID: \"8a132514-cc3c-49a6-9a36-812490cf7ada\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cmmch" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.363158 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8a132514-cc3c-49a6-9a36-812490cf7ada-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-cmmch\" (UID: \"8a132514-cc3c-49a6-9a36-812490cf7ada\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cmmch" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.399348 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsm4x\" (UniqueName: \"kubernetes.io/projected/8a132514-cc3c-49a6-9a36-812490cf7ada-kube-api-access-gsm4x\") pod \"cert-manager-operator-controller-manager-66c8bdd694-cmmch\" (UID: \"8a132514-cc3c-49a6-9a36-812490cf7ada\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cmmch" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.435661 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cmmch" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.761410 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cmmch"] Feb 19 21:42:07 crc kubenswrapper[4795]: W0219 21:42:07.771992 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a132514_cc3c_49a6_9a36_812490cf7ada.slice/crio-905dd52b70620838ccf60fe3a72b39be64ba3db24af3a2f032b6174af096f37b WatchSource:0}: Error finding container 905dd52b70620838ccf60fe3a72b39be64ba3db24af3a2f032b6174af096f37b: Status 404 returned error can't find the container with id 905dd52b70620838ccf60fe3a72b39be64ba3db24af3a2f032b6174af096f37b Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.998791 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cmmch" event={"ID":"8a132514-cc3c-49a6-9a36-812490cf7ada","Type":"ContainerStarted","Data":"905dd52b70620838ccf60fe3a72b39be64ba3db24af3a2f032b6174af096f37b"} Feb 19 21:42:13 crc kubenswrapper[4795]: I0219 21:42:13.031646 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cmmch" event={"ID":"8a132514-cc3c-49a6-9a36-812490cf7ada","Type":"ContainerStarted","Data":"bef1c724315170992f3b10dbf8664ef1160980726f3f778c97e5120732fbade1"} Feb 19 21:42:13 crc kubenswrapper[4795]: I0219 21:42:13.061197 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cmmch" podStartSLOduration=1.757070524 podStartE2EDuration="6.061132057s" podCreationTimestamp="2026-02-19 21:42:07 +0000 UTC" firstStartedPulling="2026-02-19 21:42:07.776801557 +0000 UTC m=+838.969319411" lastFinishedPulling="2026-02-19 21:42:12.08086308 +0000 UTC m=+843.273380944" observedRunningTime="2026-02-19 21:42:13.058718199 +0000 UTC m=+844.251236083" watchObservedRunningTime="2026-02-19 21:42:13.061132057 +0000 UTC m=+844.253649951" Feb 19 21:42:16 crc kubenswrapper[4795]: I0219 21:42:16.422919 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-bjb5c"] Feb 19 21:42:16 crc kubenswrapper[4795]: I0219 21:42:16.424298 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-bjb5c" Feb 19 21:42:16 crc kubenswrapper[4795]: I0219 21:42:16.426549 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 19 21:42:16 crc kubenswrapper[4795]: I0219 21:42:16.426701 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 19 21:42:16 crc kubenswrapper[4795]: I0219 21:42:16.426824 4795 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-m2pm6" Feb 19 21:42:16 crc kubenswrapper[4795]: I0219 21:42:16.430019 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-bjb5c"] Feb 19 21:42:16 crc kubenswrapper[4795]: I0219 21:42:16.593779 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b645fbc-f078-4ebe-958d-cd7a8f8ba1ee-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-bjb5c\" (UID: \"7b645fbc-f078-4ebe-958d-cd7a8f8ba1ee\") " pod="cert-manager/cert-manager-webhook-6888856db4-bjb5c" Feb 19 21:42:16 crc kubenswrapper[4795]: I0219 21:42:16.593862 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6gwv\" (UniqueName: \"kubernetes.io/projected/7b645fbc-f078-4ebe-958d-cd7a8f8ba1ee-kube-api-access-f6gwv\") pod \"cert-manager-webhook-6888856db4-bjb5c\" (UID: \"7b645fbc-f078-4ebe-958d-cd7a8f8ba1ee\") " pod="cert-manager/cert-manager-webhook-6888856db4-bjb5c" Feb 19 21:42:16 crc kubenswrapper[4795]: I0219 21:42:16.695650 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b645fbc-f078-4ebe-958d-cd7a8f8ba1ee-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-bjb5c\" (UID: \"7b645fbc-f078-4ebe-958d-cd7a8f8ba1ee\") " pod="cert-manager/cert-manager-webhook-6888856db4-bjb5c" Feb 19 21:42:16 crc kubenswrapper[4795]: I0219 21:42:16.695714 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6gwv\" (UniqueName: \"kubernetes.io/projected/7b645fbc-f078-4ebe-958d-cd7a8f8ba1ee-kube-api-access-f6gwv\") pod \"cert-manager-webhook-6888856db4-bjb5c\" (UID: \"7b645fbc-f078-4ebe-958d-cd7a8f8ba1ee\") " pod="cert-manager/cert-manager-webhook-6888856db4-bjb5c" Feb 19 21:42:16 crc kubenswrapper[4795]: I0219 21:42:16.712747 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b645fbc-f078-4ebe-958d-cd7a8f8ba1ee-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-bjb5c\" (UID: \"7b645fbc-f078-4ebe-958d-cd7a8f8ba1ee\") " pod="cert-manager/cert-manager-webhook-6888856db4-bjb5c" Feb 19 21:42:16 crc kubenswrapper[4795]: I0219 21:42:16.712937 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6gwv\" (UniqueName: \"kubernetes.io/projected/7b645fbc-f078-4ebe-958d-cd7a8f8ba1ee-kube-api-access-f6gwv\") pod \"cert-manager-webhook-6888856db4-bjb5c\" (UID: \"7b645fbc-f078-4ebe-958d-cd7a8f8ba1ee\") " pod="cert-manager/cert-manager-webhook-6888856db4-bjb5c" Feb 19 21:42:16 crc kubenswrapper[4795]: I0219 21:42:16.742078 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-bjb5c" Feb 19 21:42:17 crc kubenswrapper[4795]: I0219 21:42:17.151617 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-n8skq"] Feb 19 21:42:17 crc kubenswrapper[4795]: I0219 21:42:17.152952 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-n8skq" Feb 19 21:42:17 crc kubenswrapper[4795]: I0219 21:42:17.154844 4795 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-qsv95" Feb 19 21:42:17 crc kubenswrapper[4795]: I0219 21:42:17.200328 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-n8skq"] Feb 19 21:42:17 crc kubenswrapper[4795]: I0219 21:42:17.282077 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-bjb5c"] Feb 19 21:42:17 crc kubenswrapper[4795]: I0219 21:42:17.303906 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/35b44919-239d-4fe8-8c53-a3698e24f753-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-n8skq\" (UID: \"35b44919-239d-4fe8-8c53-a3698e24f753\") " pod="cert-manager/cert-manager-cainjector-5545bd876-n8skq" Feb 19 21:42:17 crc kubenswrapper[4795]: I0219 21:42:17.304014 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcwwh\" (UniqueName: \"kubernetes.io/projected/35b44919-239d-4fe8-8c53-a3698e24f753-kube-api-access-qcwwh\") pod \"cert-manager-cainjector-5545bd876-n8skq\" (UID: \"35b44919-239d-4fe8-8c53-a3698e24f753\") " pod="cert-manager/cert-manager-cainjector-5545bd876-n8skq" Feb 19 21:42:17 crc kubenswrapper[4795]: I0219 21:42:17.405820 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcwwh\" (UniqueName: \"kubernetes.io/projected/35b44919-239d-4fe8-8c53-a3698e24f753-kube-api-access-qcwwh\") pod \"cert-manager-cainjector-5545bd876-n8skq\" (UID: \"35b44919-239d-4fe8-8c53-a3698e24f753\") " pod="cert-manager/cert-manager-cainjector-5545bd876-n8skq" Feb 19 21:42:17 crc kubenswrapper[4795]: I0219 21:42:17.406191 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/35b44919-239d-4fe8-8c53-a3698e24f753-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-n8skq\" (UID: \"35b44919-239d-4fe8-8c53-a3698e24f753\") " pod="cert-manager/cert-manager-cainjector-5545bd876-n8skq" Feb 19 21:42:17 crc kubenswrapper[4795]: I0219 21:42:17.425202 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/35b44919-239d-4fe8-8c53-a3698e24f753-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-n8skq\" (UID: \"35b44919-239d-4fe8-8c53-a3698e24f753\") " pod="cert-manager/cert-manager-cainjector-5545bd876-n8skq" Feb 19 21:42:17 crc kubenswrapper[4795]: I0219 21:42:17.425288 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcwwh\" (UniqueName: \"kubernetes.io/projected/35b44919-239d-4fe8-8c53-a3698e24f753-kube-api-access-qcwwh\") pod \"cert-manager-cainjector-5545bd876-n8skq\" (UID: \"35b44919-239d-4fe8-8c53-a3698e24f753\") " pod="cert-manager/cert-manager-cainjector-5545bd876-n8skq" Feb 19 21:42:17 crc kubenswrapper[4795]: I0219 21:42:17.506977 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-n8skq" Feb 19 21:42:17 crc kubenswrapper[4795]: I0219 21:42:17.910569 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-n8skq"] Feb 19 21:42:17 crc kubenswrapper[4795]: W0219 21:42:17.918411 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35b44919_239d_4fe8_8c53_a3698e24f753.slice/crio-29586266cd775b987356a94c8eff4700b242d0b9f8d081d48ec24544abdd234a WatchSource:0}: Error finding container 29586266cd775b987356a94c8eff4700b242d0b9f8d081d48ec24544abdd234a: Status 404 returned error can't find the container with id 29586266cd775b987356a94c8eff4700b242d0b9f8d081d48ec24544abdd234a Feb 19 21:42:18 crc kubenswrapper[4795]: I0219 21:42:18.067997 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-n8skq" event={"ID":"35b44919-239d-4fe8-8c53-a3698e24f753","Type":"ContainerStarted","Data":"29586266cd775b987356a94c8eff4700b242d0b9f8d081d48ec24544abdd234a"} Feb 19 21:42:18 crc kubenswrapper[4795]: I0219 21:42:18.069036 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-bjb5c" event={"ID":"7b645fbc-f078-4ebe-958d-cd7a8f8ba1ee","Type":"ContainerStarted","Data":"2d2c5567b726c1865f57e06bc4c3955b23f9bc923bee9aa2e2c7784c5df2a70a"} Feb 19 21:42:23 crc kubenswrapper[4795]: I0219 21:42:23.101936 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-n8skq" event={"ID":"35b44919-239d-4fe8-8c53-a3698e24f753","Type":"ContainerStarted","Data":"a6a2727ab8180f0e056de89056ed5844515facd30e362c513559fa8ebd109e6b"} Feb 19 21:42:23 crc kubenswrapper[4795]: I0219 21:42:23.103842 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-bjb5c" event={"ID":"7b645fbc-f078-4ebe-958d-cd7a8f8ba1ee","Type":"ContainerStarted","Data":"7282ad756d00d0ecffbc963eefb288f679126aadd6560469d94b13ab709abf92"} Feb 19 21:42:23 crc kubenswrapper[4795]: I0219 21:42:23.103960 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-bjb5c" Feb 19 21:42:23 crc kubenswrapper[4795]: I0219 21:42:23.116620 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-n8skq" podStartSLOduration=2.0653202840000002 podStartE2EDuration="6.116604622s" podCreationTimestamp="2026-02-19 21:42:17 +0000 UTC" firstStartedPulling="2026-02-19 21:42:17.919866508 +0000 UTC m=+849.112384372" lastFinishedPulling="2026-02-19 21:42:21.971150846 +0000 UTC m=+853.163668710" observedRunningTime="2026-02-19 21:42:23.116588862 +0000 UTC m=+854.309106726" watchObservedRunningTime="2026-02-19 21:42:23.116604622 +0000 UTC m=+854.309122486" Feb 19 21:42:23 crc kubenswrapper[4795]: I0219 21:42:23.135738 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-bjb5c" podStartSLOduration=2.451333844 podStartE2EDuration="7.135723168s" podCreationTimestamp="2026-02-19 21:42:16 +0000 UTC" firstStartedPulling="2026-02-19 21:42:17.30025164 +0000 UTC m=+848.492769504" lastFinishedPulling="2026-02-19 21:42:21.984640964 +0000 UTC m=+853.177158828" observedRunningTime="2026-02-19 21:42:23.134111923 +0000 UTC m=+854.326629807" watchObservedRunningTime="2026-02-19 21:42:23.135723168 +0000 UTC m=+854.328241032" Feb 19 21:42:26 crc kubenswrapper[4795]: I0219 21:42:26.101260 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-jdbs7"] Feb 19 21:42:26 crc kubenswrapper[4795]: I0219 21:42:26.104824 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-jdbs7" Feb 19 21:42:26 crc kubenswrapper[4795]: I0219 21:42:26.106634 4795 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-2zdjc" Feb 19 21:42:26 crc kubenswrapper[4795]: I0219 21:42:26.106831 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-jdbs7"] Feb 19 21:42:26 crc kubenswrapper[4795]: I0219 21:42:26.236893 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c1df7da5-3926-430a-8085-202bccbc4d73-bound-sa-token\") pod \"cert-manager-545d4d4674-jdbs7\" (UID: \"c1df7da5-3926-430a-8085-202bccbc4d73\") " pod="cert-manager/cert-manager-545d4d4674-jdbs7" Feb 19 21:42:26 crc kubenswrapper[4795]: I0219 21:42:26.237009 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2pbw\" (UniqueName: \"kubernetes.io/projected/c1df7da5-3926-430a-8085-202bccbc4d73-kube-api-access-j2pbw\") pod \"cert-manager-545d4d4674-jdbs7\" (UID: \"c1df7da5-3926-430a-8085-202bccbc4d73\") " pod="cert-manager/cert-manager-545d4d4674-jdbs7" Feb 19 21:42:26 crc kubenswrapper[4795]: I0219 21:42:26.337993 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c1df7da5-3926-430a-8085-202bccbc4d73-bound-sa-token\") pod \"cert-manager-545d4d4674-jdbs7\" (UID: \"c1df7da5-3926-430a-8085-202bccbc4d73\") " pod="cert-manager/cert-manager-545d4d4674-jdbs7" Feb 19 21:42:26 crc kubenswrapper[4795]: I0219 21:42:26.338065 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2pbw\" (UniqueName: \"kubernetes.io/projected/c1df7da5-3926-430a-8085-202bccbc4d73-kube-api-access-j2pbw\") pod \"cert-manager-545d4d4674-jdbs7\" (UID: \"c1df7da5-3926-430a-8085-202bccbc4d73\") " pod="cert-manager/cert-manager-545d4d4674-jdbs7" Feb 19 21:42:26 crc kubenswrapper[4795]: I0219 21:42:26.357874 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c1df7da5-3926-430a-8085-202bccbc4d73-bound-sa-token\") pod \"cert-manager-545d4d4674-jdbs7\" (UID: \"c1df7da5-3926-430a-8085-202bccbc4d73\") " pod="cert-manager/cert-manager-545d4d4674-jdbs7" Feb 19 21:42:26 crc kubenswrapper[4795]: I0219 21:42:26.362261 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2pbw\" (UniqueName: \"kubernetes.io/projected/c1df7da5-3926-430a-8085-202bccbc4d73-kube-api-access-j2pbw\") pod \"cert-manager-545d4d4674-jdbs7\" (UID: \"c1df7da5-3926-430a-8085-202bccbc4d73\") " pod="cert-manager/cert-manager-545d4d4674-jdbs7" Feb 19 21:42:26 crc kubenswrapper[4795]: I0219 21:42:26.428260 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-jdbs7" Feb 19 21:42:26 crc kubenswrapper[4795]: W0219 21:42:26.872463 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1df7da5_3926_430a_8085_202bccbc4d73.slice/crio-702f343b7bad680159b3a0adda297d9671b00065b65f3e76618d7dfcdf9c518c WatchSource:0}: Error finding container 702f343b7bad680159b3a0adda297d9671b00065b65f3e76618d7dfcdf9c518c: Status 404 returned error can't find the container with id 702f343b7bad680159b3a0adda297d9671b00065b65f3e76618d7dfcdf9c518c Feb 19 21:42:26 crc kubenswrapper[4795]: I0219 21:42:26.874249 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-jdbs7"] Feb 19 21:42:27 crc kubenswrapper[4795]: I0219 21:42:27.125975 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-jdbs7" event={"ID":"c1df7da5-3926-430a-8085-202bccbc4d73","Type":"ContainerStarted","Data":"eb53a395a459fb62ae7aa8811ebccc7350481995d3e45eeb9820a24fb4916f59"} Feb 19 21:42:27 crc kubenswrapper[4795]: I0219 21:42:27.126334 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-jdbs7" event={"ID":"c1df7da5-3926-430a-8085-202bccbc4d73","Type":"ContainerStarted","Data":"702f343b7bad680159b3a0adda297d9671b00065b65f3e76618d7dfcdf9c518c"} Feb 19 21:42:27 crc kubenswrapper[4795]: I0219 21:42:27.144801 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-jdbs7" podStartSLOduration=1.144784013 podStartE2EDuration="1.144784013s" podCreationTimestamp="2026-02-19 21:42:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:42:27.140656747 +0000 UTC m=+858.333174611" watchObservedRunningTime="2026-02-19 21:42:27.144784013 +0000 UTC m=+858.337301877" Feb 19 21:42:31 crc kubenswrapper[4795]: I0219 21:42:31.745780 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-bjb5c" Feb 19 21:42:35 crc kubenswrapper[4795]: I0219 21:42:35.262345 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-z5tz7"] Feb 19 21:42:35 crc kubenswrapper[4795]: I0219 21:42:35.263348 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-z5tz7" Feb 19 21:42:35 crc kubenswrapper[4795]: I0219 21:42:35.265758 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 19 21:42:35 crc kubenswrapper[4795]: I0219 21:42:35.265994 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-tr24d" Feb 19 21:42:35 crc kubenswrapper[4795]: I0219 21:42:35.266203 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 19 21:42:35 crc kubenswrapper[4795]: I0219 21:42:35.283747 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-z5tz7"] Feb 19 21:42:35 crc kubenswrapper[4795]: I0219 21:42:35.361120 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8pjb\" (UniqueName: \"kubernetes.io/projected/4eeae4db-e5c7-4179-9040-91ebfbc5d48a-kube-api-access-q8pjb\") pod \"openstack-operator-index-z5tz7\" (UID: \"4eeae4db-e5c7-4179-9040-91ebfbc5d48a\") " pod="openstack-operators/openstack-operator-index-z5tz7" Feb 19 21:42:35 crc kubenswrapper[4795]: I0219 21:42:35.462676 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8pjb\" (UniqueName: \"kubernetes.io/projected/4eeae4db-e5c7-4179-9040-91ebfbc5d48a-kube-api-access-q8pjb\") pod \"openstack-operator-index-z5tz7\" (UID: \"4eeae4db-e5c7-4179-9040-91ebfbc5d48a\") " pod="openstack-operators/openstack-operator-index-z5tz7" Feb 19 21:42:35 crc kubenswrapper[4795]: I0219 21:42:35.497908 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8pjb\" (UniqueName: \"kubernetes.io/projected/4eeae4db-e5c7-4179-9040-91ebfbc5d48a-kube-api-access-q8pjb\") pod \"openstack-operator-index-z5tz7\" (UID: \"4eeae4db-e5c7-4179-9040-91ebfbc5d48a\") " pod="openstack-operators/openstack-operator-index-z5tz7" Feb 19 21:42:35 crc kubenswrapper[4795]: I0219 21:42:35.598669 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-z5tz7" Feb 19 21:42:36 crc kubenswrapper[4795]: I0219 21:42:36.039995 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-z5tz7"] Feb 19 21:42:36 crc kubenswrapper[4795]: W0219 21:42:36.043876 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4eeae4db_e5c7_4179_9040_91ebfbc5d48a.slice/crio-4fa36661f9191c592cdf1314b296aecb5ba6000cbbc3fd46efdaf12a69e8dc11 WatchSource:0}: Error finding container 4fa36661f9191c592cdf1314b296aecb5ba6000cbbc3fd46efdaf12a69e8dc11: Status 404 returned error can't find the container with id 4fa36661f9191c592cdf1314b296aecb5ba6000cbbc3fd46efdaf12a69e8dc11 Feb 19 21:42:36 crc kubenswrapper[4795]: I0219 21:42:36.180111 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-z5tz7" event={"ID":"4eeae4db-e5c7-4179-9040-91ebfbc5d48a","Type":"ContainerStarted","Data":"4fa36661f9191c592cdf1314b296aecb5ba6000cbbc3fd46efdaf12a69e8dc11"} Feb 19 21:42:37 crc kubenswrapper[4795]: I0219 21:42:37.188557 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-z5tz7" event={"ID":"4eeae4db-e5c7-4179-9040-91ebfbc5d48a","Type":"ContainerStarted","Data":"0f8c0ef1817047e299fdb098c5364254a2f8c054905906ee44025dbb49f6a8ab"} Feb 19 21:42:37 crc kubenswrapper[4795]: I0219 21:42:37.209428 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-z5tz7" podStartSLOduration=1.4712887559999999 podStartE2EDuration="2.209396814s" podCreationTimestamp="2026-02-19 21:42:35 +0000 UTC" firstStartedPulling="2026-02-19 21:42:36.046234902 +0000 UTC m=+867.238752766" lastFinishedPulling="2026-02-19 21:42:36.78434296 +0000 UTC m=+867.976860824" observedRunningTime="2026-02-19 21:42:37.202778849 +0000 UTC m=+868.395296743" watchObservedRunningTime="2026-02-19 21:42:37.209396814 +0000 UTC m=+868.401914718" Feb 19 21:42:38 crc kubenswrapper[4795]: I0219 21:42:38.437613 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-z5tz7"] Feb 19 21:42:39 crc kubenswrapper[4795]: I0219 21:42:39.046143 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-tf75g"] Feb 19 21:42:39 crc kubenswrapper[4795]: I0219 21:42:39.046987 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-tf75g" Feb 19 21:42:39 crc kubenswrapper[4795]: I0219 21:42:39.058924 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-tf75g"] Feb 19 21:42:39 crc kubenswrapper[4795]: I0219 21:42:39.200190 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-z5tz7" podUID="4eeae4db-e5c7-4179-9040-91ebfbc5d48a" containerName="registry-server" containerID="cri-o://0f8c0ef1817047e299fdb098c5364254a2f8c054905906ee44025dbb49f6a8ab" gracePeriod=2 Feb 19 21:42:39 crc kubenswrapper[4795]: I0219 21:42:39.212244 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfhts\" (UniqueName: \"kubernetes.io/projected/91c93ffc-fbe2-486e-92a9-ca5737dc7875-kube-api-access-gfhts\") pod \"openstack-operator-index-tf75g\" (UID: \"91c93ffc-fbe2-486e-92a9-ca5737dc7875\") " pod="openstack-operators/openstack-operator-index-tf75g" Feb 19 21:42:39 crc kubenswrapper[4795]: I0219 21:42:39.313745 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfhts\" (UniqueName: \"kubernetes.io/projected/91c93ffc-fbe2-486e-92a9-ca5737dc7875-kube-api-access-gfhts\") pod \"openstack-operator-index-tf75g\" (UID: \"91c93ffc-fbe2-486e-92a9-ca5737dc7875\") " pod="openstack-operators/openstack-operator-index-tf75g" Feb 19 21:42:39 crc kubenswrapper[4795]: I0219 21:42:39.334308 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfhts\" (UniqueName: \"kubernetes.io/projected/91c93ffc-fbe2-486e-92a9-ca5737dc7875-kube-api-access-gfhts\") pod \"openstack-operator-index-tf75g\" (UID: \"91c93ffc-fbe2-486e-92a9-ca5737dc7875\") " pod="openstack-operators/openstack-operator-index-tf75g" Feb 19 21:42:39 crc kubenswrapper[4795]: I0219 21:42:39.367602 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-tf75g" Feb 19 21:42:39 crc kubenswrapper[4795]: I0219 21:42:39.565966 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-z5tz7" Feb 19 21:42:39 crc kubenswrapper[4795]: I0219 21:42:39.598601 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-tf75g"] Feb 19 21:42:39 crc kubenswrapper[4795]: W0219 21:42:39.606305 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91c93ffc_fbe2_486e_92a9_ca5737dc7875.slice/crio-9a27a550b86db7057ede89e67f90298f9c043451704d647ab661cf98355e4a0c WatchSource:0}: Error finding container 9a27a550b86db7057ede89e67f90298f9c043451704d647ab661cf98355e4a0c: Status 404 returned error can't find the container with id 9a27a550b86db7057ede89e67f90298f9c043451704d647ab661cf98355e4a0c Feb 19 21:42:39 crc kubenswrapper[4795]: I0219 21:42:39.725007 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8pjb\" (UniqueName: \"kubernetes.io/projected/4eeae4db-e5c7-4179-9040-91ebfbc5d48a-kube-api-access-q8pjb\") pod \"4eeae4db-e5c7-4179-9040-91ebfbc5d48a\" (UID: \"4eeae4db-e5c7-4179-9040-91ebfbc5d48a\") " Feb 19 21:42:39 crc kubenswrapper[4795]: I0219 21:42:39.729451 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eeae4db-e5c7-4179-9040-91ebfbc5d48a-kube-api-access-q8pjb" (OuterVolumeSpecName: "kube-api-access-q8pjb") pod "4eeae4db-e5c7-4179-9040-91ebfbc5d48a" (UID: "4eeae4db-e5c7-4179-9040-91ebfbc5d48a"). InnerVolumeSpecName "kube-api-access-q8pjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:42:39 crc kubenswrapper[4795]: I0219 21:42:39.826785 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8pjb\" (UniqueName: \"kubernetes.io/projected/4eeae4db-e5c7-4179-9040-91ebfbc5d48a-kube-api-access-q8pjb\") on node \"crc\" DevicePath \"\"" Feb 19 21:42:40 crc kubenswrapper[4795]: I0219 21:42:40.206229 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-tf75g" event={"ID":"91c93ffc-fbe2-486e-92a9-ca5737dc7875","Type":"ContainerStarted","Data":"90e6c28c48f3f75fa44a5da70c3c91644365c580e2b9079af48fa261adedb09b"} Feb 19 21:42:40 crc kubenswrapper[4795]: I0219 21:42:40.206651 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-tf75g" event={"ID":"91c93ffc-fbe2-486e-92a9-ca5737dc7875","Type":"ContainerStarted","Data":"9a27a550b86db7057ede89e67f90298f9c043451704d647ab661cf98355e4a0c"} Feb 19 21:42:40 crc kubenswrapper[4795]: I0219 21:42:40.207612 4795 generic.go:334] "Generic (PLEG): container finished" podID="4eeae4db-e5c7-4179-9040-91ebfbc5d48a" containerID="0f8c0ef1817047e299fdb098c5364254a2f8c054905906ee44025dbb49f6a8ab" exitCode=0 Feb 19 21:42:40 crc kubenswrapper[4795]: I0219 21:42:40.207647 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-z5tz7" event={"ID":"4eeae4db-e5c7-4179-9040-91ebfbc5d48a","Type":"ContainerDied","Data":"0f8c0ef1817047e299fdb098c5364254a2f8c054905906ee44025dbb49f6a8ab"} Feb 19 21:42:40 crc kubenswrapper[4795]: I0219 21:42:40.207669 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-z5tz7" event={"ID":"4eeae4db-e5c7-4179-9040-91ebfbc5d48a","Type":"ContainerDied","Data":"4fa36661f9191c592cdf1314b296aecb5ba6000cbbc3fd46efdaf12a69e8dc11"} Feb 19 21:42:40 crc kubenswrapper[4795]: I0219 21:42:40.207683 4795 scope.go:117] "RemoveContainer" containerID="0f8c0ef1817047e299fdb098c5364254a2f8c054905906ee44025dbb49f6a8ab" Feb 19 21:42:40 crc kubenswrapper[4795]: I0219 21:42:40.207770 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-z5tz7" Feb 19 21:42:40 crc kubenswrapper[4795]: I0219 21:42:40.227097 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-tf75g" podStartSLOduration=0.835997599 podStartE2EDuration="1.227073731s" podCreationTimestamp="2026-02-19 21:42:39 +0000 UTC" firstStartedPulling="2026-02-19 21:42:39.610388625 +0000 UTC m=+870.802906489" lastFinishedPulling="2026-02-19 21:42:40.001464757 +0000 UTC m=+871.193982621" observedRunningTime="2026-02-19 21:42:40.219044706 +0000 UTC m=+871.411562570" watchObservedRunningTime="2026-02-19 21:42:40.227073731 +0000 UTC m=+871.419591595" Feb 19 21:42:40 crc kubenswrapper[4795]: I0219 21:42:40.239828 4795 scope.go:117] "RemoveContainer" containerID="0f8c0ef1817047e299fdb098c5364254a2f8c054905906ee44025dbb49f6a8ab" Feb 19 21:42:40 crc kubenswrapper[4795]: E0219 21:42:40.244138 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f8c0ef1817047e299fdb098c5364254a2f8c054905906ee44025dbb49f6a8ab\": container with ID starting with 0f8c0ef1817047e299fdb098c5364254a2f8c054905906ee44025dbb49f6a8ab not found: ID does not exist" containerID="0f8c0ef1817047e299fdb098c5364254a2f8c054905906ee44025dbb49f6a8ab" Feb 19 21:42:40 crc kubenswrapper[4795]: I0219 21:42:40.244222 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f8c0ef1817047e299fdb098c5364254a2f8c054905906ee44025dbb49f6a8ab"} err="failed to get container status \"0f8c0ef1817047e299fdb098c5364254a2f8c054905906ee44025dbb49f6a8ab\": rpc error: code = NotFound desc = could not find container \"0f8c0ef1817047e299fdb098c5364254a2f8c054905906ee44025dbb49f6a8ab\": container with ID starting with 0f8c0ef1817047e299fdb098c5364254a2f8c054905906ee44025dbb49f6a8ab not found: ID does not exist" Feb 19 21:42:40 crc kubenswrapper[4795]: I0219 21:42:40.251549 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-z5tz7"] Feb 19 21:42:40 crc kubenswrapper[4795]: I0219 21:42:40.257069 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-z5tz7"] Feb 19 21:42:41 crc kubenswrapper[4795]: I0219 21:42:41.518917 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eeae4db-e5c7-4179-9040-91ebfbc5d48a" path="/var/lib/kubelet/pods/4eeae4db-e5c7-4179-9040-91ebfbc5d48a/volumes" Feb 19 21:42:49 crc kubenswrapper[4795]: I0219 21:42:49.368633 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-tf75g" Feb 19 21:42:49 crc kubenswrapper[4795]: I0219 21:42:49.369419 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-tf75g" Feb 19 21:42:49 crc kubenswrapper[4795]: I0219 21:42:49.405678 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-tf75g" Feb 19 21:42:50 crc kubenswrapper[4795]: I0219 21:42:50.294985 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-tf75g" Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.303598 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4"] Feb 19 21:42:56 crc kubenswrapper[4795]: E0219 21:42:56.304449 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eeae4db-e5c7-4179-9040-91ebfbc5d48a" containerName="registry-server" Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.304464 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eeae4db-e5c7-4179-9040-91ebfbc5d48a" containerName="registry-server" Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.304632 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eeae4db-e5c7-4179-9040-91ebfbc5d48a" containerName="registry-server" Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.305619 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.319843 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-ldffb" Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.328168 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4"] Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.463145 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6afab948-ae77-464b-aa33-b8d45ddc01ff-util\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4\" (UID: \"6afab948-ae77-464b-aa33-b8d45ddc01ff\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.463210 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6afab948-ae77-464b-aa33-b8d45ddc01ff-bundle\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4\" (UID: \"6afab948-ae77-464b-aa33-b8d45ddc01ff\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.463236 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdd2d\" (UniqueName: \"kubernetes.io/projected/6afab948-ae77-464b-aa33-b8d45ddc01ff-kube-api-access-mdd2d\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4\" (UID: \"6afab948-ae77-464b-aa33-b8d45ddc01ff\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.563991 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6afab948-ae77-464b-aa33-b8d45ddc01ff-util\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4\" (UID: \"6afab948-ae77-464b-aa33-b8d45ddc01ff\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.564327 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6afab948-ae77-464b-aa33-b8d45ddc01ff-bundle\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4\" (UID: \"6afab948-ae77-464b-aa33-b8d45ddc01ff\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.564453 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdd2d\" (UniqueName: \"kubernetes.io/projected/6afab948-ae77-464b-aa33-b8d45ddc01ff-kube-api-access-mdd2d\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4\" (UID: \"6afab948-ae77-464b-aa33-b8d45ddc01ff\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.564721 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6afab948-ae77-464b-aa33-b8d45ddc01ff-util\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4\" (UID: \"6afab948-ae77-464b-aa33-b8d45ddc01ff\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.564763 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6afab948-ae77-464b-aa33-b8d45ddc01ff-bundle\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4\" (UID: \"6afab948-ae77-464b-aa33-b8d45ddc01ff\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.586496 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdd2d\" (UniqueName: \"kubernetes.io/projected/6afab948-ae77-464b-aa33-b8d45ddc01ff-kube-api-access-mdd2d\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4\" (UID: \"6afab948-ae77-464b-aa33-b8d45ddc01ff\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.690072 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.856865 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4"] Feb 19 21:42:56 crc kubenswrapper[4795]: W0219 21:42:56.864875 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6afab948_ae77_464b_aa33_b8d45ddc01ff.slice/crio-124421aeefdb63a302c2bb5e887c611741297446fc6e7c08469c3faa3b2571c7 WatchSource:0}: Error finding container 124421aeefdb63a302c2bb5e887c611741297446fc6e7c08469c3faa3b2571c7: Status 404 returned error can't find the container with id 124421aeefdb63a302c2bb5e887c611741297446fc6e7c08469c3faa3b2571c7 Feb 19 21:42:57 crc kubenswrapper[4795]: I0219 21:42:57.317806 4795 generic.go:334] "Generic (PLEG): container finished" podID="6afab948-ae77-464b-aa33-b8d45ddc01ff" containerID="560fa89ed64d4fb8a3f7191bc9e26dfc07616a4bde2cd7a679b6b0b56a68cd4f" exitCode=0 Feb 19 21:42:57 crc kubenswrapper[4795]: I0219 21:42:57.317897 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" event={"ID":"6afab948-ae77-464b-aa33-b8d45ddc01ff","Type":"ContainerDied","Data":"560fa89ed64d4fb8a3f7191bc9e26dfc07616a4bde2cd7a679b6b0b56a68cd4f"} Feb 19 21:42:57 crc kubenswrapper[4795]: I0219 21:42:57.318055 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" event={"ID":"6afab948-ae77-464b-aa33-b8d45ddc01ff","Type":"ContainerStarted","Data":"124421aeefdb63a302c2bb5e887c611741297446fc6e7c08469c3faa3b2571c7"} Feb 19 21:42:58 crc kubenswrapper[4795]: I0219 21:42:58.336199 4795 generic.go:334] "Generic (PLEG): container finished" podID="6afab948-ae77-464b-aa33-b8d45ddc01ff" containerID="bd54c9335ce2f7069bfa894ae6efb0cce1568572f89ec25fa875d2cd76665ac2" exitCode=0 Feb 19 21:42:58 crc kubenswrapper[4795]: I0219 21:42:58.336298 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" event={"ID":"6afab948-ae77-464b-aa33-b8d45ddc01ff","Type":"ContainerDied","Data":"bd54c9335ce2f7069bfa894ae6efb0cce1568572f89ec25fa875d2cd76665ac2"} Feb 19 21:42:58 crc kubenswrapper[4795]: I0219 21:42:58.428631 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:42:58 crc kubenswrapper[4795]: I0219 21:42:58.428699 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:42:59 crc kubenswrapper[4795]: I0219 21:42:59.345683 4795 generic.go:334] "Generic (PLEG): container finished" podID="6afab948-ae77-464b-aa33-b8d45ddc01ff" containerID="5a8e78242dbb62fb0a41bc624cecf9454d7d079f6f6576ad6349df0d475f901d" exitCode=0 Feb 19 21:42:59 crc kubenswrapper[4795]: I0219 21:42:59.345728 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" event={"ID":"6afab948-ae77-464b-aa33-b8d45ddc01ff","Type":"ContainerDied","Data":"5a8e78242dbb62fb0a41bc624cecf9454d7d079f6f6576ad6349df0d475f901d"} Feb 19 21:43:00 crc kubenswrapper[4795]: I0219 21:43:00.720244 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" Feb 19 21:43:00 crc kubenswrapper[4795]: I0219 21:43:00.822069 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6afab948-ae77-464b-aa33-b8d45ddc01ff-util\") pod \"6afab948-ae77-464b-aa33-b8d45ddc01ff\" (UID: \"6afab948-ae77-464b-aa33-b8d45ddc01ff\") " Feb 19 21:43:00 crc kubenswrapper[4795]: I0219 21:43:00.822132 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6afab948-ae77-464b-aa33-b8d45ddc01ff-bundle\") pod \"6afab948-ae77-464b-aa33-b8d45ddc01ff\" (UID: \"6afab948-ae77-464b-aa33-b8d45ddc01ff\") " Feb 19 21:43:00 crc kubenswrapper[4795]: I0219 21:43:00.822214 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdd2d\" (UniqueName: \"kubernetes.io/projected/6afab948-ae77-464b-aa33-b8d45ddc01ff-kube-api-access-mdd2d\") pod \"6afab948-ae77-464b-aa33-b8d45ddc01ff\" (UID: \"6afab948-ae77-464b-aa33-b8d45ddc01ff\") " Feb 19 21:43:00 crc kubenswrapper[4795]: I0219 21:43:00.823478 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6afab948-ae77-464b-aa33-b8d45ddc01ff-bundle" (OuterVolumeSpecName: "bundle") pod "6afab948-ae77-464b-aa33-b8d45ddc01ff" (UID: "6afab948-ae77-464b-aa33-b8d45ddc01ff"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:43:00 crc kubenswrapper[4795]: I0219 21:43:00.826976 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6afab948-ae77-464b-aa33-b8d45ddc01ff-kube-api-access-mdd2d" (OuterVolumeSpecName: "kube-api-access-mdd2d") pod "6afab948-ae77-464b-aa33-b8d45ddc01ff" (UID: "6afab948-ae77-464b-aa33-b8d45ddc01ff"). InnerVolumeSpecName "kube-api-access-mdd2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:43:00 crc kubenswrapper[4795]: I0219 21:43:00.840462 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6afab948-ae77-464b-aa33-b8d45ddc01ff-util" (OuterVolumeSpecName: "util") pod "6afab948-ae77-464b-aa33-b8d45ddc01ff" (UID: "6afab948-ae77-464b-aa33-b8d45ddc01ff"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:43:00 crc kubenswrapper[4795]: I0219 21:43:00.924057 4795 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6afab948-ae77-464b-aa33-b8d45ddc01ff-util\") on node \"crc\" DevicePath \"\"" Feb 19 21:43:00 crc kubenswrapper[4795]: I0219 21:43:00.924366 4795 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6afab948-ae77-464b-aa33-b8d45ddc01ff-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:43:00 crc kubenswrapper[4795]: I0219 21:43:00.924452 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdd2d\" (UniqueName: \"kubernetes.io/projected/6afab948-ae77-464b-aa33-b8d45ddc01ff-kube-api-access-mdd2d\") on node \"crc\" DevicePath \"\"" Feb 19 21:43:01 crc kubenswrapper[4795]: I0219 21:43:01.367379 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" event={"ID":"6afab948-ae77-464b-aa33-b8d45ddc01ff","Type":"ContainerDied","Data":"124421aeefdb63a302c2bb5e887c611741297446fc6e7c08469c3faa3b2571c7"} Feb 19 21:43:01 crc kubenswrapper[4795]: I0219 21:43:01.367417 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="124421aeefdb63a302c2bb5e887c611741297446fc6e7c08469c3faa3b2571c7" Feb 19 21:43:01 crc kubenswrapper[4795]: I0219 21:43:01.368141 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" Feb 19 21:43:04 crc kubenswrapper[4795]: I0219 21:43:04.001062 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6679bf9b57-vbjcq"] Feb 19 21:43:04 crc kubenswrapper[4795]: E0219 21:43:04.001681 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6afab948-ae77-464b-aa33-b8d45ddc01ff" containerName="extract" Feb 19 21:43:04 crc kubenswrapper[4795]: I0219 21:43:04.001697 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6afab948-ae77-464b-aa33-b8d45ddc01ff" containerName="extract" Feb 19 21:43:04 crc kubenswrapper[4795]: E0219 21:43:04.001720 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6afab948-ae77-464b-aa33-b8d45ddc01ff" containerName="pull" Feb 19 21:43:04 crc kubenswrapper[4795]: I0219 21:43:04.001727 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6afab948-ae77-464b-aa33-b8d45ddc01ff" containerName="pull" Feb 19 21:43:04 crc kubenswrapper[4795]: E0219 21:43:04.001746 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6afab948-ae77-464b-aa33-b8d45ddc01ff" containerName="util" Feb 19 21:43:04 crc kubenswrapper[4795]: I0219 21:43:04.001754 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6afab948-ae77-464b-aa33-b8d45ddc01ff" containerName="util" Feb 19 21:43:04 crc kubenswrapper[4795]: I0219 21:43:04.001884 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6afab948-ae77-464b-aa33-b8d45ddc01ff" containerName="extract" Feb 19 21:43:04 crc kubenswrapper[4795]: I0219 21:43:04.002411 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-vbjcq" Feb 19 21:43:04 crc kubenswrapper[4795]: I0219 21:43:04.004645 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-l2pgb" Feb 19 21:43:04 crc kubenswrapper[4795]: I0219 21:43:04.021630 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6679bf9b57-vbjcq"] Feb 19 21:43:04 crc kubenswrapper[4795]: I0219 21:43:04.063103 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr9mb\" (UniqueName: \"kubernetes.io/projected/c6c44d2f-3e8f-42de-babe-85a8fc1a97ec-kube-api-access-sr9mb\") pod \"openstack-operator-controller-init-6679bf9b57-vbjcq\" (UID: \"c6c44d2f-3e8f-42de-babe-85a8fc1a97ec\") " pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-vbjcq" Feb 19 21:43:04 crc kubenswrapper[4795]: I0219 21:43:04.164708 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr9mb\" (UniqueName: \"kubernetes.io/projected/c6c44d2f-3e8f-42de-babe-85a8fc1a97ec-kube-api-access-sr9mb\") pod \"openstack-operator-controller-init-6679bf9b57-vbjcq\" (UID: \"c6c44d2f-3e8f-42de-babe-85a8fc1a97ec\") " pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-vbjcq" Feb 19 21:43:04 crc kubenswrapper[4795]: I0219 21:43:04.181735 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr9mb\" (UniqueName: \"kubernetes.io/projected/c6c44d2f-3e8f-42de-babe-85a8fc1a97ec-kube-api-access-sr9mb\") pod \"openstack-operator-controller-init-6679bf9b57-vbjcq\" (UID: \"c6c44d2f-3e8f-42de-babe-85a8fc1a97ec\") " pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-vbjcq" Feb 19 21:43:04 crc kubenswrapper[4795]: I0219 21:43:04.317378 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-vbjcq" Feb 19 21:43:04 crc kubenswrapper[4795]: I0219 21:43:04.565093 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6679bf9b57-vbjcq"] Feb 19 21:43:05 crc kubenswrapper[4795]: I0219 21:43:05.394571 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-vbjcq" event={"ID":"c6c44d2f-3e8f-42de-babe-85a8fc1a97ec","Type":"ContainerStarted","Data":"975d1aa22e4e6bad437d6e96741fac9388db1f4391542e732131ee7f49cfc185"} Feb 19 21:43:09 crc kubenswrapper[4795]: I0219 21:43:09.430902 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-vbjcq" event={"ID":"c6c44d2f-3e8f-42de-babe-85a8fc1a97ec","Type":"ContainerStarted","Data":"91ce1c70e01e501ae90013294415d0b3a65ffe27bcbf989054ebea39c295ed52"} Feb 19 21:43:09 crc kubenswrapper[4795]: I0219 21:43:09.431531 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-vbjcq" Feb 19 21:43:09 crc kubenswrapper[4795]: I0219 21:43:09.469208 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-vbjcq" podStartSLOduration=2.638696849 podStartE2EDuration="6.469188668s" podCreationTimestamp="2026-02-19 21:43:03 +0000 UTC" firstStartedPulling="2026-02-19 21:43:04.575232891 +0000 UTC m=+895.767750755" lastFinishedPulling="2026-02-19 21:43:08.40572471 +0000 UTC m=+899.598242574" observedRunningTime="2026-02-19 21:43:09.464647161 +0000 UTC m=+900.657165055" watchObservedRunningTime="2026-02-19 21:43:09.469188668 +0000 UTC m=+900.661706542" Feb 19 21:43:13 crc kubenswrapper[4795]: I0219 21:43:13.528477 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-84vzh"] Feb 19 21:43:13 crc kubenswrapper[4795]: I0219 21:43:13.530056 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:13 crc kubenswrapper[4795]: I0219 21:43:13.555336 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-84vzh"] Feb 19 21:43:13 crc kubenswrapper[4795]: I0219 21:43:13.585797 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceeabb16-8075-4c75-8d79-b49b92451b81-catalog-content\") pod \"community-operators-84vzh\" (UID: \"ceeabb16-8075-4c75-8d79-b49b92451b81\") " pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:13 crc kubenswrapper[4795]: I0219 21:43:13.586203 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceeabb16-8075-4c75-8d79-b49b92451b81-utilities\") pod \"community-operators-84vzh\" (UID: \"ceeabb16-8075-4c75-8d79-b49b92451b81\") " pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:13 crc kubenswrapper[4795]: I0219 21:43:13.586300 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6595\" (UniqueName: \"kubernetes.io/projected/ceeabb16-8075-4c75-8d79-b49b92451b81-kube-api-access-d6595\") pod \"community-operators-84vzh\" (UID: \"ceeabb16-8075-4c75-8d79-b49b92451b81\") " pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:13 crc kubenswrapper[4795]: I0219 21:43:13.687816 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceeabb16-8075-4c75-8d79-b49b92451b81-utilities\") pod \"community-operators-84vzh\" (UID: \"ceeabb16-8075-4c75-8d79-b49b92451b81\") " pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:13 crc kubenswrapper[4795]: I0219 21:43:13.688079 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6595\" (UniqueName: \"kubernetes.io/projected/ceeabb16-8075-4c75-8d79-b49b92451b81-kube-api-access-d6595\") pod \"community-operators-84vzh\" (UID: \"ceeabb16-8075-4c75-8d79-b49b92451b81\") " pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:13 crc kubenswrapper[4795]: I0219 21:43:13.688245 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceeabb16-8075-4c75-8d79-b49b92451b81-catalog-content\") pod \"community-operators-84vzh\" (UID: \"ceeabb16-8075-4c75-8d79-b49b92451b81\") " pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:13 crc kubenswrapper[4795]: I0219 21:43:13.688407 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceeabb16-8075-4c75-8d79-b49b92451b81-utilities\") pod \"community-operators-84vzh\" (UID: \"ceeabb16-8075-4c75-8d79-b49b92451b81\") " pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:13 crc kubenswrapper[4795]: I0219 21:43:13.688574 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceeabb16-8075-4c75-8d79-b49b92451b81-catalog-content\") pod \"community-operators-84vzh\" (UID: \"ceeabb16-8075-4c75-8d79-b49b92451b81\") " pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:13 crc kubenswrapper[4795]: I0219 21:43:13.708322 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6595\" (UniqueName: \"kubernetes.io/projected/ceeabb16-8075-4c75-8d79-b49b92451b81-kube-api-access-d6595\") pod \"community-operators-84vzh\" (UID: \"ceeabb16-8075-4c75-8d79-b49b92451b81\") " pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:13 crc kubenswrapper[4795]: I0219 21:43:13.864639 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:14 crc kubenswrapper[4795]: I0219 21:43:14.102465 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-84vzh"] Feb 19 21:43:14 crc kubenswrapper[4795]: I0219 21:43:14.319487 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-vbjcq" Feb 19 21:43:14 crc kubenswrapper[4795]: I0219 21:43:14.464302 4795 generic.go:334] "Generic (PLEG): container finished" podID="ceeabb16-8075-4c75-8d79-b49b92451b81" containerID="b84390b5874e7664aed5e9062db897ad8d52dd572518db2386862aac9d669737" exitCode=0 Feb 19 21:43:14 crc kubenswrapper[4795]: I0219 21:43:14.464356 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84vzh" event={"ID":"ceeabb16-8075-4c75-8d79-b49b92451b81","Type":"ContainerDied","Data":"b84390b5874e7664aed5e9062db897ad8d52dd572518db2386862aac9d669737"} Feb 19 21:43:14 crc kubenswrapper[4795]: I0219 21:43:14.464408 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84vzh" event={"ID":"ceeabb16-8075-4c75-8d79-b49b92451b81","Type":"ContainerStarted","Data":"5e0099795ccfe2d93b8fa7fc956a19af072f61fe6c2e7aef96555ee34be37e44"} Feb 19 21:43:15 crc kubenswrapper[4795]: I0219 21:43:15.472691 4795 generic.go:334] "Generic (PLEG): container finished" podID="ceeabb16-8075-4c75-8d79-b49b92451b81" containerID="55121af7a869535898dd3f8c014efea12f70c43c4fa01f772dee8b347918943f" exitCode=0 Feb 19 21:43:15 crc kubenswrapper[4795]: I0219 21:43:15.472774 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84vzh" event={"ID":"ceeabb16-8075-4c75-8d79-b49b92451b81","Type":"ContainerDied","Data":"55121af7a869535898dd3f8c014efea12f70c43c4fa01f772dee8b347918943f"} Feb 19 21:43:16 crc kubenswrapper[4795]: I0219 21:43:16.480521 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84vzh" event={"ID":"ceeabb16-8075-4c75-8d79-b49b92451b81","Type":"ContainerStarted","Data":"5e2716d65439411b2778543fac0f0156337db9e7c7c3213d7217d4f0ef1d5487"} Feb 19 21:43:16 crc kubenswrapper[4795]: I0219 21:43:16.508273 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-84vzh" podStartSLOduration=2.153926838 podStartE2EDuration="3.50824638s" podCreationTimestamp="2026-02-19 21:43:13 +0000 UTC" firstStartedPulling="2026-02-19 21:43:14.465640369 +0000 UTC m=+905.658158233" lastFinishedPulling="2026-02-19 21:43:15.819959911 +0000 UTC m=+907.012477775" observedRunningTime="2026-02-19 21:43:16.50081186 +0000 UTC m=+907.693329734" watchObservedRunningTime="2026-02-19 21:43:16.50824638 +0000 UTC m=+907.700764284" Feb 19 21:43:19 crc kubenswrapper[4795]: I0219 21:43:19.342921 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-df67c"] Feb 19 21:43:19 crc kubenswrapper[4795]: I0219 21:43:19.345055 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:19 crc kubenswrapper[4795]: I0219 21:43:19.373110 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-df67c"] Feb 19 21:43:19 crc kubenswrapper[4795]: I0219 21:43:19.464309 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfg7g\" (UniqueName: \"kubernetes.io/projected/26c9b930-41fe-4332-9fde-8c9d4cb304bd-kube-api-access-hfg7g\") pod \"certified-operators-df67c\" (UID: \"26c9b930-41fe-4332-9fde-8c9d4cb304bd\") " pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:19 crc kubenswrapper[4795]: I0219 21:43:19.464361 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26c9b930-41fe-4332-9fde-8c9d4cb304bd-catalog-content\") pod \"certified-operators-df67c\" (UID: \"26c9b930-41fe-4332-9fde-8c9d4cb304bd\") " pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:19 crc kubenswrapper[4795]: I0219 21:43:19.464382 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26c9b930-41fe-4332-9fde-8c9d4cb304bd-utilities\") pod \"certified-operators-df67c\" (UID: \"26c9b930-41fe-4332-9fde-8c9d4cb304bd\") " pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:19 crc kubenswrapper[4795]: I0219 21:43:19.565404 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfg7g\" (UniqueName: \"kubernetes.io/projected/26c9b930-41fe-4332-9fde-8c9d4cb304bd-kube-api-access-hfg7g\") pod \"certified-operators-df67c\" (UID: \"26c9b930-41fe-4332-9fde-8c9d4cb304bd\") " pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:19 crc kubenswrapper[4795]: I0219 21:43:19.565468 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26c9b930-41fe-4332-9fde-8c9d4cb304bd-catalog-content\") pod \"certified-operators-df67c\" (UID: \"26c9b930-41fe-4332-9fde-8c9d4cb304bd\") " pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:19 crc kubenswrapper[4795]: I0219 21:43:19.565492 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26c9b930-41fe-4332-9fde-8c9d4cb304bd-utilities\") pod \"certified-operators-df67c\" (UID: \"26c9b930-41fe-4332-9fde-8c9d4cb304bd\") " pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:19 crc kubenswrapper[4795]: I0219 21:43:19.567151 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26c9b930-41fe-4332-9fde-8c9d4cb304bd-catalog-content\") pod \"certified-operators-df67c\" (UID: \"26c9b930-41fe-4332-9fde-8c9d4cb304bd\") " pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:19 crc kubenswrapper[4795]: I0219 21:43:19.567390 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26c9b930-41fe-4332-9fde-8c9d4cb304bd-utilities\") pod \"certified-operators-df67c\" (UID: \"26c9b930-41fe-4332-9fde-8c9d4cb304bd\") " pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:19 crc kubenswrapper[4795]: I0219 21:43:19.583751 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfg7g\" (UniqueName: \"kubernetes.io/projected/26c9b930-41fe-4332-9fde-8c9d4cb304bd-kube-api-access-hfg7g\") pod \"certified-operators-df67c\" (UID: \"26c9b930-41fe-4332-9fde-8c9d4cb304bd\") " pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:19 crc kubenswrapper[4795]: I0219 21:43:19.668004 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:20 crc kubenswrapper[4795]: I0219 21:43:20.082536 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-df67c"] Feb 19 21:43:20 crc kubenswrapper[4795]: E0219 21:43:20.332574 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26c9b930_41fe_4332_9fde_8c9d4cb304bd.slice/crio-aad3657be5b442a652805a5a7969b9cf191ab890426ca287320693f93c38eb61.scope\": RecentStats: unable to find data in memory cache]" Feb 19 21:43:20 crc kubenswrapper[4795]: I0219 21:43:20.506688 4795 generic.go:334] "Generic (PLEG): container finished" podID="26c9b930-41fe-4332-9fde-8c9d4cb304bd" containerID="aad3657be5b442a652805a5a7969b9cf191ab890426ca287320693f93c38eb61" exitCode=0 Feb 19 21:43:20 crc kubenswrapper[4795]: I0219 21:43:20.506749 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df67c" event={"ID":"26c9b930-41fe-4332-9fde-8c9d4cb304bd","Type":"ContainerDied","Data":"aad3657be5b442a652805a5a7969b9cf191ab890426ca287320693f93c38eb61"} Feb 19 21:43:20 crc kubenswrapper[4795]: I0219 21:43:20.506815 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df67c" event={"ID":"26c9b930-41fe-4332-9fde-8c9d4cb304bd","Type":"ContainerStarted","Data":"8e2862dfbf3b5bd22726e0781fe62e4bc152e6dd0506beef8a94a47634a3f65a"} Feb 19 21:43:21 crc kubenswrapper[4795]: I0219 21:43:21.521064 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df67c" event={"ID":"26c9b930-41fe-4332-9fde-8c9d4cb304bd","Type":"ContainerStarted","Data":"673e5888e18f68030d508c2fc17e9b749479219a1048baa8b20751a224c8aac8"} Feb 19 21:43:22 crc kubenswrapper[4795]: I0219 21:43:22.523817 4795 generic.go:334] "Generic (PLEG): container finished" podID="26c9b930-41fe-4332-9fde-8c9d4cb304bd" containerID="673e5888e18f68030d508c2fc17e9b749479219a1048baa8b20751a224c8aac8" exitCode=0 Feb 19 21:43:22 crc kubenswrapper[4795]: I0219 21:43:22.523871 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df67c" event={"ID":"26c9b930-41fe-4332-9fde-8c9d4cb304bd","Type":"ContainerDied","Data":"673e5888e18f68030d508c2fc17e9b749479219a1048baa8b20751a224c8aac8"} Feb 19 21:43:23 crc kubenswrapper[4795]: I0219 21:43:23.530767 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df67c" event={"ID":"26c9b930-41fe-4332-9fde-8c9d4cb304bd","Type":"ContainerStarted","Data":"3a19332ea30d3f30bc9024c2c11ccf1aebf8403a82db1f78a705c81e94520d3d"} Feb 19 21:43:23 crc kubenswrapper[4795]: I0219 21:43:23.866004 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:23 crc kubenswrapper[4795]: I0219 21:43:23.866069 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:23 crc kubenswrapper[4795]: I0219 21:43:23.915701 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:23 crc kubenswrapper[4795]: I0219 21:43:23.932978 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-df67c" podStartSLOduration=2.4934207 podStartE2EDuration="4.932964063s" podCreationTimestamp="2026-02-19 21:43:19 +0000 UTC" firstStartedPulling="2026-02-19 21:43:20.508081929 +0000 UTC m=+911.700599793" lastFinishedPulling="2026-02-19 21:43:22.947625292 +0000 UTC m=+914.140143156" observedRunningTime="2026-02-19 21:43:23.554506089 +0000 UTC m=+914.747023963" watchObservedRunningTime="2026-02-19 21:43:23.932964063 +0000 UTC m=+915.125481927" Feb 19 21:43:24 crc kubenswrapper[4795]: I0219 21:43:24.616050 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:26 crc kubenswrapper[4795]: I0219 21:43:26.329352 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-84vzh"] Feb 19 21:43:26 crc kubenswrapper[4795]: I0219 21:43:26.547981 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-84vzh" podUID="ceeabb16-8075-4c75-8d79-b49b92451b81" containerName="registry-server" containerID="cri-o://5e2716d65439411b2778543fac0f0156337db9e7c7c3213d7217d4f0ef1d5487" gracePeriod=2 Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.490889 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.554834 4795 generic.go:334] "Generic (PLEG): container finished" podID="ceeabb16-8075-4c75-8d79-b49b92451b81" containerID="5e2716d65439411b2778543fac0f0156337db9e7c7c3213d7217d4f0ef1d5487" exitCode=0 Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.554873 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84vzh" event={"ID":"ceeabb16-8075-4c75-8d79-b49b92451b81","Type":"ContainerDied","Data":"5e2716d65439411b2778543fac0f0156337db9e7c7c3213d7217d4f0ef1d5487"} Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.554894 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.554908 4795 scope.go:117] "RemoveContainer" containerID="5e2716d65439411b2778543fac0f0156337db9e7c7c3213d7217d4f0ef1d5487" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.554897 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84vzh" event={"ID":"ceeabb16-8075-4c75-8d79-b49b92451b81","Type":"ContainerDied","Data":"5e0099795ccfe2d93b8fa7fc956a19af072f61fe6c2e7aef96555ee34be37e44"} Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.566321 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceeabb16-8075-4c75-8d79-b49b92451b81-utilities\") pod \"ceeabb16-8075-4c75-8d79-b49b92451b81\" (UID: \"ceeabb16-8075-4c75-8d79-b49b92451b81\") " Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.566412 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceeabb16-8075-4c75-8d79-b49b92451b81-catalog-content\") pod \"ceeabb16-8075-4c75-8d79-b49b92451b81\" (UID: \"ceeabb16-8075-4c75-8d79-b49b92451b81\") " Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.566464 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6595\" (UniqueName: \"kubernetes.io/projected/ceeabb16-8075-4c75-8d79-b49b92451b81-kube-api-access-d6595\") pod \"ceeabb16-8075-4c75-8d79-b49b92451b81\" (UID: \"ceeabb16-8075-4c75-8d79-b49b92451b81\") " Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.567850 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ceeabb16-8075-4c75-8d79-b49b92451b81-utilities" (OuterVolumeSpecName: "utilities") pod "ceeabb16-8075-4c75-8d79-b49b92451b81" (UID: "ceeabb16-8075-4c75-8d79-b49b92451b81"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.568972 4795 scope.go:117] "RemoveContainer" containerID="55121af7a869535898dd3f8c014efea12f70c43c4fa01f772dee8b347918943f" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.584531 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceeabb16-8075-4c75-8d79-b49b92451b81-kube-api-access-d6595" (OuterVolumeSpecName: "kube-api-access-d6595") pod "ceeabb16-8075-4c75-8d79-b49b92451b81" (UID: "ceeabb16-8075-4c75-8d79-b49b92451b81"). InnerVolumeSpecName "kube-api-access-d6595". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.603024 4795 scope.go:117] "RemoveContainer" containerID="b84390b5874e7664aed5e9062db897ad8d52dd572518db2386862aac9d669737" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.620986 4795 scope.go:117] "RemoveContainer" containerID="5e2716d65439411b2778543fac0f0156337db9e7c7c3213d7217d4f0ef1d5487" Feb 19 21:43:27 crc kubenswrapper[4795]: E0219 21:43:27.621439 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e2716d65439411b2778543fac0f0156337db9e7c7c3213d7217d4f0ef1d5487\": container with ID starting with 5e2716d65439411b2778543fac0f0156337db9e7c7c3213d7217d4f0ef1d5487 not found: ID does not exist" containerID="5e2716d65439411b2778543fac0f0156337db9e7c7c3213d7217d4f0ef1d5487" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.621477 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e2716d65439411b2778543fac0f0156337db9e7c7c3213d7217d4f0ef1d5487"} err="failed to get container status \"5e2716d65439411b2778543fac0f0156337db9e7c7c3213d7217d4f0ef1d5487\": rpc error: code = NotFound desc = could not find container \"5e2716d65439411b2778543fac0f0156337db9e7c7c3213d7217d4f0ef1d5487\": container with ID starting with 5e2716d65439411b2778543fac0f0156337db9e7c7c3213d7217d4f0ef1d5487 not found: ID does not exist" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.621499 4795 scope.go:117] "RemoveContainer" containerID="55121af7a869535898dd3f8c014efea12f70c43c4fa01f772dee8b347918943f" Feb 19 21:43:27 crc kubenswrapper[4795]: E0219 21:43:27.621897 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55121af7a869535898dd3f8c014efea12f70c43c4fa01f772dee8b347918943f\": container with ID starting with 55121af7a869535898dd3f8c014efea12f70c43c4fa01f772dee8b347918943f not found: ID does not exist" containerID="55121af7a869535898dd3f8c014efea12f70c43c4fa01f772dee8b347918943f" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.621938 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55121af7a869535898dd3f8c014efea12f70c43c4fa01f772dee8b347918943f"} err="failed to get container status \"55121af7a869535898dd3f8c014efea12f70c43c4fa01f772dee8b347918943f\": rpc error: code = NotFound desc = could not find container \"55121af7a869535898dd3f8c014efea12f70c43c4fa01f772dee8b347918943f\": container with ID starting with 55121af7a869535898dd3f8c014efea12f70c43c4fa01f772dee8b347918943f not found: ID does not exist" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.621966 4795 scope.go:117] "RemoveContainer" containerID="b84390b5874e7664aed5e9062db897ad8d52dd572518db2386862aac9d669737" Feb 19 21:43:27 crc kubenswrapper[4795]: E0219 21:43:27.622323 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b84390b5874e7664aed5e9062db897ad8d52dd572518db2386862aac9d669737\": container with ID starting with b84390b5874e7664aed5e9062db897ad8d52dd572518db2386862aac9d669737 not found: ID does not exist" containerID="b84390b5874e7664aed5e9062db897ad8d52dd572518db2386862aac9d669737" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.622369 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b84390b5874e7664aed5e9062db897ad8d52dd572518db2386862aac9d669737"} err="failed to get container status \"b84390b5874e7664aed5e9062db897ad8d52dd572518db2386862aac9d669737\": rpc error: code = NotFound desc = could not find container \"b84390b5874e7664aed5e9062db897ad8d52dd572518db2386862aac9d669737\": container with ID starting with b84390b5874e7664aed5e9062db897ad8d52dd572518db2386862aac9d669737 not found: ID does not exist" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.623732 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ceeabb16-8075-4c75-8d79-b49b92451b81-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ceeabb16-8075-4c75-8d79-b49b92451b81" (UID: "ceeabb16-8075-4c75-8d79-b49b92451b81"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.668230 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6595\" (UniqueName: \"kubernetes.io/projected/ceeabb16-8075-4c75-8d79-b49b92451b81-kube-api-access-d6595\") on node \"crc\" DevicePath \"\"" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.668262 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceeabb16-8075-4c75-8d79-b49b92451b81-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.668271 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceeabb16-8075-4c75-8d79-b49b92451b81-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.877949 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-84vzh"] Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.882228 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-84vzh"] Feb 19 21:43:28 crc kubenswrapper[4795]: I0219 21:43:28.427512 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:43:28 crc kubenswrapper[4795]: I0219 21:43:28.427570 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:43:29 crc kubenswrapper[4795]: I0219 21:43:29.519035 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ceeabb16-8075-4c75-8d79-b49b92451b81" path="/var/lib/kubelet/pods/ceeabb16-8075-4c75-8d79-b49b92451b81/volumes" Feb 19 21:43:29 crc kubenswrapper[4795]: I0219 21:43:29.668298 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:29 crc kubenswrapper[4795]: I0219 21:43:29.668416 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:29 crc kubenswrapper[4795]: I0219 21:43:29.711689 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:30 crc kubenswrapper[4795]: I0219 21:43:30.637073 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:31 crc kubenswrapper[4795]: I0219 21:43:31.732487 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-df67c"] Feb 19 21:43:33 crc kubenswrapper[4795]: I0219 21:43:33.586875 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-df67c" podUID="26c9b930-41fe-4332-9fde-8c9d4cb304bd" containerName="registry-server" containerID="cri-o://3a19332ea30d3f30bc9024c2c11ccf1aebf8403a82db1f78a705c81e94520d3d" gracePeriod=2 Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.000067 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.051278 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26c9b930-41fe-4332-9fde-8c9d4cb304bd-catalog-content\") pod \"26c9b930-41fe-4332-9fde-8c9d4cb304bd\" (UID: \"26c9b930-41fe-4332-9fde-8c9d4cb304bd\") " Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.051471 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26c9b930-41fe-4332-9fde-8c9d4cb304bd-utilities\") pod \"26c9b930-41fe-4332-9fde-8c9d4cb304bd\" (UID: \"26c9b930-41fe-4332-9fde-8c9d4cb304bd\") " Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.051520 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfg7g\" (UniqueName: \"kubernetes.io/projected/26c9b930-41fe-4332-9fde-8c9d4cb304bd-kube-api-access-hfg7g\") pod \"26c9b930-41fe-4332-9fde-8c9d4cb304bd\" (UID: \"26c9b930-41fe-4332-9fde-8c9d4cb304bd\") " Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.052758 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26c9b930-41fe-4332-9fde-8c9d4cb304bd-utilities" (OuterVolumeSpecName: "utilities") pod "26c9b930-41fe-4332-9fde-8c9d4cb304bd" (UID: "26c9b930-41fe-4332-9fde-8c9d4cb304bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.056794 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26c9b930-41fe-4332-9fde-8c9d4cb304bd-kube-api-access-hfg7g" (OuterVolumeSpecName: "kube-api-access-hfg7g") pod "26c9b930-41fe-4332-9fde-8c9d4cb304bd" (UID: "26c9b930-41fe-4332-9fde-8c9d4cb304bd"). InnerVolumeSpecName "kube-api-access-hfg7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.124757 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-vwgdm"] Feb 19 21:43:34 crc kubenswrapper[4795]: E0219 21:43:34.125120 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26c9b930-41fe-4332-9fde-8c9d4cb304bd" containerName="extract-content" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.125143 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c9b930-41fe-4332-9fde-8c9d4cb304bd" containerName="extract-content" Feb 19 21:43:34 crc kubenswrapper[4795]: E0219 21:43:34.125158 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceeabb16-8075-4c75-8d79-b49b92451b81" containerName="extract-utilities" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.125183 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceeabb16-8075-4c75-8d79-b49b92451b81" containerName="extract-utilities" Feb 19 21:43:34 crc kubenswrapper[4795]: E0219 21:43:34.125206 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceeabb16-8075-4c75-8d79-b49b92451b81" containerName="extract-content" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.125214 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceeabb16-8075-4c75-8d79-b49b92451b81" containerName="extract-content" Feb 19 21:43:34 crc kubenswrapper[4795]: E0219 21:43:34.125225 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26c9b930-41fe-4332-9fde-8c9d4cb304bd" containerName="registry-server" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.125232 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c9b930-41fe-4332-9fde-8c9d4cb304bd" containerName="registry-server" Feb 19 21:43:34 crc kubenswrapper[4795]: E0219 21:43:34.125245 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceeabb16-8075-4c75-8d79-b49b92451b81" containerName="registry-server" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.125254 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceeabb16-8075-4c75-8d79-b49b92451b81" containerName="registry-server" Feb 19 21:43:34 crc kubenswrapper[4795]: E0219 21:43:34.125269 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26c9b930-41fe-4332-9fde-8c9d4cb304bd" containerName="extract-utilities" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.125278 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c9b930-41fe-4332-9fde-8c9d4cb304bd" containerName="extract-utilities" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.125415 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="26c9b930-41fe-4332-9fde-8c9d4cb304bd" containerName="registry-server" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.125441 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceeabb16-8075-4c75-8d79-b49b92451b81" containerName="registry-server" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.126049 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-vwgdm" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.128705 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-45p4m" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.131511 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-z7hnk"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.134462 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-z7hnk" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.136545 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-2t8tm" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.137575 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-vwgdm"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.147496 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-d8wqs"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.148488 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-d8wqs" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.150231 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26c9b930-41fe-4332-9fde-8c9d4cb304bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "26c9b930-41fe-4332-9fde-8c9d4cb304bd" (UID: "26c9b930-41fe-4332-9fde-8c9d4cb304bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.151683 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-jns27" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.152407 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-z7hnk"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.153308 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djxhq\" (UniqueName: \"kubernetes.io/projected/4cc5be3d-87d8-46a4-ba7d-d95143c11857-kube-api-access-djxhq\") pod \"barbican-operator-controller-manager-868647ff47-z7hnk\" (UID: \"4cc5be3d-87d8-46a4-ba7d-d95143c11857\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-z7hnk" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.153387 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk5sp\" (UniqueName: \"kubernetes.io/projected/54a55994-69ff-48f1-8d75-24b2a828cdc9-kube-api-access-dk5sp\") pod \"cinder-operator-controller-manager-5d946d989d-vwgdm\" (UID: \"54a55994-69ff-48f1-8d75-24b2a828cdc9\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-vwgdm" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.153482 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26c9b930-41fe-4332-9fde-8c9d4cb304bd-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.153498 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfg7g\" (UniqueName: \"kubernetes.io/projected/26c9b930-41fe-4332-9fde-8c9d4cb304bd-kube-api-access-hfg7g\") on node \"crc\" DevicePath \"\"" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.153509 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26c9b930-41fe-4332-9fde-8c9d4cb304bd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.161588 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-d8wqs"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.166719 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-5cnjr"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.167396 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-5cnjr" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.169401 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-zz2j7" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.199305 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-shb4d"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.200708 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-shb4d" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.202443 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-xjfrp" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.209284 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-5cnjr"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.222077 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-shb4d"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.264440 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk5sp\" (UniqueName: \"kubernetes.io/projected/54a55994-69ff-48f1-8d75-24b2a828cdc9-kube-api-access-dk5sp\") pod \"cinder-operator-controller-manager-5d946d989d-vwgdm\" (UID: \"54a55994-69ff-48f1-8d75-24b2a828cdc9\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-vwgdm" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.264529 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqf27\" (UniqueName: \"kubernetes.io/projected/268c2664-09cc-4616-9280-0dd6ae4159dc-kube-api-access-nqf27\") pod \"heat-operator-controller-manager-69f49c598c-shb4d\" (UID: \"268c2664-09cc-4616-9280-0dd6ae4159dc\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-shb4d" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.265505 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpt88\" (UniqueName: \"kubernetes.io/projected/d19ed31e-e599-40ec-935d-d1d404e4c7a5-kube-api-access-qpt88\") pod \"glance-operator-controller-manager-77987464f4-5cnjr\" (UID: \"d19ed31e-e599-40ec-935d-d1d404e4c7a5\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-5cnjr" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.265572 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvbwc\" (UniqueName: \"kubernetes.io/projected/5c867f91-2ab2-43ce-8291-6d01825610d1-kube-api-access-kvbwc\") pod \"designate-operator-controller-manager-6d8bf5c495-d8wqs\" (UID: \"5c867f91-2ab2-43ce-8291-6d01825610d1\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-d8wqs" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.265648 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djxhq\" (UniqueName: \"kubernetes.io/projected/4cc5be3d-87d8-46a4-ba7d-d95143c11857-kube-api-access-djxhq\") pod \"barbican-operator-controller-manager-868647ff47-z7hnk\" (UID: \"4cc5be3d-87d8-46a4-ba7d-d95143c11857\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-z7hnk" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.285951 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fdd85"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.287149 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fdd85" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.289154 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-4lm4w" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.291941 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djxhq\" (UniqueName: \"kubernetes.io/projected/4cc5be3d-87d8-46a4-ba7d-d95143c11857-kube-api-access-djxhq\") pod \"barbican-operator-controller-manager-868647ff47-z7hnk\" (UID: \"4cc5be3d-87d8-46a4-ba7d-d95143c11857\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-z7hnk" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.297026 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk5sp\" (UniqueName: \"kubernetes.io/projected/54a55994-69ff-48f1-8d75-24b2a828cdc9-kube-api-access-dk5sp\") pod \"cinder-operator-controller-manager-5d946d989d-vwgdm\" (UID: \"54a55994-69ff-48f1-8d75-24b2a828cdc9\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-vwgdm" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.299556 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.300433 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.305548 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.305896 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-qzkcm" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.313224 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fdd85"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.343897 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-7n98g"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.345250 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7n98g" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.347091 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-mwbs5" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.366836 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqf27\" (UniqueName: \"kubernetes.io/projected/268c2664-09cc-4616-9280-0dd6ae4159dc-kube-api-access-nqf27\") pod \"heat-operator-controller-manager-69f49c598c-shb4d\" (UID: \"268c2664-09cc-4616-9280-0dd6ae4159dc\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-shb4d" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.366916 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpt88\" (UniqueName: \"kubernetes.io/projected/d19ed31e-e599-40ec-935d-d1d404e4c7a5-kube-api-access-qpt88\") pod \"glance-operator-controller-manager-77987464f4-5cnjr\" (UID: \"d19ed31e-e599-40ec-935d-d1d404e4c7a5\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-5cnjr" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.366956 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvbwc\" (UniqueName: \"kubernetes.io/projected/5c867f91-2ab2-43ce-8291-6d01825610d1-kube-api-access-kvbwc\") pod \"designate-operator-controller-manager-6d8bf5c495-d8wqs\" (UID: \"5c867f91-2ab2-43ce-8291-6d01825610d1\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-d8wqs" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.366998 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2qwr\" (UniqueName: \"kubernetes.io/projected/e37494c1-8780-4612-8569-fada28f0e772-kube-api-access-m2qwr\") pod \"horizon-operator-controller-manager-5b9b8895d5-fdd85\" (UID: \"e37494c1-8780-4612-8569-fada28f0e772\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fdd85" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.367027 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert\") pod \"infra-operator-controller-manager-79d975b745-qjgvw\" (UID: \"2e80963b-888b-4bb9-9259-864e38dd10ed\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.367058 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvtht\" (UniqueName: \"kubernetes.io/projected/1d6085d5-f9db-4129-8662-b3ae045decfc-kube-api-access-vvtht\") pod \"manila-operator-controller-manager-54f6768c69-7n98g\" (UID: \"1d6085d5-f9db-4129-8662-b3ae045decfc\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7n98g" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.367089 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgwmv\" (UniqueName: \"kubernetes.io/projected/2e80963b-888b-4bb9-9259-864e38dd10ed-kube-api-access-qgwmv\") pod \"infra-operator-controller-manager-79d975b745-qjgvw\" (UID: \"2e80963b-888b-4bb9-9259-864e38dd10ed\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.367736 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-t87bb"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.371192 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-t87bb" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.375802 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-tv49m" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.396585 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.399776 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpt88\" (UniqueName: \"kubernetes.io/projected/d19ed31e-e599-40ec-935d-d1d404e4c7a5-kube-api-access-qpt88\") pod \"glance-operator-controller-manager-77987464f4-5cnjr\" (UID: \"d19ed31e-e599-40ec-935d-d1d404e4c7a5\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-5cnjr" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.399925 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvbwc\" (UniqueName: \"kubernetes.io/projected/5c867f91-2ab2-43ce-8291-6d01825610d1-kube-api-access-kvbwc\") pod \"designate-operator-controller-manager-6d8bf5c495-d8wqs\" (UID: \"5c867f91-2ab2-43ce-8291-6d01825610d1\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-d8wqs" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.406554 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqf27\" (UniqueName: \"kubernetes.io/projected/268c2664-09cc-4616-9280-0dd6ae4159dc-kube-api-access-nqf27\") pod \"heat-operator-controller-manager-69f49c598c-shb4d\" (UID: \"268c2664-09cc-4616-9280-0dd6ae4159dc\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-shb4d" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.414778 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-t6hpt"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.415686 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t6hpt" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.420859 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-gdbsq" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.431002 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-t87bb"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.443253 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-7n98g"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.449751 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-vwgdm" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.459413 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-z7hnk" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.467834 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-t6hpt"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.468137 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-d8wqs" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.468987 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2qwr\" (UniqueName: \"kubernetes.io/projected/e37494c1-8780-4612-8569-fada28f0e772-kube-api-access-m2qwr\") pod \"horizon-operator-controller-manager-5b9b8895d5-fdd85\" (UID: \"e37494c1-8780-4612-8569-fada28f0e772\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fdd85" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.469027 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert\") pod \"infra-operator-controller-manager-79d975b745-qjgvw\" (UID: \"2e80963b-888b-4bb9-9259-864e38dd10ed\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.469061 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvtht\" (UniqueName: \"kubernetes.io/projected/1d6085d5-f9db-4129-8662-b3ae045decfc-kube-api-access-vvtht\") pod \"manila-operator-controller-manager-54f6768c69-7n98g\" (UID: \"1d6085d5-f9db-4129-8662-b3ae045decfc\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7n98g" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.469086 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgwmv\" (UniqueName: \"kubernetes.io/projected/2e80963b-888b-4bb9-9259-864e38dd10ed-kube-api-access-qgwmv\") pod \"infra-operator-controller-manager-79d975b745-qjgvw\" (UID: \"2e80963b-888b-4bb9-9259-864e38dd10ed\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" Feb 19 21:43:34 crc kubenswrapper[4795]: E0219 21:43:34.469382 4795 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 21:43:34 crc kubenswrapper[4795]: E0219 21:43:34.469469 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert podName:2e80963b-888b-4bb9-9259-864e38dd10ed nodeName:}" failed. No retries permitted until 2026-02-19 21:43:34.969446932 +0000 UTC m=+926.161964796 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert") pod "infra-operator-controller-manager-79d975b745-qjgvw" (UID: "2e80963b-888b-4bb9-9259-864e38dd10ed") : secret "infra-operator-webhook-server-cert" not found Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.482519 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-mr8mh"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.482847 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-5cnjr" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.483350 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-mr8mh" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.487159 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4cf2p"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.488376 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4cf2p" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.488388 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-s48hw" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.488098 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2qwr\" (UniqueName: \"kubernetes.io/projected/e37494c1-8780-4612-8569-fada28f0e772-kube-api-access-m2qwr\") pod \"horizon-operator-controller-manager-5b9b8895d5-fdd85\" (UID: \"e37494c1-8780-4612-8569-fada28f0e772\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fdd85" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.490070 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-qzpvp" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.491544 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-mr8mh"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.493726 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvtht\" (UniqueName: \"kubernetes.io/projected/1d6085d5-f9db-4129-8662-b3ae045decfc-kube-api-access-vvtht\") pod \"manila-operator-controller-manager-54f6768c69-7n98g\" (UID: \"1d6085d5-f9db-4129-8662-b3ae045decfc\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7n98g" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.494043 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgwmv\" (UniqueName: \"kubernetes.io/projected/2e80963b-888b-4bb9-9259-864e38dd10ed-kube-api-access-qgwmv\") pod \"infra-operator-controller-manager-79d975b745-qjgvw\" (UID: \"2e80963b-888b-4bb9-9259-864e38dd10ed\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.495745 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-5b89b"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.498589 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-5b89b" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.500714 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-gvclq" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.503599 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4cf2p"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.513338 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-ckxlw"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.514110 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ckxlw" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.514570 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-shb4d" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.517826 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-jbnsb" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.534292 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-ckxlw"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.538746 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-5b89b"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.553899 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.557057 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.558970 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.559157 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-5mql4" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.570549 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztntl\" (UniqueName: \"kubernetes.io/projected/02592cbe-e1d4-4b62-8795-a204d5335594-kube-api-access-ztntl\") pod \"ironic-operator-controller-manager-554564d7fc-t87bb\" (UID: \"02592cbe-e1d4-4b62-8795-a204d5335594\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-t87bb" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.570985 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zthv5\" (UniqueName: \"kubernetes.io/projected/c7e19956-a3fb-4ed2-bc2a-72084ed62ac2-kube-api-access-zthv5\") pod \"keystone-operator-controller-manager-b4d948c87-t6hpt\" (UID: \"c7e19956-a3fb-4ed2-bc2a-72084ed62ac2\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t6hpt" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.576734 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-slqxz"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.577471 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-slqxz" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.580778 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-lg9gz" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.587721 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-bwsj2"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.588602 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-bwsj2" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.592207 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-thwpb" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.609906 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-slqxz"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.613562 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-dqpjx"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.614311 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dqpjx" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.628030 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.628373 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-t4w6j" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.628850 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fdd85" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.649811 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-dqpjx"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.651563 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-bwsj2"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.679952 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p\" (UID: \"26db9cb2-1ed4-44e4-afac-404ce0f7d445\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.680046 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-727pb\" (UniqueName: \"kubernetes.io/projected/0bdb1789-27ad-4535-86d3-fd2fb7cebba2-kube-api-access-727pb\") pod \"mariadb-operator-controller-manager-6994f66f48-mr8mh\" (UID: \"0bdb1789-27ad-4535-86d3-fd2fb7cebba2\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-mr8mh" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.680089 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zthv5\" (UniqueName: \"kubernetes.io/projected/c7e19956-a3fb-4ed2-bc2a-72084ed62ac2-kube-api-access-zthv5\") pod \"keystone-operator-controller-manager-b4d948c87-t6hpt\" (UID: \"c7e19956-a3fb-4ed2-bc2a-72084ed62ac2\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t6hpt" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.680139 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5gh7\" (UniqueName: \"kubernetes.io/projected/c2c4435e-a135-4c1f-bad4-121458c09bc3-kube-api-access-b5gh7\") pod \"neutron-operator-controller-manager-64ddbf8bb-4cf2p\" (UID: \"c2c4435e-a135-4c1f-bad4-121458c09bc3\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4cf2p" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.681841 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7n98g" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.682387 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztntl\" (UniqueName: \"kubernetes.io/projected/02592cbe-e1d4-4b62-8795-a204d5335594-kube-api-access-ztntl\") pod \"ironic-operator-controller-manager-554564d7fc-t87bb\" (UID: \"02592cbe-e1d4-4b62-8795-a204d5335594\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-t87bb" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.682456 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f88l7\" (UniqueName: \"kubernetes.io/projected/26db9cb2-1ed4-44e4-afac-404ce0f7d445-kube-api-access-f88l7\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p\" (UID: \"26db9cb2-1ed4-44e4-afac-404ce0f7d445\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.682495 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls2qm\" (UniqueName: \"kubernetes.io/projected/b22b5096-41cf-40c9-94f6-8e546ca96a96-kube-api-access-ls2qm\") pod \"octavia-operator-controller-manager-69f8888797-ckxlw\" (UID: \"b22b5096-41cf-40c9-94f6-8e546ca96a96\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ckxlw" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.682539 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh2xr\" (UniqueName: \"kubernetes.io/projected/5a6d3cc3-7e00-4013-b568-c2b835d8e2b9-kube-api-access-kh2xr\") pod \"nova-operator-controller-manager-567668f5cf-5b89b\" (UID: \"5a6d3cc3-7e00-4013-b568-c2b835d8e2b9\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-5b89b" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.688380 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-slj65"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.693298 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-slj65" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.695552 4795 generic.go:334] "Generic (PLEG): container finished" podID="26c9b930-41fe-4332-9fde-8c9d4cb304bd" containerID="3a19332ea30d3f30bc9024c2c11ccf1aebf8403a82db1f78a705c81e94520d3d" exitCode=0 Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.695599 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df67c" event={"ID":"26c9b930-41fe-4332-9fde-8c9d4cb304bd","Type":"ContainerDied","Data":"3a19332ea30d3f30bc9024c2c11ccf1aebf8403a82db1f78a705c81e94520d3d"} Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.695627 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df67c" event={"ID":"26c9b930-41fe-4332-9fde-8c9d4cb304bd","Type":"ContainerDied","Data":"8e2862dfbf3b5bd22726e0781fe62e4bc152e6dd0506beef8a94a47634a3f65a"} Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.695648 4795 scope.go:117] "RemoveContainer" containerID="3a19332ea30d3f30bc9024c2c11ccf1aebf8403a82db1f78a705c81e94520d3d" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.695767 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.697949 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-29gdh" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.724272 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-slj65"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.724327 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztntl\" (UniqueName: \"kubernetes.io/projected/02592cbe-e1d4-4b62-8795-a204d5335594-kube-api-access-ztntl\") pod \"ironic-operator-controller-manager-554564d7fc-t87bb\" (UID: \"02592cbe-e1d4-4b62-8795-a204d5335594\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-t87bb" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.739407 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zthv5\" (UniqueName: \"kubernetes.io/projected/c7e19956-a3fb-4ed2-bc2a-72084ed62ac2-kube-api-access-zthv5\") pod \"keystone-operator-controller-manager-b4d948c87-t6hpt\" (UID: \"c7e19956-a3fb-4ed2-bc2a-72084ed62ac2\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t6hpt" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.783408 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9dzq\" (UniqueName: \"kubernetes.io/projected/98979ac7-9fb1-49f8-8022-562082fc76f7-kube-api-access-s9dzq\") pod \"swift-operator-controller-manager-68f46476f-dqpjx\" (UID: \"98979ac7-9fb1-49f8-8022-562082fc76f7\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-dqpjx" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.783615 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f88l7\" (UniqueName: \"kubernetes.io/projected/26db9cb2-1ed4-44e4-afac-404ce0f7d445-kube-api-access-f88l7\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p\" (UID: \"26db9cb2-1ed4-44e4-afac-404ce0f7d445\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.783700 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls2qm\" (UniqueName: \"kubernetes.io/projected/b22b5096-41cf-40c9-94f6-8e546ca96a96-kube-api-access-ls2qm\") pod \"octavia-operator-controller-manager-69f8888797-ckxlw\" (UID: \"b22b5096-41cf-40c9-94f6-8e546ca96a96\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ckxlw" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.784326 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh2xr\" (UniqueName: \"kubernetes.io/projected/5a6d3cc3-7e00-4013-b568-c2b835d8e2b9-kube-api-access-kh2xr\") pod \"nova-operator-controller-manager-567668f5cf-5b89b\" (UID: \"5a6d3cc3-7e00-4013-b568-c2b835d8e2b9\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-5b89b" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.784432 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p\" (UID: \"26db9cb2-1ed4-44e4-afac-404ce0f7d445\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.784515 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-727pb\" (UniqueName: \"kubernetes.io/projected/0bdb1789-27ad-4535-86d3-fd2fb7cebba2-kube-api-access-727pb\") pod \"mariadb-operator-controller-manager-6994f66f48-mr8mh\" (UID: \"0bdb1789-27ad-4535-86d3-fd2fb7cebba2\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-mr8mh" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.784612 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mzr5\" (UniqueName: \"kubernetes.io/projected/7b637620-f307-4e2b-b92d-f1e0d50b0071-kube-api-access-2mzr5\") pod \"placement-operator-controller-manager-8497b45c89-bwsj2\" (UID: \"7b637620-f307-4e2b-b92d-f1e0d50b0071\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-bwsj2" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.784686 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwm2j\" (UniqueName: \"kubernetes.io/projected/e0cad59b-249e-446f-b3fa-6be8aac2a858-kube-api-access-jwm2j\") pod \"ovn-operator-controller-manager-d44cf6b75-slqxz\" (UID: \"e0cad59b-249e-446f-b3fa-6be8aac2a858\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-slqxz" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.790586 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5gh7\" (UniqueName: \"kubernetes.io/projected/c2c4435e-a135-4c1f-bad4-121458c09bc3-kube-api-access-b5gh7\") pod \"neutron-operator-controller-manager-64ddbf8bb-4cf2p\" (UID: \"c2c4435e-a135-4c1f-bad4-121458c09bc3\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4cf2p" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.786897 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-rcjgz"] Feb 19 21:43:34 crc kubenswrapper[4795]: E0219 21:43:34.785105 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:43:34 crc kubenswrapper[4795]: E0219 21:43:34.791089 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert podName:26db9cb2-1ed4-44e4-afac-404ce0f7d445 nodeName:}" failed. No retries permitted until 2026-02-19 21:43:35.291066299 +0000 UTC m=+926.483584163 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" (UID: "26db9cb2-1ed4-44e4-afac-404ce0f7d445") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.791898 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-rcjgz"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.792025 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-rcjgz" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.816775 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5gh7\" (UniqueName: \"kubernetes.io/projected/c2c4435e-a135-4c1f-bad4-121458c09bc3-kube-api-access-b5gh7\") pod \"neutron-operator-controller-manager-64ddbf8bb-4cf2p\" (UID: \"c2c4435e-a135-4c1f-bad4-121458c09bc3\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4cf2p" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.817185 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls2qm\" (UniqueName: \"kubernetes.io/projected/b22b5096-41cf-40c9-94f6-8e546ca96a96-kube-api-access-ls2qm\") pod \"octavia-operator-controller-manager-69f8888797-ckxlw\" (UID: \"b22b5096-41cf-40c9-94f6-8e546ca96a96\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ckxlw" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.817219 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f88l7\" (UniqueName: \"kubernetes.io/projected/26db9cb2-1ed4-44e4-afac-404ce0f7d445-kube-api-access-f88l7\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p\" (UID: \"26db9cb2-1ed4-44e4-afac-404ce0f7d445\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.817908 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh2xr\" (UniqueName: \"kubernetes.io/projected/5a6d3cc3-7e00-4013-b568-c2b835d8e2b9-kube-api-access-kh2xr\") pod \"nova-operator-controller-manager-567668f5cf-5b89b\" (UID: \"5a6d3cc3-7e00-4013-b568-c2b835d8e2b9\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-5b89b" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.820786 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-727pb\" (UniqueName: \"kubernetes.io/projected/0bdb1789-27ad-4535-86d3-fd2fb7cebba2-kube-api-access-727pb\") pod \"mariadb-operator-controller-manager-6994f66f48-mr8mh\" (UID: \"0bdb1789-27ad-4535-86d3-fd2fb7cebba2\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-mr8mh" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.827185 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-pwwfk" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.841284 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-bbtgm"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.842412 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bbtgm" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.842633 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4cf2p" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.850255 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-5mjnc" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.864770 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-bbtgm"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.867551 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-5b89b" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.873359 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ckxlw" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.891298 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7d95\" (UniqueName: \"kubernetes.io/projected/5f4d8698-27a0-44a4-87f6-c75d4c3407bc-kube-api-access-t7d95\") pod \"telemetry-operator-controller-manager-7f45b4ff68-slj65\" (UID: \"5f4d8698-27a0-44a4-87f6-c75d4c3407bc\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-slj65" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.891508 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddk6m\" (UniqueName: \"kubernetes.io/projected/6bdc9c62-d8c1-42d5-8696-324fdc7abc2f-kube-api-access-ddk6m\") pod \"test-operator-controller-manager-7866795846-rcjgz\" (UID: \"6bdc9c62-d8c1-42d5-8696-324fdc7abc2f\") " pod="openstack-operators/test-operator-controller-manager-7866795846-rcjgz" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.891630 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9dzq\" (UniqueName: \"kubernetes.io/projected/98979ac7-9fb1-49f8-8022-562082fc76f7-kube-api-access-s9dzq\") pod \"swift-operator-controller-manager-68f46476f-dqpjx\" (UID: \"98979ac7-9fb1-49f8-8022-562082fc76f7\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-dqpjx" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.891817 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt99q\" (UniqueName: \"kubernetes.io/projected/09ce2dcf-0fb0-4180-a019-09d1abfec00e-kube-api-access-wt99q\") pod \"watcher-operator-controller-manager-5db88f68c-bbtgm\" (UID: \"09ce2dcf-0fb0-4180-a019-09d1abfec00e\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bbtgm" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.891978 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mzr5\" (UniqueName: \"kubernetes.io/projected/7b637620-f307-4e2b-b92d-f1e0d50b0071-kube-api-access-2mzr5\") pod \"placement-operator-controller-manager-8497b45c89-bwsj2\" (UID: \"7b637620-f307-4e2b-b92d-f1e0d50b0071\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-bwsj2" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.892137 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwm2j\" (UniqueName: \"kubernetes.io/projected/e0cad59b-249e-446f-b3fa-6be8aac2a858-kube-api-access-jwm2j\") pod \"ovn-operator-controller-manager-d44cf6b75-slqxz\" (UID: \"e0cad59b-249e-446f-b3fa-6be8aac2a858\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-slqxz" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.923215 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.924208 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.928787 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-df67c"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.932219 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mzr5\" (UniqueName: \"kubernetes.io/projected/7b637620-f307-4e2b-b92d-f1e0d50b0071-kube-api-access-2mzr5\") pod \"placement-operator-controller-manager-8497b45c89-bwsj2\" (UID: \"7b637620-f307-4e2b-b92d-f1e0d50b0071\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-bwsj2" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.950590 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-klpqt" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.950770 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.950863 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.951416 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwm2j\" (UniqueName: \"kubernetes.io/projected/e0cad59b-249e-446f-b3fa-6be8aac2a858-kube-api-access-jwm2j\") pod \"ovn-operator-controller-manager-d44cf6b75-slqxz\" (UID: \"e0cad59b-249e-446f-b3fa-6be8aac2a858\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-slqxz" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.960272 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-df67c"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.960546 4795 scope.go:117] "RemoveContainer" containerID="673e5888e18f68030d508c2fc17e9b749479219a1048baa8b20751a224c8aac8" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.976594 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.981026 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9dzq\" (UniqueName: \"kubernetes.io/projected/98979ac7-9fb1-49f8-8022-562082fc76f7-kube-api-access-s9dzq\") pod \"swift-operator-controller-manager-68f46476f-dqpjx\" (UID: \"98979ac7-9fb1-49f8-8022-562082fc76f7\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-dqpjx" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.992504 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-t87bb" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.993433 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7d95\" (UniqueName: \"kubernetes.io/projected/5f4d8698-27a0-44a4-87f6-c75d4c3407bc-kube-api-access-t7d95\") pod \"telemetry-operator-controller-manager-7f45b4ff68-slj65\" (UID: \"5f4d8698-27a0-44a4-87f6-c75d4c3407bc\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-slj65" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.994692 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.994783 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.994868 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert\") pod \"infra-operator-controller-manager-79d975b745-qjgvw\" (UID: \"2e80963b-888b-4bb9-9259-864e38dd10ed\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.994944 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqq8d\" (UniqueName: \"kubernetes.io/projected/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-kube-api-access-pqq8d\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.995038 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddk6m\" (UniqueName: \"kubernetes.io/projected/6bdc9c62-d8c1-42d5-8696-324fdc7abc2f-kube-api-access-ddk6m\") pod \"test-operator-controller-manager-7866795846-rcjgz\" (UID: \"6bdc9c62-d8c1-42d5-8696-324fdc7abc2f\") " pod="openstack-operators/test-operator-controller-manager-7866795846-rcjgz" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.995136 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt99q\" (UniqueName: \"kubernetes.io/projected/09ce2dcf-0fb0-4180-a019-09d1abfec00e-kube-api-access-wt99q\") pod \"watcher-operator-controller-manager-5db88f68c-bbtgm\" (UID: \"09ce2dcf-0fb0-4180-a019-09d1abfec00e\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bbtgm" Feb 19 21:43:34 crc kubenswrapper[4795]: E0219 21:43:34.995616 4795 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 21:43:34 crc kubenswrapper[4795]: E0219 21:43:34.996619 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert podName:2e80963b-888b-4bb9-9259-864e38dd10ed nodeName:}" failed. No retries permitted until 2026-02-19 21:43:35.996604267 +0000 UTC m=+927.189122131 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert") pod "infra-operator-controller-manager-79d975b745-qjgvw" (UID: "2e80963b-888b-4bb9-9259-864e38dd10ed") : secret "infra-operator-webhook-server-cert" not found Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.996891 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-bwsj2" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.034298 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t6hpt" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.035379 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vv89z"] Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.036248 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vv89z" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.043036 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-phdgp" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.047964 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt99q\" (UniqueName: \"kubernetes.io/projected/09ce2dcf-0fb0-4180-a019-09d1abfec00e-kube-api-access-wt99q\") pod \"watcher-operator-controller-manager-5db88f68c-bbtgm\" (UID: \"09ce2dcf-0fb0-4180-a019-09d1abfec00e\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bbtgm" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.051807 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7d95\" (UniqueName: \"kubernetes.io/projected/5f4d8698-27a0-44a4-87f6-c75d4c3407bc-kube-api-access-t7d95\") pod \"telemetry-operator-controller-manager-7f45b4ff68-slj65\" (UID: \"5f4d8698-27a0-44a4-87f6-c75d4c3407bc\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-slj65" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.065313 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vv89z"] Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.085804 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddk6m\" (UniqueName: \"kubernetes.io/projected/6bdc9c62-d8c1-42d5-8696-324fdc7abc2f-kube-api-access-ddk6m\") pod \"test-operator-controller-manager-7866795846-rcjgz\" (UID: \"6bdc9c62-d8c1-42d5-8696-324fdc7abc2f\") " pod="openstack-operators/test-operator-controller-manager-7866795846-rcjgz" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.096367 4795 scope.go:117] "RemoveContainer" containerID="aad3657be5b442a652805a5a7969b9cf191ab890426ca287320693f93c38eb61" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.114602 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-mr8mh" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.115093 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.115134 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.115183 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqq8d\" (UniqueName: \"kubernetes.io/projected/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-kube-api-access-pqq8d\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:35 crc kubenswrapper[4795]: E0219 21:43:35.115201 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.115212 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4z44\" (UniqueName: \"kubernetes.io/projected/80ce3bc1-0926-47a3-acc2-6f2d8be4089c-kube-api-access-v4z44\") pod \"rabbitmq-cluster-operator-manager-668c99d594-vv89z\" (UID: \"80ce3bc1-0926-47a3-acc2-6f2d8be4089c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vv89z" Feb 19 21:43:35 crc kubenswrapper[4795]: E0219 21:43:35.115374 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 21:43:35 crc kubenswrapper[4795]: E0219 21:43:35.126192 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs podName:9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:43:35.615227419 +0000 UTC m=+926.807745283 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-62xdp" (UID: "9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4") : secret "metrics-server-cert" not found Feb 19 21:43:35 crc kubenswrapper[4795]: E0219 21:43:35.126268 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs podName:9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:43:35.62624118 +0000 UTC m=+926.818759044 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-62xdp" (UID: "9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4") : secret "webhook-server-cert" not found Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.142224 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqq8d\" (UniqueName: \"kubernetes.io/projected/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-kube-api-access-pqq8d\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.183132 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dqpjx" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.193350 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-z7hnk"] Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.199418 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-slj65" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.216917 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4z44\" (UniqueName: \"kubernetes.io/projected/80ce3bc1-0926-47a3-acc2-6f2d8be4089c-kube-api-access-v4z44\") pod \"rabbitmq-cluster-operator-manager-668c99d594-vv89z\" (UID: \"80ce3bc1-0926-47a3-acc2-6f2d8be4089c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vv89z" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.226680 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-rcjgz" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.243191 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4z44\" (UniqueName: \"kubernetes.io/projected/80ce3bc1-0926-47a3-acc2-6f2d8be4089c-kube-api-access-v4z44\") pod \"rabbitmq-cluster-operator-manager-668c99d594-vv89z\" (UID: \"80ce3bc1-0926-47a3-acc2-6f2d8be4089c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vv89z" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.243708 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-slqxz" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.251357 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bbtgm" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.310246 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-d8wqs"] Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.317733 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p\" (UID: \"26db9cb2-1ed4-44e4-afac-404ce0f7d445\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" Feb 19 21:43:35 crc kubenswrapper[4795]: E0219 21:43:35.318016 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:43:35 crc kubenswrapper[4795]: E0219 21:43:35.318073 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert podName:26db9cb2-1ed4-44e4-afac-404ce0f7d445 nodeName:}" failed. No retries permitted until 2026-02-19 21:43:36.31805389 +0000 UTC m=+927.510571754 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" (UID: "26db9cb2-1ed4-44e4-afac-404ce0f7d445") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.322530 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-vwgdm"] Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.370253 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vv89z" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.371392 4795 scope.go:117] "RemoveContainer" containerID="3a19332ea30d3f30bc9024c2c11ccf1aebf8403a82db1f78a705c81e94520d3d" Feb 19 21:43:35 crc kubenswrapper[4795]: E0219 21:43:35.378945 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a19332ea30d3f30bc9024c2c11ccf1aebf8403a82db1f78a705c81e94520d3d\": container with ID starting with 3a19332ea30d3f30bc9024c2c11ccf1aebf8403a82db1f78a705c81e94520d3d not found: ID does not exist" containerID="3a19332ea30d3f30bc9024c2c11ccf1aebf8403a82db1f78a705c81e94520d3d" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.378982 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a19332ea30d3f30bc9024c2c11ccf1aebf8403a82db1f78a705c81e94520d3d"} err="failed to get container status \"3a19332ea30d3f30bc9024c2c11ccf1aebf8403a82db1f78a705c81e94520d3d\": rpc error: code = NotFound desc = could not find container \"3a19332ea30d3f30bc9024c2c11ccf1aebf8403a82db1f78a705c81e94520d3d\": container with ID starting with 3a19332ea30d3f30bc9024c2c11ccf1aebf8403a82db1f78a705c81e94520d3d not found: ID does not exist" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.379008 4795 scope.go:117] "RemoveContainer" containerID="673e5888e18f68030d508c2fc17e9b749479219a1048baa8b20751a224c8aac8" Feb 19 21:43:35 crc kubenswrapper[4795]: E0219 21:43:35.379354 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"673e5888e18f68030d508c2fc17e9b749479219a1048baa8b20751a224c8aac8\": container with ID starting with 673e5888e18f68030d508c2fc17e9b749479219a1048baa8b20751a224c8aac8 not found: ID does not exist" containerID="673e5888e18f68030d508c2fc17e9b749479219a1048baa8b20751a224c8aac8" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.379384 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"673e5888e18f68030d508c2fc17e9b749479219a1048baa8b20751a224c8aac8"} err="failed to get container status \"673e5888e18f68030d508c2fc17e9b749479219a1048baa8b20751a224c8aac8\": rpc error: code = NotFound desc = could not find container \"673e5888e18f68030d508c2fc17e9b749479219a1048baa8b20751a224c8aac8\": container with ID starting with 673e5888e18f68030d508c2fc17e9b749479219a1048baa8b20751a224c8aac8 not found: ID does not exist" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.379405 4795 scope.go:117] "RemoveContainer" containerID="aad3657be5b442a652805a5a7969b9cf191ab890426ca287320693f93c38eb61" Feb 19 21:43:35 crc kubenswrapper[4795]: E0219 21:43:35.383557 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aad3657be5b442a652805a5a7969b9cf191ab890426ca287320693f93c38eb61\": container with ID starting with aad3657be5b442a652805a5a7969b9cf191ab890426ca287320693f93c38eb61 not found: ID does not exist" containerID="aad3657be5b442a652805a5a7969b9cf191ab890426ca287320693f93c38eb61" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.383584 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aad3657be5b442a652805a5a7969b9cf191ab890426ca287320693f93c38eb61"} err="failed to get container status \"aad3657be5b442a652805a5a7969b9cf191ab890426ca287320693f93c38eb61\": rpc error: code = NotFound desc = could not find container \"aad3657be5b442a652805a5a7969b9cf191ab890426ca287320693f93c38eb61\": container with ID starting with aad3657be5b442a652805a5a7969b9cf191ab890426ca287320693f93c38eb61 not found: ID does not exist" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.407701 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-shb4d"] Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.477977 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-5cnjr"] Feb 19 21:43:35 crc kubenswrapper[4795]: W0219 21:43:35.494890 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd19ed31e_e599_40ec_935d_d1d404e4c7a5.slice/crio-d2c575c923d5e6900d2cf0a9f79882dcba70089d5ac5991a65d2fca9e8238658 WatchSource:0}: Error finding container d2c575c923d5e6900d2cf0a9f79882dcba70089d5ac5991a65d2fca9e8238658: Status 404 returned error can't find the container with id d2c575c923d5e6900d2cf0a9f79882dcba70089d5ac5991a65d2fca9e8238658 Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.557899 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26c9b930-41fe-4332-9fde-8c9d4cb304bd" path="/var/lib/kubelet/pods/26c9b930-41fe-4332-9fde-8c9d4cb304bd/volumes" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.621830 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:35 crc kubenswrapper[4795]: E0219 21:43:35.622485 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 21:43:35 crc kubenswrapper[4795]: E0219 21:43:35.622532 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs podName:9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:43:36.622517033 +0000 UTC m=+927.815034897 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-62xdp" (UID: "9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4") : secret "metrics-server-cert" not found Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.684403 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-5b89b"] Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.692345 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-7n98g"] Feb 19 21:43:35 crc kubenswrapper[4795]: W0219 21:43:35.694816 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d6085d5_f9db_4129_8662_b3ae045decfc.slice/crio-bd157e371be79bf57fb80026e876d9c8dfdaefde324a2140c8aa008288f61573 WatchSource:0}: Error finding container bd157e371be79bf57fb80026e876d9c8dfdaefde324a2140c8aa008288f61573: Status 404 returned error can't find the container with id bd157e371be79bf57fb80026e876d9c8dfdaefde324a2140c8aa008288f61573 Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.695938 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fdd85"] Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.714692 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4cf2p"] Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.720237 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7n98g" event={"ID":"1d6085d5-f9db-4129-8662-b3ae045decfc","Type":"ContainerStarted","Data":"bd157e371be79bf57fb80026e876d9c8dfdaefde324a2140c8aa008288f61573"} Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.723596 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:35 crc kubenswrapper[4795]: E0219 21:43:35.723801 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.723800 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-shb4d" event={"ID":"268c2664-09cc-4616-9280-0dd6ae4159dc","Type":"ContainerStarted","Data":"dbab9616bea5a84cb7d46cc57b72722a137e31777aad2b53dbae04e18f1628d5"} Feb 19 21:43:35 crc kubenswrapper[4795]: E0219 21:43:35.723870 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs podName:9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:43:36.723851296 +0000 UTC m=+927.916369160 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-62xdp" (UID: "9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4") : secret "webhook-server-cert" not found Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.739614 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-5cnjr" event={"ID":"d19ed31e-e599-40ec-935d-d1d404e4c7a5","Type":"ContainerStarted","Data":"d2c575c923d5e6900d2cf0a9f79882dcba70089d5ac5991a65d2fca9e8238658"} Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.753360 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-5b89b" event={"ID":"5a6d3cc3-7e00-4013-b568-c2b835d8e2b9","Type":"ContainerStarted","Data":"c442b135c36e227a697a05c9c223e8910e7845324206053b69553f920d6dae2f"} Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.757999 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-vwgdm" event={"ID":"54a55994-69ff-48f1-8d75-24b2a828cdc9","Type":"ContainerStarted","Data":"ec41ac1e15fc06b07b215c33a62156df782759530151296fe337b5ee1b4cbc9e"} Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.763224 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fdd85" event={"ID":"e37494c1-8780-4612-8569-fada28f0e772","Type":"ContainerStarted","Data":"bdcdbd7c44d8abf1e51f5ed3ab80789b921f7f8b2debfa911d219347f82974a8"} Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.763956 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-z7hnk" event={"ID":"4cc5be3d-87d8-46a4-ba7d-d95143c11857","Type":"ContainerStarted","Data":"4e05aebfdda0466144dc837fafb427636ab38e5d9694f67c8f037795a95cd510"} Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.765357 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-d8wqs" event={"ID":"5c867f91-2ab2-43ce-8291-6d01825610d1","Type":"ContainerStarted","Data":"61d46d7d487efb406fc03aa243b02a9fbd8b5ab16421bcd7249fc64a21e01325"} Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.915087 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-ckxlw"] Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.920546 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-t87bb"] Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.028922 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert\") pod \"infra-operator-controller-manager-79d975b745-qjgvw\" (UID: \"2e80963b-888b-4bb9-9259-864e38dd10ed\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.029091 4795 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.029146 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert podName:2e80963b-888b-4bb9-9259-864e38dd10ed nodeName:}" failed. No retries permitted until 2026-02-19 21:43:38.029125802 +0000 UTC m=+929.221643666 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert") pod "infra-operator-controller-manager-79d975b745-qjgvw" (UID: "2e80963b-888b-4bb9-9259-864e38dd10ed") : secret "infra-operator-webhook-server-cert" not found Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.032797 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-mr8mh"] Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.040611 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-t6hpt"] Feb 19 21:43:36 crc kubenswrapper[4795]: W0219 21:43:36.042757 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bdb1789_27ad_4535_86d3_fd2fb7cebba2.slice/crio-fb14ff320b3a3aaf3a51fb49d16ce7882de2fab247ed18a55df683e0eaa7c00c WatchSource:0}: Error finding container fb14ff320b3a3aaf3a51fb49d16ce7882de2fab247ed18a55df683e0eaa7c00c: Status 404 returned error can't find the container with id fb14ff320b3a3aaf3a51fb49d16ce7882de2fab247ed18a55df683e0eaa7c00c Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.063157 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-bwsj2"] Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.083882 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-slj65"] Feb 19 21:43:36 crc kubenswrapper[4795]: W0219 21:43:36.102207 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f4d8698_27a0_44a4_87f6_c75d4c3407bc.slice/crio-d3ae5512b6d771f06c9ea5013c41ee2499e5ba701bf11db4e1010a2c1069e58c WatchSource:0}: Error finding container d3ae5512b6d771f06c9ea5013c41ee2499e5ba701bf11db4e1010a2c1069e58c: Status 404 returned error can't find the container with id d3ae5512b6d771f06c9ea5013c41ee2499e5ba701bf11db4e1010a2c1069e58c Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.108912 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t7d95,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-7f45b4ff68-slj65_openstack-operators(5f4d8698-27a0-44a4-87f6-c75d4c3407bc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.110112 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-slj65" podUID="5f4d8698-27a0-44a4-87f6-c75d4c3407bc" Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.157266 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-slqxz"] Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.164271 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-dqpjx"] Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.173443 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-bbtgm"] Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.184246 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vv89z"] Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.190378 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s9dzq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68f46476f-dqpjx_openstack-operators(98979ac7-9fb1-49f8-8022-562082fc76f7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.191548 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dqpjx" podUID="98979ac7-9fb1-49f8-8022-562082fc76f7" Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.194138 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-rcjgz"] Feb 19 21:43:36 crc kubenswrapper[4795]: W0219 21:43:36.204064 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09ce2dcf_0fb0_4180_a019_09d1abfec00e.slice/crio-5824c77eb3b45d38fec95d4bcff01b43ae75b9cc50186a78ae84a019839beb21 WatchSource:0}: Error finding container 5824c77eb3b45d38fec95d4bcff01b43ae75b9cc50186a78ae84a019839beb21: Status 404 returned error can't find the container with id 5824c77eb3b45d38fec95d4bcff01b43ae75b9cc50186a78ae84a019839beb21 Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.204093 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jwm2j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-d44cf6b75-slqxz_openstack-operators(e0cad59b-249e-446f-b3fa-6be8aac2a858): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.205337 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-slqxz" podUID="e0cad59b-249e-446f-b3fa-6be8aac2a858" Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.208109 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ddk6m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7866795846-rcjgz_openstack-operators(6bdc9c62-d8c1-42d5-8696-324fdc7abc2f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.209521 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-7866795846-rcjgz" podUID="6bdc9c62-d8c1-42d5-8696-324fdc7abc2f" Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.209980 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wt99q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-5db88f68c-bbtgm_openstack-operators(09ce2dcf-0fb0-4180-a019-09d1abfec00e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.211481 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bbtgm" podUID="09ce2dcf-0fb0-4180-a019-09d1abfec00e" Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.216131 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v4z44,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-vv89z_openstack-operators(80ce3bc1-0926-47a3-acc2-6f2d8be4089c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.217404 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vv89z" podUID="80ce3bc1-0926-47a3-acc2-6f2d8be4089c" Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.333981 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p\" (UID: \"26db9cb2-1ed4-44e4-afac-404ce0f7d445\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.334213 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.334263 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert podName:26db9cb2-1ed4-44e4-afac-404ce0f7d445 nodeName:}" failed. No retries permitted until 2026-02-19 21:43:38.334249264 +0000 UTC m=+929.526767128 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" (UID: "26db9cb2-1ed4-44e4-afac-404ce0f7d445") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.639901 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.640554 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.640651 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs podName:9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:43:38.640627921 +0000 UTC m=+929.833145865 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-62xdp" (UID: "9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4") : secret "metrics-server-cert" not found Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.746644 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.746850 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.746894 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs podName:9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:43:38.746881373 +0000 UTC m=+929.939399227 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-62xdp" (UID: "9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4") : secret "webhook-server-cert" not found Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.773736 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t6hpt" event={"ID":"c7e19956-a3fb-4ed2-bc2a-72084ed62ac2","Type":"ContainerStarted","Data":"5d353edfcb8c604eb1cfa8409cd495d1d491bf5065c9abafd939379719b91d6a"} Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.802527 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dqpjx" event={"ID":"98979ac7-9fb1-49f8-8022-562082fc76f7","Type":"ContainerStarted","Data":"02754dbae3ac80efd2c379255bf0881f0a30ed60550f13423a78abe03bad1451"} Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.805354 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vv89z" event={"ID":"80ce3bc1-0926-47a3-acc2-6f2d8be4089c","Type":"ContainerStarted","Data":"e55431213b3390e8940335b813260dfdcb7498e87332c33f7dfb63eed87e2ca8"} Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.806179 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dqpjx" podUID="98979ac7-9fb1-49f8-8022-562082fc76f7" Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.807623 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vv89z" podUID="80ce3bc1-0926-47a3-acc2-6f2d8be4089c" Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.807754 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-t87bb" event={"ID":"02592cbe-e1d4-4b62-8795-a204d5335594","Type":"ContainerStarted","Data":"f57671b80c6123511d81b1228f9e92b126eb6c03e07be07f6e9ba376214d0eec"} Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.810132 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ckxlw" event={"ID":"b22b5096-41cf-40c9-94f6-8e546ca96a96","Type":"ContainerStarted","Data":"1988afd4158cccf6c904b3abc3ba0130736308a4a12616377d663f3c7f44174b"} Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.814279 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-slj65" event={"ID":"5f4d8698-27a0-44a4-87f6-c75d4c3407bc","Type":"ContainerStarted","Data":"d3ae5512b6d771f06c9ea5013c41ee2499e5ba701bf11db4e1010a2c1069e58c"} Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.817297 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-slj65" podUID="5f4d8698-27a0-44a4-87f6-c75d4c3407bc" Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.817408 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-slqxz" event={"ID":"e0cad59b-249e-446f-b3fa-6be8aac2a858","Type":"ContainerStarted","Data":"b42734073de670a5b09af026b396e64c976c50a1d7d4cfd4cbd1ee0433a4c647"} Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.829237 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bbtgm" event={"ID":"09ce2dcf-0fb0-4180-a019-09d1abfec00e","Type":"ContainerStarted","Data":"5824c77eb3b45d38fec95d4bcff01b43ae75b9cc50186a78ae84a019839beb21"} Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.829265 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-slqxz" podUID="e0cad59b-249e-446f-b3fa-6be8aac2a858" Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.832499 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bbtgm" podUID="09ce2dcf-0fb0-4180-a019-09d1abfec00e" Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.835315 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-bwsj2" event={"ID":"7b637620-f307-4e2b-b92d-f1e0d50b0071","Type":"ContainerStarted","Data":"5215322d452d12698484c8773ec7ee6c3b7d76bf9e77dde4163888da83ce5cbb"} Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.838540 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4cf2p" event={"ID":"c2c4435e-a135-4c1f-bad4-121458c09bc3","Type":"ContainerStarted","Data":"a1db2a335a9378e0436bd591daea8d8a60409947f80d08a27f62a766ff23baf4"} Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.841905 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-rcjgz" event={"ID":"6bdc9c62-d8c1-42d5-8696-324fdc7abc2f","Type":"ContainerStarted","Data":"f219ba887bb2a5471b369e9ac7aab55f74556020daa5688afc8d7591e3f3d55a"} Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.842961 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-rcjgz" podUID="6bdc9c62-d8c1-42d5-8696-324fdc7abc2f" Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.844736 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-mr8mh" event={"ID":"0bdb1789-27ad-4535-86d3-fd2fb7cebba2","Type":"ContainerStarted","Data":"fb14ff320b3a3aaf3a51fb49d16ce7882de2fab247ed18a55df683e0eaa7c00c"} Feb 19 21:43:37 crc kubenswrapper[4795]: I0219 21:43:37.488401 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k4g6z"] Feb 19 21:43:37 crc kubenswrapper[4795]: I0219 21:43:37.492330 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:43:37 crc kubenswrapper[4795]: I0219 21:43:37.509611 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k4g6z"] Feb 19 21:43:37 crc kubenswrapper[4795]: I0219 21:43:37.560879 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm6kz\" (UniqueName: \"kubernetes.io/projected/487e5483-759b-49c8-a347-f9a3ecd255ff-kube-api-access-vm6kz\") pod \"redhat-marketplace-k4g6z\" (UID: \"487e5483-759b-49c8-a347-f9a3ecd255ff\") " pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:43:37 crc kubenswrapper[4795]: I0219 21:43:37.560987 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/487e5483-759b-49c8-a347-f9a3ecd255ff-catalog-content\") pod \"redhat-marketplace-k4g6z\" (UID: \"487e5483-759b-49c8-a347-f9a3ecd255ff\") " pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:43:37 crc kubenswrapper[4795]: I0219 21:43:37.561073 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/487e5483-759b-49c8-a347-f9a3ecd255ff-utilities\") pod \"redhat-marketplace-k4g6z\" (UID: \"487e5483-759b-49c8-a347-f9a3ecd255ff\") " pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:43:37 crc kubenswrapper[4795]: I0219 21:43:37.662782 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/487e5483-759b-49c8-a347-f9a3ecd255ff-utilities\") pod \"redhat-marketplace-k4g6z\" (UID: \"487e5483-759b-49c8-a347-f9a3ecd255ff\") " pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:43:37 crc kubenswrapper[4795]: I0219 21:43:37.662900 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm6kz\" (UniqueName: \"kubernetes.io/projected/487e5483-759b-49c8-a347-f9a3ecd255ff-kube-api-access-vm6kz\") pod \"redhat-marketplace-k4g6z\" (UID: \"487e5483-759b-49c8-a347-f9a3ecd255ff\") " pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:43:37 crc kubenswrapper[4795]: I0219 21:43:37.662939 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/487e5483-759b-49c8-a347-f9a3ecd255ff-catalog-content\") pod \"redhat-marketplace-k4g6z\" (UID: \"487e5483-759b-49c8-a347-f9a3ecd255ff\") " pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:43:37 crc kubenswrapper[4795]: I0219 21:43:37.663376 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/487e5483-759b-49c8-a347-f9a3ecd255ff-catalog-content\") pod \"redhat-marketplace-k4g6z\" (UID: \"487e5483-759b-49c8-a347-f9a3ecd255ff\") " pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:43:37 crc kubenswrapper[4795]: I0219 21:43:37.663612 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/487e5483-759b-49c8-a347-f9a3ecd255ff-utilities\") pod \"redhat-marketplace-k4g6z\" (UID: \"487e5483-759b-49c8-a347-f9a3ecd255ff\") " pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:43:37 crc kubenswrapper[4795]: I0219 21:43:37.684182 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm6kz\" (UniqueName: \"kubernetes.io/projected/487e5483-759b-49c8-a347-f9a3ecd255ff-kube-api-access-vm6kz\") pod \"redhat-marketplace-k4g6z\" (UID: \"487e5483-759b-49c8-a347-f9a3ecd255ff\") " pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:43:37 crc kubenswrapper[4795]: I0219 21:43:37.832582 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:43:37 crc kubenswrapper[4795]: E0219 21:43:37.852739 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dqpjx" podUID="98979ac7-9fb1-49f8-8022-562082fc76f7" Feb 19 21:43:37 crc kubenswrapper[4795]: E0219 21:43:37.853622 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-rcjgz" podUID="6bdc9c62-d8c1-42d5-8696-324fdc7abc2f" Feb 19 21:43:37 crc kubenswrapper[4795]: E0219 21:43:37.853960 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-slqxz" podUID="e0cad59b-249e-446f-b3fa-6be8aac2a858" Feb 19 21:43:37 crc kubenswrapper[4795]: E0219 21:43:37.854044 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bbtgm" podUID="09ce2dcf-0fb0-4180-a019-09d1abfec00e" Feb 19 21:43:37 crc kubenswrapper[4795]: E0219 21:43:37.854127 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-slj65" podUID="5f4d8698-27a0-44a4-87f6-c75d4c3407bc" Feb 19 21:43:37 crc kubenswrapper[4795]: E0219 21:43:37.861058 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vv89z" podUID="80ce3bc1-0926-47a3-acc2-6f2d8be4089c" Feb 19 21:43:38 crc kubenswrapper[4795]: I0219 21:43:38.068922 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert\") pod \"infra-operator-controller-manager-79d975b745-qjgvw\" (UID: \"2e80963b-888b-4bb9-9259-864e38dd10ed\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" Feb 19 21:43:38 crc kubenswrapper[4795]: E0219 21:43:38.069405 4795 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 21:43:38 crc kubenswrapper[4795]: E0219 21:43:38.069454 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert podName:2e80963b-888b-4bb9-9259-864e38dd10ed nodeName:}" failed. No retries permitted until 2026-02-19 21:43:42.069437833 +0000 UTC m=+933.261955697 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert") pod "infra-operator-controller-manager-79d975b745-qjgvw" (UID: "2e80963b-888b-4bb9-9259-864e38dd10ed") : secret "infra-operator-webhook-server-cert" not found Feb 19 21:43:38 crc kubenswrapper[4795]: I0219 21:43:38.373414 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p\" (UID: \"26db9cb2-1ed4-44e4-afac-404ce0f7d445\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" Feb 19 21:43:38 crc kubenswrapper[4795]: E0219 21:43:38.373589 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:43:38 crc kubenswrapper[4795]: E0219 21:43:38.373661 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert podName:26db9cb2-1ed4-44e4-afac-404ce0f7d445 nodeName:}" failed. No retries permitted until 2026-02-19 21:43:42.373637138 +0000 UTC m=+933.566155002 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" (UID: "26db9cb2-1ed4-44e4-afac-404ce0f7d445") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:43:38 crc kubenswrapper[4795]: I0219 21:43:38.678322 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:38 crc kubenswrapper[4795]: E0219 21:43:38.678542 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 21:43:38 crc kubenswrapper[4795]: E0219 21:43:38.678590 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs podName:9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:43:42.678576575 +0000 UTC m=+933.871094439 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-62xdp" (UID: "9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4") : secret "metrics-server-cert" not found Feb 19 21:43:38 crc kubenswrapper[4795]: I0219 21:43:38.779493 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:38 crc kubenswrapper[4795]: E0219 21:43:38.779691 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 21:43:38 crc kubenswrapper[4795]: E0219 21:43:38.779743 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs podName:9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:43:42.779729263 +0000 UTC m=+933.972247117 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-62xdp" (UID: "9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4") : secret "webhook-server-cert" not found Feb 19 21:43:42 crc kubenswrapper[4795]: I0219 21:43:42.142516 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert\") pod \"infra-operator-controller-manager-79d975b745-qjgvw\" (UID: \"2e80963b-888b-4bb9-9259-864e38dd10ed\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" Feb 19 21:43:42 crc kubenswrapper[4795]: E0219 21:43:42.142771 4795 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 21:43:42 crc kubenswrapper[4795]: E0219 21:43:42.143505 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert podName:2e80963b-888b-4bb9-9259-864e38dd10ed nodeName:}" failed. No retries permitted until 2026-02-19 21:43:50.143437088 +0000 UTC m=+941.335954992 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert") pod "infra-operator-controller-manager-79d975b745-qjgvw" (UID: "2e80963b-888b-4bb9-9259-864e38dd10ed") : secret "infra-operator-webhook-server-cert" not found Feb 19 21:43:42 crc kubenswrapper[4795]: I0219 21:43:42.450613 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p\" (UID: \"26db9cb2-1ed4-44e4-afac-404ce0f7d445\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" Feb 19 21:43:42 crc kubenswrapper[4795]: E0219 21:43:42.450830 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:43:42 crc kubenswrapper[4795]: E0219 21:43:42.450883 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert podName:26db9cb2-1ed4-44e4-afac-404ce0f7d445 nodeName:}" failed. No retries permitted until 2026-02-19 21:43:50.450867495 +0000 UTC m=+941.643385369 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" (UID: "26db9cb2-1ed4-44e4-afac-404ce0f7d445") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:43:42 crc kubenswrapper[4795]: I0219 21:43:42.755019 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:42 crc kubenswrapper[4795]: E0219 21:43:42.755256 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 21:43:42 crc kubenswrapper[4795]: E0219 21:43:42.755335 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs podName:9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:43:50.755314647 +0000 UTC m=+941.947832511 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-62xdp" (UID: "9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4") : secret "metrics-server-cert" not found Feb 19 21:43:42 crc kubenswrapper[4795]: I0219 21:43:42.856540 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:42 crc kubenswrapper[4795]: E0219 21:43:42.856692 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 21:43:42 crc kubenswrapper[4795]: E0219 21:43:42.856744 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs podName:9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:43:50.856730233 +0000 UTC m=+942.049248097 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-62xdp" (UID: "9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4") : secret "webhook-server-cert" not found Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.316637 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k4g6z"] Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.933145 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7n98g" event={"ID":"1d6085d5-f9db-4129-8662-b3ae045decfc","Type":"ContainerStarted","Data":"e07793e8fd9993a5fe24a5198c679587d953637162c90794139ae2c623166c0e"} Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.933619 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7n98g" Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.934924 4795 generic.go:334] "Generic (PLEG): container finished" podID="487e5483-759b-49c8-a347-f9a3ecd255ff" containerID="1899fef31d4fa94097bca46bd05c65b6b55ad2778a2534f09b28eef318d77d2d" exitCode=0 Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.935291 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4g6z" event={"ID":"487e5483-759b-49c8-a347-f9a3ecd255ff","Type":"ContainerDied","Data":"1899fef31d4fa94097bca46bd05c65b6b55ad2778a2534f09b28eef318d77d2d"} Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.935407 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4g6z" event={"ID":"487e5483-759b-49c8-a347-f9a3ecd255ff","Type":"ContainerStarted","Data":"bd29711f0be87b24f29b013ce08c362b1111de6a1cccb18d373555addaf3b5a6"} Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.939543 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-bwsj2" event={"ID":"7b637620-f307-4e2b-b92d-f1e0d50b0071","Type":"ContainerStarted","Data":"f83a4cba0834bf1e9732d279c5146b84203dda4a49f7281d60b184f37eb9c589"} Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.939869 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-bwsj2" Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.941102 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-5b89b" event={"ID":"5a6d3cc3-7e00-4013-b568-c2b835d8e2b9","Type":"ContainerStarted","Data":"0e508f6ef4573dd484b05d855a79b9423abb7f3ebeeaff3ae0bf8590d64b1fd0"} Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.941229 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-5b89b" Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.943937 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t6hpt" event={"ID":"c7e19956-a3fb-4ed2-bc2a-72084ed62ac2","Type":"ContainerStarted","Data":"8e5498bd191e5e827ce0256522cb08fecd273043a3be7d1ec3efef0a0e5657cc"} Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.945635 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t6hpt" Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.947790 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ckxlw" event={"ID":"b22b5096-41cf-40c9-94f6-8e546ca96a96","Type":"ContainerStarted","Data":"eab31bd5755bccd99d04d6cb6514e2450dcf22deb0d927a56a33ef797b39f7e1"} Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.948090 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ckxlw" Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.949287 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-z7hnk" event={"ID":"4cc5be3d-87d8-46a4-ba7d-d95143c11857","Type":"ContainerStarted","Data":"8281c9187b09a72d046cf8480d4a52b47744eaf9e077470d7bc2223b97c45f12"} Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.950023 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-z7hnk" Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.952721 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-5cnjr" event={"ID":"d19ed31e-e599-40ec-935d-d1d404e4c7a5","Type":"ContainerStarted","Data":"1a9e5d6728b789d5336943f354758c7acc3fb68672cc966429421a5137abaefb"} Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.952946 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987464f4-5cnjr" Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.954376 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-mr8mh" event={"ID":"0bdb1789-27ad-4535-86d3-fd2fb7cebba2","Type":"ContainerStarted","Data":"69b78e9b774021411809dd96659898820a7a2f10da65bbbd0b7757dcbed0404d"} Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.954642 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-mr8mh" Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.955414 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7n98g" podStartSLOduration=2.684259462 podStartE2EDuration="14.955400336s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:35.697408809 +0000 UTC m=+926.889926673" lastFinishedPulling="2026-02-19 21:43:47.968549643 +0000 UTC m=+939.161067547" observedRunningTime="2026-02-19 21:43:48.953699998 +0000 UTC m=+940.146217882" watchObservedRunningTime="2026-02-19 21:43:48.955400336 +0000 UTC m=+940.147918200" Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.956329 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-t87bb" event={"ID":"02592cbe-e1d4-4b62-8795-a204d5335594","Type":"ContainerStarted","Data":"e395a5835e9712e6734ed3c2badde6994d4618b6d22c115cf5f06d12296d9660"} Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.957056 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-t87bb" Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.962343 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-vwgdm" event={"ID":"54a55994-69ff-48f1-8d75-24b2a828cdc9","Type":"ContainerStarted","Data":"2fece62e050d1ad6233bd56c613c1aba27b5a19766058e19acd0b5031dd3f1e0"} Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.963051 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-vwgdm" Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.964425 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fdd85" event={"ID":"e37494c1-8780-4612-8569-fada28f0e772","Type":"ContainerStarted","Data":"625a35d65b18d26d297f052bef896978805ae6b3620dc92bc4437700ff7578b5"} Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.964763 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fdd85" Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.965783 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-shb4d" event={"ID":"268c2664-09cc-4616-9280-0dd6ae4159dc","Type":"ContainerStarted","Data":"c66691bffe40d94b042e37112c9f486a4c6411996b806e5be0ef638c78a6f404"} Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.966098 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-shb4d" Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.967240 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4cf2p" event={"ID":"c2c4435e-a135-4c1f-bad4-121458c09bc3","Type":"ContainerStarted","Data":"5956cb22ea93c0bc00613a41d173d7120d95d8d7521d324d485b64492bbe63a5"} Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.967581 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4cf2p" Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.971867 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-d8wqs" event={"ID":"5c867f91-2ab2-43ce-8291-6d01825610d1","Type":"ContainerStarted","Data":"5daeceb4704e6cb53982dcd2636fc941537acca135b4dcb774dbaeb82f6debb9"} Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.972061 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-d8wqs" Feb 19 21:43:49 crc kubenswrapper[4795]: I0219 21:43:49.001244 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987464f4-5cnjr" podStartSLOduration=2.40987124 podStartE2EDuration="15.001221701s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:35.498987683 +0000 UTC m=+926.691505547" lastFinishedPulling="2026-02-19 21:43:48.090338134 +0000 UTC m=+939.282856008" observedRunningTime="2026-02-19 21:43:49.000631494 +0000 UTC m=+940.193149358" watchObservedRunningTime="2026-02-19 21:43:49.001221701 +0000 UTC m=+940.193739575" Feb 19 21:43:49 crc kubenswrapper[4795]: I0219 21:43:49.019765 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-5b89b" podStartSLOduration=2.622914529 podStartE2EDuration="15.019749124s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:35.694917829 +0000 UTC m=+926.887435693" lastFinishedPulling="2026-02-19 21:43:48.091752414 +0000 UTC m=+939.284270288" observedRunningTime="2026-02-19 21:43:49.018782497 +0000 UTC m=+940.211300361" watchObservedRunningTime="2026-02-19 21:43:49.019749124 +0000 UTC m=+940.212266988" Feb 19 21:43:49 crc kubenswrapper[4795]: I0219 21:43:49.039635 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ckxlw" podStartSLOduration=2.9260576560000002 podStartE2EDuration="15.039617146s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:35.932621976 +0000 UTC m=+927.125139840" lastFinishedPulling="2026-02-19 21:43:48.046181446 +0000 UTC m=+939.238699330" observedRunningTime="2026-02-19 21:43:49.036800026 +0000 UTC m=+940.229317890" watchObservedRunningTime="2026-02-19 21:43:49.039617146 +0000 UTC m=+940.232135010" Feb 19 21:43:49 crc kubenswrapper[4795]: I0219 21:43:49.085748 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-bwsj2" podStartSLOduration=3.123042732 podStartE2EDuration="15.085730989s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:36.103218136 +0000 UTC m=+927.295736000" lastFinishedPulling="2026-02-19 21:43:48.065906383 +0000 UTC m=+939.258424257" observedRunningTime="2026-02-19 21:43:49.083927468 +0000 UTC m=+940.276445332" watchObservedRunningTime="2026-02-19 21:43:49.085730989 +0000 UTC m=+940.278248843" Feb 19 21:43:49 crc kubenswrapper[4795]: I0219 21:43:49.086465 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-z7hnk" podStartSLOduration=2.391954403 podStartE2EDuration="15.086459649s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:35.273965644 +0000 UTC m=+926.466483508" lastFinishedPulling="2026-02-19 21:43:47.96847088 +0000 UTC m=+939.160988754" observedRunningTime="2026-02-19 21:43:49.06630081 +0000 UTC m=+940.258818664" watchObservedRunningTime="2026-02-19 21:43:49.086459649 +0000 UTC m=+940.278977513" Feb 19 21:43:49 crc kubenswrapper[4795]: I0219 21:43:49.133639 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t6hpt" podStartSLOduration=3.117958178 podStartE2EDuration="15.133619272s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:36.077205921 +0000 UTC m=+927.269723785" lastFinishedPulling="2026-02-19 21:43:48.092866995 +0000 UTC m=+939.285384879" observedRunningTime="2026-02-19 21:43:49.129947608 +0000 UTC m=+940.322465472" watchObservedRunningTime="2026-02-19 21:43:49.133619272 +0000 UTC m=+940.326137136" Feb 19 21:43:49 crc kubenswrapper[4795]: I0219 21:43:49.159803 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-mr8mh" podStartSLOduration=3.158585045 podStartE2EDuration="15.159786091s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:36.045205267 +0000 UTC m=+927.237723131" lastFinishedPulling="2026-02-19 21:43:48.046406303 +0000 UTC m=+939.238924177" observedRunningTime="2026-02-19 21:43:49.155778238 +0000 UTC m=+940.348296102" watchObservedRunningTime="2026-02-19 21:43:49.159786091 +0000 UTC m=+940.352303955" Feb 19 21:43:49 crc kubenswrapper[4795]: I0219 21:43:49.206140 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-t87bb" podStartSLOduration=3.068283165 podStartE2EDuration="15.206124181s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:35.926508673 +0000 UTC m=+927.119026537" lastFinishedPulling="2026-02-19 21:43:48.064349669 +0000 UTC m=+939.256867553" observedRunningTime="2026-02-19 21:43:49.200472361 +0000 UTC m=+940.392990225" watchObservedRunningTime="2026-02-19 21:43:49.206124181 +0000 UTC m=+940.398642045" Feb 19 21:43:49 crc kubenswrapper[4795]: I0219 21:43:49.216877 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-vwgdm" podStartSLOduration=2.696880649 podStartE2EDuration="15.216864714s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:35.448603239 +0000 UTC m=+926.641121103" lastFinishedPulling="2026-02-19 21:43:47.968587294 +0000 UTC m=+939.161105168" observedRunningTime="2026-02-19 21:43:49.216046181 +0000 UTC m=+940.408564045" watchObservedRunningTime="2026-02-19 21:43:49.216864714 +0000 UTC m=+940.409382578" Feb 19 21:43:49 crc kubenswrapper[4795]: I0219 21:43:49.238806 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4cf2p" podStartSLOduration=3.016600454 podStartE2EDuration="15.238790354s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:35.746377803 +0000 UTC m=+926.938895667" lastFinishedPulling="2026-02-19 21:43:47.968567683 +0000 UTC m=+939.161085567" observedRunningTime="2026-02-19 21:43:49.235717437 +0000 UTC m=+940.428235301" watchObservedRunningTime="2026-02-19 21:43:49.238790354 +0000 UTC m=+940.431308228" Feb 19 21:43:49 crc kubenswrapper[4795]: I0219 21:43:49.281400 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-d8wqs" podStartSLOduration=2.683868941 podStartE2EDuration="15.281387307s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:35.448431114 +0000 UTC m=+926.640948978" lastFinishedPulling="2026-02-19 21:43:48.04594948 +0000 UTC m=+939.238467344" observedRunningTime="2026-02-19 21:43:49.277800326 +0000 UTC m=+940.470318190" watchObservedRunningTime="2026-02-19 21:43:49.281387307 +0000 UTC m=+940.473905171" Feb 19 21:43:49 crc kubenswrapper[4795]: I0219 21:43:49.297829 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-shb4d" podStartSLOduration=2.718454608 podStartE2EDuration="15.297812801s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:35.484968076 +0000 UTC m=+926.677485940" lastFinishedPulling="2026-02-19 21:43:48.064326259 +0000 UTC m=+939.256844133" observedRunningTime="2026-02-19 21:43:49.296141544 +0000 UTC m=+940.488659408" watchObservedRunningTime="2026-02-19 21:43:49.297812801 +0000 UTC m=+940.490330665" Feb 19 21:43:49 crc kubenswrapper[4795]: I0219 21:43:49.319577 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fdd85" podStartSLOduration=2.970778369 podStartE2EDuration="15.319562846s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:35.697404979 +0000 UTC m=+926.889922843" lastFinishedPulling="2026-02-19 21:43:48.046189416 +0000 UTC m=+939.238707320" observedRunningTime="2026-02-19 21:43:49.316629253 +0000 UTC m=+940.509147117" watchObservedRunningTime="2026-02-19 21:43:49.319562846 +0000 UTC m=+940.512080710" Feb 19 21:43:49 crc kubenswrapper[4795]: I0219 21:43:49.979345 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4g6z" event={"ID":"487e5483-759b-49c8-a347-f9a3ecd255ff","Type":"ContainerStarted","Data":"1dc2b16630ee0b1fd4308fd6f7f8f7e036a08702c8a58c1dc3063c3b886c95cb"} Feb 19 21:43:50 crc kubenswrapper[4795]: I0219 21:43:50.178278 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert\") pod \"infra-operator-controller-manager-79d975b745-qjgvw\" (UID: \"2e80963b-888b-4bb9-9259-864e38dd10ed\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" Feb 19 21:43:50 crc kubenswrapper[4795]: I0219 21:43:50.183982 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert\") pod \"infra-operator-controller-manager-79d975b745-qjgvw\" (UID: \"2e80963b-888b-4bb9-9259-864e38dd10ed\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" Feb 19 21:43:50 crc kubenswrapper[4795]: I0219 21:43:50.270479 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" Feb 19 21:43:50 crc kubenswrapper[4795]: I0219 21:43:50.484422 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p\" (UID: \"26db9cb2-1ed4-44e4-afac-404ce0f7d445\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" Feb 19 21:43:50 crc kubenswrapper[4795]: E0219 21:43:50.484606 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:43:50 crc kubenswrapper[4795]: E0219 21:43:50.484800 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert podName:26db9cb2-1ed4-44e4-afac-404ce0f7d445 nodeName:}" failed. No retries permitted until 2026-02-19 21:44:06.484784641 +0000 UTC m=+957.677302505 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" (UID: "26db9cb2-1ed4-44e4-afac-404ce0f7d445") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:43:50 crc kubenswrapper[4795]: I0219 21:43:50.616462 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw"] Feb 19 21:43:50 crc kubenswrapper[4795]: I0219 21:43:50.787845 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:50 crc kubenswrapper[4795]: E0219 21:43:50.788005 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 21:43:50 crc kubenswrapper[4795]: E0219 21:43:50.788053 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs podName:9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:44:06.78803922 +0000 UTC m=+957.980557084 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-62xdp" (UID: "9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4") : secret "metrics-server-cert" not found Feb 19 21:43:50 crc kubenswrapper[4795]: I0219 21:43:50.889069 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:50 crc kubenswrapper[4795]: E0219 21:43:50.889305 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 21:43:50 crc kubenswrapper[4795]: E0219 21:43:50.889692 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs podName:9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:44:06.889596819 +0000 UTC m=+958.082114693 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-62xdp" (UID: "9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4") : secret "webhook-server-cert" not found Feb 19 21:43:50 crc kubenswrapper[4795]: I0219 21:43:50.988218 4795 generic.go:334] "Generic (PLEG): container finished" podID="487e5483-759b-49c8-a347-f9a3ecd255ff" containerID="1dc2b16630ee0b1fd4308fd6f7f8f7e036a08702c8a58c1dc3063c3b886c95cb" exitCode=0 Feb 19 21:43:50 crc kubenswrapper[4795]: I0219 21:43:50.988335 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4g6z" event={"ID":"487e5483-759b-49c8-a347-f9a3ecd255ff","Type":"ContainerDied","Data":"1dc2b16630ee0b1fd4308fd6f7f8f7e036a08702c8a58c1dc3063c3b886c95cb"} Feb 19 21:43:50 crc kubenswrapper[4795]: I0219 21:43:50.990159 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" event={"ID":"2e80963b-888b-4bb9-9259-864e38dd10ed","Type":"ContainerStarted","Data":"a1e7328f20cbd1507b935f73a351364a0e5bb3a019e4d75306f8575fdfea39ab"} Feb 19 21:43:54 crc kubenswrapper[4795]: I0219 21:43:54.454698 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-vwgdm" Feb 19 21:43:54 crc kubenswrapper[4795]: I0219 21:43:54.461530 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-z7hnk" Feb 19 21:43:54 crc kubenswrapper[4795]: I0219 21:43:54.472329 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-d8wqs" Feb 19 21:43:54 crc kubenswrapper[4795]: I0219 21:43:54.486595 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987464f4-5cnjr" Feb 19 21:43:54 crc kubenswrapper[4795]: I0219 21:43:54.518109 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-shb4d" Feb 19 21:43:54 crc kubenswrapper[4795]: I0219 21:43:54.630938 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fdd85" Feb 19 21:43:54 crc kubenswrapper[4795]: I0219 21:43:54.684862 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7n98g" Feb 19 21:43:54 crc kubenswrapper[4795]: I0219 21:43:54.845865 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4cf2p" Feb 19 21:43:54 crc kubenswrapper[4795]: I0219 21:43:54.870702 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-5b89b" Feb 19 21:43:54 crc kubenswrapper[4795]: I0219 21:43:54.882584 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ckxlw" Feb 19 21:43:54 crc kubenswrapper[4795]: I0219 21:43:54.994705 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-t87bb" Feb 19 21:43:54 crc kubenswrapper[4795]: I0219 21:43:54.999725 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-bwsj2" Feb 19 21:43:55 crc kubenswrapper[4795]: I0219 21:43:55.038122 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t6hpt" Feb 19 21:43:55 crc kubenswrapper[4795]: I0219 21:43:55.117715 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-mr8mh" Feb 19 21:43:58 crc kubenswrapper[4795]: I0219 21:43:58.427403 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:43:58 crc kubenswrapper[4795]: I0219 21:43:58.427815 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:43:58 crc kubenswrapper[4795]: I0219 21:43:58.427869 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:43:58 crc kubenswrapper[4795]: I0219 21:43:58.428571 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"baca31a8ff8f8b420ab1c2fee031dade1a9efccb0543c74090535aa06f41da2f"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 21:43:58 crc kubenswrapper[4795]: I0219 21:43:58.428630 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://baca31a8ff8f8b420ab1c2fee031dade1a9efccb0543c74090535aa06f41da2f" gracePeriod=600 Feb 19 21:43:59 crc kubenswrapper[4795]: I0219 21:43:59.051968 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="baca31a8ff8f8b420ab1c2fee031dade1a9efccb0543c74090535aa06f41da2f" exitCode=0 Feb 19 21:43:59 crc kubenswrapper[4795]: I0219 21:43:59.052011 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"baca31a8ff8f8b420ab1c2fee031dade1a9efccb0543c74090535aa06f41da2f"} Feb 19 21:43:59 crc kubenswrapper[4795]: I0219 21:43:59.052078 4795 scope.go:117] "RemoveContainer" containerID="01e410588eeb6332f2520524efa20c5c33620bd277e99c49543b745d9bb370ca" Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.073543 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"26a524895f2f97a5543d7713a3a6fad00cc54e588f3f27507aef436c0255d593"} Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.074764 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dqpjx" event={"ID":"98979ac7-9fb1-49f8-8022-562082fc76f7","Type":"ContainerStarted","Data":"e4bca9f921c13624f7c2aba14785389b4a14b501a78bfad5b5f4bb7fecf2b466"} Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.074983 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dqpjx" Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.077682 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4g6z" event={"ID":"487e5483-759b-49c8-a347-f9a3ecd255ff","Type":"ContainerStarted","Data":"68160f45c546f829b363d820c59fae78a8f1b1106eedaa437c9437857e2b4cab"} Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.078819 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-slqxz" event={"ID":"e0cad59b-249e-446f-b3fa-6be8aac2a858","Type":"ContainerStarted","Data":"3b4157cb2d08c22628714a0483afb6095bb656e52ad9b2d2eb7a3cbb90e1c43b"} Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.079002 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-slqxz" Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.080113 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bbtgm" event={"ID":"09ce2dcf-0fb0-4180-a019-09d1abfec00e","Type":"ContainerStarted","Data":"6cf8278b4ab796089e65ffdca64dd3ecc12d07ec0237c81f61fe05b3ac9f0c5b"} Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.080281 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bbtgm" Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.081513 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-rcjgz" event={"ID":"6bdc9c62-d8c1-42d5-8696-324fdc7abc2f","Type":"ContainerStarted","Data":"d6f9cbbb1dc0897152ab85873bc4df76698a9562a8824a4b9423d119871fc1d4"} Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.081680 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7866795846-rcjgz" Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.082995 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vv89z" event={"ID":"80ce3bc1-0926-47a3-acc2-6f2d8be4089c","Type":"ContainerStarted","Data":"baf284142362a8de2fdf82336a2b8a11e5b36283b841faa03f93e1ece324cda5"} Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.087776 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-slj65" event={"ID":"5f4d8698-27a0-44a4-87f6-c75d4c3407bc","Type":"ContainerStarted","Data":"b7d243a92321064168826ed57751ca34bd39704c96d5da3b04fc409d20ff5abf"} Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.088020 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-slj65" Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.090206 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" event={"ID":"2e80963b-888b-4bb9-9259-864e38dd10ed","Type":"ContainerStarted","Data":"006c821325cef15c64e5965179b893fd15a5c4dfc495c8c3b5b6d2c356c673bc"} Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.090364 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.109978 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-slqxz" podStartSLOduration=3.988492526 podStartE2EDuration="28.109960892s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:36.20388255 +0000 UTC m=+927.396400424" lastFinishedPulling="2026-02-19 21:44:00.325350926 +0000 UTC m=+951.517868790" observedRunningTime="2026-02-19 21:44:02.104411045 +0000 UTC m=+953.296928909" watchObservedRunningTime="2026-02-19 21:44:02.109960892 +0000 UTC m=+953.302478766" Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.120098 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vv89z" podStartSLOduration=1.941594075 podStartE2EDuration="27.120081468s" podCreationTimestamp="2026-02-19 21:43:35 +0000 UTC" firstStartedPulling="2026-02-19 21:43:36.216018663 +0000 UTC m=+927.408536527" lastFinishedPulling="2026-02-19 21:44:01.394506016 +0000 UTC m=+952.587023920" observedRunningTime="2026-02-19 21:44:02.11697688 +0000 UTC m=+953.309494744" watchObservedRunningTime="2026-02-19 21:44:02.120081468 +0000 UTC m=+953.312599332" Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.147272 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7866795846-rcjgz" podStartSLOduration=2.783707705 podStartE2EDuration="28.147254806s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:36.207857743 +0000 UTC m=+927.400375607" lastFinishedPulling="2026-02-19 21:44:01.571404844 +0000 UTC m=+952.763922708" observedRunningTime="2026-02-19 21:44:02.140492775 +0000 UTC m=+953.333010649" watchObservedRunningTime="2026-02-19 21:44:02.147254806 +0000 UTC m=+953.339772670" Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.163261 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k4g6z" podStartSLOduration=12.530440925 podStartE2EDuration="25.163247658s" podCreationTimestamp="2026-02-19 21:43:37 +0000 UTC" firstStartedPulling="2026-02-19 21:43:48.936874643 +0000 UTC m=+940.129392507" lastFinishedPulling="2026-02-19 21:44:01.569681376 +0000 UTC m=+952.762199240" observedRunningTime="2026-02-19 21:44:02.159344307 +0000 UTC m=+953.351862171" watchObservedRunningTime="2026-02-19 21:44:02.163247658 +0000 UTC m=+953.355765522" Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.190784 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bbtgm" podStartSLOduration=3.006072707 podStartE2EDuration="28.190767795s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:36.20987874 +0000 UTC m=+927.402396604" lastFinishedPulling="2026-02-19 21:44:01.394573838 +0000 UTC m=+952.587091692" observedRunningTime="2026-02-19 21:44:02.188558543 +0000 UTC m=+953.381076407" watchObservedRunningTime="2026-02-19 21:44:02.190767795 +0000 UTC m=+953.383285659" Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.238272 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dqpjx" podStartSLOduration=2.907060849 podStartE2EDuration="28.238258017s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:36.190246195 +0000 UTC m=+927.382764059" lastFinishedPulling="2026-02-19 21:44:01.521443363 +0000 UTC m=+952.713961227" observedRunningTime="2026-02-19 21:44:02.218701514 +0000 UTC m=+953.411219378" watchObservedRunningTime="2026-02-19 21:44:02.238258017 +0000 UTC m=+953.430775871" Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.241029 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" podStartSLOduration=17.408283966 podStartE2EDuration="28.241019925s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:50.630761286 +0000 UTC m=+941.823279150" lastFinishedPulling="2026-02-19 21:44:01.463497245 +0000 UTC m=+952.656015109" observedRunningTime="2026-02-19 21:44:02.235283483 +0000 UTC m=+953.427801347" watchObservedRunningTime="2026-02-19 21:44:02.241019925 +0000 UTC m=+953.433537789" Feb 19 21:44:06 crc kubenswrapper[4795]: I0219 21:44:06.553131 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p\" (UID: \"26db9cb2-1ed4-44e4-afac-404ce0f7d445\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" Feb 19 21:44:06 crc kubenswrapper[4795]: I0219 21:44:06.564870 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p\" (UID: \"26db9cb2-1ed4-44e4-afac-404ce0f7d445\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" Feb 19 21:44:06 crc kubenswrapper[4795]: I0219 21:44:06.693394 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" Feb 19 21:44:06 crc kubenswrapper[4795]: I0219 21:44:06.858156 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:44:06 crc kubenswrapper[4795]: I0219 21:44:06.864369 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:44:06 crc kubenswrapper[4795]: I0219 21:44:06.963716 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:44:06 crc kubenswrapper[4795]: I0219 21:44:06.969335 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:44:07 crc kubenswrapper[4795]: I0219 21:44:07.032400 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-slj65" podStartSLOduration=7.652555337 podStartE2EDuration="33.032382009s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:36.108745572 +0000 UTC m=+927.301263436" lastFinishedPulling="2026-02-19 21:44:01.488572244 +0000 UTC m=+952.681090108" observedRunningTime="2026-02-19 21:44:02.252036226 +0000 UTC m=+953.444554090" watchObservedRunningTime="2026-02-19 21:44:07.032382009 +0000 UTC m=+958.224899873" Feb 19 21:44:07 crc kubenswrapper[4795]: I0219 21:44:07.037393 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p"] Feb 19 21:44:07 crc kubenswrapper[4795]: I0219 21:44:07.076396 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:44:07 crc kubenswrapper[4795]: I0219 21:44:07.132221 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" event={"ID":"26db9cb2-1ed4-44e4-afac-404ce0f7d445","Type":"ContainerStarted","Data":"cc7f0da59fb39ccd0431184ae2e66e0a24fcec4aec550587684ec43bebe26d89"} Feb 19 21:44:07 crc kubenswrapper[4795]: I0219 21:44:07.273696 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp"] Feb 19 21:44:07 crc kubenswrapper[4795]: W0219 21:44:07.281870 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f7c4d4b_cf83_47e2_a75f_e3a2c9658bb4.slice/crio-50a7097d3930b132c8d79e0393eacea33918b0628bd251c142fb763dcc3fb49e WatchSource:0}: Error finding container 50a7097d3930b132c8d79e0393eacea33918b0628bd251c142fb763dcc3fb49e: Status 404 returned error can't find the container with id 50a7097d3930b132c8d79e0393eacea33918b0628bd251c142fb763dcc3fb49e Feb 19 21:44:07 crc kubenswrapper[4795]: I0219 21:44:07.833238 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:44:07 crc kubenswrapper[4795]: I0219 21:44:07.833619 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:44:07 crc kubenswrapper[4795]: I0219 21:44:07.876241 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:44:08 crc kubenswrapper[4795]: I0219 21:44:08.144748 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" event={"ID":"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4","Type":"ContainerStarted","Data":"9d22271ac238e5923b60f1ca27d59155cb263d3bc580bed48248b3df363381b5"} Feb 19 21:44:08 crc kubenswrapper[4795]: I0219 21:44:08.144798 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" event={"ID":"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4","Type":"ContainerStarted","Data":"50a7097d3930b132c8d79e0393eacea33918b0628bd251c142fb763dcc3fb49e"} Feb 19 21:44:08 crc kubenswrapper[4795]: I0219 21:44:08.176799 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" podStartSLOduration=34.176776806 podStartE2EDuration="34.176776806s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:44:08.168277515 +0000 UTC m=+959.360795389" watchObservedRunningTime="2026-02-19 21:44:08.176776806 +0000 UTC m=+959.369294670" Feb 19 21:44:08 crc kubenswrapper[4795]: I0219 21:44:08.188518 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:44:08 crc kubenswrapper[4795]: I0219 21:44:08.240848 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k4g6z"] Feb 19 21:44:09 crc kubenswrapper[4795]: I0219 21:44:09.152965 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" event={"ID":"26db9cb2-1ed4-44e4-afac-404ce0f7d445","Type":"ContainerStarted","Data":"ad9d2127a6df34b718fac8fe15e8d0f55a9788d56a57d4eaa932cc52e7b52096"} Feb 19 21:44:09 crc kubenswrapper[4795]: I0219 21:44:09.153053 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" Feb 19 21:44:09 crc kubenswrapper[4795]: I0219 21:44:09.154799 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:44:09 crc kubenswrapper[4795]: I0219 21:44:09.188059 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" podStartSLOduration=33.524170855 podStartE2EDuration="35.18803245s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:44:07.044132991 +0000 UTC m=+958.236650855" lastFinishedPulling="2026-02-19 21:44:08.707994546 +0000 UTC m=+959.900512450" observedRunningTime="2026-02-19 21:44:09.180048514 +0000 UTC m=+960.372566408" watchObservedRunningTime="2026-02-19 21:44:09.18803245 +0000 UTC m=+960.380550314" Feb 19 21:44:10 crc kubenswrapper[4795]: I0219 21:44:10.159258 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k4g6z" podUID="487e5483-759b-49c8-a347-f9a3ecd255ff" containerName="registry-server" containerID="cri-o://68160f45c546f829b363d820c59fae78a8f1b1106eedaa437c9437857e2b4cab" gracePeriod=2 Feb 19 21:44:10 crc kubenswrapper[4795]: I0219 21:44:10.279958 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" Feb 19 21:44:10 crc kubenswrapper[4795]: I0219 21:44:10.570449 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:44:10 crc kubenswrapper[4795]: I0219 21:44:10.624052 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/487e5483-759b-49c8-a347-f9a3ecd255ff-utilities\") pod \"487e5483-759b-49c8-a347-f9a3ecd255ff\" (UID: \"487e5483-759b-49c8-a347-f9a3ecd255ff\") " Feb 19 21:44:10 crc kubenswrapper[4795]: I0219 21:44:10.624777 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm6kz\" (UniqueName: \"kubernetes.io/projected/487e5483-759b-49c8-a347-f9a3ecd255ff-kube-api-access-vm6kz\") pod \"487e5483-759b-49c8-a347-f9a3ecd255ff\" (UID: \"487e5483-759b-49c8-a347-f9a3ecd255ff\") " Feb 19 21:44:10 crc kubenswrapper[4795]: I0219 21:44:10.624813 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/487e5483-759b-49c8-a347-f9a3ecd255ff-catalog-content\") pod \"487e5483-759b-49c8-a347-f9a3ecd255ff\" (UID: \"487e5483-759b-49c8-a347-f9a3ecd255ff\") " Feb 19 21:44:10 crc kubenswrapper[4795]: I0219 21:44:10.625016 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/487e5483-759b-49c8-a347-f9a3ecd255ff-utilities" (OuterVolumeSpecName: "utilities") pod "487e5483-759b-49c8-a347-f9a3ecd255ff" (UID: "487e5483-759b-49c8-a347-f9a3ecd255ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:44:10 crc kubenswrapper[4795]: I0219 21:44:10.625371 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/487e5483-759b-49c8-a347-f9a3ecd255ff-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:44:10 crc kubenswrapper[4795]: I0219 21:44:10.638467 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/487e5483-759b-49c8-a347-f9a3ecd255ff-kube-api-access-vm6kz" (OuterVolumeSpecName: "kube-api-access-vm6kz") pod "487e5483-759b-49c8-a347-f9a3ecd255ff" (UID: "487e5483-759b-49c8-a347-f9a3ecd255ff"). InnerVolumeSpecName "kube-api-access-vm6kz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:44:10 crc kubenswrapper[4795]: I0219 21:44:10.654101 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/487e5483-759b-49c8-a347-f9a3ecd255ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "487e5483-759b-49c8-a347-f9a3ecd255ff" (UID: "487e5483-759b-49c8-a347-f9a3ecd255ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:44:10 crc kubenswrapper[4795]: I0219 21:44:10.726655 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm6kz\" (UniqueName: \"kubernetes.io/projected/487e5483-759b-49c8-a347-f9a3ecd255ff-kube-api-access-vm6kz\") on node \"crc\" DevicePath \"\"" Feb 19 21:44:10 crc kubenswrapper[4795]: I0219 21:44:10.726696 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/487e5483-759b-49c8-a347-f9a3ecd255ff-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:44:11 crc kubenswrapper[4795]: I0219 21:44:11.166314 4795 generic.go:334] "Generic (PLEG): container finished" podID="487e5483-759b-49c8-a347-f9a3ecd255ff" containerID="68160f45c546f829b363d820c59fae78a8f1b1106eedaa437c9437857e2b4cab" exitCode=0 Feb 19 21:44:11 crc kubenswrapper[4795]: I0219 21:44:11.166356 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4g6z" event={"ID":"487e5483-759b-49c8-a347-f9a3ecd255ff","Type":"ContainerDied","Data":"68160f45c546f829b363d820c59fae78a8f1b1106eedaa437c9437857e2b4cab"} Feb 19 21:44:11 crc kubenswrapper[4795]: I0219 21:44:11.166382 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4g6z" event={"ID":"487e5483-759b-49c8-a347-f9a3ecd255ff","Type":"ContainerDied","Data":"bd29711f0be87b24f29b013ce08c362b1111de6a1cccb18d373555addaf3b5a6"} Feb 19 21:44:11 crc kubenswrapper[4795]: I0219 21:44:11.166399 4795 scope.go:117] "RemoveContainer" containerID="68160f45c546f829b363d820c59fae78a8f1b1106eedaa437c9437857e2b4cab" Feb 19 21:44:11 crc kubenswrapper[4795]: I0219 21:44:11.166456 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:44:11 crc kubenswrapper[4795]: I0219 21:44:11.189017 4795 scope.go:117] "RemoveContainer" containerID="1dc2b16630ee0b1fd4308fd6f7f8f7e036a08702c8a58c1dc3063c3b886c95cb" Feb 19 21:44:11 crc kubenswrapper[4795]: I0219 21:44:11.204158 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k4g6z"] Feb 19 21:44:11 crc kubenswrapper[4795]: I0219 21:44:11.212198 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k4g6z"] Feb 19 21:44:11 crc kubenswrapper[4795]: I0219 21:44:11.232480 4795 scope.go:117] "RemoveContainer" containerID="1899fef31d4fa94097bca46bd05c65b6b55ad2778a2534f09b28eef318d77d2d" Feb 19 21:44:11 crc kubenswrapper[4795]: I0219 21:44:11.252501 4795 scope.go:117] "RemoveContainer" containerID="68160f45c546f829b363d820c59fae78a8f1b1106eedaa437c9437857e2b4cab" Feb 19 21:44:11 crc kubenswrapper[4795]: E0219 21:44:11.252996 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68160f45c546f829b363d820c59fae78a8f1b1106eedaa437c9437857e2b4cab\": container with ID starting with 68160f45c546f829b363d820c59fae78a8f1b1106eedaa437c9437857e2b4cab not found: ID does not exist" containerID="68160f45c546f829b363d820c59fae78a8f1b1106eedaa437c9437857e2b4cab" Feb 19 21:44:11 crc kubenswrapper[4795]: I0219 21:44:11.253068 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68160f45c546f829b363d820c59fae78a8f1b1106eedaa437c9437857e2b4cab"} err="failed to get container status \"68160f45c546f829b363d820c59fae78a8f1b1106eedaa437c9437857e2b4cab\": rpc error: code = NotFound desc = could not find container \"68160f45c546f829b363d820c59fae78a8f1b1106eedaa437c9437857e2b4cab\": container with ID starting with 68160f45c546f829b363d820c59fae78a8f1b1106eedaa437c9437857e2b4cab not found: ID does not exist" Feb 19 21:44:11 crc kubenswrapper[4795]: I0219 21:44:11.253093 4795 scope.go:117] "RemoveContainer" containerID="1dc2b16630ee0b1fd4308fd6f7f8f7e036a08702c8a58c1dc3063c3b886c95cb" Feb 19 21:44:11 crc kubenswrapper[4795]: E0219 21:44:11.253456 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dc2b16630ee0b1fd4308fd6f7f8f7e036a08702c8a58c1dc3063c3b886c95cb\": container with ID starting with 1dc2b16630ee0b1fd4308fd6f7f8f7e036a08702c8a58c1dc3063c3b886c95cb not found: ID does not exist" containerID="1dc2b16630ee0b1fd4308fd6f7f8f7e036a08702c8a58c1dc3063c3b886c95cb" Feb 19 21:44:11 crc kubenswrapper[4795]: I0219 21:44:11.253517 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dc2b16630ee0b1fd4308fd6f7f8f7e036a08702c8a58c1dc3063c3b886c95cb"} err="failed to get container status \"1dc2b16630ee0b1fd4308fd6f7f8f7e036a08702c8a58c1dc3063c3b886c95cb\": rpc error: code = NotFound desc = could not find container \"1dc2b16630ee0b1fd4308fd6f7f8f7e036a08702c8a58c1dc3063c3b886c95cb\": container with ID starting with 1dc2b16630ee0b1fd4308fd6f7f8f7e036a08702c8a58c1dc3063c3b886c95cb not found: ID does not exist" Feb 19 21:44:11 crc kubenswrapper[4795]: I0219 21:44:11.253559 4795 scope.go:117] "RemoveContainer" containerID="1899fef31d4fa94097bca46bd05c65b6b55ad2778a2534f09b28eef318d77d2d" Feb 19 21:44:11 crc kubenswrapper[4795]: E0219 21:44:11.253973 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1899fef31d4fa94097bca46bd05c65b6b55ad2778a2534f09b28eef318d77d2d\": container with ID starting with 1899fef31d4fa94097bca46bd05c65b6b55ad2778a2534f09b28eef318d77d2d not found: ID does not exist" containerID="1899fef31d4fa94097bca46bd05c65b6b55ad2778a2534f09b28eef318d77d2d" Feb 19 21:44:11 crc kubenswrapper[4795]: I0219 21:44:11.254027 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1899fef31d4fa94097bca46bd05c65b6b55ad2778a2534f09b28eef318d77d2d"} err="failed to get container status \"1899fef31d4fa94097bca46bd05c65b6b55ad2778a2534f09b28eef318d77d2d\": rpc error: code = NotFound desc = could not find container \"1899fef31d4fa94097bca46bd05c65b6b55ad2778a2534f09b28eef318d77d2d\": container with ID starting with 1899fef31d4fa94097bca46bd05c65b6b55ad2778a2534f09b28eef318d77d2d not found: ID does not exist" Feb 19 21:44:11 crc kubenswrapper[4795]: I0219 21:44:11.519724 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="487e5483-759b-49c8-a347-f9a3ecd255ff" path="/var/lib/kubelet/pods/487e5483-759b-49c8-a347-f9a3ecd255ff/volumes" Feb 19 21:44:15 crc kubenswrapper[4795]: I0219 21:44:15.187320 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dqpjx" Feb 19 21:44:15 crc kubenswrapper[4795]: I0219 21:44:15.203701 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-slj65" Feb 19 21:44:15 crc kubenswrapper[4795]: I0219 21:44:15.230843 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7866795846-rcjgz" Feb 19 21:44:15 crc kubenswrapper[4795]: I0219 21:44:15.249292 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-slqxz" Feb 19 21:44:15 crc kubenswrapper[4795]: I0219 21:44:15.261986 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bbtgm" Feb 19 21:44:16 crc kubenswrapper[4795]: I0219 21:44:16.699343 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" Feb 19 21:44:17 crc kubenswrapper[4795]: I0219 21:44:17.085203 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.597156 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-xj6c9"] Feb 19 21:44:33 crc kubenswrapper[4795]: E0219 21:44:33.598067 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="487e5483-759b-49c8-a347-f9a3ecd255ff" containerName="extract-utilities" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.598085 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="487e5483-759b-49c8-a347-f9a3ecd255ff" containerName="extract-utilities" Feb 19 21:44:33 crc kubenswrapper[4795]: E0219 21:44:33.598101 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="487e5483-759b-49c8-a347-f9a3ecd255ff" containerName="registry-server" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.598111 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="487e5483-759b-49c8-a347-f9a3ecd255ff" containerName="registry-server" Feb 19 21:44:33 crc kubenswrapper[4795]: E0219 21:44:33.598131 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="487e5483-759b-49c8-a347-f9a3ecd255ff" containerName="extract-content" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.598140 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="487e5483-759b-49c8-a347-f9a3ecd255ff" containerName="extract-content" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.598331 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="487e5483-759b-49c8-a347-f9a3ecd255ff" containerName="registry-server" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.599330 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-xj6c9" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.604142 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.604485 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-8lfjr" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.604734 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.604976 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.609725 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-xj6c9"] Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.648691 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40671af3-115e-495a-bdf5-34580fffdc69-config\") pod \"dnsmasq-dns-855cbc58c5-xj6c9\" (UID: \"40671af3-115e-495a-bdf5-34580fffdc69\") " pod="openstack/dnsmasq-dns-855cbc58c5-xj6c9" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.648735 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pvqh\" (UniqueName: \"kubernetes.io/projected/40671af3-115e-495a-bdf5-34580fffdc69-kube-api-access-2pvqh\") pod \"dnsmasq-dns-855cbc58c5-xj6c9\" (UID: \"40671af3-115e-495a-bdf5-34580fffdc69\") " pod="openstack/dnsmasq-dns-855cbc58c5-xj6c9" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.694049 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-ppz5s"] Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.695760 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-ppz5s" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.704321 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.712493 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-ppz5s"] Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.749985 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40671af3-115e-495a-bdf5-34580fffdc69-config\") pod \"dnsmasq-dns-855cbc58c5-xj6c9\" (UID: \"40671af3-115e-495a-bdf5-34580fffdc69\") " pod="openstack/dnsmasq-dns-855cbc58c5-xj6c9" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.750030 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pvqh\" (UniqueName: \"kubernetes.io/projected/40671af3-115e-495a-bdf5-34580fffdc69-kube-api-access-2pvqh\") pod \"dnsmasq-dns-855cbc58c5-xj6c9\" (UID: \"40671af3-115e-495a-bdf5-34580fffdc69\") " pod="openstack/dnsmasq-dns-855cbc58c5-xj6c9" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.750063 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-dns-svc\") pod \"dnsmasq-dns-6fcf94d689-ppz5s\" (UID: \"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e\") " pod="openstack/dnsmasq-dns-6fcf94d689-ppz5s" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.750079 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnfdk\" (UniqueName: \"kubernetes.io/projected/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-kube-api-access-tnfdk\") pod \"dnsmasq-dns-6fcf94d689-ppz5s\" (UID: \"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e\") " pod="openstack/dnsmasq-dns-6fcf94d689-ppz5s" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.750123 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-config\") pod \"dnsmasq-dns-6fcf94d689-ppz5s\" (UID: \"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e\") " pod="openstack/dnsmasq-dns-6fcf94d689-ppz5s" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.750921 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40671af3-115e-495a-bdf5-34580fffdc69-config\") pod \"dnsmasq-dns-855cbc58c5-xj6c9\" (UID: \"40671af3-115e-495a-bdf5-34580fffdc69\") " pod="openstack/dnsmasq-dns-855cbc58c5-xj6c9" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.782944 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pvqh\" (UniqueName: \"kubernetes.io/projected/40671af3-115e-495a-bdf5-34580fffdc69-kube-api-access-2pvqh\") pod \"dnsmasq-dns-855cbc58c5-xj6c9\" (UID: \"40671af3-115e-495a-bdf5-34580fffdc69\") " pod="openstack/dnsmasq-dns-855cbc58c5-xj6c9" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.851661 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-config\") pod \"dnsmasq-dns-6fcf94d689-ppz5s\" (UID: \"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e\") " pod="openstack/dnsmasq-dns-6fcf94d689-ppz5s" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.851769 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-dns-svc\") pod \"dnsmasq-dns-6fcf94d689-ppz5s\" (UID: \"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e\") " pod="openstack/dnsmasq-dns-6fcf94d689-ppz5s" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.851789 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnfdk\" (UniqueName: \"kubernetes.io/projected/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-kube-api-access-tnfdk\") pod \"dnsmasq-dns-6fcf94d689-ppz5s\" (UID: \"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e\") " pod="openstack/dnsmasq-dns-6fcf94d689-ppz5s" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.852913 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-dns-svc\") pod \"dnsmasq-dns-6fcf94d689-ppz5s\" (UID: \"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e\") " pod="openstack/dnsmasq-dns-6fcf94d689-ppz5s" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.853811 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-config\") pod \"dnsmasq-dns-6fcf94d689-ppz5s\" (UID: \"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e\") " pod="openstack/dnsmasq-dns-6fcf94d689-ppz5s" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.870418 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnfdk\" (UniqueName: \"kubernetes.io/projected/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-kube-api-access-tnfdk\") pod \"dnsmasq-dns-6fcf94d689-ppz5s\" (UID: \"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e\") " pod="openstack/dnsmasq-dns-6fcf94d689-ppz5s" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.916104 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-xj6c9" Feb 19 21:44:34 crc kubenswrapper[4795]: I0219 21:44:34.019597 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-ppz5s" Feb 19 21:44:34 crc kubenswrapper[4795]: I0219 21:44:34.276315 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-ppz5s"] Feb 19 21:44:34 crc kubenswrapper[4795]: I0219 21:44:34.333394 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fcf94d689-ppz5s" event={"ID":"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e","Type":"ContainerStarted","Data":"d707f3ffd06ea35995047e01b9f2f18e202cdb98a74e19c73116f0e7ffea06b9"} Feb 19 21:44:34 crc kubenswrapper[4795]: I0219 21:44:34.355715 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-xj6c9"] Feb 19 21:44:34 crc kubenswrapper[4795]: W0219 21:44:34.362553 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40671af3_115e_495a_bdf5_34580fffdc69.slice/crio-d65b0d756b7de08be8b702a4b9a1293091aada93e842a06930928d0056a27a42 WatchSource:0}: Error finding container d65b0d756b7de08be8b702a4b9a1293091aada93e842a06930928d0056a27a42: Status 404 returned error can't find the container with id d65b0d756b7de08be8b702a4b9a1293091aada93e842a06930928d0056a27a42 Feb 19 21:44:35 crc kubenswrapper[4795]: I0219 21:44:35.343730 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-855cbc58c5-xj6c9" event={"ID":"40671af3-115e-495a-bdf5-34580fffdc69","Type":"ContainerStarted","Data":"d65b0d756b7de08be8b702a4b9a1293091aada93e842a06930928d0056a27a42"} Feb 19 21:44:35 crc kubenswrapper[4795]: I0219 21:44:35.906070 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-ppz5s"] Feb 19 21:44:35 crc kubenswrapper[4795]: I0219 21:44:35.929052 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-9rmlj"] Feb 19 21:44:35 crc kubenswrapper[4795]: I0219 21:44:35.968511 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-9rmlj"] Feb 19 21:44:35 crc kubenswrapper[4795]: I0219 21:44:35.968651 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" Feb 19 21:44:35 crc kubenswrapper[4795]: I0219 21:44:35.980485 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zdpq\" (UniqueName: \"kubernetes.io/projected/880246d9-9662-47e8-a0ff-5d2aca6de029-kube-api-access-2zdpq\") pod \"dnsmasq-dns-f54874ffc-9rmlj\" (UID: \"880246d9-9662-47e8-a0ff-5d2aca6de029\") " pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" Feb 19 21:44:35 crc kubenswrapper[4795]: I0219 21:44:35.980625 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/880246d9-9662-47e8-a0ff-5d2aca6de029-config\") pod \"dnsmasq-dns-f54874ffc-9rmlj\" (UID: \"880246d9-9662-47e8-a0ff-5d2aca6de029\") " pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" Feb 19 21:44:35 crc kubenswrapper[4795]: I0219 21:44:35.980676 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/880246d9-9662-47e8-a0ff-5d2aca6de029-dns-svc\") pod \"dnsmasq-dns-f54874ffc-9rmlj\" (UID: \"880246d9-9662-47e8-a0ff-5d2aca6de029\") " pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.081694 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/880246d9-9662-47e8-a0ff-5d2aca6de029-config\") pod \"dnsmasq-dns-f54874ffc-9rmlj\" (UID: \"880246d9-9662-47e8-a0ff-5d2aca6de029\") " pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.081779 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/880246d9-9662-47e8-a0ff-5d2aca6de029-dns-svc\") pod \"dnsmasq-dns-f54874ffc-9rmlj\" (UID: \"880246d9-9662-47e8-a0ff-5d2aca6de029\") " pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.081841 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zdpq\" (UniqueName: \"kubernetes.io/projected/880246d9-9662-47e8-a0ff-5d2aca6de029-kube-api-access-2zdpq\") pod \"dnsmasq-dns-f54874ffc-9rmlj\" (UID: \"880246d9-9662-47e8-a0ff-5d2aca6de029\") " pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.083300 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/880246d9-9662-47e8-a0ff-5d2aca6de029-config\") pod \"dnsmasq-dns-f54874ffc-9rmlj\" (UID: \"880246d9-9662-47e8-a0ff-5d2aca6de029\") " pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.083342 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/880246d9-9662-47e8-a0ff-5d2aca6de029-dns-svc\") pod \"dnsmasq-dns-f54874ffc-9rmlj\" (UID: \"880246d9-9662-47e8-a0ff-5d2aca6de029\") " pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.107958 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zdpq\" (UniqueName: \"kubernetes.io/projected/880246d9-9662-47e8-a0ff-5d2aca6de029-kube-api-access-2zdpq\") pod \"dnsmasq-dns-f54874ffc-9rmlj\" (UID: \"880246d9-9662-47e8-a0ff-5d2aca6de029\") " pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.293044 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.455142 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-xj6c9"] Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.490614 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-74c7x"] Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.493621 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-74c7x" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.495153 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-74c7x"] Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.592979 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss4wh\" (UniqueName: \"kubernetes.io/projected/5cb7920e-685c-4bb7-b276-3bf902251bd7-kube-api-access-ss4wh\") pod \"dnsmasq-dns-67ff45466c-74c7x\" (UID: \"5cb7920e-685c-4bb7-b276-3bf902251bd7\") " pod="openstack/dnsmasq-dns-67ff45466c-74c7x" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.593374 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cb7920e-685c-4bb7-b276-3bf902251bd7-config\") pod \"dnsmasq-dns-67ff45466c-74c7x\" (UID: \"5cb7920e-685c-4bb7-b276-3bf902251bd7\") " pod="openstack/dnsmasq-dns-67ff45466c-74c7x" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.593447 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5cb7920e-685c-4bb7-b276-3bf902251bd7-dns-svc\") pod \"dnsmasq-dns-67ff45466c-74c7x\" (UID: \"5cb7920e-685c-4bb7-b276-3bf902251bd7\") " pod="openstack/dnsmasq-dns-67ff45466c-74c7x" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.694390 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5cb7920e-685c-4bb7-b276-3bf902251bd7-dns-svc\") pod \"dnsmasq-dns-67ff45466c-74c7x\" (UID: \"5cb7920e-685c-4bb7-b276-3bf902251bd7\") " pod="openstack/dnsmasq-dns-67ff45466c-74c7x" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.694512 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss4wh\" (UniqueName: \"kubernetes.io/projected/5cb7920e-685c-4bb7-b276-3bf902251bd7-kube-api-access-ss4wh\") pod \"dnsmasq-dns-67ff45466c-74c7x\" (UID: \"5cb7920e-685c-4bb7-b276-3bf902251bd7\") " pod="openstack/dnsmasq-dns-67ff45466c-74c7x" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.694623 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cb7920e-685c-4bb7-b276-3bf902251bd7-config\") pod \"dnsmasq-dns-67ff45466c-74c7x\" (UID: \"5cb7920e-685c-4bb7-b276-3bf902251bd7\") " pod="openstack/dnsmasq-dns-67ff45466c-74c7x" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.695388 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cb7920e-685c-4bb7-b276-3bf902251bd7-config\") pod \"dnsmasq-dns-67ff45466c-74c7x\" (UID: \"5cb7920e-685c-4bb7-b276-3bf902251bd7\") " pod="openstack/dnsmasq-dns-67ff45466c-74c7x" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.695863 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5cb7920e-685c-4bb7-b276-3bf902251bd7-dns-svc\") pod \"dnsmasq-dns-67ff45466c-74c7x\" (UID: \"5cb7920e-685c-4bb7-b276-3bf902251bd7\") " pod="openstack/dnsmasq-dns-67ff45466c-74c7x" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.722099 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss4wh\" (UniqueName: \"kubernetes.io/projected/5cb7920e-685c-4bb7-b276-3bf902251bd7-kube-api-access-ss4wh\") pod \"dnsmasq-dns-67ff45466c-74c7x\" (UID: \"5cb7920e-685c-4bb7-b276-3bf902251bd7\") " pod="openstack/dnsmasq-dns-67ff45466c-74c7x" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.808848 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-9rmlj"] Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.823380 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-74c7x" Feb 19 21:44:36 crc kubenswrapper[4795]: W0219 21:44:36.853864 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod880246d9_9662_47e8_a0ff_5d2aca6de029.slice/crio-236c5910103a0e9c6c7b85c7e035781b7dc5db0e7378799d12ebd2dd79bce1b2 WatchSource:0}: Error finding container 236c5910103a0e9c6c7b85c7e035781b7dc5db0e7378799d12ebd2dd79bce1b2: Status 404 returned error can't find the container with id 236c5910103a0e9c6c7b85c7e035781b7dc5db0e7378799d12ebd2dd79bce1b2 Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.068607 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.071825 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.074377 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.074509 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.084851 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.084893 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-pkz5l" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.085080 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.086390 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.087524 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.094297 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.206137 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7b096325-542d-4ac6-8d16-8aa0937013b2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.206247 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.206306 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.206353 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.206369 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cwvp\" (UniqueName: \"kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-kube-api-access-7cwvp\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.206408 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-config-data\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.206440 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7b096325-542d-4ac6-8d16-8aa0937013b2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.206456 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.206472 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.206488 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.206521 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.307602 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7b096325-542d-4ac6-8d16-8aa0937013b2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.307647 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.307668 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.307688 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.307758 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.307832 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7b096325-542d-4ac6-8d16-8aa0937013b2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.307868 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.307902 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.307941 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.307960 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cwvp\" (UniqueName: \"kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-kube-api-access-7cwvp\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.307998 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-config-data\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.308280 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.308887 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.309561 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-config-data\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.309750 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.310054 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.310145 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.311734 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7b096325-542d-4ac6-8d16-8aa0937013b2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.314347 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.314593 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.320520 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7b096325-542d-4ac6-8d16-8aa0937013b2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.325447 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cwvp\" (UniqueName: \"kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-kube-api-access-7cwvp\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.332019 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.342274 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-74c7x"] Feb 19 21:44:37 crc kubenswrapper[4795]: W0219 21:44:37.353692 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5cb7920e_685c_4bb7_b276_3bf902251bd7.slice/crio-d22d6d60f193bcb6a68565e5f85094654e0386ed18eb92971e096bc7c18b8ee4 WatchSource:0}: Error finding container d22d6d60f193bcb6a68565e5f85094654e0386ed18eb92971e096bc7c18b8ee4: Status 404 returned error can't find the container with id d22d6d60f193bcb6a68565e5f85094654e0386ed18eb92971e096bc7c18b8ee4 Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.381593 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" event={"ID":"880246d9-9662-47e8-a0ff-5d2aca6de029","Type":"ContainerStarted","Data":"236c5910103a0e9c6c7b85c7e035781b7dc5db0e7378799d12ebd2dd79bce1b2"} Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.383676 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-74c7x" event={"ID":"5cb7920e-685c-4bb7-b276-3bf902251bd7","Type":"ContainerStarted","Data":"d22d6d60f193bcb6a68565e5f85094654e0386ed18eb92971e096bc7c18b8ee4"} Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.440197 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.604385 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.605721 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.607267 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.610036 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.610046 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.610103 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.610984 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.611125 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.611932 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-hqncs" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.630552 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.713587 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.713648 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.713696 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.713723 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.713881 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.713922 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.713941 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.714003 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.714115 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.714193 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pjq4\" (UniqueName: \"kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-kube-api-access-7pjq4\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.714238 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.815675 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.815749 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.815787 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.815844 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.815881 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pjq4\" (UniqueName: \"kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-kube-api-access-7pjq4\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.815950 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.815983 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.816036 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.816066 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.816105 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.816141 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.816145 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.816426 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.816962 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.817048 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.817473 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.818010 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.822802 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.822851 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.823251 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.823498 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.832742 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pjq4\" (UniqueName: \"kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-kube-api-access-7pjq4\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.840578 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.933784 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.944674 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 21:44:38 crc kubenswrapper[4795]: I0219 21:44:38.399782 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7b096325-542d-4ac6-8d16-8aa0937013b2","Type":"ContainerStarted","Data":"43ff46f740a6f7a342639c9893e1a10e76310ef799a0ad928eb028dabd7dd840"} Feb 19 21:44:38 crc kubenswrapper[4795]: I0219 21:44:38.447970 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.002246 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.003721 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.009102 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.009766 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-snswc" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.010008 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.010187 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.012299 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.032246 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.135892 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.135941 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.135975 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtmqc\" (UniqueName: \"kubernetes.io/projected/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-kube-api-access-gtmqc\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.135992 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-config-data-default\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.136016 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.136133 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.136214 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-kolla-config\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.136401 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.237810 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.237871 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtmqc\" (UniqueName: \"kubernetes.io/projected/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-kube-api-access-gtmqc\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.237887 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-config-data-default\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.237909 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.237931 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.237950 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-kolla-config\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.238070 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.238152 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.238725 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-kolla-config\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.238935 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.239024 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-config-data-default\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.239101 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.239524 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.244253 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.253260 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.254560 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtmqc\" (UniqueName: \"kubernetes.io/projected/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-kube-api-access-gtmqc\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.287235 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.329606 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.470602 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d","Type":"ContainerStarted","Data":"12b91da897daae78f76b09af510ceca04ac8909ff3967813b7e6274bf414c6a5"} Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.987899 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.475646 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.477979 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.481883 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-pwwm8" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.482101 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.482366 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.482669 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.483566 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.492267 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99","Type":"ContainerStarted","Data":"050b9a153d584bbd1ba63be9e7a93c951075127827418493ce2ba5e1d8a7ed20"} Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.557233 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.557288 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.557317 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.557346 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.557419 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.557452 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.557476 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwqnm\" (UniqueName: \"kubernetes.io/projected/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-kube-api-access-kwqnm\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.557548 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.659737 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.659783 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.659802 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.659826 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.659846 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.659894 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.659924 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.659945 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwqnm\" (UniqueName: \"kubernetes.io/projected/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-kube-api-access-kwqnm\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.660842 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.661099 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.661475 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.661631 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.661752 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.669310 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.678447 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwqnm\" (UniqueName: \"kubernetes.io/projected/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-kube-api-access-kwqnm\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.683057 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.698034 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.783440 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.784693 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.787353 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.787619 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-hptrp" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.787897 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.790941 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.844351 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.867496 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-config-data\") pod \"memcached-0\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " pod="openstack/memcached-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.867566 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " pod="openstack/memcached-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.867586 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-kolla-config\") pod \"memcached-0\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " pod="openstack/memcached-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.867610 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csp2v\" (UniqueName: \"kubernetes.io/projected/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-kube-api-access-csp2v\") pod \"memcached-0\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " pod="openstack/memcached-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.867626 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " pod="openstack/memcached-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.969292 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " pod="openstack/memcached-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.969328 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-kolla-config\") pod \"memcached-0\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " pod="openstack/memcached-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.969353 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csp2v\" (UniqueName: \"kubernetes.io/projected/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-kube-api-access-csp2v\") pod \"memcached-0\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " pod="openstack/memcached-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.969369 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " pod="openstack/memcached-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.969444 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-config-data\") pod \"memcached-0\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " pod="openstack/memcached-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.970028 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-config-data\") pod \"memcached-0\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " pod="openstack/memcached-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.973966 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " pod="openstack/memcached-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.974553 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-kolla-config\") pod \"memcached-0\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " pod="openstack/memcached-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.979128 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " pod="openstack/memcached-0" Feb 19 21:44:41 crc kubenswrapper[4795]: I0219 21:44:41.002357 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csp2v\" (UniqueName: \"kubernetes.io/projected/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-kube-api-access-csp2v\") pod \"memcached-0\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " pod="openstack/memcached-0" Feb 19 21:44:41 crc kubenswrapper[4795]: I0219 21:44:41.100145 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 19 21:44:42 crc kubenswrapper[4795]: I0219 21:44:42.863734 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:44:42 crc kubenswrapper[4795]: I0219 21:44:42.864597 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 21:44:42 crc kubenswrapper[4795]: I0219 21:44:42.867749 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-2gbt9" Feb 19 21:44:42 crc kubenswrapper[4795]: I0219 21:44:42.874759 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:44:42 crc kubenswrapper[4795]: I0219 21:44:42.896535 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr8kk\" (UniqueName: \"kubernetes.io/projected/bb3b374e-f01b-4997-9ecf-fbeeb384cc2c-kube-api-access-qr8kk\") pod \"kube-state-metrics-0\" (UID: \"bb3b374e-f01b-4997-9ecf-fbeeb384cc2c\") " pod="openstack/kube-state-metrics-0" Feb 19 21:44:43 crc kubenswrapper[4795]: I0219 21:44:42.998353 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr8kk\" (UniqueName: \"kubernetes.io/projected/bb3b374e-f01b-4997-9ecf-fbeeb384cc2c-kube-api-access-qr8kk\") pod \"kube-state-metrics-0\" (UID: \"bb3b374e-f01b-4997-9ecf-fbeeb384cc2c\") " pod="openstack/kube-state-metrics-0" Feb 19 21:44:43 crc kubenswrapper[4795]: I0219 21:44:43.020642 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr8kk\" (UniqueName: \"kubernetes.io/projected/bb3b374e-f01b-4997-9ecf-fbeeb384cc2c-kube-api-access-qr8kk\") pod \"kube-state-metrics-0\" (UID: \"bb3b374e-f01b-4997-9ecf-fbeeb384cc2c\") " pod="openstack/kube-state-metrics-0" Feb 19 21:44:43 crc kubenswrapper[4795]: I0219 21:44:43.202213 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.159655 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.162463 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.167082 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.167099 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.167474 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.167581 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-pkmhz" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.172394 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.183919 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.264038 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czm58\" (UniqueName: \"kubernetes.io/projected/3c5a8678-8ce2-4bee-9160-37b1dea9f897-kube-api-access-czm58\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.264118 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.264280 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c5a8678-8ce2-4bee-9160-37b1dea9f897-config\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.264412 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.264459 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c5a8678-8ce2-4bee-9160-37b1dea9f897-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.264577 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3c5a8678-8ce2-4bee-9160-37b1dea9f897-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.264743 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.264788 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.276886 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-w9fbs"] Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.281751 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.285830 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.286155 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-877p5" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.286454 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.288974 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-tl5hf"] Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.290857 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.303342 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-w9fbs"] Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.308865 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-tl5hf"] Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.367216 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czm58\" (UniqueName: \"kubernetes.io/projected/3c5a8678-8ce2-4bee-9160-37b1dea9f897-kube-api-access-czm58\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.367388 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-log\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.367454 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-run\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.367482 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.367514 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3c5a8678-8ce2-4bee-9160-37b1dea9f897-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.368376 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3c5a8678-8ce2-4bee-9160-37b1dea9f897-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.368685 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.368898 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.369027 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-scripts\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.369097 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkzbj\" (UniqueName: \"kubernetes.io/projected/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-kube-api-access-bkzbj\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.369150 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-combined-ca-bundle\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.369214 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-run\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.369273 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.369303 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c5a8678-8ce2-4bee-9160-37b1dea9f897-config\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.369332 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-log-ovn\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.369365 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c5a8678-8ce2-4bee-9160-37b1dea9f897-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.369394 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp572\" (UniqueName: \"kubernetes.io/projected/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-kube-api-access-rp572\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.369423 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-lib\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.369760 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.369804 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-scripts\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.369841 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-ovn-controller-tls-certs\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.369884 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-run-ovn\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.369949 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-etc-ovs\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.371122 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c5a8678-8ce2-4bee-9160-37b1dea9f897-config\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.371846 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c5a8678-8ce2-4bee-9160-37b1dea9f897-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.379692 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.379692 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.387522 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czm58\" (UniqueName: \"kubernetes.io/projected/3c5a8678-8ce2-4bee-9160-37b1dea9f897-kube-api-access-czm58\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.388187 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.402899 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.471094 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-run\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.471154 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-scripts\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.471201 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkzbj\" (UniqueName: \"kubernetes.io/projected/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-kube-api-access-bkzbj\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.471220 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-combined-ca-bundle\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.471250 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-run\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.471279 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-log-ovn\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.471303 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-lib\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.471319 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp572\" (UniqueName: \"kubernetes.io/projected/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-kube-api-access-rp572\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.471342 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-scripts\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.471357 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-ovn-controller-tls-certs\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.471388 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-run-ovn\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.471403 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-etc-ovs\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.471440 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-log\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.472199 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-log\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.472385 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-lib\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.472502 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-log-ovn\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.472776 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-run\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.473331 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-run\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.473495 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-run-ovn\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.474709 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-scripts\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.474913 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-etc-ovs\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.475942 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-scripts\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.478005 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-ovn-controller-tls-certs\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.478183 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-combined-ca-bundle\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.487184 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.490807 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkzbj\" (UniqueName: \"kubernetes.io/projected/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-kube-api-access-bkzbj\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.496848 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp572\" (UniqueName: \"kubernetes.io/projected/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-kube-api-access-rp572\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.608917 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.626481 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.046883 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.048042 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.053393 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.053758 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.054766 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-p6ptc" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.055403 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.069719 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.198756 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.198809 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c2bcb9c-07d3-4d71-924b-aacd537e3430-config\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.198840 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwcsv\" (UniqueName: \"kubernetes.io/projected/3c2bcb9c-07d3-4d71-924b-aacd537e3430-kube-api-access-fwcsv\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.198859 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.198878 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.199057 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.199271 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3c2bcb9c-07d3-4d71-924b-aacd537e3430-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.199333 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c2bcb9c-07d3-4d71-924b-aacd537e3430-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.300481 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.300535 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c2bcb9c-07d3-4d71-924b-aacd537e3430-config\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.300570 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwcsv\" (UniqueName: \"kubernetes.io/projected/3c2bcb9c-07d3-4d71-924b-aacd537e3430-kube-api-access-fwcsv\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.300589 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.300612 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.300639 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.300689 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3c2bcb9c-07d3-4d71-924b-aacd537e3430-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.300723 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c2bcb9c-07d3-4d71-924b-aacd537e3430-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.300967 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.301145 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3c2bcb9c-07d3-4d71-924b-aacd537e3430-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.301776 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c2bcb9c-07d3-4d71-924b-aacd537e3430-config\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.301967 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c2bcb9c-07d3-4d71-924b-aacd537e3430-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.310812 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.310827 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.310923 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.321083 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.322221 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwcsv\" (UniqueName: \"kubernetes.io/projected/3c2bcb9c-07d3-4d71-924b-aacd537e3430-kube-api-access-fwcsv\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.374313 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:53 crc kubenswrapper[4795]: E0219 21:44:53.027691 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:221c84e162c46ac7454de6fb84343d0a605f2ea1d7d5647a34a66569e0a8fd76" Feb 19 21:44:53 crc kubenswrapper[4795]: E0219 21:44:53.028180 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:221c84e162c46ac7454de6fb84343d0a605f2ea1d7d5647a34a66569e0a8fd76,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7cwvp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(7b096325-542d-4ac6-8d16-8aa0937013b2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 21:44:53 crc kubenswrapper[4795]: E0219 21:44:53.029395 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="7b096325-542d-4ac6-8d16-8aa0937013b2" Feb 19 21:44:53 crc kubenswrapper[4795]: E0219 21:44:53.595056 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:221c84e162c46ac7454de6fb84343d0a605f2ea1d7d5647a34a66569e0a8fd76\\\"\"" pod="openstack/rabbitmq-server-0" podUID="7b096325-542d-4ac6-8d16-8aa0937013b2" Feb 19 21:45:00 crc kubenswrapper[4795]: I0219 21:45:00.157982 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v"] Feb 19 21:45:00 crc kubenswrapper[4795]: I0219 21:45:00.160083 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" Feb 19 21:45:00 crc kubenswrapper[4795]: I0219 21:45:00.163241 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 21:45:00 crc kubenswrapper[4795]: I0219 21:45:00.163949 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 21:45:00 crc kubenswrapper[4795]: I0219 21:45:00.164321 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v"] Feb 19 21:45:00 crc kubenswrapper[4795]: I0219 21:45:00.202287 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1627c007-5a7c-4fa5-a15f-0da43560c849-secret-volume\") pod \"collect-profiles-29525625-kzr2v\" (UID: \"1627c007-5a7c-4fa5-a15f-0da43560c849\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" Feb 19 21:45:00 crc kubenswrapper[4795]: I0219 21:45:00.202352 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frbhn\" (UniqueName: \"kubernetes.io/projected/1627c007-5a7c-4fa5-a15f-0da43560c849-kube-api-access-frbhn\") pod \"collect-profiles-29525625-kzr2v\" (UID: \"1627c007-5a7c-4fa5-a15f-0da43560c849\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" Feb 19 21:45:00 crc kubenswrapper[4795]: I0219 21:45:00.202857 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1627c007-5a7c-4fa5-a15f-0da43560c849-config-volume\") pod \"collect-profiles-29525625-kzr2v\" (UID: \"1627c007-5a7c-4fa5-a15f-0da43560c849\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" Feb 19 21:45:00 crc kubenswrapper[4795]: I0219 21:45:00.304858 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1627c007-5a7c-4fa5-a15f-0da43560c849-secret-volume\") pod \"collect-profiles-29525625-kzr2v\" (UID: \"1627c007-5a7c-4fa5-a15f-0da43560c849\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" Feb 19 21:45:00 crc kubenswrapper[4795]: I0219 21:45:00.304954 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frbhn\" (UniqueName: \"kubernetes.io/projected/1627c007-5a7c-4fa5-a15f-0da43560c849-kube-api-access-frbhn\") pod \"collect-profiles-29525625-kzr2v\" (UID: \"1627c007-5a7c-4fa5-a15f-0da43560c849\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" Feb 19 21:45:00 crc kubenswrapper[4795]: I0219 21:45:00.305015 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1627c007-5a7c-4fa5-a15f-0da43560c849-config-volume\") pod \"collect-profiles-29525625-kzr2v\" (UID: \"1627c007-5a7c-4fa5-a15f-0da43560c849\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" Feb 19 21:45:00 crc kubenswrapper[4795]: I0219 21:45:00.306281 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1627c007-5a7c-4fa5-a15f-0da43560c849-config-volume\") pod \"collect-profiles-29525625-kzr2v\" (UID: \"1627c007-5a7c-4fa5-a15f-0da43560c849\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" Feb 19 21:45:00 crc kubenswrapper[4795]: I0219 21:45:00.314385 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1627c007-5a7c-4fa5-a15f-0da43560c849-secret-volume\") pod \"collect-profiles-29525625-kzr2v\" (UID: \"1627c007-5a7c-4fa5-a15f-0da43560c849\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" Feb 19 21:45:00 crc kubenswrapper[4795]: I0219 21:45:00.322263 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frbhn\" (UniqueName: \"kubernetes.io/projected/1627c007-5a7c-4fa5-a15f-0da43560c849-kube-api-access-frbhn\") pod \"collect-profiles-29525625-kzr2v\" (UID: \"1627c007-5a7c-4fa5-a15f-0da43560c849\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" Feb 19 21:45:00 crc kubenswrapper[4795]: I0219 21:45:00.494094 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" Feb 19 21:45:00 crc kubenswrapper[4795]: E0219 21:45:00.616477 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed" Feb 19 21:45:00 crc kubenswrapper[4795]: E0219 21:45:00.616666 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gtmqc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(0bbc6c00-2fc9-42cb-9c5a-9a160903ae99): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 21:45:00 crc kubenswrapper[4795]: E0219 21:45:00.617897 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" Feb 19 21:45:00 crc kubenswrapper[4795]: E0219 21:45:00.659418 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed\\\"\"" pod="openstack/openstack-galera-0" podUID="0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" Feb 19 21:45:01 crc kubenswrapper[4795]: E0219 21:45:01.510217 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 19 21:45:01 crc kubenswrapper[4795]: E0219 21:45:01.510365 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tnfdk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6fcf94d689-ppz5s_openstack(8f68f295-3b94-4e1f-8e9d-ba49ffe5198e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 21:45:01 crc kubenswrapper[4795]: E0219 21:45:01.511660 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6fcf94d689-ppz5s" podUID="8f68f295-3b94-4e1f-8e9d-ba49ffe5198e" Feb 19 21:45:01 crc kubenswrapper[4795]: E0219 21:45:01.515664 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 19 21:45:01 crc kubenswrapper[4795]: E0219 21:45:01.515868 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ss4wh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-67ff45466c-74c7x_openstack(5cb7920e-685c-4bb7-b276-3bf902251bd7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 21:45:01 crc kubenswrapper[4795]: E0219 21:45:01.517031 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-67ff45466c-74c7x" podUID="5cb7920e-685c-4bb7-b276-3bf902251bd7" Feb 19 21:45:01 crc kubenswrapper[4795]: E0219 21:45:01.547762 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 19 21:45:01 crc kubenswrapper[4795]: E0219 21:45:01.547927 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2pvqh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-855cbc58c5-xj6c9_openstack(40671af3-115e-495a-bdf5-34580fffdc69): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 21:45:01 crc kubenswrapper[4795]: E0219 21:45:01.550340 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-855cbc58c5-xj6c9" podUID="40671af3-115e-495a-bdf5-34580fffdc69" Feb 19 21:45:01 crc kubenswrapper[4795]: E0219 21:45:01.562350 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 19 21:45:01 crc kubenswrapper[4795]: E0219 21:45:01.562529 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2zdpq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-f54874ffc-9rmlj_openstack(880246d9-9662-47e8-a0ff-5d2aca6de029): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 21:45:01 crc kubenswrapper[4795]: E0219 21:45:01.564413 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" podUID="880246d9-9662-47e8-a0ff-5d2aca6de029" Feb 19 21:45:01 crc kubenswrapper[4795]: E0219 21:45:01.668127 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2\\\"\"" pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" podUID="880246d9-9662-47e8-a0ff-5d2aca6de029" Feb 19 21:45:01 crc kubenswrapper[4795]: E0219 21:45:01.668606 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2\\\"\"" pod="openstack/dnsmasq-dns-67ff45466c-74c7x" podUID="5cb7920e-685c-4bb7-b276-3bf902251bd7" Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.093143 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-ppz5s" Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.099578 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-xj6c9" Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.133872 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40671af3-115e-495a-bdf5-34580fffdc69-config\") pod \"40671af3-115e-495a-bdf5-34580fffdc69\" (UID: \"40671af3-115e-495a-bdf5-34580fffdc69\") " Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.133969 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-dns-svc\") pod \"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e\" (UID: \"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e\") " Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.134006 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnfdk\" (UniqueName: \"kubernetes.io/projected/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-kube-api-access-tnfdk\") pod \"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e\" (UID: \"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e\") " Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.134031 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pvqh\" (UniqueName: \"kubernetes.io/projected/40671af3-115e-495a-bdf5-34580fffdc69-kube-api-access-2pvqh\") pod \"40671af3-115e-495a-bdf5-34580fffdc69\" (UID: \"40671af3-115e-495a-bdf5-34580fffdc69\") " Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.134074 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-config\") pod \"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e\" (UID: \"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e\") " Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.134884 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-config" (OuterVolumeSpecName: "config") pod "8f68f295-3b94-4e1f-8e9d-ba49ffe5198e" (UID: "8f68f295-3b94-4e1f-8e9d-ba49ffe5198e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.135017 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-w9fbs"] Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.135322 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40671af3-115e-495a-bdf5-34580fffdc69-config" (OuterVolumeSpecName: "config") pod "40671af3-115e-495a-bdf5-34580fffdc69" (UID: "40671af3-115e-495a-bdf5-34580fffdc69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.137624 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8f68f295-3b94-4e1f-8e9d-ba49ffe5198e" (UID: "8f68f295-3b94-4e1f-8e9d-ba49ffe5198e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.138591 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.140625 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-kube-api-access-tnfdk" (OuterVolumeSpecName: "kube-api-access-tnfdk") pod "8f68f295-3b94-4e1f-8e9d-ba49ffe5198e" (UID: "8f68f295-3b94-4e1f-8e9d-ba49ffe5198e"). InnerVolumeSpecName "kube-api-access-tnfdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.144816 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40671af3-115e-495a-bdf5-34580fffdc69-kube-api-access-2pvqh" (OuterVolumeSpecName: "kube-api-access-2pvqh") pod "40671af3-115e-495a-bdf5-34580fffdc69" (UID: "40671af3-115e-495a-bdf5-34580fffdc69"). InnerVolumeSpecName "kube-api-access-2pvqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.236293 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40671af3-115e-495a-bdf5-34580fffdc69-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.236322 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.236332 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnfdk\" (UniqueName: \"kubernetes.io/projected/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-kube-api-access-tnfdk\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.236341 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pvqh\" (UniqueName: \"kubernetes.io/projected/40671af3-115e-495a-bdf5-34580fffdc69-kube-api-access-2pvqh\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.236352 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.285335 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v"] Feb 19 21:45:02 crc kubenswrapper[4795]: W0219 21:45:02.297360 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1627c007_5a7c_4fa5_a15f_0da43560c849.slice/crio-a0fb4a5ee44a5dda402292aff76ee26a3be9ac93d140292f388dd07787d2ee32 WatchSource:0}: Error finding container a0fb4a5ee44a5dda402292aff76ee26a3be9ac93d140292f388dd07787d2ee32: Status 404 returned error can't find the container with id a0fb4a5ee44a5dda402292aff76ee26a3be9ac93d140292f388dd07787d2ee32 Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.322757 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:45:02 crc kubenswrapper[4795]: W0219 21:45:02.335558 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb3b374e_f01b_4997_9ecf_fbeeb384cc2c.slice/crio-1c52434379a6a736b3ede83a285c9c03d25be4bf31ea4977955cffc137750ca3 WatchSource:0}: Error finding container 1c52434379a6a736b3ede83a285c9c03d25be4bf31ea4977955cffc137750ca3: Status 404 returned error can't find the container with id 1c52434379a6a736b3ede83a285c9c03d25be4bf31ea4977955cffc137750ca3 Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.343070 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.400833 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-tl5hf"] Feb 19 21:45:02 crc kubenswrapper[4795]: W0219 21:45:02.407899 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a19676c_9314_43a3_a2f8_bcf56d6b5ce3.slice/crio-79efa7732f14d953bfab30c856bec07a21e8fbe6e42ebb6b0b9f4b604f334bb3 WatchSource:0}: Error finding container 79efa7732f14d953bfab30c856bec07a21e8fbe6e42ebb6b0b9f4b604f334bb3: Status 404 returned error can't find the container with id 79efa7732f14d953bfab30c856bec07a21e8fbe6e42ebb6b0b9f4b604f334bb3 Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.489556 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.609722 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.675510 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fcf94d689-ppz5s" event={"ID":"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e","Type":"ContainerDied","Data":"d707f3ffd06ea35995047e01b9f2f18e202cdb98a74e19c73116f0e7ffea06b9"} Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.675541 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-ppz5s" Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.676931 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3c2bcb9c-07d3-4d71-924b-aacd537e3430","Type":"ContainerStarted","Data":"0753bcc18c087ec61d4625b239ed921fd6b476f148310ba726f57a4cfa8d345c"} Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.677920 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tl5hf" event={"ID":"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3","Type":"ContainerStarted","Data":"79efa7732f14d953bfab30c856bec07a21e8fbe6e42ebb6b0b9f4b604f334bb3"} Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.679002 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-855cbc58c5-xj6c9" event={"ID":"40671af3-115e-495a-bdf5-34580fffdc69","Type":"ContainerDied","Data":"d65b0d756b7de08be8b702a4b9a1293091aada93e842a06930928d0056a27a42"} Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.679043 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-xj6c9" Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.680425 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" event={"ID":"1627c007-5a7c-4fa5-a15f-0da43560c849","Type":"ContainerStarted","Data":"6d51fcadf72cdef4da5baae1dcfd6cb93884c814c13eaf31041e3c1336f34a93"} Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.680454 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" event={"ID":"1627c007-5a7c-4fa5-a15f-0da43560c849","Type":"ContainerStarted","Data":"a0fb4a5ee44a5dda402292aff76ee26a3be9ac93d140292f388dd07787d2ee32"} Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.681505 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d","Type":"ContainerStarted","Data":"fa781215a57c5a384dc9196151cb9d88b19a59e6ec4219a4b6443b0c5d96ab8f"} Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.682407 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3","Type":"ContainerStarted","Data":"aada954b6c8106a5c25613b1c4b96d76ce41049aa7128aa357d9511f84c5abf0"} Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.683583 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"bb3b374e-f01b-4997-9ecf-fbeeb384cc2c","Type":"ContainerStarted","Data":"1c52434379a6a736b3ede83a285c9c03d25be4bf31ea4977955cffc137750ca3"} Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.685266 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-w9fbs" event={"ID":"c30e8522-2d7f-4f10-a0b4-a7cfc351d093","Type":"ContainerStarted","Data":"dc18420d588bd541d274269ae096f1224bb6a914c81107d9d0d3602a4e7a25d2"} Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.702434 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" podStartSLOduration=2.702415031 podStartE2EDuration="2.702415031s" podCreationTimestamp="2026-02-19 21:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:45:02.699881309 +0000 UTC m=+1013.892399173" watchObservedRunningTime="2026-02-19 21:45:02.702415031 +0000 UTC m=+1013.894932895" Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.746563 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-xj6c9"] Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.746607 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-xj6c9"] Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.802680 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-ppz5s"] Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.810084 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-ppz5s"] Feb 19 21:45:03 crc kubenswrapper[4795]: I0219 21:45:03.157326 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 21:45:03 crc kubenswrapper[4795]: W0219 21:45:03.439101 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c5a8678_8ce2_4bee_9160_37b1dea9f897.slice/crio-f3cf06abd80bf98f00707db38827bef51395b42a731dbd8362aaf2be900fcb70 WatchSource:0}: Error finding container f3cf06abd80bf98f00707db38827bef51395b42a731dbd8362aaf2be900fcb70: Status 404 returned error can't find the container with id f3cf06abd80bf98f00707db38827bef51395b42a731dbd8362aaf2be900fcb70 Feb 19 21:45:03 crc kubenswrapper[4795]: I0219 21:45:03.524380 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40671af3-115e-495a-bdf5-34580fffdc69" path="/var/lib/kubelet/pods/40671af3-115e-495a-bdf5-34580fffdc69/volumes" Feb 19 21:45:03 crc kubenswrapper[4795]: I0219 21:45:03.524826 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f68f295-3b94-4e1f-8e9d-ba49ffe5198e" path="/var/lib/kubelet/pods/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e/volumes" Feb 19 21:45:03 crc kubenswrapper[4795]: I0219 21:45:03.697299 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3c5a8678-8ce2-4bee-9160-37b1dea9f897","Type":"ContainerStarted","Data":"f3cf06abd80bf98f00707db38827bef51395b42a731dbd8362aaf2be900fcb70"} Feb 19 21:45:03 crc kubenswrapper[4795]: I0219 21:45:03.698978 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d","Type":"ContainerStarted","Data":"026030d64ab176967289d43803e975c62df885e093dcaebf727c13a2d67da5ff"} Feb 19 21:45:03 crc kubenswrapper[4795]: I0219 21:45:03.702072 4795 generic.go:334] "Generic (PLEG): container finished" podID="1627c007-5a7c-4fa5-a15f-0da43560c849" containerID="6d51fcadf72cdef4da5baae1dcfd6cb93884c814c13eaf31041e3c1336f34a93" exitCode=0 Feb 19 21:45:03 crc kubenswrapper[4795]: I0219 21:45:03.702247 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" event={"ID":"1627c007-5a7c-4fa5-a15f-0da43560c849","Type":"ContainerDied","Data":"6d51fcadf72cdef4da5baae1dcfd6cb93884c814c13eaf31041e3c1336f34a93"} Feb 19 21:45:03 crc kubenswrapper[4795]: I0219 21:45:03.704679 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d","Type":"ContainerStarted","Data":"f5dc53fcd687359370a9224413921410e03027d27bb1e741143948af4422db6c"} Feb 19 21:45:06 crc kubenswrapper[4795]: I0219 21:45:06.594762 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" Feb 19 21:45:06 crc kubenswrapper[4795]: I0219 21:45:06.730315 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1627c007-5a7c-4fa5-a15f-0da43560c849-config-volume\") pod \"1627c007-5a7c-4fa5-a15f-0da43560c849\" (UID: \"1627c007-5a7c-4fa5-a15f-0da43560c849\") " Feb 19 21:45:06 crc kubenswrapper[4795]: I0219 21:45:06.730375 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frbhn\" (UniqueName: \"kubernetes.io/projected/1627c007-5a7c-4fa5-a15f-0da43560c849-kube-api-access-frbhn\") pod \"1627c007-5a7c-4fa5-a15f-0da43560c849\" (UID: \"1627c007-5a7c-4fa5-a15f-0da43560c849\") " Feb 19 21:45:06 crc kubenswrapper[4795]: I0219 21:45:06.730596 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1627c007-5a7c-4fa5-a15f-0da43560c849-secret-volume\") pod \"1627c007-5a7c-4fa5-a15f-0da43560c849\" (UID: \"1627c007-5a7c-4fa5-a15f-0da43560c849\") " Feb 19 21:45:06 crc kubenswrapper[4795]: I0219 21:45:06.731708 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1627c007-5a7c-4fa5-a15f-0da43560c849-config-volume" (OuterVolumeSpecName: "config-volume") pod "1627c007-5a7c-4fa5-a15f-0da43560c849" (UID: "1627c007-5a7c-4fa5-a15f-0da43560c849"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:06 crc kubenswrapper[4795]: I0219 21:45:06.735314 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1627c007-5a7c-4fa5-a15f-0da43560c849-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1627c007-5a7c-4fa5-a15f-0da43560c849" (UID: "1627c007-5a7c-4fa5-a15f-0da43560c849"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:45:06 crc kubenswrapper[4795]: I0219 21:45:06.736927 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" event={"ID":"1627c007-5a7c-4fa5-a15f-0da43560c849","Type":"ContainerDied","Data":"a0fb4a5ee44a5dda402292aff76ee26a3be9ac93d140292f388dd07787d2ee32"} Feb 19 21:45:06 crc kubenswrapper[4795]: I0219 21:45:06.736965 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0fb4a5ee44a5dda402292aff76ee26a3be9ac93d140292f388dd07787d2ee32" Feb 19 21:45:06 crc kubenswrapper[4795]: I0219 21:45:06.737022 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" Feb 19 21:45:06 crc kubenswrapper[4795]: I0219 21:45:06.737315 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1627c007-5a7c-4fa5-a15f-0da43560c849-kube-api-access-frbhn" (OuterVolumeSpecName: "kube-api-access-frbhn") pod "1627c007-5a7c-4fa5-a15f-0da43560c849" (UID: "1627c007-5a7c-4fa5-a15f-0da43560c849"). InnerVolumeSpecName "kube-api-access-frbhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:06 crc kubenswrapper[4795]: I0219 21:45:06.739207 4795 generic.go:334] "Generic (PLEG): container finished" podID="2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" containerID="026030d64ab176967289d43803e975c62df885e093dcaebf727c13a2d67da5ff" exitCode=0 Feb 19 21:45:06 crc kubenswrapper[4795]: I0219 21:45:06.739238 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d","Type":"ContainerDied","Data":"026030d64ab176967289d43803e975c62df885e093dcaebf727c13a2d67da5ff"} Feb 19 21:45:06 crc kubenswrapper[4795]: I0219 21:45:06.832897 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1627c007-5a7c-4fa5-a15f-0da43560c849-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:06 crc kubenswrapper[4795]: I0219 21:45:06.832935 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1627c007-5a7c-4fa5-a15f-0da43560c849-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:06 crc kubenswrapper[4795]: I0219 21:45:06.832949 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frbhn\" (UniqueName: \"kubernetes.io/projected/1627c007-5a7c-4fa5-a15f-0da43560c849-kube-api-access-frbhn\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:07 crc kubenswrapper[4795]: I0219 21:45:07.756708 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d","Type":"ContainerStarted","Data":"ae770d1785f3d5deaa5b1b98adf315fd8a61a2cbce434ecb3970ab496579b196"} Feb 19 21:45:07 crc kubenswrapper[4795]: I0219 21:45:07.759313 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3c2bcb9c-07d3-4d71-924b-aacd537e3430","Type":"ContainerStarted","Data":"fc2aa4e6ca186f0a3259cef3b73108248f8b30394d6e7f6ee3356a21235bd96b"} Feb 19 21:45:07 crc kubenswrapper[4795]: I0219 21:45:07.762788 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3","Type":"ContainerStarted","Data":"db08b177837bd80146274836c0977b72219e93219854a2c6c5e81a24922a33fe"} Feb 19 21:45:07 crc kubenswrapper[4795]: I0219 21:45:07.764016 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 19 21:45:07 crc kubenswrapper[4795]: I0219 21:45:07.766781 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 19 21:45:07 crc kubenswrapper[4795]: I0219 21:45:07.768155 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3c5a8678-8ce2-4bee-9160-37b1dea9f897","Type":"ContainerStarted","Data":"146b7776ad30bb62917c430a4cb976669fc9f5db740f7f370033e5be6e16f033"} Feb 19 21:45:07 crc kubenswrapper[4795]: I0219 21:45:07.782929 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=23.202192063 podStartE2EDuration="27.782870794s" podCreationTimestamp="2026-02-19 21:44:40 +0000 UTC" firstStartedPulling="2026-02-19 21:45:02.502001688 +0000 UTC m=+1013.694519552" lastFinishedPulling="2026-02-19 21:45:07.082680419 +0000 UTC m=+1018.275198283" observedRunningTime="2026-02-19 21:45:07.779422016 +0000 UTC m=+1018.971939880" watchObservedRunningTime="2026-02-19 21:45:07.782870794 +0000 UTC m=+1018.975388658" Feb 19 21:45:07 crc kubenswrapper[4795]: I0219 21:45:07.801986 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=20.692956864 podStartE2EDuration="25.801958023s" podCreationTimestamp="2026-02-19 21:44:42 +0000 UTC" firstStartedPulling="2026-02-19 21:45:02.339129786 +0000 UTC m=+1013.531647650" lastFinishedPulling="2026-02-19 21:45:07.448130955 +0000 UTC m=+1018.640648809" observedRunningTime="2026-02-19 21:45:07.794574714 +0000 UTC m=+1018.987092578" watchObservedRunningTime="2026-02-19 21:45:07.801958023 +0000 UTC m=+1018.994475887" Feb 19 21:45:08 crc kubenswrapper[4795]: I0219 21:45:08.778137 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-w9fbs" event={"ID":"c30e8522-2d7f-4f10-a0b4-a7cfc351d093","Type":"ContainerStarted","Data":"e6a343acb1888ef7b6cc47d70fee6b276aa66bddee8c2a595c2dde665a4a1a27"} Feb 19 21:45:08 crc kubenswrapper[4795]: I0219 21:45:08.778542 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-w9fbs" Feb 19 21:45:08 crc kubenswrapper[4795]: I0219 21:45:08.781996 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7b096325-542d-4ac6-8d16-8aa0937013b2","Type":"ContainerStarted","Data":"90e58bd420672619d861e1841541403fdbfcac1af5a896cc071e58d6af65cf55"} Feb 19 21:45:08 crc kubenswrapper[4795]: I0219 21:45:08.785971 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"bb3b374e-f01b-4997-9ecf-fbeeb384cc2c","Type":"ContainerStarted","Data":"0c46b510d414f62d57f7afe61292d3a54a60ed7655ea208a609c0d72a9940824"} Feb 19 21:45:08 crc kubenswrapper[4795]: I0219 21:45:08.788314 4795 generic.go:334] "Generic (PLEG): container finished" podID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerID="000996048fc152f2f7ff89617b797c3b6c97a3bedd94f9b3f95b461725418fde" exitCode=0 Feb 19 21:45:08 crc kubenswrapper[4795]: I0219 21:45:08.788404 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tl5hf" event={"ID":"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3","Type":"ContainerDied","Data":"000996048fc152f2f7ff89617b797c3b6c97a3bedd94f9b3f95b461725418fde"} Feb 19 21:45:08 crc kubenswrapper[4795]: I0219 21:45:08.795662 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-w9fbs" podStartSLOduration=16.550970748 podStartE2EDuration="21.795646581s" podCreationTimestamp="2026-02-19 21:44:47 +0000 UTC" firstStartedPulling="2026-02-19 21:45:02.138302511 +0000 UTC m=+1013.330820405" lastFinishedPulling="2026-02-19 21:45:07.382978374 +0000 UTC m=+1018.575496238" observedRunningTime="2026-02-19 21:45:08.795398864 +0000 UTC m=+1019.987916728" watchObservedRunningTime="2026-02-19 21:45:08.795646581 +0000 UTC m=+1019.988164445" Feb 19 21:45:08 crc kubenswrapper[4795]: I0219 21:45:08.818835 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=29.320914657 podStartE2EDuration="29.818820815s" podCreationTimestamp="2026-02-19 21:44:39 +0000 UTC" firstStartedPulling="2026-02-19 21:45:02.340820664 +0000 UTC m=+1013.533338528" lastFinishedPulling="2026-02-19 21:45:02.838726822 +0000 UTC m=+1014.031244686" observedRunningTime="2026-02-19 21:45:08.813259618 +0000 UTC m=+1020.005777492" watchObservedRunningTime="2026-02-19 21:45:08.818820815 +0000 UTC m=+1020.011338679" Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.814510 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3c2bcb9c-07d3-4d71-924b-aacd537e3430","Type":"ContainerStarted","Data":"e932ce1114c0c3a49c8f6332f06a4a9aadb5f1382200346f37f0e3e9fe2d3373"} Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.841877 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-p9cs4"] Feb 19 21:45:09 crc kubenswrapper[4795]: E0219 21:45:09.842322 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1627c007-5a7c-4fa5-a15f-0da43560c849" containerName="collect-profiles" Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.842342 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1627c007-5a7c-4fa5-a15f-0da43560c849" containerName="collect-profiles" Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.842564 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1627c007-5a7c-4fa5-a15f-0da43560c849" containerName="collect-profiles" Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.843230 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.845661 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.848514 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-p9cs4"] Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.851574 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=14.927591584 podStartE2EDuration="21.851556277s" podCreationTimestamp="2026-02-19 21:44:48 +0000 UTC" firstStartedPulling="2026-02-19 21:45:02.597668751 +0000 UTC m=+1013.790186615" lastFinishedPulling="2026-02-19 21:45:09.521633404 +0000 UTC m=+1020.714151308" observedRunningTime="2026-02-19 21:45:09.844795526 +0000 UTC m=+1021.037313390" watchObservedRunningTime="2026-02-19 21:45:09.851556277 +0000 UTC m=+1021.044074131" Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.854127 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tl5hf" event={"ID":"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3","Type":"ContainerStarted","Data":"e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23"} Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.862351 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3c5a8678-8ce2-4bee-9160-37b1dea9f897","Type":"ContainerStarted","Data":"9afa6d2d9c1c3abf2795f79e56d0f2c85700d5b1b8b011dec52725e8bbc63d6b"} Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.899736 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wdlj\" (UniqueName: \"kubernetes.io/projected/eee0ea5d-4b43-4421-b23e-555c5eac3564-kube-api-access-6wdlj\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.900498 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/eee0ea5d-4b43-4421-b23e-555c5eac3564-ovs-rundir\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.900547 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/eee0ea5d-4b43-4421-b23e-555c5eac3564-ovn-rundir\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.900588 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eee0ea5d-4b43-4421-b23e-555c5eac3564-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.900742 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eee0ea5d-4b43-4421-b23e-555c5eac3564-combined-ca-bundle\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.901751 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eee0ea5d-4b43-4421-b23e-555c5eac3564-config\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.991077 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=17.919675076 podStartE2EDuration="23.991057378s" podCreationTimestamp="2026-02-19 21:44:46 +0000 UTC" firstStartedPulling="2026-02-19 21:45:03.442290407 +0000 UTC m=+1014.634808271" lastFinishedPulling="2026-02-19 21:45:09.513672699 +0000 UTC m=+1020.706190573" observedRunningTime="2026-02-19 21:45:09.912302293 +0000 UTC m=+1021.104820157" watchObservedRunningTime="2026-02-19 21:45:09.991057378 +0000 UTC m=+1021.183575242" Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.995707 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-9rmlj"] Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.004353 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wdlj\" (UniqueName: \"kubernetes.io/projected/eee0ea5d-4b43-4421-b23e-555c5eac3564-kube-api-access-6wdlj\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.004395 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/eee0ea5d-4b43-4421-b23e-555c5eac3564-ovs-rundir\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.004418 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/eee0ea5d-4b43-4421-b23e-555c5eac3564-ovn-rundir\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.004445 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eee0ea5d-4b43-4421-b23e-555c5eac3564-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.004497 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eee0ea5d-4b43-4421-b23e-555c5eac3564-combined-ca-bundle\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.004512 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eee0ea5d-4b43-4421-b23e-555c5eac3564-config\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.005514 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eee0ea5d-4b43-4421-b23e-555c5eac3564-config\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.005555 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/eee0ea5d-4b43-4421-b23e-555c5eac3564-ovn-rundir\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.005885 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/eee0ea5d-4b43-4421-b23e-555c5eac3564-ovs-rundir\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.018416 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eee0ea5d-4b43-4421-b23e-555c5eac3564-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.018866 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eee0ea5d-4b43-4421-b23e-555c5eac3564-combined-ca-bundle\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.035311 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57bdd75c-sdlxp"] Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.037303 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.039204 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.041265 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wdlj\" (UniqueName: \"kubernetes.io/projected/eee0ea5d-4b43-4421-b23e-555c5eac3564-kube-api-access-6wdlj\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.063972 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57bdd75c-sdlxp"] Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.112964 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-ovsdbserver-nb\") pod \"dnsmasq-dns-57bdd75c-sdlxp\" (UID: \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\") " pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.113254 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2p7v\" (UniqueName: \"kubernetes.io/projected/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-kube-api-access-n2p7v\") pod \"dnsmasq-dns-57bdd75c-sdlxp\" (UID: \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\") " pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.113364 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-config\") pod \"dnsmasq-dns-57bdd75c-sdlxp\" (UID: \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\") " pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.113440 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-dns-svc\") pod \"dnsmasq-dns-57bdd75c-sdlxp\" (UID: \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\") " pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.143141 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-74c7x"] Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.172053 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.175805 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-hk588"] Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.188917 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.190903 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.215863 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-ovsdbserver-sb\") pod \"dnsmasq-dns-75b7bcc64f-hk588\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.215905 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-dns-svc\") pod \"dnsmasq-dns-75b7bcc64f-hk588\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.215955 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2p7v\" (UniqueName: \"kubernetes.io/projected/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-kube-api-access-n2p7v\") pod \"dnsmasq-dns-57bdd75c-sdlxp\" (UID: \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\") " pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.215990 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xddmw\" (UniqueName: \"kubernetes.io/projected/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-kube-api-access-xddmw\") pod \"dnsmasq-dns-75b7bcc64f-hk588\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.216018 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-config\") pod \"dnsmasq-dns-57bdd75c-sdlxp\" (UID: \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\") " pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.216051 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-ovsdbserver-nb\") pod \"dnsmasq-dns-75b7bcc64f-hk588\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.216076 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-dns-svc\") pod \"dnsmasq-dns-57bdd75c-sdlxp\" (UID: \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\") " pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.216108 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-ovsdbserver-nb\") pod \"dnsmasq-dns-57bdd75c-sdlxp\" (UID: \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\") " pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.216139 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-config\") pod \"dnsmasq-dns-75b7bcc64f-hk588\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.220645 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-config\") pod \"dnsmasq-dns-57bdd75c-sdlxp\" (UID: \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\") " pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.221132 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-dns-svc\") pod \"dnsmasq-dns-57bdd75c-sdlxp\" (UID: \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\") " pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.222228 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-ovsdbserver-nb\") pod \"dnsmasq-dns-57bdd75c-sdlxp\" (UID: \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\") " pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.239579 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-hk588"] Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.242855 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2p7v\" (UniqueName: \"kubernetes.io/projected/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-kube-api-access-n2p7v\") pod \"dnsmasq-dns-57bdd75c-sdlxp\" (UID: \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\") " pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.329007 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-config\") pod \"dnsmasq-dns-75b7bcc64f-hk588\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.329118 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-ovsdbserver-sb\") pod \"dnsmasq-dns-75b7bcc64f-hk588\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.329144 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-dns-svc\") pod \"dnsmasq-dns-75b7bcc64f-hk588\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.329244 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xddmw\" (UniqueName: \"kubernetes.io/projected/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-kube-api-access-xddmw\") pod \"dnsmasq-dns-75b7bcc64f-hk588\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.329319 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-ovsdbserver-nb\") pod \"dnsmasq-dns-75b7bcc64f-hk588\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.330553 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-ovsdbserver-nb\") pod \"dnsmasq-dns-75b7bcc64f-hk588\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.331183 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-dns-svc\") pod \"dnsmasq-dns-75b7bcc64f-hk588\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.331594 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-config\") pod \"dnsmasq-dns-75b7bcc64f-hk588\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.333741 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-ovsdbserver-sb\") pod \"dnsmasq-dns-75b7bcc64f-hk588\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.348101 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.353016 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xddmw\" (UniqueName: \"kubernetes.io/projected/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-kube-api-access-xddmw\") pod \"dnsmasq-dns-75b7bcc64f-hk588\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.363033 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.374999 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.423482 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.430808 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/880246d9-9662-47e8-a0ff-5d2aca6de029-dns-svc\") pod \"880246d9-9662-47e8-a0ff-5d2aca6de029\" (UID: \"880246d9-9662-47e8-a0ff-5d2aca6de029\") " Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.430978 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zdpq\" (UniqueName: \"kubernetes.io/projected/880246d9-9662-47e8-a0ff-5d2aca6de029-kube-api-access-2zdpq\") pod \"880246d9-9662-47e8-a0ff-5d2aca6de029\" (UID: \"880246d9-9662-47e8-a0ff-5d2aca6de029\") " Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.431007 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/880246d9-9662-47e8-a0ff-5d2aca6de029-config\") pod \"880246d9-9662-47e8-a0ff-5d2aca6de029\" (UID: \"880246d9-9662-47e8-a0ff-5d2aca6de029\") " Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.433140 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/880246d9-9662-47e8-a0ff-5d2aca6de029-config" (OuterVolumeSpecName: "config") pod "880246d9-9662-47e8-a0ff-5d2aca6de029" (UID: "880246d9-9662-47e8-a0ff-5d2aca6de029"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.433642 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/880246d9-9662-47e8-a0ff-5d2aca6de029-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "880246d9-9662-47e8-a0ff-5d2aca6de029" (UID: "880246d9-9662-47e8-a0ff-5d2aca6de029"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.444719 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/880246d9-9662-47e8-a0ff-5d2aca6de029-kube-api-access-2zdpq" (OuterVolumeSpecName: "kube-api-access-2zdpq") pod "880246d9-9662-47e8-a0ff-5d2aca6de029" (UID: "880246d9-9662-47e8-a0ff-5d2aca6de029"). InnerVolumeSpecName "kube-api-access-2zdpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.468821 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-74c7x" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.512389 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.532671 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cb7920e-685c-4bb7-b276-3bf902251bd7-config\") pod \"5cb7920e-685c-4bb7-b276-3bf902251bd7\" (UID: \"5cb7920e-685c-4bb7-b276-3bf902251bd7\") " Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.532833 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss4wh\" (UniqueName: \"kubernetes.io/projected/5cb7920e-685c-4bb7-b276-3bf902251bd7-kube-api-access-ss4wh\") pod \"5cb7920e-685c-4bb7-b276-3bf902251bd7\" (UID: \"5cb7920e-685c-4bb7-b276-3bf902251bd7\") " Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.532880 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5cb7920e-685c-4bb7-b276-3bf902251bd7-dns-svc\") pod \"5cb7920e-685c-4bb7-b276-3bf902251bd7\" (UID: \"5cb7920e-685c-4bb7-b276-3bf902251bd7\") " Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.533231 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zdpq\" (UniqueName: \"kubernetes.io/projected/880246d9-9662-47e8-a0ff-5d2aca6de029-kube-api-access-2zdpq\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.533252 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/880246d9-9662-47e8-a0ff-5d2aca6de029-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.533262 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/880246d9-9662-47e8-a0ff-5d2aca6de029-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.533309 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cb7920e-685c-4bb7-b276-3bf902251bd7-config" (OuterVolumeSpecName: "config") pod "5cb7920e-685c-4bb7-b276-3bf902251bd7" (UID: "5cb7920e-685c-4bb7-b276-3bf902251bd7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.534975 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cb7920e-685c-4bb7-b276-3bf902251bd7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5cb7920e-685c-4bb7-b276-3bf902251bd7" (UID: "5cb7920e-685c-4bb7-b276-3bf902251bd7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.544362 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cb7920e-685c-4bb7-b276-3bf902251bd7-kube-api-access-ss4wh" (OuterVolumeSpecName: "kube-api-access-ss4wh") pod "5cb7920e-685c-4bb7-b276-3bf902251bd7" (UID: "5cb7920e-685c-4bb7-b276-3bf902251bd7"). InnerVolumeSpecName "kube-api-access-ss4wh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.635142 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss4wh\" (UniqueName: \"kubernetes.io/projected/5cb7920e-685c-4bb7-b276-3bf902251bd7-kube-api-access-ss4wh\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.635980 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5cb7920e-685c-4bb7-b276-3bf902251bd7-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.635994 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cb7920e-685c-4bb7-b276-3bf902251bd7-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.713006 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-p9cs4"] Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.852305 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.852359 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.856326 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57bdd75c-sdlxp"] Feb 19 21:45:10 crc kubenswrapper[4795]: W0219 21:45:10.867612 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod267a3a62_4f3e_43c8_a1a8_8b47e9d17e80.slice/crio-f96b347d54ce26b56709e3b669b660d82336d8ce5d8f3ba525a2a9bc9a62104b WatchSource:0}: Error finding container f96b347d54ce26b56709e3b669b660d82336d8ce5d8f3ba525a2a9bc9a62104b: Status 404 returned error can't find the container with id f96b347d54ce26b56709e3b669b660d82336d8ce5d8f3ba525a2a9bc9a62104b Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.871946 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-74c7x" event={"ID":"5cb7920e-685c-4bb7-b276-3bf902251bd7","Type":"ContainerDied","Data":"d22d6d60f193bcb6a68565e5f85094654e0386ed18eb92971e096bc7c18b8ee4"} Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.872065 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-74c7x" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.876868 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tl5hf" event={"ID":"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3","Type":"ContainerStarted","Data":"ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf"} Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.876927 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.876952 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.879531 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" event={"ID":"880246d9-9662-47e8-a0ff-5d2aca6de029","Type":"ContainerDied","Data":"236c5910103a0e9c6c7b85c7e035781b7dc5db0e7378799d12ebd2dd79bce1b2"} Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.879596 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.882721 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-p9cs4" event={"ID":"eee0ea5d-4b43-4421-b23e-555c5eac3564","Type":"ContainerStarted","Data":"eaba90113d6ff0b858d733af82b8a4a862659df0d41e63fdc645db66d9298341"} Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.883326 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.909194 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-tl5hf" podStartSLOduration=18.932413798 podStartE2EDuration="23.90915543s" podCreationTimestamp="2026-02-19 21:44:47 +0000 UTC" firstStartedPulling="2026-02-19 21:45:02.411065698 +0000 UTC m=+1013.603583562" lastFinishedPulling="2026-02-19 21:45:07.38780733 +0000 UTC m=+1018.580325194" observedRunningTime="2026-02-19 21:45:10.89571431 +0000 UTC m=+1022.088232174" watchObservedRunningTime="2026-02-19 21:45:10.90915543 +0000 UTC m=+1022.101673314" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.934252 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-74c7x"] Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.948260 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-74c7x"] Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.961557 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-hk588"] Feb 19 21:45:10 crc kubenswrapper[4795]: W0219 21:45:10.973475 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9aa18e6a_5d0f_4f6e_b36c_3a2b9e2d0d24.slice/crio-30d488a86fb4ac5c67d31e2d6e5fe67d75ebbbcc1b14297f26623cb276be726d WatchSource:0}: Error finding container 30d488a86fb4ac5c67d31e2d6e5fe67d75ebbbcc1b14297f26623cb276be726d: Status 404 returned error can't find the container with id 30d488a86fb4ac5c67d31e2d6e5fe67d75ebbbcc1b14297f26623cb276be726d Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.977385 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-9rmlj"] Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.981578 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-9rmlj"] Feb 19 21:45:11 crc kubenswrapper[4795]: I0219 21:45:11.488385 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 19 21:45:11 crc kubenswrapper[4795]: I0219 21:45:11.536043 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cb7920e-685c-4bb7-b276-3bf902251bd7" path="/var/lib/kubelet/pods/5cb7920e-685c-4bb7-b276-3bf902251bd7/volumes" Feb 19 21:45:11 crc kubenswrapper[4795]: I0219 21:45:11.536846 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="880246d9-9662-47e8-a0ff-5d2aca6de029" path="/var/lib/kubelet/pods/880246d9-9662-47e8-a0ff-5d2aca6de029/volumes" Feb 19 21:45:11 crc kubenswrapper[4795]: I0219 21:45:11.544581 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 19 21:45:11 crc kubenswrapper[4795]: I0219 21:45:11.890665 4795 generic.go:334] "Generic (PLEG): container finished" podID="9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24" containerID="7dadad1b3dff5c98d556b48e60505152135bae2b96f577c3d9bca2b9ab991bb6" exitCode=0 Feb 19 21:45:11 crc kubenswrapper[4795]: I0219 21:45:11.890742 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" event={"ID":"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24","Type":"ContainerDied","Data":"7dadad1b3dff5c98d556b48e60505152135bae2b96f577c3d9bca2b9ab991bb6"} Feb 19 21:45:11 crc kubenswrapper[4795]: I0219 21:45:11.890772 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" event={"ID":"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24","Type":"ContainerStarted","Data":"30d488a86fb4ac5c67d31e2d6e5fe67d75ebbbcc1b14297f26623cb276be726d"} Feb 19 21:45:11 crc kubenswrapper[4795]: I0219 21:45:11.892719 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-p9cs4" event={"ID":"eee0ea5d-4b43-4421-b23e-555c5eac3564","Type":"ContainerStarted","Data":"3d2f2fb944954b1447f44f88d21764d52d00b1e2faecd4fc556d6cdb1c4d63f4"} Feb 19 21:45:11 crc kubenswrapper[4795]: I0219 21:45:11.898705 4795 generic.go:334] "Generic (PLEG): container finished" podID="267a3a62-4f3e-43c8-a1a8-8b47e9d17e80" containerID="7eff9819df5a88a947e58d27d572069ed1a99c521c3bc92a815ae47f9f49adad" exitCode=0 Feb 19 21:45:11 crc kubenswrapper[4795]: I0219 21:45:11.900918 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" event={"ID":"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80","Type":"ContainerDied","Data":"7eff9819df5a88a947e58d27d572069ed1a99c521c3bc92a815ae47f9f49adad"} Feb 19 21:45:11 crc kubenswrapper[4795]: I0219 21:45:11.901001 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" event={"ID":"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80","Type":"ContainerStarted","Data":"f96b347d54ce26b56709e3b669b660d82336d8ce5d8f3ba525a2a9bc9a62104b"} Feb 19 21:45:11 crc kubenswrapper[4795]: I0219 21:45:11.902675 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 19 21:45:11 crc kubenswrapper[4795]: I0219 21:45:11.977502 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-p9cs4" podStartSLOduration=2.977486667 podStartE2EDuration="2.977486667s" podCreationTimestamp="2026-02-19 21:45:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:45:11.959859549 +0000 UTC m=+1023.152377433" watchObservedRunningTime="2026-02-19 21:45:11.977486667 +0000 UTC m=+1023.170004531" Feb 19 21:45:12 crc kubenswrapper[4795]: I0219 21:45:12.524185 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 19 21:45:12 crc kubenswrapper[4795]: I0219 21:45:12.910068 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" event={"ID":"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24","Type":"ContainerStarted","Data":"05ca792cce2c1842654ca399f54a44b1f3a2e077f6b7464bb9380e6c32ea3603"} Feb 19 21:45:12 crc kubenswrapper[4795]: I0219 21:45:12.910319 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:12 crc kubenswrapper[4795]: I0219 21:45:12.912461 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" event={"ID":"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80","Type":"ContainerStarted","Data":"66417e96a24bb0cf529cc2cfe3ecba9c66996b31063cea3b0971bb91b892c130"} Feb 19 21:45:12 crc kubenswrapper[4795]: I0219 21:45:12.936439 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" podStartSLOduration=2.498844959 podStartE2EDuration="2.936411042s" podCreationTimestamp="2026-02-19 21:45:10 +0000 UTC" firstStartedPulling="2026-02-19 21:45:10.985344013 +0000 UTC m=+1022.177861877" lastFinishedPulling="2026-02-19 21:45:11.422910096 +0000 UTC m=+1022.615427960" observedRunningTime="2026-02-19 21:45:12.931513674 +0000 UTC m=+1024.124031548" watchObservedRunningTime="2026-02-19 21:45:12.936411042 +0000 UTC m=+1024.128928946" Feb 19 21:45:12 crc kubenswrapper[4795]: I0219 21:45:12.954177 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" podStartSLOduration=2.481070436 podStartE2EDuration="2.954144704s" podCreationTimestamp="2026-02-19 21:45:10 +0000 UTC" firstStartedPulling="2026-02-19 21:45:10.871346361 +0000 UTC m=+1022.063864225" lastFinishedPulling="2026-02-19 21:45:11.344420619 +0000 UTC m=+1022.536938493" observedRunningTime="2026-02-19 21:45:12.950723067 +0000 UTC m=+1024.143240941" watchObservedRunningTime="2026-02-19 21:45:12.954144704 +0000 UTC m=+1024.146662568" Feb 19 21:45:13 crc kubenswrapper[4795]: I0219 21:45:13.206878 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 19 21:45:13 crc kubenswrapper[4795]: I0219 21:45:13.378817 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 19 21:45:13 crc kubenswrapper[4795]: I0219 21:45:13.455371 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 19 21:45:13 crc kubenswrapper[4795]: I0219 21:45:13.917893 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.417204 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.607153 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.608333 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.610379 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-x48sf" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.610959 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.611068 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.611093 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.631051 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.697100 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-scripts\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.697156 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.697296 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxwxt\" (UniqueName: \"kubernetes.io/projected/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-kube-api-access-xxwxt\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.697338 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.697432 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-config\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.697459 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.697518 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.799069 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.799180 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.799218 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-scripts\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.799256 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.799322 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxwxt\" (UniqueName: \"kubernetes.io/projected/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-kube-api-access-xxwxt\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.799352 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.799420 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-config\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.800664 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.800724 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-scripts\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.801830 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-config\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.805536 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.810783 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.813017 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.820389 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxwxt\" (UniqueName: \"kubernetes.io/projected/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-kube-api-access-xxwxt\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.930443 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 19 21:45:15 crc kubenswrapper[4795]: I0219 21:45:15.385021 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 19 21:45:15 crc kubenswrapper[4795]: I0219 21:45:15.935292 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c","Type":"ContainerStarted","Data":"dbab531f1a8f22d58c44dcac6c6209fda329451de2d8664028adcfc876aa2507"} Feb 19 21:45:15 crc kubenswrapper[4795]: I0219 21:45:15.936686 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99","Type":"ContainerStarted","Data":"106dec830838772106b6621ed4bf25acb79616ec219d016a3ce4bb8a514abaf0"} Feb 19 21:45:16 crc kubenswrapper[4795]: I0219 21:45:16.101775 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 19 21:45:16 crc kubenswrapper[4795]: I0219 21:45:16.943722 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c","Type":"ContainerStarted","Data":"1947803c21d9e4b80d454677a036fe260f1c04cd37249f1f066a15e8b611b1c3"} Feb 19 21:45:16 crc kubenswrapper[4795]: I0219 21:45:16.944118 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c","Type":"ContainerStarted","Data":"2cccec2e02e2c16f7cff73b24c7a62191291d809702ff698745b7a193cbe2c2f"} Feb 19 21:45:16 crc kubenswrapper[4795]: I0219 21:45:16.945272 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 19 21:45:16 crc kubenswrapper[4795]: I0219 21:45:16.964774 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.002435226 podStartE2EDuration="2.964750197s" podCreationTimestamp="2026-02-19 21:45:14 +0000 UTC" firstStartedPulling="2026-02-19 21:45:15.400831007 +0000 UTC m=+1026.593348871" lastFinishedPulling="2026-02-19 21:45:16.363145978 +0000 UTC m=+1027.555663842" observedRunningTime="2026-02-19 21:45:16.963017718 +0000 UTC m=+1028.155535582" watchObservedRunningTime="2026-02-19 21:45:16.964750197 +0000 UTC m=+1028.157268091" Feb 19 21:45:19 crc kubenswrapper[4795]: I0219 21:45:19.606240 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-jlwf9"] Feb 19 21:45:19 crc kubenswrapper[4795]: I0219 21:45:19.608849 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jlwf9" Feb 19 21:45:19 crc kubenswrapper[4795]: I0219 21:45:19.611027 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 19 21:45:19 crc kubenswrapper[4795]: I0219 21:45:19.618196 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jlwf9"] Feb 19 21:45:19 crc kubenswrapper[4795]: I0219 21:45:19.778017 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cchds\" (UniqueName: \"kubernetes.io/projected/11cc466e-0752-46b5-9775-c29748b13724-kube-api-access-cchds\") pod \"root-account-create-update-jlwf9\" (UID: \"11cc466e-0752-46b5-9775-c29748b13724\") " pod="openstack/root-account-create-update-jlwf9" Feb 19 21:45:19 crc kubenswrapper[4795]: I0219 21:45:19.778104 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11cc466e-0752-46b5-9775-c29748b13724-operator-scripts\") pod \"root-account-create-update-jlwf9\" (UID: \"11cc466e-0752-46b5-9775-c29748b13724\") " pod="openstack/root-account-create-update-jlwf9" Feb 19 21:45:19 crc kubenswrapper[4795]: I0219 21:45:19.879351 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cchds\" (UniqueName: \"kubernetes.io/projected/11cc466e-0752-46b5-9775-c29748b13724-kube-api-access-cchds\") pod \"root-account-create-update-jlwf9\" (UID: \"11cc466e-0752-46b5-9775-c29748b13724\") " pod="openstack/root-account-create-update-jlwf9" Feb 19 21:45:19 crc kubenswrapper[4795]: I0219 21:45:19.879463 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11cc466e-0752-46b5-9775-c29748b13724-operator-scripts\") pod \"root-account-create-update-jlwf9\" (UID: \"11cc466e-0752-46b5-9775-c29748b13724\") " pod="openstack/root-account-create-update-jlwf9" Feb 19 21:45:19 crc kubenswrapper[4795]: I0219 21:45:19.881005 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11cc466e-0752-46b5-9775-c29748b13724-operator-scripts\") pod \"root-account-create-update-jlwf9\" (UID: \"11cc466e-0752-46b5-9775-c29748b13724\") " pod="openstack/root-account-create-update-jlwf9" Feb 19 21:45:19 crc kubenswrapper[4795]: I0219 21:45:19.901203 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cchds\" (UniqueName: \"kubernetes.io/projected/11cc466e-0752-46b5-9775-c29748b13724-kube-api-access-cchds\") pod \"root-account-create-update-jlwf9\" (UID: \"11cc466e-0752-46b5-9775-c29748b13724\") " pod="openstack/root-account-create-update-jlwf9" Feb 19 21:45:19 crc kubenswrapper[4795]: I0219 21:45:19.955813 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jlwf9" Feb 19 21:45:19 crc kubenswrapper[4795]: I0219 21:45:19.967220 4795 generic.go:334] "Generic (PLEG): container finished" podID="0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" containerID="106dec830838772106b6621ed4bf25acb79616ec219d016a3ce4bb8a514abaf0" exitCode=0 Feb 19 21:45:19 crc kubenswrapper[4795]: I0219 21:45:19.967261 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99","Type":"ContainerDied","Data":"106dec830838772106b6621ed4bf25acb79616ec219d016a3ce4bb8a514abaf0"} Feb 19 21:45:20 crc kubenswrapper[4795]: I0219 21:45:20.365376 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:20 crc kubenswrapper[4795]: I0219 21:45:20.414453 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jlwf9"] Feb 19 21:45:20 crc kubenswrapper[4795]: W0219 21:45:20.420674 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11cc466e_0752_46b5_9775_c29748b13724.slice/crio-979235bd6eae8ecfcd6ef62b42508d6ffd09e3a3584da53da154d43decebd790 WatchSource:0}: Error finding container 979235bd6eae8ecfcd6ef62b42508d6ffd09e3a3584da53da154d43decebd790: Status 404 returned error can't find the container with id 979235bd6eae8ecfcd6ef62b42508d6ffd09e3a3584da53da154d43decebd790 Feb 19 21:45:20 crc kubenswrapper[4795]: I0219 21:45:20.513875 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:20 crc kubenswrapper[4795]: I0219 21:45:20.577694 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57bdd75c-sdlxp"] Feb 19 21:45:20 crc kubenswrapper[4795]: I0219 21:45:20.974649 4795 generic.go:334] "Generic (PLEG): container finished" podID="11cc466e-0752-46b5-9775-c29748b13724" containerID="0c92f3d0df6fb4ffa1967f7b15d462ab0538b07666cad5a1fe2cc6d118293fe8" exitCode=0 Feb 19 21:45:20 crc kubenswrapper[4795]: I0219 21:45:20.974743 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jlwf9" event={"ID":"11cc466e-0752-46b5-9775-c29748b13724","Type":"ContainerDied","Data":"0c92f3d0df6fb4ffa1967f7b15d462ab0538b07666cad5a1fe2cc6d118293fe8"} Feb 19 21:45:20 crc kubenswrapper[4795]: I0219 21:45:20.974772 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jlwf9" event={"ID":"11cc466e-0752-46b5-9775-c29748b13724","Type":"ContainerStarted","Data":"979235bd6eae8ecfcd6ef62b42508d6ffd09e3a3584da53da154d43decebd790"} Feb 19 21:45:20 crc kubenswrapper[4795]: I0219 21:45:20.976595 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" podUID="267a3a62-4f3e-43c8-a1a8-8b47e9d17e80" containerName="dnsmasq-dns" containerID="cri-o://66417e96a24bb0cf529cc2cfe3ecba9c66996b31063cea3b0971bb91b892c130" gracePeriod=10 Feb 19 21:45:20 crc kubenswrapper[4795]: I0219 21:45:20.976663 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99","Type":"ContainerStarted","Data":"0f7b958a502b531c3fff207837d630d66468893d907fbb748e2bd58503327153"} Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.018940 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371992.835857 podStartE2EDuration="44.018918202s" podCreationTimestamp="2026-02-19 21:44:37 +0000 UTC" firstStartedPulling="2026-02-19 21:44:40.037786119 +0000 UTC m=+991.230303983" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:45:21.013576201 +0000 UTC m=+1032.206094085" watchObservedRunningTime="2026-02-19 21:45:21.018918202 +0000 UTC m=+1032.211436066" Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.415822 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.507030 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-dns-svc\") pod \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\" (UID: \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\") " Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.507134 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2p7v\" (UniqueName: \"kubernetes.io/projected/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-kube-api-access-n2p7v\") pod \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\" (UID: \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\") " Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.507207 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-config\") pod \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\" (UID: \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\") " Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.507285 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-ovsdbserver-nb\") pod \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\" (UID: \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\") " Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.520698 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-kube-api-access-n2p7v" (OuterVolumeSpecName: "kube-api-access-n2p7v") pod "267a3a62-4f3e-43c8-a1a8-8b47e9d17e80" (UID: "267a3a62-4f3e-43c8-a1a8-8b47e9d17e80"). InnerVolumeSpecName "kube-api-access-n2p7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.544587 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-config" (OuterVolumeSpecName: "config") pod "267a3a62-4f3e-43c8-a1a8-8b47e9d17e80" (UID: "267a3a62-4f3e-43c8-a1a8-8b47e9d17e80"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.548321 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "267a3a62-4f3e-43c8-a1a8-8b47e9d17e80" (UID: "267a3a62-4f3e-43c8-a1a8-8b47e9d17e80"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.552832 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "267a3a62-4f3e-43c8-a1a8-8b47e9d17e80" (UID: "267a3a62-4f3e-43c8-a1a8-8b47e9d17e80"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.609902 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2p7v\" (UniqueName: \"kubernetes.io/projected/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-kube-api-access-n2p7v\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.610284 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.610350 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.610380 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.984371 4795 generic.go:334] "Generic (PLEG): container finished" podID="267a3a62-4f3e-43c8-a1a8-8b47e9d17e80" containerID="66417e96a24bb0cf529cc2cfe3ecba9c66996b31063cea3b0971bb91b892c130" exitCode=0 Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.984422 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.984445 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" event={"ID":"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80","Type":"ContainerDied","Data":"66417e96a24bb0cf529cc2cfe3ecba9c66996b31063cea3b0971bb91b892c130"} Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.984697 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" event={"ID":"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80","Type":"ContainerDied","Data":"f96b347d54ce26b56709e3b669b660d82336d8ce5d8f3ba525a2a9bc9a62104b"} Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.984720 4795 scope.go:117] "RemoveContainer" containerID="66417e96a24bb0cf529cc2cfe3ecba9c66996b31063cea3b0971bb91b892c130" Feb 19 21:45:22 crc kubenswrapper[4795]: I0219 21:45:22.002606 4795 scope.go:117] "RemoveContainer" containerID="7eff9819df5a88a947e58d27d572069ed1a99c521c3bc92a815ae47f9f49adad" Feb 19 21:45:22 crc kubenswrapper[4795]: I0219 21:45:22.018210 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57bdd75c-sdlxp"] Feb 19 21:45:22 crc kubenswrapper[4795]: I0219 21:45:22.023644 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57bdd75c-sdlxp"] Feb 19 21:45:22 crc kubenswrapper[4795]: I0219 21:45:22.042326 4795 scope.go:117] "RemoveContainer" containerID="66417e96a24bb0cf529cc2cfe3ecba9c66996b31063cea3b0971bb91b892c130" Feb 19 21:45:22 crc kubenswrapper[4795]: E0219 21:45:22.044309 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66417e96a24bb0cf529cc2cfe3ecba9c66996b31063cea3b0971bb91b892c130\": container with ID starting with 66417e96a24bb0cf529cc2cfe3ecba9c66996b31063cea3b0971bb91b892c130 not found: ID does not exist" containerID="66417e96a24bb0cf529cc2cfe3ecba9c66996b31063cea3b0971bb91b892c130" Feb 19 21:45:22 crc kubenswrapper[4795]: I0219 21:45:22.044356 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66417e96a24bb0cf529cc2cfe3ecba9c66996b31063cea3b0971bb91b892c130"} err="failed to get container status \"66417e96a24bb0cf529cc2cfe3ecba9c66996b31063cea3b0971bb91b892c130\": rpc error: code = NotFound desc = could not find container \"66417e96a24bb0cf529cc2cfe3ecba9c66996b31063cea3b0971bb91b892c130\": container with ID starting with 66417e96a24bb0cf529cc2cfe3ecba9c66996b31063cea3b0971bb91b892c130 not found: ID does not exist" Feb 19 21:45:22 crc kubenswrapper[4795]: I0219 21:45:22.044389 4795 scope.go:117] "RemoveContainer" containerID="7eff9819df5a88a947e58d27d572069ed1a99c521c3bc92a815ae47f9f49adad" Feb 19 21:45:22 crc kubenswrapper[4795]: E0219 21:45:22.044763 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eff9819df5a88a947e58d27d572069ed1a99c521c3bc92a815ae47f9f49adad\": container with ID starting with 7eff9819df5a88a947e58d27d572069ed1a99c521c3bc92a815ae47f9f49adad not found: ID does not exist" containerID="7eff9819df5a88a947e58d27d572069ed1a99c521c3bc92a815ae47f9f49adad" Feb 19 21:45:22 crc kubenswrapper[4795]: I0219 21:45:22.044800 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eff9819df5a88a947e58d27d572069ed1a99c521c3bc92a815ae47f9f49adad"} err="failed to get container status \"7eff9819df5a88a947e58d27d572069ed1a99c521c3bc92a815ae47f9f49adad\": rpc error: code = NotFound desc = could not find container \"7eff9819df5a88a947e58d27d572069ed1a99c521c3bc92a815ae47f9f49adad\": container with ID starting with 7eff9819df5a88a947e58d27d572069ed1a99c521c3bc92a815ae47f9f49adad not found: ID does not exist" Feb 19 21:45:22 crc kubenswrapper[4795]: I0219 21:45:22.303966 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jlwf9" Feb 19 21:45:22 crc kubenswrapper[4795]: I0219 21:45:22.421149 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11cc466e-0752-46b5-9775-c29748b13724-operator-scripts\") pod \"11cc466e-0752-46b5-9775-c29748b13724\" (UID: \"11cc466e-0752-46b5-9775-c29748b13724\") " Feb 19 21:45:22 crc kubenswrapper[4795]: I0219 21:45:22.421532 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cchds\" (UniqueName: \"kubernetes.io/projected/11cc466e-0752-46b5-9775-c29748b13724-kube-api-access-cchds\") pod \"11cc466e-0752-46b5-9775-c29748b13724\" (UID: \"11cc466e-0752-46b5-9775-c29748b13724\") " Feb 19 21:45:22 crc kubenswrapper[4795]: I0219 21:45:22.421706 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11cc466e-0752-46b5-9775-c29748b13724-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "11cc466e-0752-46b5-9775-c29748b13724" (UID: "11cc466e-0752-46b5-9775-c29748b13724"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:22 crc kubenswrapper[4795]: I0219 21:45:22.429563 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11cc466e-0752-46b5-9775-c29748b13724-kube-api-access-cchds" (OuterVolumeSpecName: "kube-api-access-cchds") pod "11cc466e-0752-46b5-9775-c29748b13724" (UID: "11cc466e-0752-46b5-9775-c29748b13724"). InnerVolumeSpecName "kube-api-access-cchds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:22 crc kubenswrapper[4795]: I0219 21:45:22.523861 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11cc466e-0752-46b5-9775-c29748b13724-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:22 crc kubenswrapper[4795]: I0219 21:45:22.523908 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cchds\" (UniqueName: \"kubernetes.io/projected/11cc466e-0752-46b5-9775-c29748b13724-kube-api-access-cchds\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.000287 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jlwf9" event={"ID":"11cc466e-0752-46b5-9775-c29748b13724","Type":"ContainerDied","Data":"979235bd6eae8ecfcd6ef62b42508d6ffd09e3a3584da53da154d43decebd790"} Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.000356 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jlwf9" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.000365 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="979235bd6eae8ecfcd6ef62b42508d6ffd09e3a3584da53da154d43decebd790" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.152756 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-hpfn4"] Feb 19 21:45:23 crc kubenswrapper[4795]: E0219 21:45:23.153476 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="267a3a62-4f3e-43c8-a1a8-8b47e9d17e80" containerName="dnsmasq-dns" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.153501 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="267a3a62-4f3e-43c8-a1a8-8b47e9d17e80" containerName="dnsmasq-dns" Feb 19 21:45:23 crc kubenswrapper[4795]: E0219 21:45:23.153528 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="267a3a62-4f3e-43c8-a1a8-8b47e9d17e80" containerName="init" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.153536 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="267a3a62-4f3e-43c8-a1a8-8b47e9d17e80" containerName="init" Feb 19 21:45:23 crc kubenswrapper[4795]: E0219 21:45:23.153558 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11cc466e-0752-46b5-9775-c29748b13724" containerName="mariadb-account-create-update" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.153567 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="11cc466e-0752-46b5-9775-c29748b13724" containerName="mariadb-account-create-update" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.153786 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="267a3a62-4f3e-43c8-a1a8-8b47e9d17e80" containerName="dnsmasq-dns" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.153807 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="11cc466e-0752-46b5-9775-c29748b13724" containerName="mariadb-account-create-update" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.154811 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.208508 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-hpfn4"] Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.234932 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-config\") pod \"dnsmasq-dns-689df5d84f-hpfn4\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.235268 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-dns-svc\") pod \"dnsmasq-dns-689df5d84f-hpfn4\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.235404 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpq8v\" (UniqueName: \"kubernetes.io/projected/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-kube-api-access-cpq8v\") pod \"dnsmasq-dns-689df5d84f-hpfn4\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.235564 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-ovsdbserver-sb\") pod \"dnsmasq-dns-689df5d84f-hpfn4\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.235689 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-ovsdbserver-nb\") pod \"dnsmasq-dns-689df5d84f-hpfn4\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.337125 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-config\") pod \"dnsmasq-dns-689df5d84f-hpfn4\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.337411 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-dns-svc\") pod \"dnsmasq-dns-689df5d84f-hpfn4\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.337546 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpq8v\" (UniqueName: \"kubernetes.io/projected/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-kube-api-access-cpq8v\") pod \"dnsmasq-dns-689df5d84f-hpfn4\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.337692 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-ovsdbserver-sb\") pod \"dnsmasq-dns-689df5d84f-hpfn4\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.337815 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-ovsdbserver-nb\") pod \"dnsmasq-dns-689df5d84f-hpfn4\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.338028 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-config\") pod \"dnsmasq-dns-689df5d84f-hpfn4\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.338368 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-dns-svc\") pod \"dnsmasq-dns-689df5d84f-hpfn4\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.338652 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-ovsdbserver-sb\") pod \"dnsmasq-dns-689df5d84f-hpfn4\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.338729 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-ovsdbserver-nb\") pod \"dnsmasq-dns-689df5d84f-hpfn4\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.360026 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpq8v\" (UniqueName: \"kubernetes.io/projected/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-kube-api-access-cpq8v\") pod \"dnsmasq-dns-689df5d84f-hpfn4\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.495258 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.525262 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="267a3a62-4f3e-43c8-a1a8-8b47e9d17e80" path="/var/lib/kubelet/pods/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80/volumes" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.926888 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-hpfn4"] Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.009175 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" event={"ID":"7b4dab90-2dba-4e1f-95fe-5c435d4e270a","Type":"ContainerStarted","Data":"6a5cc19b4b04424e6441590aca0708de64c3cb94b9b2a21745480c88aa7a5c4f"} Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.267017 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.271758 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.273913 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.274294 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.281104 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-slsb4" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.281120 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.296567 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.351780 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6c773ec2-a400-42a9-8784-ed9c295c3bb4-cache\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.351827 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bh4z\" (UniqueName: \"kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-kube-api-access-7bh4z\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.351879 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c773ec2-a400-42a9-8784-ed9c295c3bb4-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.351908 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.351927 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6c773ec2-a400-42a9-8784-ed9c295c3bb4-lock\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.351961 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.453544 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6c773ec2-a400-42a9-8784-ed9c295c3bb4-cache\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.453604 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bh4z\" (UniqueName: \"kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-kube-api-access-7bh4z\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.453653 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c773ec2-a400-42a9-8784-ed9c295c3bb4-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.453686 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.453703 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6c773ec2-a400-42a9-8784-ed9c295c3bb4-lock\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.453744 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: E0219 21:45:24.454097 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 21:45:24 crc kubenswrapper[4795]: E0219 21:45:24.454217 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 21:45:24 crc kubenswrapper[4795]: E0219 21:45:24.454319 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift podName:6c773ec2-a400-42a9-8784-ed9c295c3bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:45:24.954299582 +0000 UTC m=+1036.146817446 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift") pod "swift-storage-0" (UID: "6c773ec2-a400-42a9-8784-ed9c295c3bb4") : configmap "swift-ring-files" not found Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.454371 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6c773ec2-a400-42a9-8784-ed9c295c3bb4-lock\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.454108 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.454136 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6c773ec2-a400-42a9-8784-ed9c295c3bb4-cache\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.473401 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c773ec2-a400-42a9-8784-ed9c295c3bb4-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.483508 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bh4z\" (UniqueName: \"kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-kube-api-access-7bh4z\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.489450 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.737052 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-xmbg2"] Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.738228 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.739821 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.740672 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.741874 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.756586 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-xmbg2"] Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.863232 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8945f31-b1d9-4c65-9f8c-2619f87d4237-scripts\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.863368 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svtdp\" (UniqueName: \"kubernetes.io/projected/f8945f31-b1d9-4c65-9f8c-2619f87d4237-kube-api-access-svtdp\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.863462 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-dispersionconf\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.863566 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f8945f31-b1d9-4c65-9f8c-2619f87d4237-ring-data-devices\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.863672 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f8945f31-b1d9-4c65-9f8c-2619f87d4237-etc-swift\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.863710 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-swiftconf\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.863760 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-combined-ca-bundle\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.964724 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svtdp\" (UniqueName: \"kubernetes.io/projected/f8945f31-b1d9-4c65-9f8c-2619f87d4237-kube-api-access-svtdp\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.965005 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.965116 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-dispersionconf\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.965221 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f8945f31-b1d9-4c65-9f8c-2619f87d4237-ring-data-devices\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.965318 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f8945f31-b1d9-4c65-9f8c-2619f87d4237-etc-swift\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.965393 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-swiftconf\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.965473 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-combined-ca-bundle\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.965568 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8945f31-b1d9-4c65-9f8c-2619f87d4237-scripts\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.965936 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f8945f31-b1d9-4c65-9f8c-2619f87d4237-etc-swift\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: E0219 21:45:24.965206 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 21:45:24 crc kubenswrapper[4795]: E0219 21:45:24.965992 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 21:45:24 crc kubenswrapper[4795]: E0219 21:45:24.966032 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift podName:6c773ec2-a400-42a9-8784-ed9c295c3bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:45:25.96601713 +0000 UTC m=+1037.158534994 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift") pod "swift-storage-0" (UID: "6c773ec2-a400-42a9-8784-ed9c295c3bb4") : configmap "swift-ring-files" not found Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.966496 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8945f31-b1d9-4c65-9f8c-2619f87d4237-scripts\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.966646 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f8945f31-b1d9-4c65-9f8c-2619f87d4237-ring-data-devices\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.969903 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-dispersionconf\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.970497 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-combined-ca-bundle\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.970790 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-swiftconf\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.985068 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svtdp\" (UniqueName: \"kubernetes.io/projected/f8945f31-b1d9-4c65-9f8c-2619f87d4237-kube-api-access-svtdp\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:25 crc kubenswrapper[4795]: I0219 21:45:25.017586 4795 generic.go:334] "Generic (PLEG): container finished" podID="7b4dab90-2dba-4e1f-95fe-5c435d4e270a" containerID="c1eba6aa8c078daa91e1cfd1acb049a25d5fbe528714a75b4a04c5fe223c7ced" exitCode=0 Feb 19 21:45:25 crc kubenswrapper[4795]: I0219 21:45:25.017701 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" event={"ID":"7b4dab90-2dba-4e1f-95fe-5c435d4e270a","Type":"ContainerDied","Data":"c1eba6aa8c078daa91e1cfd1acb049a25d5fbe528714a75b4a04c5fe223c7ced"} Feb 19 21:45:25 crc kubenswrapper[4795]: I0219 21:45:25.054039 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:25 crc kubenswrapper[4795]: I0219 21:45:25.522655 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-xmbg2"] Feb 19 21:45:25 crc kubenswrapper[4795]: W0219 21:45:25.537151 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8945f31_b1d9_4c65_9f8c_2619f87d4237.slice/crio-ca2816f59591a52db0fc74a4911b062bb5362403053e13a03adbbd1512ca5f93 WatchSource:0}: Error finding container ca2816f59591a52db0fc74a4911b062bb5362403053e13a03adbbd1512ca5f93: Status 404 returned error can't find the container with id ca2816f59591a52db0fc74a4911b062bb5362403053e13a03adbbd1512ca5f93 Feb 19 21:45:25 crc kubenswrapper[4795]: I0219 21:45:25.985758 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:25 crc kubenswrapper[4795]: E0219 21:45:25.985940 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 21:45:25 crc kubenswrapper[4795]: E0219 21:45:25.985961 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 21:45:25 crc kubenswrapper[4795]: E0219 21:45:25.986013 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift podName:6c773ec2-a400-42a9-8784-ed9c295c3bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:45:27.985997614 +0000 UTC m=+1039.178515478 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift") pod "swift-storage-0" (UID: "6c773ec2-a400-42a9-8784-ed9c295c3bb4") : configmap "swift-ring-files" not found Feb 19 21:45:26 crc kubenswrapper[4795]: I0219 21:45:26.029231 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" event={"ID":"7b4dab90-2dba-4e1f-95fe-5c435d4e270a","Type":"ContainerStarted","Data":"5482b8f1b44d8eeecca016192cda7d894246fb50c4d06a9b45628279359dfa86"} Feb 19 21:45:26 crc kubenswrapper[4795]: I0219 21:45:26.029385 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:26 crc kubenswrapper[4795]: I0219 21:45:26.030603 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xmbg2" event={"ID":"f8945f31-b1d9-4c65-9f8c-2619f87d4237","Type":"ContainerStarted","Data":"ca2816f59591a52db0fc74a4911b062bb5362403053e13a03adbbd1512ca5f93"} Feb 19 21:45:28 crc kubenswrapper[4795]: I0219 21:45:28.026231 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:28 crc kubenswrapper[4795]: E0219 21:45:28.026513 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 21:45:28 crc kubenswrapper[4795]: E0219 21:45:28.027748 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 21:45:28 crc kubenswrapper[4795]: E0219 21:45:28.027832 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift podName:6c773ec2-a400-42a9-8784-ed9c295c3bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:45:32.027805159 +0000 UTC m=+1043.220323043 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift") pod "swift-storage-0" (UID: "6c773ec2-a400-42a9-8784-ed9c295c3bb4") : configmap "swift-ring-files" not found Feb 19 21:45:29 crc kubenswrapper[4795]: I0219 21:45:29.330731 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 19 21:45:29 crc kubenswrapper[4795]: I0219 21:45:29.331074 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 19 21:45:29 crc kubenswrapper[4795]: I0219 21:45:29.409933 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 19 21:45:29 crc kubenswrapper[4795]: I0219 21:45:29.433833 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" podStartSLOduration=6.433813901 podStartE2EDuration="6.433813901s" podCreationTimestamp="2026-02-19 21:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:45:26.046965794 +0000 UTC m=+1037.239483658" watchObservedRunningTime="2026-02-19 21:45:29.433813901 +0000 UTC m=+1040.626331765" Feb 19 21:45:30 crc kubenswrapper[4795]: I0219 21:45:30.126825 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 19 21:45:31 crc kubenswrapper[4795]: I0219 21:45:31.067460 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xmbg2" event={"ID":"f8945f31-b1d9-4c65-9f8c-2619f87d4237","Type":"ContainerStarted","Data":"2c21dc2092bc9760ef50273deb040b6251a319d32c5e2c0fdd3bf1678ba55094"} Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.099643 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:32 crc kubenswrapper[4795]: E0219 21:45:32.099910 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 21:45:32 crc kubenswrapper[4795]: E0219 21:45:32.099925 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 21:45:32 crc kubenswrapper[4795]: E0219 21:45:32.099966 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift podName:6c773ec2-a400-42a9-8784-ed9c295c3bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:45:40.099952573 +0000 UTC m=+1051.292470427 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift") pod "swift-storage-0" (UID: "6c773ec2-a400-42a9-8784-ed9c295c3bb4") : configmap "swift-ring-files" not found Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.107141 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-xmbg2" podStartSLOduration=3.5614602250000003 podStartE2EDuration="8.107121795s" podCreationTimestamp="2026-02-19 21:45:24 +0000 UTC" firstStartedPulling="2026-02-19 21:45:25.541327133 +0000 UTC m=+1036.733844997" lastFinishedPulling="2026-02-19 21:45:30.086988703 +0000 UTC m=+1041.279506567" observedRunningTime="2026-02-19 21:45:31.096054531 +0000 UTC m=+1042.288572395" watchObservedRunningTime="2026-02-19 21:45:32.107121795 +0000 UTC m=+1043.299639659" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.110280 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-q8t82"] Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.112040 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-q8t82" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.120854 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-q8t82"] Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.201236 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65449c45-b8f9-445e-80e7-6e3c8541c62c-operator-scripts\") pod \"keystone-db-create-q8t82\" (UID: \"65449c45-b8f9-445e-80e7-6e3c8541c62c\") " pod="openstack/keystone-db-create-q8t82" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.201617 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tchxf\" (UniqueName: \"kubernetes.io/projected/65449c45-b8f9-445e-80e7-6e3c8541c62c-kube-api-access-tchxf\") pod \"keystone-db-create-q8t82\" (UID: \"65449c45-b8f9-445e-80e7-6e3c8541c62c\") " pod="openstack/keystone-db-create-q8t82" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.225621 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-41fb-account-create-update-ntc9w"] Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.226635 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-41fb-account-create-update-ntc9w" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.233755 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.239153 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-41fb-account-create-update-ntc9w"] Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.304562 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tchxf\" (UniqueName: \"kubernetes.io/projected/65449c45-b8f9-445e-80e7-6e3c8541c62c-kube-api-access-tchxf\") pod \"keystone-db-create-q8t82\" (UID: \"65449c45-b8f9-445e-80e7-6e3c8541c62c\") " pod="openstack/keystone-db-create-q8t82" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.304666 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65449c45-b8f9-445e-80e7-6e3c8541c62c-operator-scripts\") pod \"keystone-db-create-q8t82\" (UID: \"65449c45-b8f9-445e-80e7-6e3c8541c62c\") " pod="openstack/keystone-db-create-q8t82" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.304700 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-969df\" (UniqueName: \"kubernetes.io/projected/fb1c9562-0143-4fa4-86d3-f1ed93f3fa31-kube-api-access-969df\") pod \"keystone-41fb-account-create-update-ntc9w\" (UID: \"fb1c9562-0143-4fa4-86d3-f1ed93f3fa31\") " pod="openstack/keystone-41fb-account-create-update-ntc9w" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.304757 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb1c9562-0143-4fa4-86d3-f1ed93f3fa31-operator-scripts\") pod \"keystone-41fb-account-create-update-ntc9w\" (UID: \"fb1c9562-0143-4fa4-86d3-f1ed93f3fa31\") " pod="openstack/keystone-41fb-account-create-update-ntc9w" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.305472 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65449c45-b8f9-445e-80e7-6e3c8541c62c-operator-scripts\") pod \"keystone-db-create-q8t82\" (UID: \"65449c45-b8f9-445e-80e7-6e3c8541c62c\") " pod="openstack/keystone-db-create-q8t82" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.317899 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-lqp9l"] Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.318882 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-lqp9l" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.329297 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-lqp9l"] Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.329948 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tchxf\" (UniqueName: \"kubernetes.io/projected/65449c45-b8f9-445e-80e7-6e3c8541c62c-kube-api-access-tchxf\") pod \"keystone-db-create-q8t82\" (UID: \"65449c45-b8f9-445e-80e7-6e3c8541c62c\") " pod="openstack/keystone-db-create-q8t82" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.406378 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-969df\" (UniqueName: \"kubernetes.io/projected/fb1c9562-0143-4fa4-86d3-f1ed93f3fa31-kube-api-access-969df\") pod \"keystone-41fb-account-create-update-ntc9w\" (UID: \"fb1c9562-0143-4fa4-86d3-f1ed93f3fa31\") " pod="openstack/keystone-41fb-account-create-update-ntc9w" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.406439 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb1c9562-0143-4fa4-86d3-f1ed93f3fa31-operator-scripts\") pod \"keystone-41fb-account-create-update-ntc9w\" (UID: \"fb1c9562-0143-4fa4-86d3-f1ed93f3fa31\") " pod="openstack/keystone-41fb-account-create-update-ntc9w" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.406465 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddebb2b6-7bc0-45af-ba68-ae108b0d91fd-operator-scripts\") pod \"placement-db-create-lqp9l\" (UID: \"ddebb2b6-7bc0-45af-ba68-ae108b0d91fd\") " pod="openstack/placement-db-create-lqp9l" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.406497 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hmkw\" (UniqueName: \"kubernetes.io/projected/ddebb2b6-7bc0-45af-ba68-ae108b0d91fd-kube-api-access-4hmkw\") pod \"placement-db-create-lqp9l\" (UID: \"ddebb2b6-7bc0-45af-ba68-ae108b0d91fd\") " pod="openstack/placement-db-create-lqp9l" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.407562 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb1c9562-0143-4fa4-86d3-f1ed93f3fa31-operator-scripts\") pod \"keystone-41fb-account-create-update-ntc9w\" (UID: \"fb1c9562-0143-4fa4-86d3-f1ed93f3fa31\") " pod="openstack/keystone-41fb-account-create-update-ntc9w" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.408893 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-c741-account-create-update-hdlzx"] Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.410158 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c741-account-create-update-hdlzx" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.411985 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.419632 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c741-account-create-update-hdlzx"] Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.423925 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-969df\" (UniqueName: \"kubernetes.io/projected/fb1c9562-0143-4fa4-86d3-f1ed93f3fa31-kube-api-access-969df\") pod \"keystone-41fb-account-create-update-ntc9w\" (UID: \"fb1c9562-0143-4fa4-86d3-f1ed93f3fa31\") " pod="openstack/keystone-41fb-account-create-update-ntc9w" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.498128 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-q8t82" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.516537 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf4dk\" (UniqueName: \"kubernetes.io/projected/541fd524-94f2-4149-b16b-ab11a716ff95-kube-api-access-xf4dk\") pod \"placement-c741-account-create-update-hdlzx\" (UID: \"541fd524-94f2-4149-b16b-ab11a716ff95\") " pod="openstack/placement-c741-account-create-update-hdlzx" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.516944 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddebb2b6-7bc0-45af-ba68-ae108b0d91fd-operator-scripts\") pod \"placement-db-create-lqp9l\" (UID: \"ddebb2b6-7bc0-45af-ba68-ae108b0d91fd\") " pod="openstack/placement-db-create-lqp9l" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.516976 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/541fd524-94f2-4149-b16b-ab11a716ff95-operator-scripts\") pod \"placement-c741-account-create-update-hdlzx\" (UID: \"541fd524-94f2-4149-b16b-ab11a716ff95\") " pod="openstack/placement-c741-account-create-update-hdlzx" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.517020 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hmkw\" (UniqueName: \"kubernetes.io/projected/ddebb2b6-7bc0-45af-ba68-ae108b0d91fd-kube-api-access-4hmkw\") pod \"placement-db-create-lqp9l\" (UID: \"ddebb2b6-7bc0-45af-ba68-ae108b0d91fd\") " pod="openstack/placement-db-create-lqp9l" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.518225 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddebb2b6-7bc0-45af-ba68-ae108b0d91fd-operator-scripts\") pod \"placement-db-create-lqp9l\" (UID: \"ddebb2b6-7bc0-45af-ba68-ae108b0d91fd\") " pod="openstack/placement-db-create-lqp9l" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.534133 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hmkw\" (UniqueName: \"kubernetes.io/projected/ddebb2b6-7bc0-45af-ba68-ae108b0d91fd-kube-api-access-4hmkw\") pod \"placement-db-create-lqp9l\" (UID: \"ddebb2b6-7bc0-45af-ba68-ae108b0d91fd\") " pod="openstack/placement-db-create-lqp9l" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.541552 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-41fb-account-create-update-ntc9w" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.618062 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf4dk\" (UniqueName: \"kubernetes.io/projected/541fd524-94f2-4149-b16b-ab11a716ff95-kube-api-access-xf4dk\") pod \"placement-c741-account-create-update-hdlzx\" (UID: \"541fd524-94f2-4149-b16b-ab11a716ff95\") " pod="openstack/placement-c741-account-create-update-hdlzx" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.618179 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/541fd524-94f2-4149-b16b-ab11a716ff95-operator-scripts\") pod \"placement-c741-account-create-update-hdlzx\" (UID: \"541fd524-94f2-4149-b16b-ab11a716ff95\") " pod="openstack/placement-c741-account-create-update-hdlzx" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.619575 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/541fd524-94f2-4149-b16b-ab11a716ff95-operator-scripts\") pod \"placement-c741-account-create-update-hdlzx\" (UID: \"541fd524-94f2-4149-b16b-ab11a716ff95\") " pod="openstack/placement-c741-account-create-update-hdlzx" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.644742 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf4dk\" (UniqueName: \"kubernetes.io/projected/541fd524-94f2-4149-b16b-ab11a716ff95-kube-api-access-xf4dk\") pod \"placement-c741-account-create-update-hdlzx\" (UID: \"541fd524-94f2-4149-b16b-ab11a716ff95\") " pod="openstack/placement-c741-account-create-update-hdlzx" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.661820 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-lqp9l" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.755544 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c741-account-create-update-hdlzx" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.949472 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-q8t82"] Feb 19 21:45:33 crc kubenswrapper[4795]: W0219 21:45:33.055895 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb1c9562_0143_4fa4_86d3_f1ed93f3fa31.slice/crio-4383db569460ac992c685e32755baabb7fa4619c28b9fce54927b757c2667447 WatchSource:0}: Error finding container 4383db569460ac992c685e32755baabb7fa4619c28b9fce54927b757c2667447: Status 404 returned error can't find the container with id 4383db569460ac992c685e32755baabb7fa4619c28b9fce54927b757c2667447 Feb 19 21:45:33 crc kubenswrapper[4795]: I0219 21:45:33.058044 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-41fb-account-create-update-ntc9w"] Feb 19 21:45:33 crc kubenswrapper[4795]: I0219 21:45:33.089317 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-41fb-account-create-update-ntc9w" event={"ID":"fb1c9562-0143-4fa4-86d3-f1ed93f3fa31","Type":"ContainerStarted","Data":"4383db569460ac992c685e32755baabb7fa4619c28b9fce54927b757c2667447"} Feb 19 21:45:33 crc kubenswrapper[4795]: I0219 21:45:33.090552 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-q8t82" event={"ID":"65449c45-b8f9-445e-80e7-6e3c8541c62c","Type":"ContainerStarted","Data":"867f54bd3a17809eff72b8c0887ca1d6cd2443800eed44f06f2ec6266a03d5ea"} Feb 19 21:45:33 crc kubenswrapper[4795]: I0219 21:45:33.139881 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-lqp9l"] Feb 19 21:45:33 crc kubenswrapper[4795]: I0219 21:45:33.236339 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c741-account-create-update-hdlzx"] Feb 19 21:45:33 crc kubenswrapper[4795]: W0219 21:45:33.240563 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod541fd524_94f2_4149_b16b_ab11a716ff95.slice/crio-3e05bcd97e2ba2640c0bab9eed29f375dc6d5ddbeb2df079a97ff158c1bcdbf5 WatchSource:0}: Error finding container 3e05bcd97e2ba2640c0bab9eed29f375dc6d5ddbeb2df079a97ff158c1bcdbf5: Status 404 returned error can't find the container with id 3e05bcd97e2ba2640c0bab9eed29f375dc6d5ddbeb2df079a97ff158c1bcdbf5 Feb 19 21:45:33 crc kubenswrapper[4795]: I0219 21:45:33.498355 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:33 crc kubenswrapper[4795]: I0219 21:45:33.555279 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-hk588"] Feb 19 21:45:33 crc kubenswrapper[4795]: I0219 21:45:33.555537 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" podUID="9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24" containerName="dnsmasq-dns" containerID="cri-o://05ca792cce2c1842654ca399f54a44b1f3a2e077f6b7464bb9380e6c32ea3603" gracePeriod=10 Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.098370 4795 generic.go:334] "Generic (PLEG): container finished" podID="541fd524-94f2-4149-b16b-ab11a716ff95" containerID="95c63f1640980c95f161952724dcddb5ac630545c512a2aa1ea3882e47e48df9" exitCode=0 Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.098579 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c741-account-create-update-hdlzx" event={"ID":"541fd524-94f2-4149-b16b-ab11a716ff95","Type":"ContainerDied","Data":"95c63f1640980c95f161952724dcddb5ac630545c512a2aa1ea3882e47e48df9"} Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.098602 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c741-account-create-update-hdlzx" event={"ID":"541fd524-94f2-4149-b16b-ab11a716ff95","Type":"ContainerStarted","Data":"3e05bcd97e2ba2640c0bab9eed29f375dc6d5ddbeb2df079a97ff158c1bcdbf5"} Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.101630 4795 generic.go:334] "Generic (PLEG): container finished" podID="ddebb2b6-7bc0-45af-ba68-ae108b0d91fd" containerID="c3d76989d78df6e0877ea687626ca27c833a6485113f8ba0b36de864c9998267" exitCode=0 Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.101786 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-lqp9l" event={"ID":"ddebb2b6-7bc0-45af-ba68-ae108b0d91fd","Type":"ContainerDied","Data":"c3d76989d78df6e0877ea687626ca27c833a6485113f8ba0b36de864c9998267"} Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.101811 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-lqp9l" event={"ID":"ddebb2b6-7bc0-45af-ba68-ae108b0d91fd","Type":"ContainerStarted","Data":"4dddb577653131d07177fdafb62bca6d10cf85c3012b351bdb9691eeb4630158"} Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.105356 4795 generic.go:334] "Generic (PLEG): container finished" podID="fb1c9562-0143-4fa4-86d3-f1ed93f3fa31" containerID="83bc6dd6efe6f9962ae47d6fb7eb26b3205715ecf2651537919f42966aa1ce32" exitCode=0 Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.105407 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-41fb-account-create-update-ntc9w" event={"ID":"fb1c9562-0143-4fa4-86d3-f1ed93f3fa31","Type":"ContainerDied","Data":"83bc6dd6efe6f9962ae47d6fb7eb26b3205715ecf2651537919f42966aa1ce32"} Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.106898 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.107048 4795 generic.go:334] "Generic (PLEG): container finished" podID="65449c45-b8f9-445e-80e7-6e3c8541c62c" containerID="d90138519ffcbc0102f0a7e9dc5bfa3f8f09f718ea4e3fd3a5f7e537cfb53122" exitCode=0 Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.107110 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-q8t82" event={"ID":"65449c45-b8f9-445e-80e7-6e3c8541c62c","Type":"ContainerDied","Data":"d90138519ffcbc0102f0a7e9dc5bfa3f8f09f718ea4e3fd3a5f7e537cfb53122"} Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.108527 4795 generic.go:334] "Generic (PLEG): container finished" podID="9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24" containerID="05ca792cce2c1842654ca399f54a44b1f3a2e077f6b7464bb9380e6c32ea3603" exitCode=0 Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.108552 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" event={"ID":"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24","Type":"ContainerDied","Data":"05ca792cce2c1842654ca399f54a44b1f3a2e077f6b7464bb9380e6c32ea3603"} Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.108567 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" event={"ID":"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24","Type":"ContainerDied","Data":"30d488a86fb4ac5c67d31e2d6e5fe67d75ebbbcc1b14297f26623cb276be726d"} Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.108582 4795 scope.go:117] "RemoveContainer" containerID="05ca792cce2c1842654ca399f54a44b1f3a2e077f6b7464bb9380e6c32ea3603" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.108656 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.130701 4795 scope.go:117] "RemoveContainer" containerID="7dadad1b3dff5c98d556b48e60505152135bae2b96f577c3d9bca2b9ab991bb6" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.161553 4795 scope.go:117] "RemoveContainer" containerID="05ca792cce2c1842654ca399f54a44b1f3a2e077f6b7464bb9380e6c32ea3603" Feb 19 21:45:34 crc kubenswrapper[4795]: E0219 21:45:34.164674 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05ca792cce2c1842654ca399f54a44b1f3a2e077f6b7464bb9380e6c32ea3603\": container with ID starting with 05ca792cce2c1842654ca399f54a44b1f3a2e077f6b7464bb9380e6c32ea3603 not found: ID does not exist" containerID="05ca792cce2c1842654ca399f54a44b1f3a2e077f6b7464bb9380e6c32ea3603" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.164711 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05ca792cce2c1842654ca399f54a44b1f3a2e077f6b7464bb9380e6c32ea3603"} err="failed to get container status \"05ca792cce2c1842654ca399f54a44b1f3a2e077f6b7464bb9380e6c32ea3603\": rpc error: code = NotFound desc = could not find container \"05ca792cce2c1842654ca399f54a44b1f3a2e077f6b7464bb9380e6c32ea3603\": container with ID starting with 05ca792cce2c1842654ca399f54a44b1f3a2e077f6b7464bb9380e6c32ea3603 not found: ID does not exist" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.164732 4795 scope.go:117] "RemoveContainer" containerID="7dadad1b3dff5c98d556b48e60505152135bae2b96f577c3d9bca2b9ab991bb6" Feb 19 21:45:34 crc kubenswrapper[4795]: E0219 21:45:34.166360 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dadad1b3dff5c98d556b48e60505152135bae2b96f577c3d9bca2b9ab991bb6\": container with ID starting with 7dadad1b3dff5c98d556b48e60505152135bae2b96f577c3d9bca2b9ab991bb6 not found: ID does not exist" containerID="7dadad1b3dff5c98d556b48e60505152135bae2b96f577c3d9bca2b9ab991bb6" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.166385 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dadad1b3dff5c98d556b48e60505152135bae2b96f577c3d9bca2b9ab991bb6"} err="failed to get container status \"7dadad1b3dff5c98d556b48e60505152135bae2b96f577c3d9bca2b9ab991bb6\": rpc error: code = NotFound desc = could not find container \"7dadad1b3dff5c98d556b48e60505152135bae2b96f577c3d9bca2b9ab991bb6\": container with ID starting with 7dadad1b3dff5c98d556b48e60505152135bae2b96f577c3d9bca2b9ab991bb6 not found: ID does not exist" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.255999 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-config\") pod \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.256147 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xddmw\" (UniqueName: \"kubernetes.io/projected/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-kube-api-access-xddmw\") pod \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.256198 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-ovsdbserver-sb\") pod \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.256257 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-ovsdbserver-nb\") pod \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.256274 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-dns-svc\") pod \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.261670 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-kube-api-access-xddmw" (OuterVolumeSpecName: "kube-api-access-xddmw") pod "9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24" (UID: "9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24"). InnerVolumeSpecName "kube-api-access-xddmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:34 crc kubenswrapper[4795]: E0219 21:45:34.300680 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-dns-svc podName:9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24 nodeName:}" failed. No retries permitted until 2026-02-19 21:45:34.800654069 +0000 UTC m=+1045.993171933 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "dns-svc" (UniqueName: "kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-dns-svc") pod "9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24" (UID: "9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24") : error deleting /var/lib/kubelet/pods/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24/volume-subpaths: remove /var/lib/kubelet/pods/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24/volume-subpaths: no such file or directory Feb 19 21:45:34 crc kubenswrapper[4795]: E0219 21:45:34.300792 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-ovsdbserver-sb podName:9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24 nodeName:}" failed. No retries permitted until 2026-02-19 21:45:34.800783413 +0000 UTC m=+1045.993301287 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ovsdbserver-sb" (UniqueName: "kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-ovsdbserver-sb") pod "9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24" (UID: "9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24") : error deleting /var/lib/kubelet/pods/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24/volume-subpaths: remove /var/lib/kubelet/pods/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24/volume-subpaths: no such file or directory Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.300979 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24" (UID: "9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.301045 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-config" (OuterVolumeSpecName: "config") pod "9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24" (UID: "9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.357709 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xddmw\" (UniqueName: \"kubernetes.io/projected/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-kube-api-access-xddmw\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.357920 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.357983 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.864362 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-ovsdbserver-sb\") pod \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.864784 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-dns-svc\") pod \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.865386 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24" (UID: "9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.865572 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24" (UID: "9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.967328 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.967374 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.996146 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.072388 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-hk588"] Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.075483 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-hk588"] Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.120275 4795 generic.go:334] "Generic (PLEG): container finished" podID="ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" containerID="f5dc53fcd687359370a9224413921410e03027d27bb1e741143948af4422db6c" exitCode=0 Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.120426 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d","Type":"ContainerDied","Data":"f5dc53fcd687359370a9224413921410e03027d27bb1e741143948af4422db6c"} Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.556116 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24" path="/var/lib/kubelet/pods/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24/volumes" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.579121 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c741-account-create-update-hdlzx" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.689730 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xf4dk\" (UniqueName: \"kubernetes.io/projected/541fd524-94f2-4149-b16b-ab11a716ff95-kube-api-access-xf4dk\") pod \"541fd524-94f2-4149-b16b-ab11a716ff95\" (UID: \"541fd524-94f2-4149-b16b-ab11a716ff95\") " Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.689784 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/541fd524-94f2-4149-b16b-ab11a716ff95-operator-scripts\") pod \"541fd524-94f2-4149-b16b-ab11a716ff95\" (UID: \"541fd524-94f2-4149-b16b-ab11a716ff95\") " Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.690736 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/541fd524-94f2-4149-b16b-ab11a716ff95-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "541fd524-94f2-4149-b16b-ab11a716ff95" (UID: "541fd524-94f2-4149-b16b-ab11a716ff95"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.704008 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/541fd524-94f2-4149-b16b-ab11a716ff95-kube-api-access-xf4dk" (OuterVolumeSpecName: "kube-api-access-xf4dk") pod "541fd524-94f2-4149-b16b-ab11a716ff95" (UID: "541fd524-94f2-4149-b16b-ab11a716ff95"). InnerVolumeSpecName "kube-api-access-xf4dk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.787544 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-lqp9l" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.791669 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xf4dk\" (UniqueName: \"kubernetes.io/projected/541fd524-94f2-4149-b16b-ab11a716ff95-kube-api-access-xf4dk\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.791700 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/541fd524-94f2-4149-b16b-ab11a716ff95-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.794559 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-q8t82" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.799468 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-41fb-account-create-update-ntc9w" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.892774 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-969df\" (UniqueName: \"kubernetes.io/projected/fb1c9562-0143-4fa4-86d3-f1ed93f3fa31-kube-api-access-969df\") pod \"fb1c9562-0143-4fa4-86d3-f1ed93f3fa31\" (UID: \"fb1c9562-0143-4fa4-86d3-f1ed93f3fa31\") " Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.892846 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65449c45-b8f9-445e-80e7-6e3c8541c62c-operator-scripts\") pod \"65449c45-b8f9-445e-80e7-6e3c8541c62c\" (UID: \"65449c45-b8f9-445e-80e7-6e3c8541c62c\") " Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.892874 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb1c9562-0143-4fa4-86d3-f1ed93f3fa31-operator-scripts\") pod \"fb1c9562-0143-4fa4-86d3-f1ed93f3fa31\" (UID: \"fb1c9562-0143-4fa4-86d3-f1ed93f3fa31\") " Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.892939 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddebb2b6-7bc0-45af-ba68-ae108b0d91fd-operator-scripts\") pod \"ddebb2b6-7bc0-45af-ba68-ae108b0d91fd\" (UID: \"ddebb2b6-7bc0-45af-ba68-ae108b0d91fd\") " Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.892957 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hmkw\" (UniqueName: \"kubernetes.io/projected/ddebb2b6-7bc0-45af-ba68-ae108b0d91fd-kube-api-access-4hmkw\") pod \"ddebb2b6-7bc0-45af-ba68-ae108b0d91fd\" (UID: \"ddebb2b6-7bc0-45af-ba68-ae108b0d91fd\") " Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.892977 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tchxf\" (UniqueName: \"kubernetes.io/projected/65449c45-b8f9-445e-80e7-6e3c8541c62c-kube-api-access-tchxf\") pod \"65449c45-b8f9-445e-80e7-6e3c8541c62c\" (UID: \"65449c45-b8f9-445e-80e7-6e3c8541c62c\") " Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.893648 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddebb2b6-7bc0-45af-ba68-ae108b0d91fd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ddebb2b6-7bc0-45af-ba68-ae108b0d91fd" (UID: "ddebb2b6-7bc0-45af-ba68-ae108b0d91fd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.893647 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65449c45-b8f9-445e-80e7-6e3c8541c62c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "65449c45-b8f9-445e-80e7-6e3c8541c62c" (UID: "65449c45-b8f9-445e-80e7-6e3c8541c62c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.893647 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb1c9562-0143-4fa4-86d3-f1ed93f3fa31-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fb1c9562-0143-4fa4-86d3-f1ed93f3fa31" (UID: "fb1c9562-0143-4fa4-86d3-f1ed93f3fa31"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.897691 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb1c9562-0143-4fa4-86d3-f1ed93f3fa31-kube-api-access-969df" (OuterVolumeSpecName: "kube-api-access-969df") pod "fb1c9562-0143-4fa4-86d3-f1ed93f3fa31" (UID: "fb1c9562-0143-4fa4-86d3-f1ed93f3fa31"). InnerVolumeSpecName "kube-api-access-969df". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.897834 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddebb2b6-7bc0-45af-ba68-ae108b0d91fd-kube-api-access-4hmkw" (OuterVolumeSpecName: "kube-api-access-4hmkw") pod "ddebb2b6-7bc0-45af-ba68-ae108b0d91fd" (UID: "ddebb2b6-7bc0-45af-ba68-ae108b0d91fd"). InnerVolumeSpecName "kube-api-access-4hmkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.898601 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65449c45-b8f9-445e-80e7-6e3c8541c62c-kube-api-access-tchxf" (OuterVolumeSpecName: "kube-api-access-tchxf") pod "65449c45-b8f9-445e-80e7-6e3c8541c62c" (UID: "65449c45-b8f9-445e-80e7-6e3c8541c62c"). InnerVolumeSpecName "kube-api-access-tchxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.994507 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65449c45-b8f9-445e-80e7-6e3c8541c62c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.994536 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb1c9562-0143-4fa4-86d3-f1ed93f3fa31-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.994546 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddebb2b6-7bc0-45af-ba68-ae108b0d91fd-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.994555 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hmkw\" (UniqueName: \"kubernetes.io/projected/ddebb2b6-7bc0-45af-ba68-ae108b0d91fd-kube-api-access-4hmkw\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.994564 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tchxf\" (UniqueName: \"kubernetes.io/projected/65449c45-b8f9-445e-80e7-6e3c8541c62c-kube-api-access-tchxf\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.994574 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-969df\" (UniqueName: \"kubernetes.io/projected/fb1c9562-0143-4fa4-86d3-f1ed93f3fa31-kube-api-access-969df\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.128345 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-lqp9l" event={"ID":"ddebb2b6-7bc0-45af-ba68-ae108b0d91fd","Type":"ContainerDied","Data":"4dddb577653131d07177fdafb62bca6d10cf85c3012b351bdb9691eeb4630158"} Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.128681 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dddb577653131d07177fdafb62bca6d10cf85c3012b351bdb9691eeb4630158" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.128742 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-lqp9l" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.141691 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-41fb-account-create-update-ntc9w" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.141904 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-41fb-account-create-update-ntc9w" event={"ID":"fb1c9562-0143-4fa4-86d3-f1ed93f3fa31","Type":"ContainerDied","Data":"4383db569460ac992c685e32755baabb7fa4619c28b9fce54927b757c2667447"} Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.141938 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4383db569460ac992c685e32755baabb7fa4619c28b9fce54927b757c2667447" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.144435 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-q8t82" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.144439 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-q8t82" event={"ID":"65449c45-b8f9-445e-80e7-6e3c8541c62c","Type":"ContainerDied","Data":"867f54bd3a17809eff72b8c0887ca1d6cd2443800eed44f06f2ec6266a03d5ea"} Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.144555 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="867f54bd3a17809eff72b8c0887ca1d6cd2443800eed44f06f2ec6266a03d5ea" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.147001 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d","Type":"ContainerStarted","Data":"5a6b19520891e7087129c9dfe002592444a956d213365be36d88dc721e7adc6e"} Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.147571 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.148800 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c741-account-create-update-hdlzx" event={"ID":"541fd524-94f2-4149-b16b-ab11a716ff95","Type":"ContainerDied","Data":"3e05bcd97e2ba2640c0bab9eed29f375dc6d5ddbeb2df079a97ff158c1bcdbf5"} Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.148827 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e05bcd97e2ba2640c0bab9eed29f375dc6d5ddbeb2df079a97ff158c1bcdbf5" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.148858 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c741-account-create-update-hdlzx" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.187414 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.154715879 podStartE2EDuration="1m0.18739142s" podCreationTimestamp="2026-02-19 21:44:36 +0000 UTC" firstStartedPulling="2026-02-19 21:44:38.460652946 +0000 UTC m=+989.653170810" lastFinishedPulling="2026-02-19 21:45:01.493328487 +0000 UTC m=+1012.685846351" observedRunningTime="2026-02-19 21:45:36.180941408 +0000 UTC m=+1047.373459282" watchObservedRunningTime="2026-02-19 21:45:36.18739142 +0000 UTC m=+1047.379909284" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.249094 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-7ktnd"] Feb 19 21:45:36 crc kubenswrapper[4795]: E0219 21:45:36.250220 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24" containerName="init" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.250244 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24" containerName="init" Feb 19 21:45:36 crc kubenswrapper[4795]: E0219 21:45:36.250268 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb1c9562-0143-4fa4-86d3-f1ed93f3fa31" containerName="mariadb-account-create-update" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.250277 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb1c9562-0143-4fa4-86d3-f1ed93f3fa31" containerName="mariadb-account-create-update" Feb 19 21:45:36 crc kubenswrapper[4795]: E0219 21:45:36.250291 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="541fd524-94f2-4149-b16b-ab11a716ff95" containerName="mariadb-account-create-update" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.250300 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="541fd524-94f2-4149-b16b-ab11a716ff95" containerName="mariadb-account-create-update" Feb 19 21:45:36 crc kubenswrapper[4795]: E0219 21:45:36.250310 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24" containerName="dnsmasq-dns" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.250318 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24" containerName="dnsmasq-dns" Feb 19 21:45:36 crc kubenswrapper[4795]: E0219 21:45:36.250350 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65449c45-b8f9-445e-80e7-6e3c8541c62c" containerName="mariadb-database-create" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.250358 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="65449c45-b8f9-445e-80e7-6e3c8541c62c" containerName="mariadb-database-create" Feb 19 21:45:36 crc kubenswrapper[4795]: E0219 21:45:36.250373 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddebb2b6-7bc0-45af-ba68-ae108b0d91fd" containerName="mariadb-database-create" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.250380 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddebb2b6-7bc0-45af-ba68-ae108b0d91fd" containerName="mariadb-database-create" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.250588 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddebb2b6-7bc0-45af-ba68-ae108b0d91fd" containerName="mariadb-database-create" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.250607 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="65449c45-b8f9-445e-80e7-6e3c8541c62c" containerName="mariadb-database-create" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.250620 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="541fd524-94f2-4149-b16b-ab11a716ff95" containerName="mariadb-account-create-update" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.250637 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24" containerName="dnsmasq-dns" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.250651 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb1c9562-0143-4fa4-86d3-f1ed93f3fa31" containerName="mariadb-account-create-update" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.251251 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7ktnd" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.265662 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-7ktnd"] Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.348649 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-f769-account-create-update-25m5x"] Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.349625 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f769-account-create-update-25m5x" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.366638 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.372387 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f769-account-create-update-25m5x"] Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.406148 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2c0e289-4e3b-4b5a-93db-d38621a870ec-operator-scripts\") pod \"glance-db-create-7ktnd\" (UID: \"a2c0e289-4e3b-4b5a-93db-d38621a870ec\") " pod="openstack/glance-db-create-7ktnd" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.406266 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4wn5\" (UniqueName: \"kubernetes.io/projected/a2c0e289-4e3b-4b5a-93db-d38621a870ec-kube-api-access-t4wn5\") pod \"glance-db-create-7ktnd\" (UID: \"a2c0e289-4e3b-4b5a-93db-d38621a870ec\") " pod="openstack/glance-db-create-7ktnd" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.510000 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2c0e289-4e3b-4b5a-93db-d38621a870ec-operator-scripts\") pod \"glance-db-create-7ktnd\" (UID: \"a2c0e289-4e3b-4b5a-93db-d38621a870ec\") " pod="openstack/glance-db-create-7ktnd" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.510057 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlbk2\" (UniqueName: \"kubernetes.io/projected/890a044b-0060-4feb-866b-9a9e80bfa706-kube-api-access-tlbk2\") pod \"glance-f769-account-create-update-25m5x\" (UID: \"890a044b-0060-4feb-866b-9a9e80bfa706\") " pod="openstack/glance-f769-account-create-update-25m5x" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.510115 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4wn5\" (UniqueName: \"kubernetes.io/projected/a2c0e289-4e3b-4b5a-93db-d38621a870ec-kube-api-access-t4wn5\") pod \"glance-db-create-7ktnd\" (UID: \"a2c0e289-4e3b-4b5a-93db-d38621a870ec\") " pod="openstack/glance-db-create-7ktnd" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.510151 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/890a044b-0060-4feb-866b-9a9e80bfa706-operator-scripts\") pod \"glance-f769-account-create-update-25m5x\" (UID: \"890a044b-0060-4feb-866b-9a9e80bfa706\") " pod="openstack/glance-f769-account-create-update-25m5x" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.511047 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2c0e289-4e3b-4b5a-93db-d38621a870ec-operator-scripts\") pod \"glance-db-create-7ktnd\" (UID: \"a2c0e289-4e3b-4b5a-93db-d38621a870ec\") " pod="openstack/glance-db-create-7ktnd" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.529513 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4wn5\" (UniqueName: \"kubernetes.io/projected/a2c0e289-4e3b-4b5a-93db-d38621a870ec-kube-api-access-t4wn5\") pod \"glance-db-create-7ktnd\" (UID: \"a2c0e289-4e3b-4b5a-93db-d38621a870ec\") " pod="openstack/glance-db-create-7ktnd" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.570478 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7ktnd" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.611515 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlbk2\" (UniqueName: \"kubernetes.io/projected/890a044b-0060-4feb-866b-9a9e80bfa706-kube-api-access-tlbk2\") pod \"glance-f769-account-create-update-25m5x\" (UID: \"890a044b-0060-4feb-866b-9a9e80bfa706\") " pod="openstack/glance-f769-account-create-update-25m5x" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.611620 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/890a044b-0060-4feb-866b-9a9e80bfa706-operator-scripts\") pod \"glance-f769-account-create-update-25m5x\" (UID: \"890a044b-0060-4feb-866b-9a9e80bfa706\") " pod="openstack/glance-f769-account-create-update-25m5x" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.612673 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/890a044b-0060-4feb-866b-9a9e80bfa706-operator-scripts\") pod \"glance-f769-account-create-update-25m5x\" (UID: \"890a044b-0060-4feb-866b-9a9e80bfa706\") " pod="openstack/glance-f769-account-create-update-25m5x" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.638678 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlbk2\" (UniqueName: \"kubernetes.io/projected/890a044b-0060-4feb-866b-9a9e80bfa706-kube-api-access-tlbk2\") pod \"glance-f769-account-create-update-25m5x\" (UID: \"890a044b-0060-4feb-866b-9a9e80bfa706\") " pod="openstack/glance-f769-account-create-update-25m5x" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.682967 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f769-account-create-update-25m5x" Feb 19 21:45:37 crc kubenswrapper[4795]: I0219 21:45:37.081911 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-7ktnd"] Feb 19 21:45:37 crc kubenswrapper[4795]: W0219 21:45:37.083116 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2c0e289_4e3b_4b5a_93db_d38621a870ec.slice/crio-f4a0b272510dcb48f64a7a1f3dfc756614d3c0e18ddd17eec3be36dac261739e WatchSource:0}: Error finding container f4a0b272510dcb48f64a7a1f3dfc756614d3c0e18ddd17eec3be36dac261739e: Status 404 returned error can't find the container with id f4a0b272510dcb48f64a7a1f3dfc756614d3c0e18ddd17eec3be36dac261739e Feb 19 21:45:37 crc kubenswrapper[4795]: I0219 21:45:37.156595 4795 generic.go:334] "Generic (PLEG): container finished" podID="f8945f31-b1d9-4c65-9f8c-2619f87d4237" containerID="2c21dc2092bc9760ef50273deb040b6251a319d32c5e2c0fdd3bf1678ba55094" exitCode=0 Feb 19 21:45:37 crc kubenswrapper[4795]: I0219 21:45:37.156683 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xmbg2" event={"ID":"f8945f31-b1d9-4c65-9f8c-2619f87d4237","Type":"ContainerDied","Data":"2c21dc2092bc9760ef50273deb040b6251a319d32c5e2c0fdd3bf1678ba55094"} Feb 19 21:45:37 crc kubenswrapper[4795]: I0219 21:45:37.157798 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7ktnd" event={"ID":"a2c0e289-4e3b-4b5a-93db-d38621a870ec","Type":"ContainerStarted","Data":"f4a0b272510dcb48f64a7a1f3dfc756614d3c0e18ddd17eec3be36dac261739e"} Feb 19 21:45:37 crc kubenswrapper[4795]: I0219 21:45:37.231216 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f769-account-create-update-25m5x"] Feb 19 21:45:37 crc kubenswrapper[4795]: I0219 21:45:37.644210 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-w9fbs" podUID="c30e8522-2d7f-4f10-a0b4-a7cfc351d093" containerName="ovn-controller" probeResult="failure" output=< Feb 19 21:45:37 crc kubenswrapper[4795]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 19 21:45:37 crc kubenswrapper[4795]: > Feb 19 21:45:37 crc kubenswrapper[4795]: I0219 21:45:37.980324 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-jlwf9"] Feb 19 21:45:37 crc kubenswrapper[4795]: I0219 21:45:37.985223 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-jlwf9"] Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.078894 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-jr6xc"] Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.080123 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jr6xc" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.082188 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.099290 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jr6xc"] Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.174186 4795 generic.go:334] "Generic (PLEG): container finished" podID="890a044b-0060-4feb-866b-9a9e80bfa706" containerID="f13d7ab12cdcba95909b27dcd1c1e77cc3443dbab3be6042ac8015c2168e9280" exitCode=0 Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.174278 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f769-account-create-update-25m5x" event={"ID":"890a044b-0060-4feb-866b-9a9e80bfa706","Type":"ContainerDied","Data":"f13d7ab12cdcba95909b27dcd1c1e77cc3443dbab3be6042ac8015c2168e9280"} Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.174323 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f769-account-create-update-25m5x" event={"ID":"890a044b-0060-4feb-866b-9a9e80bfa706","Type":"ContainerStarted","Data":"62a552ba7e9d49668b066ddc893227489242cacf8fb293f6b71e0d3a5c13b2b2"} Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.176131 4795 generic.go:334] "Generic (PLEG): container finished" podID="a2c0e289-4e3b-4b5a-93db-d38621a870ec" containerID="00ef13a6fc1812f0a18c09ccfd124e49c733a0a626dc7ae23746169010b14516" exitCode=0 Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.176280 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7ktnd" event={"ID":"a2c0e289-4e3b-4b5a-93db-d38621a870ec","Type":"ContainerDied","Data":"00ef13a6fc1812f0a18c09ccfd124e49c733a0a626dc7ae23746169010b14516"} Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.242662 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dvbp\" (UniqueName: \"kubernetes.io/projected/5b15ba11-a170-4fac-bac1-15ecf9de7379-kube-api-access-7dvbp\") pod \"root-account-create-update-jr6xc\" (UID: \"5b15ba11-a170-4fac-bac1-15ecf9de7379\") " pod="openstack/root-account-create-update-jr6xc" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.242776 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b15ba11-a170-4fac-bac1-15ecf9de7379-operator-scripts\") pod \"root-account-create-update-jr6xc\" (UID: \"5b15ba11-a170-4fac-bac1-15ecf9de7379\") " pod="openstack/root-account-create-update-jr6xc" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.343711 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b15ba11-a170-4fac-bac1-15ecf9de7379-operator-scripts\") pod \"root-account-create-update-jr6xc\" (UID: \"5b15ba11-a170-4fac-bac1-15ecf9de7379\") " pod="openstack/root-account-create-update-jr6xc" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.343800 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dvbp\" (UniqueName: \"kubernetes.io/projected/5b15ba11-a170-4fac-bac1-15ecf9de7379-kube-api-access-7dvbp\") pod \"root-account-create-update-jr6xc\" (UID: \"5b15ba11-a170-4fac-bac1-15ecf9de7379\") " pod="openstack/root-account-create-update-jr6xc" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.345380 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b15ba11-a170-4fac-bac1-15ecf9de7379-operator-scripts\") pod \"root-account-create-update-jr6xc\" (UID: \"5b15ba11-a170-4fac-bac1-15ecf9de7379\") " pod="openstack/root-account-create-update-jr6xc" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.361147 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dvbp\" (UniqueName: \"kubernetes.io/projected/5b15ba11-a170-4fac-bac1-15ecf9de7379-kube-api-access-7dvbp\") pod \"root-account-create-update-jr6xc\" (UID: \"5b15ba11-a170-4fac-bac1-15ecf9de7379\") " pod="openstack/root-account-create-update-jr6xc" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.434480 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jr6xc" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.527888 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.659353 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f8945f31-b1d9-4c65-9f8c-2619f87d4237-etc-swift\") pod \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.659399 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svtdp\" (UniqueName: \"kubernetes.io/projected/f8945f31-b1d9-4c65-9f8c-2619f87d4237-kube-api-access-svtdp\") pod \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.659460 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-swiftconf\") pod \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.659522 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8945f31-b1d9-4c65-9f8c-2619f87d4237-scripts\") pod \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.659556 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-combined-ca-bundle\") pod \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.659654 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-dispersionconf\") pod \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.659691 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f8945f31-b1d9-4c65-9f8c-2619f87d4237-ring-data-devices\") pod \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.660430 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8945f31-b1d9-4c65-9f8c-2619f87d4237-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f8945f31-b1d9-4c65-9f8c-2619f87d4237" (UID: "f8945f31-b1d9-4c65-9f8c-2619f87d4237"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.660804 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8945f31-b1d9-4c65-9f8c-2619f87d4237-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f8945f31-b1d9-4c65-9f8c-2619f87d4237" (UID: "f8945f31-b1d9-4c65-9f8c-2619f87d4237"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.683784 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f8945f31-b1d9-4c65-9f8c-2619f87d4237" (UID: "f8945f31-b1d9-4c65-9f8c-2619f87d4237"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.688311 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f8945f31-b1d9-4c65-9f8c-2619f87d4237" (UID: "f8945f31-b1d9-4c65-9f8c-2619f87d4237"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.690577 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8945f31-b1d9-4c65-9f8c-2619f87d4237-kube-api-access-svtdp" (OuterVolumeSpecName: "kube-api-access-svtdp") pod "f8945f31-b1d9-4c65-9f8c-2619f87d4237" (UID: "f8945f31-b1d9-4c65-9f8c-2619f87d4237"). InnerVolumeSpecName "kube-api-access-svtdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.690777 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8945f31-b1d9-4c65-9f8c-2619f87d4237-scripts" (OuterVolumeSpecName: "scripts") pod "f8945f31-b1d9-4c65-9f8c-2619f87d4237" (UID: "f8945f31-b1d9-4c65-9f8c-2619f87d4237"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.706682 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8945f31-b1d9-4c65-9f8c-2619f87d4237" (UID: "f8945f31-b1d9-4c65-9f8c-2619f87d4237"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.761641 4795 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.761674 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8945f31-b1d9-4c65-9f8c-2619f87d4237-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.761684 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.761695 4795 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.761713 4795 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f8945f31-b1d9-4c65-9f8c-2619f87d4237-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.761721 4795 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f8945f31-b1d9-4c65-9f8c-2619f87d4237-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.761729 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svtdp\" (UniqueName: \"kubernetes.io/projected/f8945f31-b1d9-4c65-9f8c-2619f87d4237-kube-api-access-svtdp\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:38 crc kubenswrapper[4795]: W0219 21:45:38.894362 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b15ba11_a170_4fac_bac1_15ecf9de7379.slice/crio-31f177d86dd9cbd251e0070c154b15ebc1a97c0cad3ad1618ee4fd7eb8b37ce3 WatchSource:0}: Error finding container 31f177d86dd9cbd251e0070c154b15ebc1a97c0cad3ad1618ee4fd7eb8b37ce3: Status 404 returned error can't find the container with id 31f177d86dd9cbd251e0070c154b15ebc1a97c0cad3ad1618ee4fd7eb8b37ce3 Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.895965 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jr6xc"] Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.185664 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jr6xc" event={"ID":"5b15ba11-a170-4fac-bac1-15ecf9de7379","Type":"ContainerStarted","Data":"27f5e555e30e338ab9e5ae7facd6ba963cf6336bbb4bec245c099c4afa8bb6a3"} Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.185964 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jr6xc" event={"ID":"5b15ba11-a170-4fac-bac1-15ecf9de7379","Type":"ContainerStarted","Data":"31f177d86dd9cbd251e0070c154b15ebc1a97c0cad3ad1618ee4fd7eb8b37ce3"} Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.187452 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xmbg2" event={"ID":"f8945f31-b1d9-4c65-9f8c-2619f87d4237","Type":"ContainerDied","Data":"ca2816f59591a52db0fc74a4911b062bb5362403053e13a03adbbd1512ca5f93"} Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.187493 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca2816f59591a52db0fc74a4911b062bb5362403053e13a03adbbd1512ca5f93" Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.187567 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.216111 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-jr6xc" podStartSLOduration=1.216094648 podStartE2EDuration="1.216094648s" podCreationTimestamp="2026-02-19 21:45:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:45:39.20941898 +0000 UTC m=+1050.401936844" watchObservedRunningTime="2026-02-19 21:45:39.216094648 +0000 UTC m=+1050.408612512" Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.525649 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11cc466e-0752-46b5-9775-c29748b13724" path="/var/lib/kubelet/pods/11cc466e-0752-46b5-9775-c29748b13724/volumes" Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.557640 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f769-account-create-update-25m5x" Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.616277 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7ktnd" Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.700618 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/890a044b-0060-4feb-866b-9a9e80bfa706-operator-scripts\") pod \"890a044b-0060-4feb-866b-9a9e80bfa706\" (UID: \"890a044b-0060-4feb-866b-9a9e80bfa706\") " Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.701284 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlbk2\" (UniqueName: \"kubernetes.io/projected/890a044b-0060-4feb-866b-9a9e80bfa706-kube-api-access-tlbk2\") pod \"890a044b-0060-4feb-866b-9a9e80bfa706\" (UID: \"890a044b-0060-4feb-866b-9a9e80bfa706\") " Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.702330 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/890a044b-0060-4feb-866b-9a9e80bfa706-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "890a044b-0060-4feb-866b-9a9e80bfa706" (UID: "890a044b-0060-4feb-866b-9a9e80bfa706"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.708840 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/890a044b-0060-4feb-866b-9a9e80bfa706-kube-api-access-tlbk2" (OuterVolumeSpecName: "kube-api-access-tlbk2") pod "890a044b-0060-4feb-866b-9a9e80bfa706" (UID: "890a044b-0060-4feb-866b-9a9e80bfa706"). InnerVolumeSpecName "kube-api-access-tlbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.802260 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2c0e289-4e3b-4b5a-93db-d38621a870ec-operator-scripts\") pod \"a2c0e289-4e3b-4b5a-93db-d38621a870ec\" (UID: \"a2c0e289-4e3b-4b5a-93db-d38621a870ec\") " Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.802297 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4wn5\" (UniqueName: \"kubernetes.io/projected/a2c0e289-4e3b-4b5a-93db-d38621a870ec-kube-api-access-t4wn5\") pod \"a2c0e289-4e3b-4b5a-93db-d38621a870ec\" (UID: \"a2c0e289-4e3b-4b5a-93db-d38621a870ec\") " Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.802620 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/890a044b-0060-4feb-866b-9a9e80bfa706-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.802637 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlbk2\" (UniqueName: \"kubernetes.io/projected/890a044b-0060-4feb-866b-9a9e80bfa706-kube-api-access-tlbk2\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.802822 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2c0e289-4e3b-4b5a-93db-d38621a870ec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a2c0e289-4e3b-4b5a-93db-d38621a870ec" (UID: "a2c0e289-4e3b-4b5a-93db-d38621a870ec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.805324 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2c0e289-4e3b-4b5a-93db-d38621a870ec-kube-api-access-t4wn5" (OuterVolumeSpecName: "kube-api-access-t4wn5") pod "a2c0e289-4e3b-4b5a-93db-d38621a870ec" (UID: "a2c0e289-4e3b-4b5a-93db-d38621a870ec"). InnerVolumeSpecName "kube-api-access-t4wn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.921642 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2c0e289-4e3b-4b5a-93db-d38621a870ec-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.921685 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4wn5\" (UniqueName: \"kubernetes.io/projected/a2c0e289-4e3b-4b5a-93db-d38621a870ec-kube-api-access-t4wn5\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:40 crc kubenswrapper[4795]: I0219 21:45:40.124581 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:40 crc kubenswrapper[4795]: I0219 21:45:40.129191 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:40 crc kubenswrapper[4795]: I0219 21:45:40.189775 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 19 21:45:40 crc kubenswrapper[4795]: I0219 21:45:40.201934 4795 generic.go:334] "Generic (PLEG): container finished" podID="5b15ba11-a170-4fac-bac1-15ecf9de7379" containerID="27f5e555e30e338ab9e5ae7facd6ba963cf6336bbb4bec245c099c4afa8bb6a3" exitCode=0 Feb 19 21:45:40 crc kubenswrapper[4795]: I0219 21:45:40.202035 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jr6xc" event={"ID":"5b15ba11-a170-4fac-bac1-15ecf9de7379","Type":"ContainerDied","Data":"27f5e555e30e338ab9e5ae7facd6ba963cf6336bbb4bec245c099c4afa8bb6a3"} Feb 19 21:45:40 crc kubenswrapper[4795]: I0219 21:45:40.203768 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f769-account-create-update-25m5x" event={"ID":"890a044b-0060-4feb-866b-9a9e80bfa706","Type":"ContainerDied","Data":"62a552ba7e9d49668b066ddc893227489242cacf8fb293f6b71e0d3a5c13b2b2"} Feb 19 21:45:40 crc kubenswrapper[4795]: I0219 21:45:40.203806 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62a552ba7e9d49668b066ddc893227489242cacf8fb293f6b71e0d3a5c13b2b2" Feb 19 21:45:40 crc kubenswrapper[4795]: I0219 21:45:40.203806 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f769-account-create-update-25m5x" Feb 19 21:45:40 crc kubenswrapper[4795]: I0219 21:45:40.205019 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7ktnd" event={"ID":"a2c0e289-4e3b-4b5a-93db-d38621a870ec","Type":"ContainerDied","Data":"f4a0b272510dcb48f64a7a1f3dfc756614d3c0e18ddd17eec3be36dac261739e"} Feb 19 21:45:40 crc kubenswrapper[4795]: I0219 21:45:40.205037 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4a0b272510dcb48f64a7a1f3dfc756614d3c0e18ddd17eec3be36dac261739e" Feb 19 21:45:40 crc kubenswrapper[4795]: I0219 21:45:40.205083 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7ktnd" Feb 19 21:45:40 crc kubenswrapper[4795]: I0219 21:45:40.810316 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 19 21:45:40 crc kubenswrapper[4795]: W0219 21:45:40.812530 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c773ec2_a400_42a9_8784_ed9c295c3bb4.slice/crio-5f46001ee34a633ce141975121f3de2f61c9a83244b699ec2a130f6af5efbc36 WatchSource:0}: Error finding container 5f46001ee34a633ce141975121f3de2f61c9a83244b699ec2a130f6af5efbc36: Status 404 returned error can't find the container with id 5f46001ee34a633ce141975121f3de2f61c9a83244b699ec2a130f6af5efbc36 Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.213235 4795 generic.go:334] "Generic (PLEG): container finished" podID="7b096325-542d-4ac6-8d16-8aa0937013b2" containerID="90e58bd420672619d861e1841541403fdbfcac1af5a896cc071e58d6af65cf55" exitCode=0 Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.213325 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7b096325-542d-4ac6-8d16-8aa0937013b2","Type":"ContainerDied","Data":"90e58bd420672619d861e1841541403fdbfcac1af5a896cc071e58d6af65cf55"} Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.214553 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerStarted","Data":"5f46001ee34a633ce141975121f3de2f61c9a83244b699ec2a130f6af5efbc36"} Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.597134 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-2wbff"] Feb 19 21:45:41 crc kubenswrapper[4795]: E0219 21:45:41.597862 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="890a044b-0060-4feb-866b-9a9e80bfa706" containerName="mariadb-account-create-update" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.597878 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="890a044b-0060-4feb-866b-9a9e80bfa706" containerName="mariadb-account-create-update" Feb 19 21:45:41 crc kubenswrapper[4795]: E0219 21:45:41.597896 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2c0e289-4e3b-4b5a-93db-d38621a870ec" containerName="mariadb-database-create" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.597904 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2c0e289-4e3b-4b5a-93db-d38621a870ec" containerName="mariadb-database-create" Feb 19 21:45:41 crc kubenswrapper[4795]: E0219 21:45:41.597917 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8945f31-b1d9-4c65-9f8c-2619f87d4237" containerName="swift-ring-rebalance" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.597923 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8945f31-b1d9-4c65-9f8c-2619f87d4237" containerName="swift-ring-rebalance" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.598075 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2c0e289-4e3b-4b5a-93db-d38621a870ec" containerName="mariadb-database-create" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.598087 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8945f31-b1d9-4c65-9f8c-2619f87d4237" containerName="swift-ring-rebalance" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.598102 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="890a044b-0060-4feb-866b-9a9e80bfa706" containerName="mariadb-account-create-update" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.598597 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-2wbff" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.603177 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.603439 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6rhfn" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.616739 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-2wbff"] Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.757797 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jr6xc" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.758940 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-db-sync-config-data\") pod \"glance-db-sync-2wbff\" (UID: \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\") " pod="openstack/glance-db-sync-2wbff" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.758964 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-combined-ca-bundle\") pod \"glance-db-sync-2wbff\" (UID: \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\") " pod="openstack/glance-db-sync-2wbff" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.758988 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-config-data\") pod \"glance-db-sync-2wbff\" (UID: \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\") " pod="openstack/glance-db-sync-2wbff" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.759019 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q9z2\" (UniqueName: \"kubernetes.io/projected/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-kube-api-access-6q9z2\") pod \"glance-db-sync-2wbff\" (UID: \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\") " pod="openstack/glance-db-sync-2wbff" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.860682 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b15ba11-a170-4fac-bac1-15ecf9de7379-operator-scripts\") pod \"5b15ba11-a170-4fac-bac1-15ecf9de7379\" (UID: \"5b15ba11-a170-4fac-bac1-15ecf9de7379\") " Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.860778 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dvbp\" (UniqueName: \"kubernetes.io/projected/5b15ba11-a170-4fac-bac1-15ecf9de7379-kube-api-access-7dvbp\") pod \"5b15ba11-a170-4fac-bac1-15ecf9de7379\" (UID: \"5b15ba11-a170-4fac-bac1-15ecf9de7379\") " Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.861074 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-combined-ca-bundle\") pod \"glance-db-sync-2wbff\" (UID: \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\") " pod="openstack/glance-db-sync-2wbff" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.861093 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-db-sync-config-data\") pod \"glance-db-sync-2wbff\" (UID: \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\") " pod="openstack/glance-db-sync-2wbff" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.861116 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-config-data\") pod \"glance-db-sync-2wbff\" (UID: \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\") " pod="openstack/glance-db-sync-2wbff" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.861190 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q9z2\" (UniqueName: \"kubernetes.io/projected/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-kube-api-access-6q9z2\") pod \"glance-db-sync-2wbff\" (UID: \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\") " pod="openstack/glance-db-sync-2wbff" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.861715 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b15ba11-a170-4fac-bac1-15ecf9de7379-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5b15ba11-a170-4fac-bac1-15ecf9de7379" (UID: "5b15ba11-a170-4fac-bac1-15ecf9de7379"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.865527 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-config-data\") pod \"glance-db-sync-2wbff\" (UID: \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\") " pod="openstack/glance-db-sync-2wbff" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.865989 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-combined-ca-bundle\") pod \"glance-db-sync-2wbff\" (UID: \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\") " pod="openstack/glance-db-sync-2wbff" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.866353 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b15ba11-a170-4fac-bac1-15ecf9de7379-kube-api-access-7dvbp" (OuterVolumeSpecName: "kube-api-access-7dvbp") pod "5b15ba11-a170-4fac-bac1-15ecf9de7379" (UID: "5b15ba11-a170-4fac-bac1-15ecf9de7379"). InnerVolumeSpecName "kube-api-access-7dvbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.875070 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-db-sync-config-data\") pod \"glance-db-sync-2wbff\" (UID: \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\") " pod="openstack/glance-db-sync-2wbff" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.890804 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q9z2\" (UniqueName: \"kubernetes.io/projected/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-kube-api-access-6q9z2\") pod \"glance-db-sync-2wbff\" (UID: \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\") " pod="openstack/glance-db-sync-2wbff" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.918494 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-2wbff" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.963865 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b15ba11-a170-4fac-bac1-15ecf9de7379-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.963903 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dvbp\" (UniqueName: \"kubernetes.io/projected/5b15ba11-a170-4fac-bac1-15ecf9de7379-kube-api-access-7dvbp\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:42 crc kubenswrapper[4795]: I0219 21:45:42.226807 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7b096325-542d-4ac6-8d16-8aa0937013b2","Type":"ContainerStarted","Data":"65d80d81718a8cacc6c1500df8f269c37262bad204c9b2124ccbee151d27a66a"} Feb 19 21:45:42 crc kubenswrapper[4795]: I0219 21:45:42.227023 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 19 21:45:42 crc kubenswrapper[4795]: I0219 21:45:42.229777 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jr6xc" event={"ID":"5b15ba11-a170-4fac-bac1-15ecf9de7379","Type":"ContainerDied","Data":"31f177d86dd9cbd251e0070c154b15ebc1a97c0cad3ad1618ee4fd7eb8b37ce3"} Feb 19 21:45:42 crc kubenswrapper[4795]: I0219 21:45:42.230048 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31f177d86dd9cbd251e0070c154b15ebc1a97c0cad3ad1618ee4fd7eb8b37ce3" Feb 19 21:45:42 crc kubenswrapper[4795]: I0219 21:45:42.229870 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jr6xc" Feb 19 21:45:42 crc kubenswrapper[4795]: I0219 21:45:42.259605 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371970.595188 podStartE2EDuration="1m6.259588432s" podCreationTimestamp="2026-02-19 21:44:36 +0000 UTC" firstStartedPulling="2026-02-19 21:44:37.978295277 +0000 UTC m=+989.170813141" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:45:42.252293187 +0000 UTC m=+1053.444811051" watchObservedRunningTime="2026-02-19 21:45:42.259588432 +0000 UTC m=+1053.452106296" Feb 19 21:45:42 crc kubenswrapper[4795]: I0219 21:45:42.556745 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-2wbff"] Feb 19 21:45:42 crc kubenswrapper[4795]: I0219 21:45:42.646221 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-w9fbs" podUID="c30e8522-2d7f-4f10-a0b4-a7cfc351d093" containerName="ovn-controller" probeResult="failure" output=< Feb 19 21:45:42 crc kubenswrapper[4795]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 19 21:45:42 crc kubenswrapper[4795]: > Feb 19 21:45:42 crc kubenswrapper[4795]: I0219 21:45:42.685194 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:45:42 crc kubenswrapper[4795]: I0219 21:45:42.695567 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:45:42 crc kubenswrapper[4795]: I0219 21:45:42.923637 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-w9fbs-config-d9rr5"] Feb 19 21:45:42 crc kubenswrapper[4795]: E0219 21:45:42.924363 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b15ba11-a170-4fac-bac1-15ecf9de7379" containerName="mariadb-account-create-update" Feb 19 21:45:42 crc kubenswrapper[4795]: I0219 21:45:42.924376 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b15ba11-a170-4fac-bac1-15ecf9de7379" containerName="mariadb-account-create-update" Feb 19 21:45:42 crc kubenswrapper[4795]: I0219 21:45:42.924535 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b15ba11-a170-4fac-bac1-15ecf9de7379" containerName="mariadb-account-create-update" Feb 19 21:45:42 crc kubenswrapper[4795]: I0219 21:45:42.925042 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:42 crc kubenswrapper[4795]: I0219 21:45:42.928573 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 19 21:45:42 crc kubenswrapper[4795]: I0219 21:45:42.931661 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-w9fbs-config-d9rr5"] Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.081802 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-run-ovn\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.081863 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-additional-scripts\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.081891 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-run\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.082073 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-scripts\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.082155 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-log-ovn\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.082266 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d8gs\" (UniqueName: \"kubernetes.io/projected/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-kube-api-access-4d8gs\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.184271 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d8gs\" (UniqueName: \"kubernetes.io/projected/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-kube-api-access-4d8gs\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.184363 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-run-ovn\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.184397 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-additional-scripts\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.184439 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-run\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.184478 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-scripts\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.184508 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-log-ovn\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.184830 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-log-ovn\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.184831 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-run-ovn\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.184874 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-run\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.185455 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-additional-scripts\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.186364 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-scripts\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.204719 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d8gs\" (UniqueName: \"kubernetes.io/projected/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-kube-api-access-4d8gs\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.245105 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerStarted","Data":"51c55baa52f08bcb95276c0f7a67a7ef348b9bd02a9fc401f50e679f37e0c117"} Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.246906 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-2wbff" event={"ID":"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8","Type":"ContainerStarted","Data":"256eb7cf5acf7b851870ab83b4723dabaee1fbaa60ee04f8ea61103c387af2ad"} Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.329095 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.819633 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-w9fbs-config-d9rr5"] Feb 19 21:45:44 crc kubenswrapper[4795]: I0219 21:45:44.262717 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-w9fbs-config-d9rr5" event={"ID":"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd","Type":"ContainerStarted","Data":"7c32d427d23010f8ec7644388fc5c6ced9ee00bbf0a6575354abad70dc15498d"} Feb 19 21:45:44 crc kubenswrapper[4795]: I0219 21:45:44.263009 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-w9fbs-config-d9rr5" event={"ID":"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd","Type":"ContainerStarted","Data":"65876dc5d7abd6c76f19edc3979a74c5098fbab43150468cdad7e42ab8b1ff48"} Feb 19 21:45:44 crc kubenswrapper[4795]: I0219 21:45:44.265892 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerStarted","Data":"b4b0474dd3a4192273fa0b2dc273a792e955cd0dde33e24a1afa65bb56656eaa"} Feb 19 21:45:44 crc kubenswrapper[4795]: I0219 21:45:44.265934 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerStarted","Data":"d2281de4777acfa86c800d06ed4c2e0ac8613cf4008b8449cd7089d057ee51ec"} Feb 19 21:45:44 crc kubenswrapper[4795]: I0219 21:45:44.265944 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerStarted","Data":"22441008e17545864d7c6366d4ab4fa8333a1c04e36b9961e8fdfdfaeec8b1b6"} Feb 19 21:45:44 crc kubenswrapper[4795]: I0219 21:45:44.282312 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-w9fbs-config-d9rr5" podStartSLOduration=2.282298148 podStartE2EDuration="2.282298148s" podCreationTimestamp="2026-02-19 21:45:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:45:44.281962818 +0000 UTC m=+1055.474480672" watchObservedRunningTime="2026-02-19 21:45:44.282298148 +0000 UTC m=+1055.474816002" Feb 19 21:45:45 crc kubenswrapper[4795]: I0219 21:45:45.279123 4795 generic.go:334] "Generic (PLEG): container finished" podID="478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd" containerID="7c32d427d23010f8ec7644388fc5c6ced9ee00bbf0a6575354abad70dc15498d" exitCode=0 Feb 19 21:45:45 crc kubenswrapper[4795]: I0219 21:45:45.279376 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-w9fbs-config-d9rr5" event={"ID":"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd","Type":"ContainerDied","Data":"7c32d427d23010f8ec7644388fc5c6ced9ee00bbf0a6575354abad70dc15498d"} Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.289460 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerStarted","Data":"f2b55f40d9ab92e06fdc09f65e72764f7f9c63fbb1f126ede2058624236d001f"} Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.289773 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerStarted","Data":"5019fca913d092cd1d004e058553c364ac08007bafeae027b681e3bb6eb59026"} Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.289784 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerStarted","Data":"be1c394688c447ed772b4929317159025a1e97491b40b847644ed369351532b5"} Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.289795 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerStarted","Data":"a86db1a02ddab5086097179b35e6f17d71c32f36157625ee70912b23839603d4"} Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.617705 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.762969 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d8gs\" (UniqueName: \"kubernetes.io/projected/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-kube-api-access-4d8gs\") pod \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.763108 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-run\") pod \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.763204 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-run-ovn\") pod \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.763224 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-run" (OuterVolumeSpecName: "var-run") pod "478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd" (UID: "478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.763254 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-scripts\") pod \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.763326 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd" (UID: "478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.763377 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-additional-scripts\") pod \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.763433 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-log-ovn\") pod \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.763541 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd" (UID: "478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.763911 4795 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.763929 4795 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-run\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.763941 4795 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.763958 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd" (UID: "478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.764135 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-scripts" (OuterVolumeSpecName: "scripts") pod "478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd" (UID: "478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.768688 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-kube-api-access-4d8gs" (OuterVolumeSpecName: "kube-api-access-4d8gs") pod "478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd" (UID: "478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd"). InnerVolumeSpecName "kube-api-access-4d8gs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.865653 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.865679 4795 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.865690 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d8gs\" (UniqueName: \"kubernetes.io/projected/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-kube-api-access-4d8gs\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:47 crc kubenswrapper[4795]: I0219 21:45:47.301207 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-w9fbs-config-d9rr5" event={"ID":"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd","Type":"ContainerDied","Data":"65876dc5d7abd6c76f19edc3979a74c5098fbab43150468cdad7e42ab8b1ff48"} Feb 19 21:45:47 crc kubenswrapper[4795]: I0219 21:45:47.301583 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65876dc5d7abd6c76f19edc3979a74c5098fbab43150468cdad7e42ab8b1ff48" Feb 19 21:45:47 crc kubenswrapper[4795]: I0219 21:45:47.301662 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:47 crc kubenswrapper[4795]: I0219 21:45:47.315704 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerStarted","Data":"c8d438309dbe4ee742eb9d7a2b93e755a74c9ba2dd39409dcf7caf84dee6405a"} Feb 19 21:45:47 crc kubenswrapper[4795]: I0219 21:45:47.315742 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerStarted","Data":"c5d0640985105b2140d43ecf956f5621f6e82eb5a3d40d95fb3d09d303406c84"} Feb 19 21:45:47 crc kubenswrapper[4795]: I0219 21:45:47.405877 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-w9fbs-config-d9rr5"] Feb 19 21:45:47 crc kubenswrapper[4795]: I0219 21:45:47.415472 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-w9fbs-config-d9rr5"] Feb 19 21:45:47 crc kubenswrapper[4795]: I0219 21:45:47.521700 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd" path="/var/lib/kubelet/pods/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd/volumes" Feb 19 21:45:47 crc kubenswrapper[4795]: I0219 21:45:47.653938 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-w9fbs" Feb 19 21:45:47 crc kubenswrapper[4795]: I0219 21:45:47.936351 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.333107 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerStarted","Data":"955740cbd5ac4eda735378957980240001d5c0ce0905f2fca18b4155f3fb6c98"} Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.333667 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerStarted","Data":"bb960073c3b2955d7aa2d18d3eb2e0958e7e98f4cd499d7077f5064d1e43a05e"} Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.333681 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerStarted","Data":"8d3a569ab5140e595996d2c82fd170ed28aa9420de4fdae36e9b5854b2e0bd5e"} Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.333691 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerStarted","Data":"2ce7f7a343a6c79fd57a6b4c7cea8f6f21ccfbc5ea5261b4c592c5cc2035910e"} Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.333703 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerStarted","Data":"cb0176a835c07bb843ea9834f19b5792b6d9700c5cd61a140ec8b99a66854f5f"} Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.669355 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=19.519956086 podStartE2EDuration="25.669335813s" podCreationTimestamp="2026-02-19 21:45:23 +0000 UTC" firstStartedPulling="2026-02-19 21:45:40.815880836 +0000 UTC m=+1052.008398700" lastFinishedPulling="2026-02-19 21:45:46.965260563 +0000 UTC m=+1058.157778427" observedRunningTime="2026-02-19 21:45:48.381026682 +0000 UTC m=+1059.573544546" watchObservedRunningTime="2026-02-19 21:45:48.669335813 +0000 UTC m=+1059.861853677" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.675508 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-768666cd57-5c55b"] Feb 19 21:45:48 crc kubenswrapper[4795]: E0219 21:45:48.675845 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd" containerName="ovn-config" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.675861 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd" containerName="ovn-config" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.676027 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd" containerName="ovn-config" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.676979 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.680357 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.687000 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-768666cd57-5c55b"] Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.814528 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-ovsdbserver-nb\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.814598 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-dns-svc\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.814790 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-ovsdbserver-sb\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.814900 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w95wm\" (UniqueName: \"kubernetes.io/projected/2f384824-f8ad-42d9-b09b-decb5280b448-kube-api-access-w95wm\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.814991 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-config\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.815025 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-dns-swift-storage-0\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.916702 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-ovsdbserver-nb\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.916762 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-dns-svc\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.916788 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-ovsdbserver-sb\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.916819 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w95wm\" (UniqueName: \"kubernetes.io/projected/2f384824-f8ad-42d9-b09b-decb5280b448-kube-api-access-w95wm\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.916851 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-config\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.916871 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-dns-swift-storage-0\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.917711 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-dns-swift-storage-0\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.918325 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-ovsdbserver-nb\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.919503 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-dns-svc\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.920018 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-config\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.920110 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-ovsdbserver-sb\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.942596 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w95wm\" (UniqueName: \"kubernetes.io/projected/2f384824-f8ad-42d9-b09b-decb5280b448-kube-api-access-w95wm\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:49 crc kubenswrapper[4795]: I0219 21:45:49.036755 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:49 crc kubenswrapper[4795]: I0219 21:45:49.479648 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-768666cd57-5c55b"] Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.446587 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.772052 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-snb69"] Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.773333 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-snb69" Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.786838 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-snb69"] Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.863606 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-9f51-account-create-update-n57zq"] Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.864717 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9f51-account-create-update-n57zq" Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.867146 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.886292 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9f51-account-create-update-n57zq"] Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.893062 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88vvr\" (UniqueName: \"kubernetes.io/projected/18561896-d336-4962-8e9e-4ccf748f8605-kube-api-access-88vvr\") pod \"cinder-db-create-snb69\" (UID: \"18561896-d336-4962-8e9e-4ccf748f8605\") " pod="openstack/cinder-db-create-snb69" Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.893221 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18561896-d336-4962-8e9e-4ccf748f8605-operator-scripts\") pod \"cinder-db-create-snb69\" (UID: \"18561896-d336-4962-8e9e-4ccf748f8605\") " pod="openstack/cinder-db-create-snb69" Feb 19 21:45:57 crc kubenswrapper[4795]: W0219 21:45:57.966992 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f384824_f8ad_42d9_b09b_decb5280b448.slice/crio-13682a647f5a86053553d890ba06f71e1b6b17acba7850bc7912292cc1d6509d WatchSource:0}: Error finding container 13682a647f5a86053553d890ba06f71e1b6b17acba7850bc7912292cc1d6509d: Status 404 returned error can't find the container with id 13682a647f5a86053553d890ba06f71e1b6b17acba7850bc7912292cc1d6509d Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.968996 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-vlmnn"] Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.970017 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vlmnn" Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.975283 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-vlmnn"] Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.995793 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88vvr\" (UniqueName: \"kubernetes.io/projected/18561896-d336-4962-8e9e-4ccf748f8605-kube-api-access-88vvr\") pod \"cinder-db-create-snb69\" (UID: \"18561896-d336-4962-8e9e-4ccf748f8605\") " pod="openstack/cinder-db-create-snb69" Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.995865 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d152069-2c3d-4cf4-94e8-3068e24def9f-operator-scripts\") pod \"cinder-9f51-account-create-update-n57zq\" (UID: \"7d152069-2c3d-4cf4-94e8-3068e24def9f\") " pod="openstack/cinder-9f51-account-create-update-n57zq" Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.995916 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v72cs\" (UniqueName: \"kubernetes.io/projected/7d152069-2c3d-4cf4-94e8-3068e24def9f-kube-api-access-v72cs\") pod \"cinder-9f51-account-create-update-n57zq\" (UID: \"7d152069-2c3d-4cf4-94e8-3068e24def9f\") " pod="openstack/cinder-9f51-account-create-update-n57zq" Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.995966 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18561896-d336-4962-8e9e-4ccf748f8605-operator-scripts\") pod \"cinder-db-create-snb69\" (UID: \"18561896-d336-4962-8e9e-4ccf748f8605\") " pod="openstack/cinder-db-create-snb69" Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.996608 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18561896-d336-4962-8e9e-4ccf748f8605-operator-scripts\") pod \"cinder-db-create-snb69\" (UID: \"18561896-d336-4962-8e9e-4ccf748f8605\") " pod="openstack/cinder-db-create-snb69" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.017505 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88vvr\" (UniqueName: \"kubernetes.io/projected/18561896-d336-4962-8e9e-4ccf748f8605-kube-api-access-88vvr\") pod \"cinder-db-create-snb69\" (UID: \"18561896-d336-4962-8e9e-4ccf748f8605\") " pod="openstack/cinder-db-create-snb69" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.075679 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-d602-account-create-update-mc6fv"] Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.076667 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d602-account-create-update-mc6fv" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.078529 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.091279 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d602-account-create-update-mc6fv"] Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.091484 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-snb69" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.097418 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hxjx\" (UniqueName: \"kubernetes.io/projected/73f01f44-1467-442f-b91f-ac1765626a3d-kube-api-access-9hxjx\") pod \"barbican-db-create-vlmnn\" (UID: \"73f01f44-1467-442f-b91f-ac1765626a3d\") " pod="openstack/barbican-db-create-vlmnn" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.097576 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d152069-2c3d-4cf4-94e8-3068e24def9f-operator-scripts\") pod \"cinder-9f51-account-create-update-n57zq\" (UID: \"7d152069-2c3d-4cf4-94e8-3068e24def9f\") " pod="openstack/cinder-9f51-account-create-update-n57zq" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.097648 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v72cs\" (UniqueName: \"kubernetes.io/projected/7d152069-2c3d-4cf4-94e8-3068e24def9f-kube-api-access-v72cs\") pod \"cinder-9f51-account-create-update-n57zq\" (UID: \"7d152069-2c3d-4cf4-94e8-3068e24def9f\") " pod="openstack/cinder-9f51-account-create-update-n57zq" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.097687 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73f01f44-1467-442f-b91f-ac1765626a3d-operator-scripts\") pod \"barbican-db-create-vlmnn\" (UID: \"73f01f44-1467-442f-b91f-ac1765626a3d\") " pod="openstack/barbican-db-create-vlmnn" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.098460 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d152069-2c3d-4cf4-94e8-3068e24def9f-operator-scripts\") pod \"cinder-9f51-account-create-update-n57zq\" (UID: \"7d152069-2c3d-4cf4-94e8-3068e24def9f\") " pod="openstack/cinder-9f51-account-create-update-n57zq" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.116292 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v72cs\" (UniqueName: \"kubernetes.io/projected/7d152069-2c3d-4cf4-94e8-3068e24def9f-kube-api-access-v72cs\") pod \"cinder-9f51-account-create-update-n57zq\" (UID: \"7d152069-2c3d-4cf4-94e8-3068e24def9f\") " pod="openstack/cinder-9f51-account-create-update-n57zq" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.151355 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-dxql7"] Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.152248 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dxql7" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.157889 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.158259 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.158630 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8pz2x" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.165388 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.182519 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9f51-account-create-update-n57zq" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.190917 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-dxql7"] Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.199273 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hxjx\" (UniqueName: \"kubernetes.io/projected/73f01f44-1467-442f-b91f-ac1765626a3d-kube-api-access-9hxjx\") pod \"barbican-db-create-vlmnn\" (UID: \"73f01f44-1467-442f-b91f-ac1765626a3d\") " pod="openstack/barbican-db-create-vlmnn" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.199325 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf27b\" (UniqueName: \"kubernetes.io/projected/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-kube-api-access-vf27b\") pod \"keystone-db-sync-dxql7\" (UID: \"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23\") " pod="openstack/keystone-db-sync-dxql7" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.199369 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-config-data\") pod \"keystone-db-sync-dxql7\" (UID: \"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23\") " pod="openstack/keystone-db-sync-dxql7" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.199387 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57961551-d4f8-4586-b255-8810fbdb499a-operator-scripts\") pod \"barbican-d602-account-create-update-mc6fv\" (UID: \"57961551-d4f8-4586-b255-8810fbdb499a\") " pod="openstack/barbican-d602-account-create-update-mc6fv" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.199433 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kkx2\" (UniqueName: \"kubernetes.io/projected/57961551-d4f8-4586-b255-8810fbdb499a-kube-api-access-4kkx2\") pod \"barbican-d602-account-create-update-mc6fv\" (UID: \"57961551-d4f8-4586-b255-8810fbdb499a\") " pod="openstack/barbican-d602-account-create-update-mc6fv" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.199591 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-combined-ca-bundle\") pod \"keystone-db-sync-dxql7\" (UID: \"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23\") " pod="openstack/keystone-db-sync-dxql7" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.199728 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73f01f44-1467-442f-b91f-ac1765626a3d-operator-scripts\") pod \"barbican-db-create-vlmnn\" (UID: \"73f01f44-1467-442f-b91f-ac1765626a3d\") " pod="openstack/barbican-db-create-vlmnn" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.200521 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73f01f44-1467-442f-b91f-ac1765626a3d-operator-scripts\") pod \"barbican-db-create-vlmnn\" (UID: \"73f01f44-1467-442f-b91f-ac1765626a3d\") " pod="openstack/barbican-db-create-vlmnn" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.210997 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-8jt8c"] Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.212020 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8jt8c" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.226673 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hxjx\" (UniqueName: \"kubernetes.io/projected/73f01f44-1467-442f-b91f-ac1765626a3d-kube-api-access-9hxjx\") pod \"barbican-db-create-vlmnn\" (UID: \"73f01f44-1467-442f-b91f-ac1765626a3d\") " pod="openstack/barbican-db-create-vlmnn" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.251182 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-8jt8c"] Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.295517 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7c93-account-create-update-ptrqq"] Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.298064 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c93-account-create-update-ptrqq" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.301131 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-combined-ca-bundle\") pod \"keystone-db-sync-dxql7\" (UID: \"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23\") " pod="openstack/keystone-db-sync-dxql7" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.301208 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c-operator-scripts\") pod \"neutron-db-create-8jt8c\" (UID: \"6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c\") " pod="openstack/neutron-db-create-8jt8c" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.301293 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf27b\" (UniqueName: \"kubernetes.io/projected/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-kube-api-access-vf27b\") pod \"keystone-db-sync-dxql7\" (UID: \"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23\") " pod="openstack/keystone-db-sync-dxql7" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.301324 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scxqb\" (UniqueName: \"kubernetes.io/projected/6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c-kube-api-access-scxqb\") pod \"neutron-db-create-8jt8c\" (UID: \"6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c\") " pod="openstack/neutron-db-create-8jt8c" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.301346 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-config-data\") pod \"keystone-db-sync-dxql7\" (UID: \"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23\") " pod="openstack/keystone-db-sync-dxql7" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.301363 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57961551-d4f8-4586-b255-8810fbdb499a-operator-scripts\") pod \"barbican-d602-account-create-update-mc6fv\" (UID: \"57961551-d4f8-4586-b255-8810fbdb499a\") " pod="openstack/barbican-d602-account-create-update-mc6fv" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.301534 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kkx2\" (UniqueName: \"kubernetes.io/projected/57961551-d4f8-4586-b255-8810fbdb499a-kube-api-access-4kkx2\") pod \"barbican-d602-account-create-update-mc6fv\" (UID: \"57961551-d4f8-4586-b255-8810fbdb499a\") " pod="openstack/barbican-d602-account-create-update-mc6fv" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.301972 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57961551-d4f8-4586-b255-8810fbdb499a-operator-scripts\") pod \"barbican-d602-account-create-update-mc6fv\" (UID: \"57961551-d4f8-4586-b255-8810fbdb499a\") " pod="openstack/barbican-d602-account-create-update-mc6fv" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.304630 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7c93-account-create-update-ptrqq"] Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.305010 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.305106 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-combined-ca-bundle\") pod \"keystone-db-sync-dxql7\" (UID: \"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23\") " pod="openstack/keystone-db-sync-dxql7" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.316797 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-config-data\") pod \"keystone-db-sync-dxql7\" (UID: \"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23\") " pod="openstack/keystone-db-sync-dxql7" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.323761 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kkx2\" (UniqueName: \"kubernetes.io/projected/57961551-d4f8-4586-b255-8810fbdb499a-kube-api-access-4kkx2\") pod \"barbican-d602-account-create-update-mc6fv\" (UID: \"57961551-d4f8-4586-b255-8810fbdb499a\") " pod="openstack/barbican-d602-account-create-update-mc6fv" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.325018 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf27b\" (UniqueName: \"kubernetes.io/projected/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-kube-api-access-vf27b\") pod \"keystone-db-sync-dxql7\" (UID: \"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23\") " pod="openstack/keystone-db-sync-dxql7" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.402684 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c-operator-scripts\") pod \"neutron-db-create-8jt8c\" (UID: \"6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c\") " pod="openstack/neutron-db-create-8jt8c" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.402741 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cb700b2-4c29-4deb-a379-d18f2695dcaf-operator-scripts\") pod \"neutron-7c93-account-create-update-ptrqq\" (UID: \"4cb700b2-4c29-4deb-a379-d18f2695dcaf\") " pod="openstack/neutron-7c93-account-create-update-ptrqq" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.402804 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfjgm\" (UniqueName: \"kubernetes.io/projected/4cb700b2-4c29-4deb-a379-d18f2695dcaf-kube-api-access-rfjgm\") pod \"neutron-7c93-account-create-update-ptrqq\" (UID: \"4cb700b2-4c29-4deb-a379-d18f2695dcaf\") " pod="openstack/neutron-7c93-account-create-update-ptrqq" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.402848 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scxqb\" (UniqueName: \"kubernetes.io/projected/6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c-kube-api-access-scxqb\") pod \"neutron-db-create-8jt8c\" (UID: \"6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c\") " pod="openstack/neutron-db-create-8jt8c" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.405359 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c-operator-scripts\") pod \"neutron-db-create-8jt8c\" (UID: \"6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c\") " pod="openstack/neutron-db-create-8jt8c" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.425714 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scxqb\" (UniqueName: \"kubernetes.io/projected/6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c-kube-api-access-scxqb\") pod \"neutron-db-create-8jt8c\" (UID: \"6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c\") " pod="openstack/neutron-db-create-8jt8c" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.442977 4795 generic.go:334] "Generic (PLEG): container finished" podID="2f384824-f8ad-42d9-b09b-decb5280b448" containerID="269c23977ee5c8c89a33c0c6142172272e66947879bd10de823a2ea6ed071bb6" exitCode=0 Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.443028 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768666cd57-5c55b" event={"ID":"2f384824-f8ad-42d9-b09b-decb5280b448","Type":"ContainerDied","Data":"269c23977ee5c8c89a33c0c6142172272e66947879bd10de823a2ea6ed071bb6"} Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.443061 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768666cd57-5c55b" event={"ID":"2f384824-f8ad-42d9-b09b-decb5280b448","Type":"ContainerStarted","Data":"13682a647f5a86053553d890ba06f71e1b6b17acba7850bc7912292cc1d6509d"} Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.446923 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-snb69"] Feb 19 21:45:58 crc kubenswrapper[4795]: W0219 21:45:58.448386 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18561896_d336_4962_8e9e_4ccf748f8605.slice/crio-237556236a6852992de9809aa75761000b557b98fe94e76438ed8fb257b1f0d3 WatchSource:0}: Error finding container 237556236a6852992de9809aa75761000b557b98fe94e76438ed8fb257b1f0d3: Status 404 returned error can't find the container with id 237556236a6852992de9809aa75761000b557b98fe94e76438ed8fb257b1f0d3 Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.494683 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vlmnn" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.503627 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfjgm\" (UniqueName: \"kubernetes.io/projected/4cb700b2-4c29-4deb-a379-d18f2695dcaf-kube-api-access-rfjgm\") pod \"neutron-7c93-account-create-update-ptrqq\" (UID: \"4cb700b2-4c29-4deb-a379-d18f2695dcaf\") " pod="openstack/neutron-7c93-account-create-update-ptrqq" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.503801 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cb700b2-4c29-4deb-a379-d18f2695dcaf-operator-scripts\") pod \"neutron-7c93-account-create-update-ptrqq\" (UID: \"4cb700b2-4c29-4deb-a379-d18f2695dcaf\") " pod="openstack/neutron-7c93-account-create-update-ptrqq" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.504851 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cb700b2-4c29-4deb-a379-d18f2695dcaf-operator-scripts\") pod \"neutron-7c93-account-create-update-ptrqq\" (UID: \"4cb700b2-4c29-4deb-a379-d18f2695dcaf\") " pod="openstack/neutron-7c93-account-create-update-ptrqq" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.511493 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d602-account-create-update-mc6fv" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.518589 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dxql7" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.520009 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfjgm\" (UniqueName: \"kubernetes.io/projected/4cb700b2-4c29-4deb-a379-d18f2695dcaf-kube-api-access-rfjgm\") pod \"neutron-7c93-account-create-update-ptrqq\" (UID: \"4cb700b2-4c29-4deb-a379-d18f2695dcaf\") " pod="openstack/neutron-7c93-account-create-update-ptrqq" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.546935 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8jt8c" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.722155 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c93-account-create-update-ptrqq" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.781213 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9f51-account-create-update-n57zq"] Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.861695 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-vlmnn"] Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.146028 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-8jt8c"] Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.154963 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d602-account-create-update-mc6fv"] Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.255354 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-dxql7"] Feb 19 21:45:59 crc kubenswrapper[4795]: W0219 21:45:59.267110 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc70b8f2_4f1b_4b6e_b657_66aac1cbfa23.slice/crio-884276ae22779ec15ffcd93cb5953527ba29f916cfb173ab2b7943b9d65999d4 WatchSource:0}: Error finding container 884276ae22779ec15ffcd93cb5953527ba29f916cfb173ab2b7943b9d65999d4: Status 404 returned error can't find the container with id 884276ae22779ec15ffcd93cb5953527ba29f916cfb173ab2b7943b9d65999d4 Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.358926 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7c93-account-create-update-ptrqq"] Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.456084 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d602-account-create-update-mc6fv" event={"ID":"57961551-d4f8-4586-b255-8810fbdb499a","Type":"ContainerStarted","Data":"122378ef931bbdce5fbd1c31b593170eeae5e5fed8a74b40fbb0d536de3a3d22"} Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.456127 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d602-account-create-update-mc6fv" event={"ID":"57961551-d4f8-4586-b255-8810fbdb499a","Type":"ContainerStarted","Data":"22d7d81ad8b9d1148d11d4477982eeefb07c589c029993c9da669c9f0edf5d61"} Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.469754 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-2wbff" event={"ID":"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8","Type":"ContainerStarted","Data":"f3e360ac25dc639935c61b2baa9937c7f07b88ebec95aede182ab25a4907955c"} Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.473979 4795 generic.go:334] "Generic (PLEG): container finished" podID="18561896-d336-4962-8e9e-4ccf748f8605" containerID="6b0c6da228cd7d133c73866bc31005f08b881ce2ffda7a1dc8905fe2cbf580b7" exitCode=0 Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.474081 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-snb69" event={"ID":"18561896-d336-4962-8e9e-4ccf748f8605","Type":"ContainerDied","Data":"6b0c6da228cd7d133c73866bc31005f08b881ce2ffda7a1dc8905fe2cbf580b7"} Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.474105 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-snb69" event={"ID":"18561896-d336-4962-8e9e-4ccf748f8605","Type":"ContainerStarted","Data":"237556236a6852992de9809aa75761000b557b98fe94e76438ed8fb257b1f0d3"} Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.476253 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-d602-account-create-update-mc6fv" podStartSLOduration=1.476237266 podStartE2EDuration="1.476237266s" podCreationTimestamp="2026-02-19 21:45:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:45:59.474462245 +0000 UTC m=+1070.666980109" watchObservedRunningTime="2026-02-19 21:45:59.476237266 +0000 UTC m=+1070.668755130" Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.479807 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dxql7" event={"ID":"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23","Type":"ContainerStarted","Data":"884276ae22779ec15ffcd93cb5953527ba29f916cfb173ab2b7943b9d65999d4"} Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.483223 4795 generic.go:334] "Generic (PLEG): container finished" podID="73f01f44-1467-442f-b91f-ac1765626a3d" containerID="464384efe0cd85358add102341feabf48351464a71bfc2ad9ed2ce3ea45a4a94" exitCode=0 Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.483312 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-vlmnn" event={"ID":"73f01f44-1467-442f-b91f-ac1765626a3d","Type":"ContainerDied","Data":"464384efe0cd85358add102341feabf48351464a71bfc2ad9ed2ce3ea45a4a94"} Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.483362 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-vlmnn" event={"ID":"73f01f44-1467-442f-b91f-ac1765626a3d","Type":"ContainerStarted","Data":"0abef5e7947f0e3cbddf50027fa385aebc0f4b6ccd2ad482c073299aac2c5e2e"} Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.484653 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c93-account-create-update-ptrqq" event={"ID":"4cb700b2-4c29-4deb-a379-d18f2695dcaf","Type":"ContainerStarted","Data":"3989bba1eed50aec45faea3f156750277605314604317a5496d6bb8b3902c171"} Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.490919 4795 generic.go:334] "Generic (PLEG): container finished" podID="7d152069-2c3d-4cf4-94e8-3068e24def9f" containerID="62e8bfd862eff78d929edc90f98ab271d6cbd53192119320b7d77f0dddb03767" exitCode=0 Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.490980 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9f51-account-create-update-n57zq" event={"ID":"7d152069-2c3d-4cf4-94e8-3068e24def9f","Type":"ContainerDied","Data":"62e8bfd862eff78d929edc90f98ab271d6cbd53192119320b7d77f0dddb03767"} Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.491003 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9f51-account-create-update-n57zq" event={"ID":"7d152069-2c3d-4cf4-94e8-3068e24def9f","Type":"ContainerStarted","Data":"a0c267ff5a8f3bc00149f43c149c6b31e9b75bde6755d31bd67947fa2475dd75"} Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.492665 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8jt8c" event={"ID":"6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c","Type":"ContainerStarted","Data":"83550b0e412e3c0f2147b2389e1c3220efde13a4057f9e67e612e0461d3172fc"} Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.492697 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8jt8c" event={"ID":"6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c","Type":"ContainerStarted","Data":"6e153c4a49c239a97a8eb900ea2dbb494088e83e1efd11bec5254c0600bc3983"} Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.505926 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768666cd57-5c55b" event={"ID":"2f384824-f8ad-42d9-b09b-decb5280b448","Type":"ContainerStarted","Data":"2ad1ba517bb7be9e1a9d1797c2af47441870593656bca08175b2e33f72c8f9fc"} Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.506493 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.514730 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-2wbff" podStartSLOduration=3.085979002 podStartE2EDuration="18.514709771s" podCreationTimestamp="2026-02-19 21:45:41 +0000 UTC" firstStartedPulling="2026-02-19 21:45:42.819619277 +0000 UTC m=+1054.012137141" lastFinishedPulling="2026-02-19 21:45:58.248350046 +0000 UTC m=+1069.440867910" observedRunningTime="2026-02-19 21:45:59.514193806 +0000 UTC m=+1070.706711670" watchObservedRunningTime="2026-02-19 21:45:59.514709771 +0000 UTC m=+1070.707227635" Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.596929 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-8jt8c" podStartSLOduration=1.5969009889999999 podStartE2EDuration="1.596900989s" podCreationTimestamp="2026-02-19 21:45:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:45:59.591861776 +0000 UTC m=+1070.784379640" watchObservedRunningTime="2026-02-19 21:45:59.596900989 +0000 UTC m=+1070.789418853" Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.635047 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-768666cd57-5c55b" podStartSLOduration=11.635026294 podStartE2EDuration="11.635026294s" podCreationTimestamp="2026-02-19 21:45:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:45:59.630480386 +0000 UTC m=+1070.822998250" watchObservedRunningTime="2026-02-19 21:45:59.635026294 +0000 UTC m=+1070.827544158" Feb 19 21:46:00 crc kubenswrapper[4795]: I0219 21:46:00.516792 4795 generic.go:334] "Generic (PLEG): container finished" podID="6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c" containerID="83550b0e412e3c0f2147b2389e1c3220efde13a4057f9e67e612e0461d3172fc" exitCode=0 Feb 19 21:46:00 crc kubenswrapper[4795]: I0219 21:46:00.516871 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8jt8c" event={"ID":"6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c","Type":"ContainerDied","Data":"83550b0e412e3c0f2147b2389e1c3220efde13a4057f9e67e612e0461d3172fc"} Feb 19 21:46:00 crc kubenswrapper[4795]: I0219 21:46:00.522134 4795 generic.go:334] "Generic (PLEG): container finished" podID="57961551-d4f8-4586-b255-8810fbdb499a" containerID="122378ef931bbdce5fbd1c31b593170eeae5e5fed8a74b40fbb0d536de3a3d22" exitCode=0 Feb 19 21:46:00 crc kubenswrapper[4795]: I0219 21:46:00.522199 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d602-account-create-update-mc6fv" event={"ID":"57961551-d4f8-4586-b255-8810fbdb499a","Type":"ContainerDied","Data":"122378ef931bbdce5fbd1c31b593170eeae5e5fed8a74b40fbb0d536de3a3d22"} Feb 19 21:46:00 crc kubenswrapper[4795]: I0219 21:46:00.535420 4795 generic.go:334] "Generic (PLEG): container finished" podID="4cb700b2-4c29-4deb-a379-d18f2695dcaf" containerID="88bfd25384ba94c2658b1d015dafd73e32d146937b8caf9b1891261850b11f22" exitCode=0 Feb 19 21:46:00 crc kubenswrapper[4795]: I0219 21:46:00.535713 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c93-account-create-update-ptrqq" event={"ID":"4cb700b2-4c29-4deb-a379-d18f2695dcaf","Type":"ContainerDied","Data":"88bfd25384ba94c2658b1d015dafd73e32d146937b8caf9b1891261850b11f22"} Feb 19 21:46:00 crc kubenswrapper[4795]: I0219 21:46:00.968449 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vlmnn" Feb 19 21:46:00 crc kubenswrapper[4795]: I0219 21:46:00.969862 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9f51-account-create-update-n57zq" Feb 19 21:46:00 crc kubenswrapper[4795]: I0219 21:46:00.975250 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-snb69" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.081248 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d152069-2c3d-4cf4-94e8-3068e24def9f-operator-scripts\") pod \"7d152069-2c3d-4cf4-94e8-3068e24def9f\" (UID: \"7d152069-2c3d-4cf4-94e8-3068e24def9f\") " Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.081299 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88vvr\" (UniqueName: \"kubernetes.io/projected/18561896-d336-4962-8e9e-4ccf748f8605-kube-api-access-88vvr\") pod \"18561896-d336-4962-8e9e-4ccf748f8605\" (UID: \"18561896-d336-4962-8e9e-4ccf748f8605\") " Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.081332 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hxjx\" (UniqueName: \"kubernetes.io/projected/73f01f44-1467-442f-b91f-ac1765626a3d-kube-api-access-9hxjx\") pod \"73f01f44-1467-442f-b91f-ac1765626a3d\" (UID: \"73f01f44-1467-442f-b91f-ac1765626a3d\") " Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.081381 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73f01f44-1467-442f-b91f-ac1765626a3d-operator-scripts\") pod \"73f01f44-1467-442f-b91f-ac1765626a3d\" (UID: \"73f01f44-1467-442f-b91f-ac1765626a3d\") " Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.081494 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v72cs\" (UniqueName: \"kubernetes.io/projected/7d152069-2c3d-4cf4-94e8-3068e24def9f-kube-api-access-v72cs\") pod \"7d152069-2c3d-4cf4-94e8-3068e24def9f\" (UID: \"7d152069-2c3d-4cf4-94e8-3068e24def9f\") " Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.081521 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18561896-d336-4962-8e9e-4ccf748f8605-operator-scripts\") pod \"18561896-d336-4962-8e9e-4ccf748f8605\" (UID: \"18561896-d336-4962-8e9e-4ccf748f8605\") " Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.085820 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d152069-2c3d-4cf4-94e8-3068e24def9f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7d152069-2c3d-4cf4-94e8-3068e24def9f" (UID: "7d152069-2c3d-4cf4-94e8-3068e24def9f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.086306 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73f01f44-1467-442f-b91f-ac1765626a3d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "73f01f44-1467-442f-b91f-ac1765626a3d" (UID: "73f01f44-1467-442f-b91f-ac1765626a3d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.087111 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18561896-d336-4962-8e9e-4ccf748f8605-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "18561896-d336-4962-8e9e-4ccf748f8605" (UID: "18561896-d336-4962-8e9e-4ccf748f8605"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.089020 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d152069-2c3d-4cf4-94e8-3068e24def9f-kube-api-access-v72cs" (OuterVolumeSpecName: "kube-api-access-v72cs") pod "7d152069-2c3d-4cf4-94e8-3068e24def9f" (UID: "7d152069-2c3d-4cf4-94e8-3068e24def9f"). InnerVolumeSpecName "kube-api-access-v72cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.089328 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18561896-d336-4962-8e9e-4ccf748f8605-kube-api-access-88vvr" (OuterVolumeSpecName: "kube-api-access-88vvr") pod "18561896-d336-4962-8e9e-4ccf748f8605" (UID: "18561896-d336-4962-8e9e-4ccf748f8605"). InnerVolumeSpecName "kube-api-access-88vvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.091356 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73f01f44-1467-442f-b91f-ac1765626a3d-kube-api-access-9hxjx" (OuterVolumeSpecName: "kube-api-access-9hxjx") pod "73f01f44-1467-442f-b91f-ac1765626a3d" (UID: "73f01f44-1467-442f-b91f-ac1765626a3d"). InnerVolumeSpecName "kube-api-access-9hxjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.182849 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v72cs\" (UniqueName: \"kubernetes.io/projected/7d152069-2c3d-4cf4-94e8-3068e24def9f-kube-api-access-v72cs\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.183150 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18561896-d336-4962-8e9e-4ccf748f8605-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.183160 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d152069-2c3d-4cf4-94e8-3068e24def9f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.183180 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88vvr\" (UniqueName: \"kubernetes.io/projected/18561896-d336-4962-8e9e-4ccf748f8605-kube-api-access-88vvr\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.183189 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hxjx\" (UniqueName: \"kubernetes.io/projected/73f01f44-1467-442f-b91f-ac1765626a3d-kube-api-access-9hxjx\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.183199 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73f01f44-1467-442f-b91f-ac1765626a3d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.550434 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9f51-account-create-update-n57zq" event={"ID":"7d152069-2c3d-4cf4-94e8-3068e24def9f","Type":"ContainerDied","Data":"a0c267ff5a8f3bc00149f43c149c6b31e9b75bde6755d31bd67947fa2475dd75"} Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.550480 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0c267ff5a8f3bc00149f43c149c6b31e9b75bde6755d31bd67947fa2475dd75" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.550556 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9f51-account-create-update-n57zq" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.553251 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-snb69" event={"ID":"18561896-d336-4962-8e9e-4ccf748f8605","Type":"ContainerDied","Data":"237556236a6852992de9809aa75761000b557b98fe94e76438ed8fb257b1f0d3"} Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.553279 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="237556236a6852992de9809aa75761000b557b98fe94e76438ed8fb257b1f0d3" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.553327 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-snb69" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.555592 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vlmnn" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.556233 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-vlmnn" event={"ID":"73f01f44-1467-442f-b91f-ac1765626a3d","Type":"ContainerDied","Data":"0abef5e7947f0e3cbddf50027fa385aebc0f4b6ccd2ad482c073299aac2c5e2e"} Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.556262 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0abef5e7947f0e3cbddf50027fa385aebc0f4b6ccd2ad482c073299aac2c5e2e" Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.736803 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c93-account-create-update-ptrqq" Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.763650 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d602-account-create-update-mc6fv" Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.771843 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8jt8c" Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.830348 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cb700b2-4c29-4deb-a379-d18f2695dcaf-operator-scripts\") pod \"4cb700b2-4c29-4deb-a379-d18f2695dcaf\" (UID: \"4cb700b2-4c29-4deb-a379-d18f2695dcaf\") " Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.832947 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cb700b2-4c29-4deb-a379-d18f2695dcaf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4cb700b2-4c29-4deb-a379-d18f2695dcaf" (UID: "4cb700b2-4c29-4deb-a379-d18f2695dcaf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.833640 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfjgm\" (UniqueName: \"kubernetes.io/projected/4cb700b2-4c29-4deb-a379-d18f2695dcaf-kube-api-access-rfjgm\") pod \"4cb700b2-4c29-4deb-a379-d18f2695dcaf\" (UID: \"4cb700b2-4c29-4deb-a379-d18f2695dcaf\") " Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.835072 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scxqb\" (UniqueName: \"kubernetes.io/projected/6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c-kube-api-access-scxqb\") pod \"6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c\" (UID: \"6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c\") " Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.836098 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c-operator-scripts\") pod \"6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c\" (UID: \"6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c\") " Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.836496 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57961551-d4f8-4586-b255-8810fbdb499a-operator-scripts\") pod \"57961551-d4f8-4586-b255-8810fbdb499a\" (UID: \"57961551-d4f8-4586-b255-8810fbdb499a\") " Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.836660 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kkx2\" (UniqueName: \"kubernetes.io/projected/57961551-d4f8-4586-b255-8810fbdb499a-kube-api-access-4kkx2\") pod \"57961551-d4f8-4586-b255-8810fbdb499a\" (UID: \"57961551-d4f8-4586-b255-8810fbdb499a\") " Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.836772 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c" (UID: "6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.837095 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57961551-d4f8-4586-b255-8810fbdb499a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "57961551-d4f8-4586-b255-8810fbdb499a" (UID: "57961551-d4f8-4586-b255-8810fbdb499a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.838416 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cb700b2-4c29-4deb-a379-d18f2695dcaf-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.838439 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.838450 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57961551-d4f8-4586-b255-8810fbdb499a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.839672 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cb700b2-4c29-4deb-a379-d18f2695dcaf-kube-api-access-rfjgm" (OuterVolumeSpecName: "kube-api-access-rfjgm") pod "4cb700b2-4c29-4deb-a379-d18f2695dcaf" (UID: "4cb700b2-4c29-4deb-a379-d18f2695dcaf"). InnerVolumeSpecName "kube-api-access-rfjgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.840561 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c-kube-api-access-scxqb" (OuterVolumeSpecName: "kube-api-access-scxqb") pod "6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c" (UID: "6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c"). InnerVolumeSpecName "kube-api-access-scxqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.840969 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57961551-d4f8-4586-b255-8810fbdb499a-kube-api-access-4kkx2" (OuterVolumeSpecName: "kube-api-access-4kkx2") pod "57961551-d4f8-4586-b255-8810fbdb499a" (UID: "57961551-d4f8-4586-b255-8810fbdb499a"). InnerVolumeSpecName "kube-api-access-4kkx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.940387 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfjgm\" (UniqueName: \"kubernetes.io/projected/4cb700b2-4c29-4deb-a379-d18f2695dcaf-kube-api-access-rfjgm\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.940417 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scxqb\" (UniqueName: \"kubernetes.io/projected/6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c-kube-api-access-scxqb\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.940426 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kkx2\" (UniqueName: \"kubernetes.io/projected/57961551-d4f8-4586-b255-8810fbdb499a-kube-api-access-4kkx2\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.039418 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.118661 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-hpfn4"] Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.118932 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" podUID="7b4dab90-2dba-4e1f-95fe-5c435d4e270a" containerName="dnsmasq-dns" containerID="cri-o://5482b8f1b44d8eeecca016192cda7d894246fb50c4d06a9b45628279359dfa86" gracePeriod=10 Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.473782 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.549416 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-ovsdbserver-nb\") pod \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.549491 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-ovsdbserver-sb\") pod \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.549559 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-dns-svc\") pod \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.549604 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpq8v\" (UniqueName: \"kubernetes.io/projected/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-kube-api-access-cpq8v\") pod \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.549640 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-config\") pod \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.554681 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-kube-api-access-cpq8v" (OuterVolumeSpecName: "kube-api-access-cpq8v") pod "7b4dab90-2dba-4e1f-95fe-5c435d4e270a" (UID: "7b4dab90-2dba-4e1f-95fe-5c435d4e270a"). InnerVolumeSpecName "kube-api-access-cpq8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.583212 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8jt8c" event={"ID":"6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c","Type":"ContainerDied","Data":"6e153c4a49c239a97a8eb900ea2dbb494088e83e1efd11bec5254c0600bc3983"} Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.583256 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e153c4a49c239a97a8eb900ea2dbb494088e83e1efd11bec5254c0600bc3983" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.583339 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8jt8c" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.585745 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dxql7" event={"ID":"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23","Type":"ContainerStarted","Data":"088f63c71340a869532178f76fc11ab9c0cbbdc5c1aeacd6e851cf5d40a46fa0"} Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.593700 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d602-account-create-update-mc6fv" event={"ID":"57961551-d4f8-4586-b255-8810fbdb499a","Type":"ContainerDied","Data":"22d7d81ad8b9d1148d11d4477982eeefb07c589c029993c9da669c9f0edf5d61"} Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.593743 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22d7d81ad8b9d1148d11d4477982eeefb07c589c029993c9da669c9f0edf5d61" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.593834 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d602-account-create-update-mc6fv" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.595659 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7b4dab90-2dba-4e1f-95fe-5c435d4e270a" (UID: "7b4dab90-2dba-4e1f-95fe-5c435d4e270a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.600325 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c93-account-create-update-ptrqq" event={"ID":"4cb700b2-4c29-4deb-a379-d18f2695dcaf","Type":"ContainerDied","Data":"3989bba1eed50aec45faea3f156750277605314604317a5496d6bb8b3902c171"} Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.600367 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3989bba1eed50aec45faea3f156750277605314604317a5496d6bb8b3902c171" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.600452 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c93-account-create-update-ptrqq" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.607185 4795 generic.go:334] "Generic (PLEG): container finished" podID="7b4dab90-2dba-4e1f-95fe-5c435d4e270a" containerID="5482b8f1b44d8eeecca016192cda7d894246fb50c4d06a9b45628279359dfa86" exitCode=0 Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.607238 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" event={"ID":"7b4dab90-2dba-4e1f-95fe-5c435d4e270a","Type":"ContainerDied","Data":"5482b8f1b44d8eeecca016192cda7d894246fb50c4d06a9b45628279359dfa86"} Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.607278 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" event={"ID":"7b4dab90-2dba-4e1f-95fe-5c435d4e270a","Type":"ContainerDied","Data":"6a5cc19b4b04424e6441590aca0708de64c3cb94b9b2a21745480c88aa7a5c4f"} Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.607307 4795 scope.go:117] "RemoveContainer" containerID="5482b8f1b44d8eeecca016192cda7d894246fb50c4d06a9b45628279359dfa86" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.608282 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.617793 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-dxql7" podStartSLOduration=2.255890913 podStartE2EDuration="6.61777349s" podCreationTimestamp="2026-02-19 21:45:58 +0000 UTC" firstStartedPulling="2026-02-19 21:45:59.270295187 +0000 UTC m=+1070.462813051" lastFinishedPulling="2026-02-19 21:46:03.632177754 +0000 UTC m=+1074.824695628" observedRunningTime="2026-02-19 21:46:04.604747742 +0000 UTC m=+1075.797265596" watchObservedRunningTime="2026-02-19 21:46:04.61777349 +0000 UTC m=+1075.810291354" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.620616 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-config" (OuterVolumeSpecName: "config") pod "7b4dab90-2dba-4e1f-95fe-5c435d4e270a" (UID: "7b4dab90-2dba-4e1f-95fe-5c435d4e270a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.622078 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7b4dab90-2dba-4e1f-95fe-5c435d4e270a" (UID: "7b4dab90-2dba-4e1f-95fe-5c435d4e270a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.628653 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7b4dab90-2dba-4e1f-95fe-5c435d4e270a" (UID: "7b4dab90-2dba-4e1f-95fe-5c435d4e270a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.652387 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.652424 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.652438 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.652449 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpq8v\" (UniqueName: \"kubernetes.io/projected/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-kube-api-access-cpq8v\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.652462 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.692938 4795 scope.go:117] "RemoveContainer" containerID="c1eba6aa8c078daa91e1cfd1acb049a25d5fbe528714a75b4a04c5fe223c7ced" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.720048 4795 scope.go:117] "RemoveContainer" containerID="5482b8f1b44d8eeecca016192cda7d894246fb50c4d06a9b45628279359dfa86" Feb 19 21:46:04 crc kubenswrapper[4795]: E0219 21:46:04.720577 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5482b8f1b44d8eeecca016192cda7d894246fb50c4d06a9b45628279359dfa86\": container with ID starting with 5482b8f1b44d8eeecca016192cda7d894246fb50c4d06a9b45628279359dfa86 not found: ID does not exist" containerID="5482b8f1b44d8eeecca016192cda7d894246fb50c4d06a9b45628279359dfa86" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.720617 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5482b8f1b44d8eeecca016192cda7d894246fb50c4d06a9b45628279359dfa86"} err="failed to get container status \"5482b8f1b44d8eeecca016192cda7d894246fb50c4d06a9b45628279359dfa86\": rpc error: code = NotFound desc = could not find container \"5482b8f1b44d8eeecca016192cda7d894246fb50c4d06a9b45628279359dfa86\": container with ID starting with 5482b8f1b44d8eeecca016192cda7d894246fb50c4d06a9b45628279359dfa86 not found: ID does not exist" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.720644 4795 scope.go:117] "RemoveContainer" containerID="c1eba6aa8c078daa91e1cfd1acb049a25d5fbe528714a75b4a04c5fe223c7ced" Feb 19 21:46:04 crc kubenswrapper[4795]: E0219 21:46:04.720991 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1eba6aa8c078daa91e1cfd1acb049a25d5fbe528714a75b4a04c5fe223c7ced\": container with ID starting with c1eba6aa8c078daa91e1cfd1acb049a25d5fbe528714a75b4a04c5fe223c7ced not found: ID does not exist" containerID="c1eba6aa8c078daa91e1cfd1acb049a25d5fbe528714a75b4a04c5fe223c7ced" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.721014 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1eba6aa8c078daa91e1cfd1acb049a25d5fbe528714a75b4a04c5fe223c7ced"} err="failed to get container status \"c1eba6aa8c078daa91e1cfd1acb049a25d5fbe528714a75b4a04c5fe223c7ced\": rpc error: code = NotFound desc = could not find container \"c1eba6aa8c078daa91e1cfd1acb049a25d5fbe528714a75b4a04c5fe223c7ced\": container with ID starting with c1eba6aa8c078daa91e1cfd1acb049a25d5fbe528714a75b4a04c5fe223c7ced not found: ID does not exist" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.945082 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-hpfn4"] Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.952056 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-hpfn4"] Feb 19 21:46:05 crc kubenswrapper[4795]: I0219 21:46:05.529258 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b4dab90-2dba-4e1f-95fe-5c435d4e270a" path="/var/lib/kubelet/pods/7b4dab90-2dba-4e1f-95fe-5c435d4e270a/volumes" Feb 19 21:46:05 crc kubenswrapper[4795]: I0219 21:46:05.626040 4795 generic.go:334] "Generic (PLEG): container finished" podID="ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8" containerID="f3e360ac25dc639935c61b2baa9937c7f07b88ebec95aede182ab25a4907955c" exitCode=0 Feb 19 21:46:05 crc kubenswrapper[4795]: I0219 21:46:05.626115 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-2wbff" event={"ID":"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8","Type":"ContainerDied","Data":"f3e360ac25dc639935c61b2baa9937c7f07b88ebec95aede182ab25a4907955c"} Feb 19 21:46:06 crc kubenswrapper[4795]: I0219 21:46:06.638483 4795 generic.go:334] "Generic (PLEG): container finished" podID="cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23" containerID="088f63c71340a869532178f76fc11ab9c0cbbdc5c1aeacd6e851cf5d40a46fa0" exitCode=0 Feb 19 21:46:06 crc kubenswrapper[4795]: I0219 21:46:06.638574 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dxql7" event={"ID":"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23","Type":"ContainerDied","Data":"088f63c71340a869532178f76fc11ab9c0cbbdc5c1aeacd6e851cf5d40a46fa0"} Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.147523 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-2wbff" Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.197866 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6q9z2\" (UniqueName: \"kubernetes.io/projected/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-kube-api-access-6q9z2\") pod \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\" (UID: \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\") " Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.197981 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-db-sync-config-data\") pod \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\" (UID: \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\") " Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.198018 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-config-data\") pod \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\" (UID: \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\") " Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.198109 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-combined-ca-bundle\") pod \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\" (UID: \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\") " Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.203684 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-kube-api-access-6q9z2" (OuterVolumeSpecName: "kube-api-access-6q9z2") pod "ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8" (UID: "ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8"). InnerVolumeSpecName "kube-api-access-6q9z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.203741 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8" (UID: "ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.221969 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8" (UID: "ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.240814 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-config-data" (OuterVolumeSpecName: "config-data") pod "ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8" (UID: "ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.300918 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6q9z2\" (UniqueName: \"kubernetes.io/projected/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-kube-api-access-6q9z2\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.300975 4795 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.300984 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.300994 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.648404 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-2wbff" event={"ID":"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8","Type":"ContainerDied","Data":"256eb7cf5acf7b851870ab83b4723dabaee1fbaa60ee04f8ea61103c387af2ad"} Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.648814 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="256eb7cf5acf7b851870ab83b4723dabaee1fbaa60ee04f8ea61103c387af2ad" Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.648437 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-2wbff" Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.902673 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dxql7" Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.912067 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf27b\" (UniqueName: \"kubernetes.io/projected/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-kube-api-access-vf27b\") pod \"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23\" (UID: \"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23\") " Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.912250 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-combined-ca-bundle\") pod \"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23\" (UID: \"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23\") " Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.912325 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-config-data\") pod \"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23\" (UID: \"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23\") " Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.927857 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-kube-api-access-vf27b" (OuterVolumeSpecName: "kube-api-access-vf27b") pod "cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23" (UID: "cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23"). InnerVolumeSpecName "kube-api-access-vf27b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.956487 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23" (UID: "cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.989006 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-config-data" (OuterVolumeSpecName: "config-data") pod "cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23" (UID: "cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.014454 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.014487 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.014499 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vf27b\" (UniqueName: \"kubernetes.io/projected/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-kube-api-access-vf27b\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016049 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68677f88c9-jp7k7"] Feb 19 21:46:08 crc kubenswrapper[4795]: E0219 21:46:08.016497 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cb700b2-4c29-4deb-a379-d18f2695dcaf" containerName="mariadb-account-create-update" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016520 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cb700b2-4c29-4deb-a379-d18f2695dcaf" containerName="mariadb-account-create-update" Feb 19 21:46:08 crc kubenswrapper[4795]: E0219 21:46:08.016541 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c" containerName="mariadb-database-create" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016550 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c" containerName="mariadb-database-create" Feb 19 21:46:08 crc kubenswrapper[4795]: E0219 21:46:08.016559 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8" containerName="glance-db-sync" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016568 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8" containerName="glance-db-sync" Feb 19 21:46:08 crc kubenswrapper[4795]: E0219 21:46:08.016582 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18561896-d336-4962-8e9e-4ccf748f8605" containerName="mariadb-database-create" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016588 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="18561896-d336-4962-8e9e-4ccf748f8605" containerName="mariadb-database-create" Feb 19 21:46:08 crc kubenswrapper[4795]: E0219 21:46:08.016602 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b4dab90-2dba-4e1f-95fe-5c435d4e270a" containerName="init" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016609 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b4dab90-2dba-4e1f-95fe-5c435d4e270a" containerName="init" Feb 19 21:46:08 crc kubenswrapper[4795]: E0219 21:46:08.016628 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d152069-2c3d-4cf4-94e8-3068e24def9f" containerName="mariadb-account-create-update" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016636 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d152069-2c3d-4cf4-94e8-3068e24def9f" containerName="mariadb-account-create-update" Feb 19 21:46:08 crc kubenswrapper[4795]: E0219 21:46:08.016649 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23" containerName="keystone-db-sync" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016655 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23" containerName="keystone-db-sync" Feb 19 21:46:08 crc kubenswrapper[4795]: E0219 21:46:08.016664 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57961551-d4f8-4586-b255-8810fbdb499a" containerName="mariadb-account-create-update" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016672 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="57961551-d4f8-4586-b255-8810fbdb499a" containerName="mariadb-account-create-update" Feb 19 21:46:08 crc kubenswrapper[4795]: E0219 21:46:08.016693 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b4dab90-2dba-4e1f-95fe-5c435d4e270a" containerName="dnsmasq-dns" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016703 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b4dab90-2dba-4e1f-95fe-5c435d4e270a" containerName="dnsmasq-dns" Feb 19 21:46:08 crc kubenswrapper[4795]: E0219 21:46:08.016715 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73f01f44-1467-442f-b91f-ac1765626a3d" containerName="mariadb-database-create" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016723 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="73f01f44-1467-442f-b91f-ac1765626a3d" containerName="mariadb-database-create" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016896 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b4dab90-2dba-4e1f-95fe-5c435d4e270a" containerName="dnsmasq-dns" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016909 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="73f01f44-1467-442f-b91f-ac1765626a3d" containerName="mariadb-database-create" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016919 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8" containerName="glance-db-sync" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016925 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cb700b2-4c29-4deb-a379-d18f2695dcaf" containerName="mariadb-account-create-update" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016937 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c" containerName="mariadb-database-create" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016947 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d152069-2c3d-4cf4-94e8-3068e24def9f" containerName="mariadb-account-create-update" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016955 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="18561896-d336-4962-8e9e-4ccf748f8605" containerName="mariadb-database-create" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016966 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23" containerName="keystone-db-sync" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016982 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="57961551-d4f8-4586-b255-8810fbdb499a" containerName="mariadb-account-create-update" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.017857 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.031080 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68677f88c9-jp7k7"] Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.115724 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-ovsdbserver-nb\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.115826 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-dns-swift-storage-0\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.115888 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-dns-svc\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.115920 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crvcs\" (UniqueName: \"kubernetes.io/projected/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-kube-api-access-crvcs\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.116003 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-ovsdbserver-sb\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.116046 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-config\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.217758 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-dns-svc\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.217801 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crvcs\" (UniqueName: \"kubernetes.io/projected/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-kube-api-access-crvcs\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.217832 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-ovsdbserver-sb\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.217862 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-config\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.217908 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-ovsdbserver-nb\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.217943 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-dns-swift-storage-0\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.218845 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-ovsdbserver-sb\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.218888 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-config\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.218922 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-dns-svc\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.219082 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-dns-swift-storage-0\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.219117 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-ovsdbserver-nb\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.238535 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crvcs\" (UniqueName: \"kubernetes.io/projected/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-kube-api-access-crvcs\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.431584 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.657338 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dxql7" event={"ID":"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23","Type":"ContainerDied","Data":"884276ae22779ec15ffcd93cb5953527ba29f916cfb173ab2b7943b9d65999d4"} Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.657943 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="884276ae22779ec15ffcd93cb5953527ba29f916cfb173ab2b7943b9d65999d4" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.657733 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dxql7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.853864 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68677f88c9-jp7k7"] Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.881644 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d67cdfc8f-sh4j9"] Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.893222 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.904019 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d67cdfc8f-sh4j9"] Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.920682 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68677f88c9-jp7k7"] Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.929424 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xp5gn"] Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.930563 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.932109 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-dns-swift-storage-0\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.932157 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-ovsdbserver-nb\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.932212 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-ovsdbserver-sb\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.932316 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-config\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.932378 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-dns-svc\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.932525 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjfkf\" (UniqueName: \"kubernetes.io/projected/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-kube-api-access-bjfkf\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.932919 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.933203 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.933379 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.933601 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.933806 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8pz2x" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.943003 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xp5gn"] Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.036302 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-dns-swift-storage-0\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.036365 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-combined-ca-bundle\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.036389 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-ovsdbserver-nb\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.036408 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-credential-keys\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.036437 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-fernet-keys\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.036460 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-ovsdbserver-sb\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.036497 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-config\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.036515 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-dns-svc\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.036538 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-scripts\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.036569 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkx2w\" (UniqueName: \"kubernetes.io/projected/60623dff-0241-4bc9-8b17-de61d7271e19-kube-api-access-kkx2w\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.036586 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-config-data\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.036606 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjfkf\" (UniqueName: \"kubernetes.io/projected/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-kube-api-access-bjfkf\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.037777 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-dns-swift-storage-0\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.038055 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-ovsdbserver-nb\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.038363 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-config\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.038889 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-ovsdbserver-sb\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.039009 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-dns-svc\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.092897 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-zjbsw"] Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.094077 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.096283 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjfkf\" (UniqueName: \"kubernetes.io/projected/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-kube-api-access-bjfkf\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.107958 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.108227 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-wf9zm" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.108728 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.138071 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-scripts\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.138181 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-combined-ca-bundle\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.138213 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-db-sync-config-data\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.138244 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-credential-keys\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.138285 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbqsc\" (UniqueName: \"kubernetes.io/projected/5084e7b9-4923-449e-b0d7-28c602faeff0-kube-api-access-qbqsc\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.138315 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-fernet-keys\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.138337 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5084e7b9-4923-449e-b0d7-28c602faeff0-etc-machine-id\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.138409 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-combined-ca-bundle\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.138451 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-scripts\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.138486 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-config-data\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.138515 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkx2w\" (UniqueName: \"kubernetes.io/projected/60623dff-0241-4bc9-8b17-de61d7271e19-kube-api-access-kkx2w\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.138542 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-config-data\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.142004 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-fernet-keys\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.144227 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-zjbsw"] Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.149616 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-combined-ca-bundle\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.150013 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-scripts\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.155994 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-config-data\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.160563 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-credential-keys\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.182904 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.197222 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.200557 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.200752 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.233640 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.250602 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-combined-ca-bundle\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.250656 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-config-data\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.250673 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6698443-b029-4098-81d6-dba6d5f239f2-run-httpd\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.250699 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-scripts\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.250720 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-config-data\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.250737 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.250775 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drpsg\" (UniqueName: \"kubernetes.io/projected/f6698443-b029-4098-81d6-dba6d5f239f2-kube-api-access-drpsg\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.250801 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-scripts\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.250835 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6698443-b029-4098-81d6-dba6d5f239f2-log-httpd\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.250863 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-db-sync-config-data\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.250885 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.250908 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbqsc\" (UniqueName: \"kubernetes.io/projected/5084e7b9-4923-449e-b0d7-28c602faeff0-kube-api-access-qbqsc\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.250924 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5084e7b9-4923-449e-b0d7-28c602faeff0-etc-machine-id\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.267812 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkx2w\" (UniqueName: \"kubernetes.io/projected/60623dff-0241-4bc9-8b17-de61d7271e19-kube-api-access-kkx2w\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.272111 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-scripts\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.272563 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5084e7b9-4923-449e-b0d7-28c602faeff0-etc-machine-id\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.279727 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.282310 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-config-data\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.310729 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.335869 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbqsc\" (UniqueName: \"kubernetes.io/projected/5084e7b9-4923-449e-b0d7-28c602faeff0-kube-api-access-qbqsc\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.339634 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-db-sync-config-data\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.343494 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-combined-ca-bundle\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.355128 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-b4bcd"] Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.356829 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-b4bcd" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.363965 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.364590 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.376759 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-config-data\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.376871 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6698443-b029-4098-81d6-dba6d5f239f2-run-httpd\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.376996 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-scripts\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.377488 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.377630 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drpsg\" (UniqueName: \"kubernetes.io/projected/f6698443-b029-4098-81d6-dba6d5f239f2-kube-api-access-drpsg\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.377762 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6698443-b029-4098-81d6-dba6d5f239f2-log-httpd\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.378931 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6698443-b029-4098-81d6-dba6d5f239f2-log-httpd\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.379301 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-nf9cn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.379909 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6698443-b029-4098-81d6-dba6d5f239f2-run-httpd\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.384653 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.414343 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.415314 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-scripts\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.440564 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.441117 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drpsg\" (UniqueName: \"kubernetes.io/projected/f6698443-b029-4098-81d6-dba6d5f239f2-kube-api-access-drpsg\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.450859 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.498435 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-config-data\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.516438 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-jkspq"] Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.517641 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.531503 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-combined-ca-bundle\") pod \"neutron-db-sync-b4bcd\" (UID: \"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec\") " pod="openstack/neutron-db-sync-b4bcd" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.531666 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-config\") pod \"neutron-db-sync-b4bcd\" (UID: \"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec\") " pod="openstack/neutron-db-sync-b4bcd" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.531689 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6xqk\" (UniqueName: \"kubernetes.io/projected/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-kube-api-access-c6xqk\") pod \"neutron-db-sync-b4bcd\" (UID: \"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec\") " pod="openstack/neutron-db-sync-b4bcd" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.533875 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-bkmsl" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.534012 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.539534 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.600464 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-ttz5x"] Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.602346 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ttz5x" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.605930 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-xv49j" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.606121 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.628034 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-ttz5x"] Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.632953 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-combined-ca-bundle\") pod \"neutron-db-sync-b4bcd\" (UID: \"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec\") " pod="openstack/neutron-db-sync-b4bcd" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.632994 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-config-data\") pod \"placement-db-sync-jkspq\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.633051 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7sch\" (UniqueName: \"kubernetes.io/projected/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-kube-api-access-k7sch\") pod \"barbican-db-sync-ttz5x\" (UID: \"e61f40e0-d6c3-49f7-a93f-d9956f086d4b\") " pod="openstack/barbican-db-sync-ttz5x" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.633090 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-combined-ca-bundle\") pod \"placement-db-sync-jkspq\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.633126 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-combined-ca-bundle\") pod \"barbican-db-sync-ttz5x\" (UID: \"e61f40e0-d6c3-49f7-a93f-d9956f086d4b\") " pod="openstack/barbican-db-sync-ttz5x" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.633149 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-db-sync-config-data\") pod \"barbican-db-sync-ttz5x\" (UID: \"e61f40e0-d6c3-49f7-a93f-d9956f086d4b\") " pod="openstack/barbican-db-sync-ttz5x" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.633187 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/454af6b2-4c9e-4706-a537-b3e3d468353d-logs\") pod \"placement-db-sync-jkspq\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.633205 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-scripts\") pod \"placement-db-sync-jkspq\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.633222 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-config\") pod \"neutron-db-sync-b4bcd\" (UID: \"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec\") " pod="openstack/neutron-db-sync-b4bcd" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.633372 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6xqk\" (UniqueName: \"kubernetes.io/projected/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-kube-api-access-c6xqk\") pod \"neutron-db-sync-b4bcd\" (UID: \"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec\") " pod="openstack/neutron-db-sync-b4bcd" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.633409 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppjmb\" (UniqueName: \"kubernetes.io/projected/454af6b2-4c9e-4706-a537-b3e3d468353d-kube-api-access-ppjmb\") pod \"placement-db-sync-jkspq\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.641792 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.645389 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-combined-ca-bundle\") pod \"neutron-db-sync-b4bcd\" (UID: \"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec\") " pod="openstack/neutron-db-sync-b4bcd" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.646995 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-config\") pod \"neutron-db-sync-b4bcd\" (UID: \"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec\") " pod="openstack/neutron-db-sync-b4bcd" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.649122 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-b4bcd"] Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.649337 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6xqk\" (UniqueName: \"kubernetes.io/projected/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-kube-api-access-c6xqk\") pod \"neutron-db-sync-b4bcd\" (UID: \"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec\") " pod="openstack/neutron-db-sync-b4bcd" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.699231 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jkspq"] Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.700559 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" event={"ID":"8df3e76b-9aa2-476b-aa19-62518a8ddd1e","Type":"ContainerStarted","Data":"d98c932d8b2fc29804500d56b2954097b63f156f2f810e2111cc071a2a6acce2"} Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.720588 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d67cdfc8f-sh4j9"] Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.734836 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-combined-ca-bundle\") pod \"barbican-db-sync-ttz5x\" (UID: \"e61f40e0-d6c3-49f7-a93f-d9956f086d4b\") " pod="openstack/barbican-db-sync-ttz5x" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.734881 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-db-sync-config-data\") pod \"barbican-db-sync-ttz5x\" (UID: \"e61f40e0-d6c3-49f7-a93f-d9956f086d4b\") " pod="openstack/barbican-db-sync-ttz5x" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.734933 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/454af6b2-4c9e-4706-a537-b3e3d468353d-logs\") pod \"placement-db-sync-jkspq\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.734954 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-scripts\") pod \"placement-db-sync-jkspq\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.735004 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppjmb\" (UniqueName: \"kubernetes.io/projected/454af6b2-4c9e-4706-a537-b3e3d468353d-kube-api-access-ppjmb\") pod \"placement-db-sync-jkspq\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.735064 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-config-data\") pod \"placement-db-sync-jkspq\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.735104 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7sch\" (UniqueName: \"kubernetes.io/projected/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-kube-api-access-k7sch\") pod \"barbican-db-sync-ttz5x\" (UID: \"e61f40e0-d6c3-49f7-a93f-d9956f086d4b\") " pod="openstack/barbican-db-sync-ttz5x" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.735159 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-combined-ca-bundle\") pod \"placement-db-sync-jkspq\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.735382 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/454af6b2-4c9e-4706-a537-b3e3d468353d-logs\") pod \"placement-db-sync-jkspq\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.742376 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-scripts\") pod \"placement-db-sync-jkspq\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.742544 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-db-sync-config-data\") pod \"barbican-db-sync-ttz5x\" (UID: \"e61f40e0-d6c3-49f7-a93f-d9956f086d4b\") " pod="openstack/barbican-db-sync-ttz5x" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.743432 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-config-data\") pod \"placement-db-sync-jkspq\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.743798 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-combined-ca-bundle\") pod \"barbican-db-sync-ttz5x\" (UID: \"e61f40e0-d6c3-49f7-a93f-d9956f086d4b\") " pod="openstack/barbican-db-sync-ttz5x" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.750156 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.750447 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-combined-ca-bundle\") pod \"placement-db-sync-jkspq\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.751457 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7sch\" (UniqueName: \"kubernetes.io/projected/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-kube-api-access-k7sch\") pod \"barbican-db-sync-ttz5x\" (UID: \"e61f40e0-d6c3-49f7-a93f-d9956f086d4b\") " pod="openstack/barbican-db-sync-ttz5x" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.753650 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppjmb\" (UniqueName: \"kubernetes.io/projected/454af6b2-4c9e-4706-a537-b3e3d468353d-kube-api-access-ppjmb\") pod \"placement-db-sync-jkspq\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.758505 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67dccc895-rxl4z"] Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.763590 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.765663 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67dccc895-rxl4z"] Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.814776 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-nf9cn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.823271 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-b4bcd" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.836872 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-ovsdbserver-nb\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.836918 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfv2b\" (UniqueName: \"kubernetes.io/projected/db8db625-527b-49de-bab0-c2065360d792-kube-api-access-jfv2b\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.836934 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-ovsdbserver-sb\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.836951 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-dns-swift-storage-0\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.836967 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-config\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.837044 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-dns-svc\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.922755 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.940268 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-dns-svc\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.940376 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-ovsdbserver-nb\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.940403 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfv2b\" (UniqueName: \"kubernetes.io/projected/db8db625-527b-49de-bab0-c2065360d792-kube-api-access-jfv2b\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.940418 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-ovsdbserver-sb\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.940437 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-dns-swift-storage-0\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.940452 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-config\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.941694 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-dns-svc\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.943142 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-ovsdbserver-nb\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.945085 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-ovsdbserver-sb\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.946292 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-dns-swift-storage-0\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.947926 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-config\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.955332 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ttz5x" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.974588 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xp5gn"] Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.985586 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfv2b\" (UniqueName: \"kubernetes.io/projected/db8db625-527b-49de-bab0-c2065360d792-kube-api-access-jfv2b\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.989219 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.991818 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.996701 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:09.999026 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6rhfn" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.003526 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.010838 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.047685 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9x6w\" (UniqueName: \"kubernetes.io/projected/5f9a41f3-9dae-4426-8687-368f5911a834-kube-api-access-r9x6w\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.047752 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.047777 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f9a41f3-9dae-4426-8687-368f5911a834-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.047805 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.047836 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.047915 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f9a41f3-9dae-4426-8687-368f5911a834-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.048248 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: W0219 21:46:10.053903 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60623dff_0241_4bc9_8b17_de61d7271e19.slice/crio-5919e2bd3e6456e8a3e11cb00ab17617e176207d63da18ffb2be8092647e71a8 WatchSource:0}: Error finding container 5919e2bd3e6456e8a3e11cb00ab17617e176207d63da18ffb2be8092647e71a8: Status 404 returned error can't find the container with id 5919e2bd3e6456e8a3e11cb00ab17617e176207d63da18ffb2be8092647e71a8 Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.086878 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.096690 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.109410 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.110755 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.125765 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.136725 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.183850 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.187852 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.187939 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f9a41f3-9dae-4426-8687-368f5911a834-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.187976 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.188017 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9x6w\" (UniqueName: \"kubernetes.io/projected/5f9a41f3-9dae-4426-8687-368f5911a834-kube-api-access-r9x6w\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.188069 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.188092 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f9a41f3-9dae-4426-8687-368f5911a834-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.188181 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: W0219 21:46:10.192863 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5084e7b9_4923_449e_b0d7_28c602faeff0.slice/crio-afea16e4a0f822c8b9769634b3e5dede9add2d921ddeb3308c942ab11c4579f9 WatchSource:0}: Error finding container afea16e4a0f822c8b9769634b3e5dede9add2d921ddeb3308c942ab11c4579f9: Status 404 returned error can't find the container with id afea16e4a0f822c8b9769634b3e5dede9add2d921ddeb3308c942ab11c4579f9 Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.193901 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.194228 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f9a41f3-9dae-4426-8687-368f5911a834-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.194966 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.195786 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.196465 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f9a41f3-9dae-4426-8687-368f5911a834-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.208719 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.227648 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9x6w\" (UniqueName: \"kubernetes.io/projected/5f9a41f3-9dae-4426-8687-368f5911a834-kube-api-access-r9x6w\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.262054 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.268911 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d67cdfc8f-sh4j9"] Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.290757 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.290838 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-scripts\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.290887 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.291412 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-config-data\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.291458 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2rcc\" (UniqueName: \"kubernetes.io/projected/2b668591-0c11-42f0-b813-94b76c8cbd1b-kube-api-access-v2rcc\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.291634 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2b668591-0c11-42f0-b813-94b76c8cbd1b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.291668 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b668591-0c11-42f0-b813-94b76c8cbd1b-logs\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.308070 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-zjbsw"] Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.315960 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.325475 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-b4bcd"] Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.393357 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-config-data\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.400119 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2rcc\" (UniqueName: \"kubernetes.io/projected/2b668591-0c11-42f0-b813-94b76c8cbd1b-kube-api-access-v2rcc\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.400294 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2b668591-0c11-42f0-b813-94b76c8cbd1b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.400369 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b668591-0c11-42f0-b813-94b76c8cbd1b-logs\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.400502 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.400664 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-scripts\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.400755 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.401036 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2b668591-0c11-42f0-b813-94b76c8cbd1b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.401061 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b668591-0c11-42f0-b813-94b76c8cbd1b-logs\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.401302 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.413308 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-scripts\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.415835 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.420272 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-config-data\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.422903 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2rcc\" (UniqueName: \"kubernetes.io/projected/2b668591-0c11-42f0-b813-94b76c8cbd1b-kube-api-access-v2rcc\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.447697 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.641610 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-ttz5x"] Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.649330 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jkspq"] Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.664313 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67dccc895-rxl4z"] Feb 19 21:46:10 crc kubenswrapper[4795]: W0219 21:46:10.671255 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb8db625_527b_49de_bab0_c2065360d792.slice/crio-4fa1b7dc967ebd0720fc00eaa4e4599d7f0abf1d8d22f716edb0032cd429d0aa WatchSource:0}: Error finding container 4fa1b7dc967ebd0720fc00eaa4e4599d7f0abf1d8d22f716edb0032cd429d0aa: Status 404 returned error can't find the container with id 4fa1b7dc967ebd0720fc00eaa4e4599d7f0abf1d8d22f716edb0032cd429d0aa Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.715188 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xp5gn" event={"ID":"60623dff-0241-4bc9-8b17-de61d7271e19","Type":"ContainerStarted","Data":"9ae6b68ead4d625967dc9f8e67a83bcadf072cd6d6b5d617f89d10bb543678b8"} Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.715252 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xp5gn" event={"ID":"60623dff-0241-4bc9-8b17-de61d7271e19","Type":"ContainerStarted","Data":"5919e2bd3e6456e8a3e11cb00ab17617e176207d63da18ffb2be8092647e71a8"} Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.719013 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.719797 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6698443-b029-4098-81d6-dba6d5f239f2","Type":"ContainerStarted","Data":"333abcac5e9ec8b6d96f2784182bddc2611944e9296eb36664d925e9b90b96b2"} Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.721090 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jkspq" event={"ID":"454af6b2-4c9e-4706-a537-b3e3d468353d","Type":"ContainerStarted","Data":"2f56b820ecab38a5f4f0b9d439b2a430292837a387794fcf0430bf0243ab99f1"} Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.721855 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zjbsw" event={"ID":"5084e7b9-4923-449e-b0d7-28c602faeff0","Type":"ContainerStarted","Data":"afea16e4a0f822c8b9769634b3e5dede9add2d921ddeb3308c942ab11c4579f9"} Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.723219 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ttz5x" event={"ID":"e61f40e0-d6c3-49f7-a93f-d9956f086d4b","Type":"ContainerStarted","Data":"e8b62c3b7599823f742eaf7d07a077b7a0c11e4d7c8b7810bc63fd0eb218d413"} Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.725089 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-b4bcd" event={"ID":"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec","Type":"ContainerStarted","Data":"b0effc166b07beb1020b33c2901a97075e3341db6cb3398e72144f393ccb6850"} Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.725129 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-b4bcd" event={"ID":"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec","Type":"ContainerStarted","Data":"9f412f3971266eb4c52987ab5a1d19a79a80d58662b3e55c9fe1df068708db16"} Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.728708 4795 generic.go:334] "Generic (PLEG): container finished" podID="8df3e76b-9aa2-476b-aa19-62518a8ddd1e" containerID="0896c578e55d0f5168d156a0553e2f4852050668424c462319ac2e65528da7b8" exitCode=0 Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.728785 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" event={"ID":"8df3e76b-9aa2-476b-aa19-62518a8ddd1e","Type":"ContainerDied","Data":"0896c578e55d0f5168d156a0553e2f4852050668424c462319ac2e65528da7b8"} Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.740933 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xp5gn" podStartSLOduration=2.740917479 podStartE2EDuration="2.740917479s" podCreationTimestamp="2026-02-19 21:46:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:10.737427801 +0000 UTC m=+1081.929945695" watchObservedRunningTime="2026-02-19 21:46:10.740917479 +0000 UTC m=+1081.933435343" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.744180 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dccc895-rxl4z" event={"ID":"db8db625-527b-49de-bab0-c2065360d792","Type":"ContainerStarted","Data":"4fa1b7dc967ebd0720fc00eaa4e4599d7f0abf1d8d22f716edb0032cd429d0aa"} Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.749610 4795 generic.go:334] "Generic (PLEG): container finished" podID="76a3a185-4911-4b2e-ab11-5ea1c61c2b69" containerID="729c91e32f5713fa403f44c9ceb7ddfc0360242d02f92dad958a357428c7d54a" exitCode=0 Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.749653 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" event={"ID":"76a3a185-4911-4b2e-ab11-5ea1c61c2b69","Type":"ContainerDied","Data":"729c91e32f5713fa403f44c9ceb7ddfc0360242d02f92dad958a357428c7d54a"} Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.749695 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" event={"ID":"76a3a185-4911-4b2e-ab11-5ea1c61c2b69","Type":"ContainerStarted","Data":"1deaacbccd11506cbf41e18d38101cecbd3bd4809459a1cb284d3391b2eee2e5"} Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.768292 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-b4bcd" podStartSLOduration=1.768269791 podStartE2EDuration="1.768269791s" podCreationTimestamp="2026-02-19 21:46:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:10.75688568 +0000 UTC m=+1081.949403544" watchObservedRunningTime="2026-02-19 21:46:10.768269791 +0000 UTC m=+1081.960787665" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.948312 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.301218 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.324044 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.424065 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-dns-swift-storage-0\") pod \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.424135 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-ovsdbserver-nb\") pod \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.424238 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-config\") pod \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.424265 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crvcs\" (UniqueName: \"kubernetes.io/projected/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-kube-api-access-crvcs\") pod \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.424296 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-config\") pod \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.424345 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-dns-swift-storage-0\") pod \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.424369 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-ovsdbserver-sb\") pod \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.424388 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-dns-svc\") pod \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.424408 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-dns-svc\") pod \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.424431 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-ovsdbserver-nb\") pod \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.424446 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-ovsdbserver-sb\") pod \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.424482 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjfkf\" (UniqueName: \"kubernetes.io/projected/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-kube-api-access-bjfkf\") pod \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.445924 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-kube-api-access-crvcs" (OuterVolumeSpecName: "kube-api-access-crvcs") pod "8df3e76b-9aa2-476b-aa19-62518a8ddd1e" (UID: "8df3e76b-9aa2-476b-aa19-62518a8ddd1e"). InnerVolumeSpecName "kube-api-access-crvcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.464003 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.466621 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-kube-api-access-bjfkf" (OuterVolumeSpecName: "kube-api-access-bjfkf") pod "76a3a185-4911-4b2e-ab11-5ea1c61c2b69" (UID: "76a3a185-4911-4b2e-ab11-5ea1c61c2b69"). InnerVolumeSpecName "kube-api-access-bjfkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:11 crc kubenswrapper[4795]: W0219 21:46:11.513644 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b668591_0c11_42f0_b813_94b76c8cbd1b.slice/crio-2d8f5079a7199ad590b8e7ae61509c061499525039c4df8317fdb2a9d4218e70 WatchSource:0}: Error finding container 2d8f5079a7199ad590b8e7ae61509c061499525039c4df8317fdb2a9d4218e70: Status 404 returned error can't find the container with id 2d8f5079a7199ad590b8e7ae61509c061499525039c4df8317fdb2a9d4218e70 Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.518533 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "76a3a185-4911-4b2e-ab11-5ea1c61c2b69" (UID: "76a3a185-4911-4b2e-ab11-5ea1c61c2b69"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.520700 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-config" (OuterVolumeSpecName: "config") pod "76a3a185-4911-4b2e-ab11-5ea1c61c2b69" (UID: "76a3a185-4911-4b2e-ab11-5ea1c61c2b69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.525843 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjfkf\" (UniqueName: \"kubernetes.io/projected/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-kube-api-access-bjfkf\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.525871 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.525881 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crvcs\" (UniqueName: \"kubernetes.io/projected/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-kube-api-access-crvcs\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.525891 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.527434 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8df3e76b-9aa2-476b-aa19-62518a8ddd1e" (UID: "8df3e76b-9aa2-476b-aa19-62518a8ddd1e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.532080 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8df3e76b-9aa2-476b-aa19-62518a8ddd1e" (UID: "8df3e76b-9aa2-476b-aa19-62518a8ddd1e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.539081 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-config" (OuterVolumeSpecName: "config") pod "8df3e76b-9aa2-476b-aa19-62518a8ddd1e" (UID: "8df3e76b-9aa2-476b-aa19-62518a8ddd1e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.540089 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "76a3a185-4911-4b2e-ab11-5ea1c61c2b69" (UID: "76a3a185-4911-4b2e-ab11-5ea1c61c2b69"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.541928 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "76a3a185-4911-4b2e-ab11-5ea1c61c2b69" (UID: "76a3a185-4911-4b2e-ab11-5ea1c61c2b69"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.546997 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8df3e76b-9aa2-476b-aa19-62518a8ddd1e" (UID: "8df3e76b-9aa2-476b-aa19-62518a8ddd1e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.548141 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8df3e76b-9aa2-476b-aa19-62518a8ddd1e" (UID: "8df3e76b-9aa2-476b-aa19-62518a8ddd1e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.560670 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "76a3a185-4911-4b2e-ab11-5ea1c61c2b69" (UID: "76a3a185-4911-4b2e-ab11-5ea1c61c2b69"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.628006 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.628545 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.628562 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.628756 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.629301 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.629460 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.629481 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.629532 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.762659 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" event={"ID":"8df3e76b-9aa2-476b-aa19-62518a8ddd1e","Type":"ContainerDied","Data":"d98c932d8b2fc29804500d56b2954097b63f156f2f810e2111cc071a2a6acce2"} Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.762679 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.762717 4795 scope.go:117] "RemoveContainer" containerID="0896c578e55d0f5168d156a0553e2f4852050668424c462319ac2e65528da7b8" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.765623 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f9a41f3-9dae-4426-8687-368f5911a834","Type":"ContainerStarted","Data":"74e861972e0b08341b7578c99f45877ca28dee930187dfc7c5d68702065ba963"} Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.766813 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2b668591-0c11-42f0-b813-94b76c8cbd1b","Type":"ContainerStarted","Data":"2d8f5079a7199ad590b8e7ae61509c061499525039c4df8317fdb2a9d4218e70"} Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.768885 4795 generic.go:334] "Generic (PLEG): container finished" podID="db8db625-527b-49de-bab0-c2065360d792" containerID="3f9a72a27cc208f60203c5a7cd9bfc613d8657082c6f35b5044badc8bc0caace" exitCode=0 Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.769090 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dccc895-rxl4z" event={"ID":"db8db625-527b-49de-bab0-c2065360d792","Type":"ContainerDied","Data":"3f9a72a27cc208f60203c5a7cd9bfc613d8657082c6f35b5044badc8bc0caace"} Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.772709 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" event={"ID":"76a3a185-4911-4b2e-ab11-5ea1c61c2b69","Type":"ContainerDied","Data":"1deaacbccd11506cbf41e18d38101cecbd3bd4809459a1cb284d3391b2eee2e5"} Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.772776 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.913527 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68677f88c9-jp7k7"] Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.956831 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68677f88c9-jp7k7"] Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.987229 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d67cdfc8f-sh4j9"] Feb 19 21:46:12 crc kubenswrapper[4795]: I0219 21:46:11.999477 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d67cdfc8f-sh4j9"] Feb 19 21:46:12 crc kubenswrapper[4795]: I0219 21:46:12.007688 4795 scope.go:117] "RemoveContainer" containerID="729c91e32f5713fa403f44c9ceb7ddfc0360242d02f92dad958a357428c7d54a" Feb 19 21:46:12 crc kubenswrapper[4795]: I0219 21:46:12.086462 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:46:12 crc kubenswrapper[4795]: I0219 21:46:12.124646 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:46:12 crc kubenswrapper[4795]: I0219 21:46:12.200603 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:46:12 crc kubenswrapper[4795]: I0219 21:46:12.800773 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f9a41f3-9dae-4426-8687-368f5911a834","Type":"ContainerStarted","Data":"cbf59bdd46660f6210995b0eb2fc2126976500e2a6c6cee2fb345b41a51900b2"} Feb 19 21:46:12 crc kubenswrapper[4795]: I0219 21:46:12.804606 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2b668591-0c11-42f0-b813-94b76c8cbd1b","Type":"ContainerStarted","Data":"f5cfd3f9869f87af702a2e0e738d0b4feb92ba480863b5a46d242043d700ac19"} Feb 19 21:46:12 crc kubenswrapper[4795]: I0219 21:46:12.807066 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dccc895-rxl4z" event={"ID":"db8db625-527b-49de-bab0-c2065360d792","Type":"ContainerStarted","Data":"c943900c340586ccd6fb6dfea1e9575c93f20a12cfb4d8dd7d04eaf8b69bb4cb"} Feb 19 21:46:12 crc kubenswrapper[4795]: I0219 21:46:12.807389 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:12 crc kubenswrapper[4795]: I0219 21:46:12.828219 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67dccc895-rxl4z" podStartSLOduration=3.828205896 podStartE2EDuration="3.828205896s" podCreationTimestamp="2026-02-19 21:46:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:12.827585998 +0000 UTC m=+1084.020103862" watchObservedRunningTime="2026-02-19 21:46:12.828205896 +0000 UTC m=+1084.020723760" Feb 19 21:46:13 crc kubenswrapper[4795]: I0219 21:46:13.520862 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76a3a185-4911-4b2e-ab11-5ea1c61c2b69" path="/var/lib/kubelet/pods/76a3a185-4911-4b2e-ab11-5ea1c61c2b69/volumes" Feb 19 21:46:13 crc kubenswrapper[4795]: I0219 21:46:13.523550 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8df3e76b-9aa2-476b-aa19-62518a8ddd1e" path="/var/lib/kubelet/pods/8df3e76b-9aa2-476b-aa19-62518a8ddd1e/volumes" Feb 19 21:46:13 crc kubenswrapper[4795]: I0219 21:46:13.818075 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2b668591-0c11-42f0-b813-94b76c8cbd1b","Type":"ContainerStarted","Data":"ecaa7d4a6273a3af65b1f5d9585a04b8dd375c28b94a4127bae9a591c4a91d3e"} Feb 19 21:46:13 crc kubenswrapper[4795]: I0219 21:46:13.818249 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2b668591-0c11-42f0-b813-94b76c8cbd1b" containerName="glance-log" containerID="cri-o://f5cfd3f9869f87af702a2e0e738d0b4feb92ba480863b5a46d242043d700ac19" gracePeriod=30 Feb 19 21:46:13 crc kubenswrapper[4795]: I0219 21:46:13.818341 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2b668591-0c11-42f0-b813-94b76c8cbd1b" containerName="glance-httpd" containerID="cri-o://ecaa7d4a6273a3af65b1f5d9585a04b8dd375c28b94a4127bae9a591c4a91d3e" gracePeriod=30 Feb 19 21:46:13 crc kubenswrapper[4795]: I0219 21:46:13.825007 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f9a41f3-9dae-4426-8687-368f5911a834","Type":"ContainerStarted","Data":"a8c2432cd1a201f5714626d8c9b69ee61c26692ea73aa3955e3d0f4d2094d962"} Feb 19 21:46:13 crc kubenswrapper[4795]: I0219 21:46:13.825137 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5f9a41f3-9dae-4426-8687-368f5911a834" containerName="glance-log" containerID="cri-o://cbf59bdd46660f6210995b0eb2fc2126976500e2a6c6cee2fb345b41a51900b2" gracePeriod=30 Feb 19 21:46:13 crc kubenswrapper[4795]: I0219 21:46:13.825275 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5f9a41f3-9dae-4426-8687-368f5911a834" containerName="glance-httpd" containerID="cri-o://a8c2432cd1a201f5714626d8c9b69ee61c26692ea73aa3955e3d0f4d2094d962" gracePeriod=30 Feb 19 21:46:13 crc kubenswrapper[4795]: I0219 21:46:13.841607 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.841588076 podStartE2EDuration="4.841588076s" podCreationTimestamp="2026-02-19 21:46:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:13.835455373 +0000 UTC m=+1085.027973237" watchObservedRunningTime="2026-02-19 21:46:13.841588076 +0000 UTC m=+1085.034105940" Feb 19 21:46:13 crc kubenswrapper[4795]: I0219 21:46:13.877745 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.877719575 podStartE2EDuration="5.877719575s" podCreationTimestamp="2026-02-19 21:46:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:13.869655738 +0000 UTC m=+1085.062173622" watchObservedRunningTime="2026-02-19 21:46:13.877719575 +0000 UTC m=+1085.070237439" Feb 19 21:46:14 crc kubenswrapper[4795]: I0219 21:46:14.835368 4795 generic.go:334] "Generic (PLEG): container finished" podID="5f9a41f3-9dae-4426-8687-368f5911a834" containerID="a8c2432cd1a201f5714626d8c9b69ee61c26692ea73aa3955e3d0f4d2094d962" exitCode=0 Feb 19 21:46:14 crc kubenswrapper[4795]: I0219 21:46:14.835638 4795 generic.go:334] "Generic (PLEG): container finished" podID="5f9a41f3-9dae-4426-8687-368f5911a834" containerID="cbf59bdd46660f6210995b0eb2fc2126976500e2a6c6cee2fb345b41a51900b2" exitCode=143 Feb 19 21:46:14 crc kubenswrapper[4795]: I0219 21:46:14.835400 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f9a41f3-9dae-4426-8687-368f5911a834","Type":"ContainerDied","Data":"a8c2432cd1a201f5714626d8c9b69ee61c26692ea73aa3955e3d0f4d2094d962"} Feb 19 21:46:14 crc kubenswrapper[4795]: I0219 21:46:14.835706 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f9a41f3-9dae-4426-8687-368f5911a834","Type":"ContainerDied","Data":"cbf59bdd46660f6210995b0eb2fc2126976500e2a6c6cee2fb345b41a51900b2"} Feb 19 21:46:14 crc kubenswrapper[4795]: I0219 21:46:14.838972 4795 generic.go:334] "Generic (PLEG): container finished" podID="2b668591-0c11-42f0-b813-94b76c8cbd1b" containerID="ecaa7d4a6273a3af65b1f5d9585a04b8dd375c28b94a4127bae9a591c4a91d3e" exitCode=0 Feb 19 21:46:14 crc kubenswrapper[4795]: I0219 21:46:14.838991 4795 generic.go:334] "Generic (PLEG): container finished" podID="2b668591-0c11-42f0-b813-94b76c8cbd1b" containerID="f5cfd3f9869f87af702a2e0e738d0b4feb92ba480863b5a46d242043d700ac19" exitCode=143 Feb 19 21:46:14 crc kubenswrapper[4795]: I0219 21:46:14.839013 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2b668591-0c11-42f0-b813-94b76c8cbd1b","Type":"ContainerDied","Data":"ecaa7d4a6273a3af65b1f5d9585a04b8dd375c28b94a4127bae9a591c4a91d3e"} Feb 19 21:46:14 crc kubenswrapper[4795]: I0219 21:46:14.839057 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2b668591-0c11-42f0-b813-94b76c8cbd1b","Type":"ContainerDied","Data":"f5cfd3f9869f87af702a2e0e738d0b4feb92ba480863b5a46d242043d700ac19"} Feb 19 21:46:14 crc kubenswrapper[4795]: I0219 21:46:14.840705 4795 generic.go:334] "Generic (PLEG): container finished" podID="60623dff-0241-4bc9-8b17-de61d7271e19" containerID="9ae6b68ead4d625967dc9f8e67a83bcadf072cd6d6b5d617f89d10bb543678b8" exitCode=0 Feb 19 21:46:14 crc kubenswrapper[4795]: I0219 21:46:14.840732 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xp5gn" event={"ID":"60623dff-0241-4bc9-8b17-de61d7271e19","Type":"ContainerDied","Data":"9ae6b68ead4d625967dc9f8e67a83bcadf072cd6d6b5d617f89d10bb543678b8"} Feb 19 21:46:20 crc kubenswrapper[4795]: I0219 21:46:20.098956 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:20 crc kubenswrapper[4795]: I0219 21:46:20.173224 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-768666cd57-5c55b"] Feb 19 21:46:20 crc kubenswrapper[4795]: I0219 21:46:20.173555 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-768666cd57-5c55b" podUID="2f384824-f8ad-42d9-b09b-decb5280b448" containerName="dnsmasq-dns" containerID="cri-o://2ad1ba517bb7be9e1a9d1797c2af47441870593656bca08175b2e33f72c8f9fc" gracePeriod=10 Feb 19 21:46:20 crc kubenswrapper[4795]: I0219 21:46:20.908245 4795 generic.go:334] "Generic (PLEG): container finished" podID="2f384824-f8ad-42d9-b09b-decb5280b448" containerID="2ad1ba517bb7be9e1a9d1797c2af47441870593656bca08175b2e33f72c8f9fc" exitCode=0 Feb 19 21:46:20 crc kubenswrapper[4795]: I0219 21:46:20.908301 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768666cd57-5c55b" event={"ID":"2f384824-f8ad-42d9-b09b-decb5280b448","Type":"ContainerDied","Data":"2ad1ba517bb7be9e1a9d1797c2af47441870593656bca08175b2e33f72c8f9fc"} Feb 19 21:46:21 crc kubenswrapper[4795]: I0219 21:46:21.924476 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xp5gn" event={"ID":"60623dff-0241-4bc9-8b17-de61d7271e19","Type":"ContainerDied","Data":"5919e2bd3e6456e8a3e11cb00ab17617e176207d63da18ffb2be8092647e71a8"} Feb 19 21:46:21 crc kubenswrapper[4795]: I0219 21:46:21.924745 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5919e2bd3e6456e8a3e11cb00ab17617e176207d63da18ffb2be8092647e71a8" Feb 19 21:46:21 crc kubenswrapper[4795]: I0219 21:46:21.966160 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.041833 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-scripts\") pod \"60623dff-0241-4bc9-8b17-de61d7271e19\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.041912 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-fernet-keys\") pod \"60623dff-0241-4bc9-8b17-de61d7271e19\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.042039 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-config-data\") pod \"60623dff-0241-4bc9-8b17-de61d7271e19\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.042087 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-combined-ca-bundle\") pod \"60623dff-0241-4bc9-8b17-de61d7271e19\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.042132 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkx2w\" (UniqueName: \"kubernetes.io/projected/60623dff-0241-4bc9-8b17-de61d7271e19-kube-api-access-kkx2w\") pod \"60623dff-0241-4bc9-8b17-de61d7271e19\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.042159 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-credential-keys\") pod \"60623dff-0241-4bc9-8b17-de61d7271e19\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.049547 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "60623dff-0241-4bc9-8b17-de61d7271e19" (UID: "60623dff-0241-4bc9-8b17-de61d7271e19"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.049687 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "60623dff-0241-4bc9-8b17-de61d7271e19" (UID: "60623dff-0241-4bc9-8b17-de61d7271e19"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.056604 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-scripts" (OuterVolumeSpecName: "scripts") pod "60623dff-0241-4bc9-8b17-de61d7271e19" (UID: "60623dff-0241-4bc9-8b17-de61d7271e19"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.056661 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60623dff-0241-4bc9-8b17-de61d7271e19-kube-api-access-kkx2w" (OuterVolumeSpecName: "kube-api-access-kkx2w") pod "60623dff-0241-4bc9-8b17-de61d7271e19" (UID: "60623dff-0241-4bc9-8b17-de61d7271e19"). InnerVolumeSpecName "kube-api-access-kkx2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.082104 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60623dff-0241-4bc9-8b17-de61d7271e19" (UID: "60623dff-0241-4bc9-8b17-de61d7271e19"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.087347 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-config-data" (OuterVolumeSpecName: "config-data") pod "60623dff-0241-4bc9-8b17-de61d7271e19" (UID: "60623dff-0241-4bc9-8b17-de61d7271e19"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.143838 4795 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.143869 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.143877 4795 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.143885 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.143893 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.143901 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkx2w\" (UniqueName: \"kubernetes.io/projected/60623dff-0241-4bc9-8b17-de61d7271e19-kube-api-access-kkx2w\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.931819 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.054979 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xp5gn"] Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.064147 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xp5gn"] Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.150546 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-nffrq"] Feb 19 21:46:23 crc kubenswrapper[4795]: E0219 21:46:23.151153 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76a3a185-4911-4b2e-ab11-5ea1c61c2b69" containerName="init" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.151199 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="76a3a185-4911-4b2e-ab11-5ea1c61c2b69" containerName="init" Feb 19 21:46:23 crc kubenswrapper[4795]: E0219 21:46:23.151217 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df3e76b-9aa2-476b-aa19-62518a8ddd1e" containerName="init" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.151225 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df3e76b-9aa2-476b-aa19-62518a8ddd1e" containerName="init" Feb 19 21:46:23 crc kubenswrapper[4795]: E0219 21:46:23.151284 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60623dff-0241-4bc9-8b17-de61d7271e19" containerName="keystone-bootstrap" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.151296 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="60623dff-0241-4bc9-8b17-de61d7271e19" containerName="keystone-bootstrap" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.151695 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8df3e76b-9aa2-476b-aa19-62518a8ddd1e" containerName="init" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.151746 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="60623dff-0241-4bc9-8b17-de61d7271e19" containerName="keystone-bootstrap" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.151767 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="76a3a185-4911-4b2e-ab11-5ea1c61c2b69" containerName="init" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.155511 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.158848 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.159081 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.159079 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.159512 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8pz2x" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.159660 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.180941 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-nffrq"] Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.262848 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-fernet-keys\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.262926 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g59s7\" (UniqueName: \"kubernetes.io/projected/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-kube-api-access-g59s7\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.262981 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-config-data\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.263050 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-scripts\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.263191 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-credential-keys\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.263337 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-combined-ca-bundle\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.364341 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-fernet-keys\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.364389 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g59s7\" (UniqueName: \"kubernetes.io/projected/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-kube-api-access-g59s7\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.364412 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-config-data\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.364462 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-scripts\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.364496 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-credential-keys\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.364540 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-combined-ca-bundle\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.370746 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-combined-ca-bundle\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.370990 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-fernet-keys\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.371105 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-credential-keys\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.371455 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-config-data\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.371885 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-scripts\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.381477 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g59s7\" (UniqueName: \"kubernetes.io/projected/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-kube-api-access-g59s7\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.477576 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.522185 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60623dff-0241-4bc9-8b17-de61d7271e19" path="/var/lib/kubelet/pods/60623dff-0241-4bc9-8b17-de61d7271e19/volumes" Feb 19 21:46:28 crc kubenswrapper[4795]: I0219 21:46:28.428126 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:46:28 crc kubenswrapper[4795]: I0219 21:46:28.428931 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.038755 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-768666cd57-5c55b" podUID="2f384824-f8ad-42d9-b09b-decb5280b448" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: i/o timeout" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.767794 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.782273 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.870922 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b668591-0c11-42f0-b813-94b76c8cbd1b-logs\") pod \"2b668591-0c11-42f0-b813-94b76c8cbd1b\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.870984 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2b668591-0c11-42f0-b813-94b76c8cbd1b-httpd-run\") pod \"2b668591-0c11-42f0-b813-94b76c8cbd1b\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.871009 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-scripts\") pod \"2b668591-0c11-42f0-b813-94b76c8cbd1b\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.871047 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-config-data\") pod \"5f9a41f3-9dae-4426-8687-368f5911a834\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.871067 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f9a41f3-9dae-4426-8687-368f5911a834-logs\") pod \"5f9a41f3-9dae-4426-8687-368f5911a834\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.871098 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"2b668591-0c11-42f0-b813-94b76c8cbd1b\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.871120 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-combined-ca-bundle\") pod \"2b668591-0c11-42f0-b813-94b76c8cbd1b\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.871151 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-scripts\") pod \"5f9a41f3-9dae-4426-8687-368f5911a834\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.871199 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2rcc\" (UniqueName: \"kubernetes.io/projected/2b668591-0c11-42f0-b813-94b76c8cbd1b-kube-api-access-v2rcc\") pod \"2b668591-0c11-42f0-b813-94b76c8cbd1b\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.871219 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-config-data\") pod \"2b668591-0c11-42f0-b813-94b76c8cbd1b\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.871251 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f9a41f3-9dae-4426-8687-368f5911a834-httpd-run\") pod \"5f9a41f3-9dae-4426-8687-368f5911a834\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.871301 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9x6w\" (UniqueName: \"kubernetes.io/projected/5f9a41f3-9dae-4426-8687-368f5911a834-kube-api-access-r9x6w\") pod \"5f9a41f3-9dae-4426-8687-368f5911a834\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.871330 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"5f9a41f3-9dae-4426-8687-368f5911a834\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.871357 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-combined-ca-bundle\") pod \"5f9a41f3-9dae-4426-8687-368f5911a834\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.872878 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f9a41f3-9dae-4426-8687-368f5911a834-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5f9a41f3-9dae-4426-8687-368f5911a834" (UID: "5f9a41f3-9dae-4426-8687-368f5911a834"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.873063 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b668591-0c11-42f0-b813-94b76c8cbd1b-logs" (OuterVolumeSpecName: "logs") pod "2b668591-0c11-42f0-b813-94b76c8cbd1b" (UID: "2b668591-0c11-42f0-b813-94b76c8cbd1b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.873246 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b668591-0c11-42f0-b813-94b76c8cbd1b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2b668591-0c11-42f0-b813-94b76c8cbd1b" (UID: "2b668591-0c11-42f0-b813-94b76c8cbd1b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.873402 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f9a41f3-9dae-4426-8687-368f5911a834-logs" (OuterVolumeSpecName: "logs") pod "5f9a41f3-9dae-4426-8687-368f5911a834" (UID: "5f9a41f3-9dae-4426-8687-368f5911a834"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.878458 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b668591-0c11-42f0-b813-94b76c8cbd1b-kube-api-access-v2rcc" (OuterVolumeSpecName: "kube-api-access-v2rcc") pod "2b668591-0c11-42f0-b813-94b76c8cbd1b" (UID: "2b668591-0c11-42f0-b813-94b76c8cbd1b"). InnerVolumeSpecName "kube-api-access-v2rcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.878937 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-scripts" (OuterVolumeSpecName: "scripts") pod "5f9a41f3-9dae-4426-8687-368f5911a834" (UID: "5f9a41f3-9dae-4426-8687-368f5911a834"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.879042 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "5f9a41f3-9dae-4426-8687-368f5911a834" (UID: "5f9a41f3-9dae-4426-8687-368f5911a834"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.879534 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f9a41f3-9dae-4426-8687-368f5911a834-kube-api-access-r9x6w" (OuterVolumeSpecName: "kube-api-access-r9x6w") pod "5f9a41f3-9dae-4426-8687-368f5911a834" (UID: "5f9a41f3-9dae-4426-8687-368f5911a834"). InnerVolumeSpecName "kube-api-access-r9x6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.881486 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "2b668591-0c11-42f0-b813-94b76c8cbd1b" (UID: "2b668591-0c11-42f0-b813-94b76c8cbd1b"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.892484 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-scripts" (OuterVolumeSpecName: "scripts") pod "2b668591-0c11-42f0-b813-94b76c8cbd1b" (UID: "2b668591-0c11-42f0-b813-94b76c8cbd1b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.898805 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b668591-0c11-42f0-b813-94b76c8cbd1b" (UID: "2b668591-0c11-42f0-b813-94b76c8cbd1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.900639 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f9a41f3-9dae-4426-8687-368f5911a834" (UID: "5f9a41f3-9dae-4426-8687-368f5911a834"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.923266 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-config-data" (OuterVolumeSpecName: "config-data") pod "5f9a41f3-9dae-4426-8687-368f5911a834" (UID: "5f9a41f3-9dae-4426-8687-368f5911a834"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.925748 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-config-data" (OuterVolumeSpecName: "config-data") pod "2b668591-0c11-42f0-b813-94b76c8cbd1b" (UID: "2b668591-0c11-42f0-b813-94b76c8cbd1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.973831 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.973867 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f9a41f3-9dae-4426-8687-368f5911a834-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.973922 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.973933 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.973944 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.973952 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2rcc\" (UniqueName: \"kubernetes.io/projected/2b668591-0c11-42f0-b813-94b76c8cbd1b-kube-api-access-v2rcc\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.973987 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.973994 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f9a41f3-9dae-4426-8687-368f5911a834-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.974003 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9x6w\" (UniqueName: \"kubernetes.io/projected/5f9a41f3-9dae-4426-8687-368f5911a834-kube-api-access-r9x6w\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.974018 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.974026 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.974034 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b668591-0c11-42f0-b813-94b76c8cbd1b-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.974043 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2b668591-0c11-42f0-b813-94b76c8cbd1b-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.974054 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.986067 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f9a41f3-9dae-4426-8687-368f5911a834","Type":"ContainerDied","Data":"74e861972e0b08341b7578c99f45877ca28dee930187dfc7c5d68702065ba963"} Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.986377 4795 scope.go:117] "RemoveContainer" containerID="a8c2432cd1a201f5714626d8c9b69ee61c26692ea73aa3955e3d0f4d2094d962" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.986079 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.992461 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2b668591-0c11-42f0-b813-94b76c8cbd1b","Type":"ContainerDied","Data":"2d8f5079a7199ad590b8e7ae61509c061499525039c4df8317fdb2a9d4218e70"} Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.992581 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.000326 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.002796 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.020473 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.027050 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.075450 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.075488 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.089288 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:46:30 crc kubenswrapper[4795]: E0219 21:46:30.089693 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b668591-0c11-42f0-b813-94b76c8cbd1b" containerName="glance-log" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.089710 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b668591-0c11-42f0-b813-94b76c8cbd1b" containerName="glance-log" Feb 19 21:46:30 crc kubenswrapper[4795]: E0219 21:46:30.089722 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b668591-0c11-42f0-b813-94b76c8cbd1b" containerName="glance-httpd" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.089728 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b668591-0c11-42f0-b813-94b76c8cbd1b" containerName="glance-httpd" Feb 19 21:46:30 crc kubenswrapper[4795]: E0219 21:46:30.089742 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f9a41f3-9dae-4426-8687-368f5911a834" containerName="glance-httpd" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.089748 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f9a41f3-9dae-4426-8687-368f5911a834" containerName="glance-httpd" Feb 19 21:46:30 crc kubenswrapper[4795]: E0219 21:46:30.089757 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f9a41f3-9dae-4426-8687-368f5911a834" containerName="glance-log" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.089763 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f9a41f3-9dae-4426-8687-368f5911a834" containerName="glance-log" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.089936 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b668591-0c11-42f0-b813-94b76c8cbd1b" containerName="glance-httpd" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.089949 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f9a41f3-9dae-4426-8687-368f5911a834" containerName="glance-httpd" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.089967 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f9a41f3-9dae-4426-8687-368f5911a834" containerName="glance-log" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.089976 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b668591-0c11-42f0-b813-94b76c8cbd1b" containerName="glance-log" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.090827 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.093851 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.097642 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.097694 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.099309 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.105904 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6rhfn" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.112937 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.125319 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.141750 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.143336 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.149774 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.149951 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.157367 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.180358 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.180423 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b449064d-5c14-4362-ba7b-a24ee9292789-logs\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.180476 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.180555 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.180626 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.180752 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.180804 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b449064d-5c14-4362-ba7b-a24ee9292789-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.180852 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zp5z\" (UniqueName: \"kubernetes.io/projected/b449064d-5c14-4362-ba7b-a24ee9292789-kube-api-access-7zp5z\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.284572 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.284632 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b449064d-5c14-4362-ba7b-a24ee9292789-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.284650 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zp5z\" (UniqueName: \"kubernetes.io/projected/b449064d-5c14-4362-ba7b-a24ee9292789-kube-api-access-7zp5z\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.284698 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac1caac2-edf5-453d-a76d-e1c65b7f038b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.284721 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.284747 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.284765 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b449064d-5c14-4362-ba7b-a24ee9292789-logs\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.284788 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.284816 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.284831 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-scripts\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.284848 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac1caac2-edf5-453d-a76d-e1c65b7f038b-logs\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.284864 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.284882 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-config-data\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.284903 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz2gw\" (UniqueName: \"kubernetes.io/projected/ac1caac2-edf5-453d-a76d-e1c65b7f038b-kube-api-access-hz2gw\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.284925 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.284946 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.286372 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.286435 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b449064d-5c14-4362-ba7b-a24ee9292789-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.286709 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b449064d-5c14-4362-ba7b-a24ee9292789-logs\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.290805 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.292752 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.293648 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.300887 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.317437 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zp5z\" (UniqueName: \"kubernetes.io/projected/b449064d-5c14-4362-ba7b-a24ee9292789-kube-api-access-7zp5z\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.338055 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.386811 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.386860 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-scripts\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.386884 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac1caac2-edf5-453d-a76d-e1c65b7f038b-logs\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.386913 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-config-data\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.386939 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz2gw\" (UniqueName: \"kubernetes.io/projected/ac1caac2-edf5-453d-a76d-e1c65b7f038b-kube-api-access-hz2gw\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.386983 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.387089 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac1caac2-edf5-453d-a76d-e1c65b7f038b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.387115 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.387492 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.388190 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac1caac2-edf5-453d-a76d-e1c65b7f038b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.388421 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac1caac2-edf5-453d-a76d-e1c65b7f038b-logs\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.392293 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-config-data\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.393064 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-scripts\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.393200 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.393919 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.404932 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz2gw\" (UniqueName: \"kubernetes.io/projected/ac1caac2-edf5-453d-a76d-e1c65b7f038b-kube-api-access-hz2gw\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.413772 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.434469 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.472635 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:46:31 crc kubenswrapper[4795]: E0219 21:46:31.194364 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:3fa6e687aa002b92fedbfe2c1ccaa2906b399c58d17bf9ecece2c4cd69a0210b" Feb 19 21:46:31 crc kubenswrapper[4795]: E0219 21:46:31.194544 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:3fa6e687aa002b92fedbfe2c1ccaa2906b399c58d17bf9ecece2c4cd69a0210b,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qbqsc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-zjbsw_openstack(5084e7b9-4923-449e-b0d7-28c602faeff0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 21:46:31 crc kubenswrapper[4795]: E0219 21:46:31.195878 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-zjbsw" podUID="5084e7b9-4923-449e-b0d7-28c602faeff0" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.204924 4795 scope.go:117] "RemoveContainer" containerID="cbf59bdd46660f6210995b0eb2fc2126976500e2a6c6cee2fb345b41a51900b2" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.352030 4795 scope.go:117] "RemoveContainer" containerID="ecaa7d4a6273a3af65b1f5d9585a04b8dd375c28b94a4127bae9a591c4a91d3e" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.436505 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.455929 4795 scope.go:117] "RemoveContainer" containerID="f5cfd3f9869f87af702a2e0e738d0b4feb92ba480863b5a46d242043d700ac19" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.525388 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b668591-0c11-42f0-b813-94b76c8cbd1b" path="/var/lib/kubelet/pods/2b668591-0c11-42f0-b813-94b76c8cbd1b/volumes" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.526734 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f9a41f3-9dae-4426-8687-368f5911a834" path="/var/lib/kubelet/pods/5f9a41f3-9dae-4426-8687-368f5911a834/volumes" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.547870 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-ovsdbserver-sb\") pod \"2f384824-f8ad-42d9-b09b-decb5280b448\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.548335 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-dns-swift-storage-0\") pod \"2f384824-f8ad-42d9-b09b-decb5280b448\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.548372 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w95wm\" (UniqueName: \"kubernetes.io/projected/2f384824-f8ad-42d9-b09b-decb5280b448-kube-api-access-w95wm\") pod \"2f384824-f8ad-42d9-b09b-decb5280b448\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.548397 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-ovsdbserver-nb\") pod \"2f384824-f8ad-42d9-b09b-decb5280b448\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.548726 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-config\") pod \"2f384824-f8ad-42d9-b09b-decb5280b448\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.548759 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-dns-svc\") pod \"2f384824-f8ad-42d9-b09b-decb5280b448\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.554277 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f384824-f8ad-42d9-b09b-decb5280b448-kube-api-access-w95wm" (OuterVolumeSpecName: "kube-api-access-w95wm") pod "2f384824-f8ad-42d9-b09b-decb5280b448" (UID: "2f384824-f8ad-42d9-b09b-decb5280b448"). InnerVolumeSpecName "kube-api-access-w95wm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.594341 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-config" (OuterVolumeSpecName: "config") pod "2f384824-f8ad-42d9-b09b-decb5280b448" (UID: "2f384824-f8ad-42d9-b09b-decb5280b448"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.594921 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2f384824-f8ad-42d9-b09b-decb5280b448" (UID: "2f384824-f8ad-42d9-b09b-decb5280b448"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.595689 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2f384824-f8ad-42d9-b09b-decb5280b448" (UID: "2f384824-f8ad-42d9-b09b-decb5280b448"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.599543 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2f384824-f8ad-42d9-b09b-decb5280b448" (UID: "2f384824-f8ad-42d9-b09b-decb5280b448"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.605775 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2f384824-f8ad-42d9-b09b-decb5280b448" (UID: "2f384824-f8ad-42d9-b09b-decb5280b448"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.650586 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.650621 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.650630 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.650640 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w95wm\" (UniqueName: \"kubernetes.io/projected/2f384824-f8ad-42d9-b09b-decb5280b448-kube-api-access-w95wm\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.650648 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.650657 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.695630 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-nffrq"] Feb 19 21:46:31 crc kubenswrapper[4795]: W0219 21:46:31.696694 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2a7b298_40b6_43b3_9099_ec74f2f0bfad.slice/crio-476e1da2e4d000d91846aabf81bf70ab643e3d012d9f8cc8cadac543daeaf6ad WatchSource:0}: Error finding container 476e1da2e4d000d91846aabf81bf70ab643e3d012d9f8cc8cadac543daeaf6ad: Status 404 returned error can't find the container with id 476e1da2e4d000d91846aabf81bf70ab643e3d012d9f8cc8cadac543daeaf6ad Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.866159 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:46:32 crc kubenswrapper[4795]: I0219 21:46:32.009903 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ttz5x" event={"ID":"e61f40e0-d6c3-49f7-a93f-d9956f086d4b","Type":"ContainerStarted","Data":"76c63f1533e7c3fa5624057c7772474f1729c45e568e970f6a98d3e337ab74ae"} Feb 19 21:46:32 crc kubenswrapper[4795]: I0219 21:46:32.011671 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6698443-b029-4098-81d6-dba6d5f239f2","Type":"ContainerStarted","Data":"b61288d5e6e3a8ef1a9301b874e513c48be039a918932bba9b9edb5495a72e81"} Feb 19 21:46:32 crc kubenswrapper[4795]: I0219 21:46:32.013650 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jkspq" event={"ID":"454af6b2-4c9e-4706-a537-b3e3d468353d","Type":"ContainerStarted","Data":"a55b1033c9d990cf35d1c92321ef2ffff86d5c602754349685623befa9e70257"} Feb 19 21:46:32 crc kubenswrapper[4795]: I0219 21:46:32.015356 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nffrq" event={"ID":"a2a7b298-40b6-43b3-9099-ec74f2f0bfad","Type":"ContainerStarted","Data":"da621ffa210adf5451b6eb9d78a9cbed2608c7b5e2b1a9ce00eba1c1e65c6ec4"} Feb 19 21:46:32 crc kubenswrapper[4795]: I0219 21:46:32.015380 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nffrq" event={"ID":"a2a7b298-40b6-43b3-9099-ec74f2f0bfad","Type":"ContainerStarted","Data":"476e1da2e4d000d91846aabf81bf70ab643e3d012d9f8cc8cadac543daeaf6ad"} Feb 19 21:46:32 crc kubenswrapper[4795]: I0219 21:46:32.018421 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768666cd57-5c55b" event={"ID":"2f384824-f8ad-42d9-b09b-decb5280b448","Type":"ContainerDied","Data":"13682a647f5a86053553d890ba06f71e1b6b17acba7850bc7912292cc1d6509d"} Feb 19 21:46:32 crc kubenswrapper[4795]: I0219 21:46:32.018450 4795 scope.go:117] "RemoveContainer" containerID="2ad1ba517bb7be9e1a9d1797c2af47441870593656bca08175b2e33f72c8f9fc" Feb 19 21:46:32 crc kubenswrapper[4795]: I0219 21:46:32.018538 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:46:32 crc kubenswrapper[4795]: I0219 21:46:32.024896 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-ttz5x" podStartSLOduration=2.465339131 podStartE2EDuration="23.024878702s" podCreationTimestamp="2026-02-19 21:46:09 +0000 UTC" firstStartedPulling="2026-02-19 21:46:10.648954286 +0000 UTC m=+1081.841472150" lastFinishedPulling="2026-02-19 21:46:31.208493857 +0000 UTC m=+1102.401011721" observedRunningTime="2026-02-19 21:46:32.024693486 +0000 UTC m=+1103.217211350" watchObservedRunningTime="2026-02-19 21:46:32.024878702 +0000 UTC m=+1103.217396566" Feb 19 21:46:32 crc kubenswrapper[4795]: I0219 21:46:32.038769 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b449064d-5c14-4362-ba7b-a24ee9292789","Type":"ContainerStarted","Data":"fe3c35dfb7e24d88f06d64c3416cafe5c8ebe7a75022634f77450664da8f2158"} Feb 19 21:46:32 crc kubenswrapper[4795]: E0219 21:46:32.045136 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:3fa6e687aa002b92fedbfe2c1ccaa2906b399c58d17bf9ecece2c4cd69a0210b\\\"\"" pod="openstack/cinder-db-sync-zjbsw" podUID="5084e7b9-4923-449e-b0d7-28c602faeff0" Feb 19 21:46:32 crc kubenswrapper[4795]: I0219 21:46:32.053032 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-jkspq" podStartSLOduration=2.538526204 podStartE2EDuration="23.053013725s" podCreationTimestamp="2026-02-19 21:46:09 +0000 UTC" firstStartedPulling="2026-02-19 21:46:10.651319762 +0000 UTC m=+1081.843837636" lastFinishedPulling="2026-02-19 21:46:31.165807293 +0000 UTC m=+1102.358325157" observedRunningTime="2026-02-19 21:46:32.042019035 +0000 UTC m=+1103.234536899" watchObservedRunningTime="2026-02-19 21:46:32.053013725 +0000 UTC m=+1103.245531589" Feb 19 21:46:32 crc kubenswrapper[4795]: I0219 21:46:32.067280 4795 scope.go:117] "RemoveContainer" containerID="269c23977ee5c8c89a33c0c6142172272e66947879bd10de823a2ea6ed071bb6" Feb 19 21:46:32 crc kubenswrapper[4795]: I0219 21:46:32.067326 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-nffrq" podStartSLOduration=9.067304598 podStartE2EDuration="9.067304598s" podCreationTimestamp="2026-02-19 21:46:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:32.058948142 +0000 UTC m=+1103.251466016" watchObservedRunningTime="2026-02-19 21:46:32.067304598 +0000 UTC m=+1103.259822482" Feb 19 21:46:32 crc kubenswrapper[4795]: I0219 21:46:32.098986 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-768666cd57-5c55b"] Feb 19 21:46:32 crc kubenswrapper[4795]: I0219 21:46:32.105537 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-768666cd57-5c55b"] Feb 19 21:46:32 crc kubenswrapper[4795]: I0219 21:46:32.435446 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:46:32 crc kubenswrapper[4795]: W0219 21:46:32.622977 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac1caac2_edf5_453d_a76d_e1c65b7f038b.slice/crio-3b74237da4db43dc7c546ec4709a7afdab3dbcd047ed3e21f44f8f9ee5a66753 WatchSource:0}: Error finding container 3b74237da4db43dc7c546ec4709a7afdab3dbcd047ed3e21f44f8f9ee5a66753: Status 404 returned error can't find the container with id 3b74237da4db43dc7c546ec4709a7afdab3dbcd047ed3e21f44f8f9ee5a66753 Feb 19 21:46:33 crc kubenswrapper[4795]: I0219 21:46:33.052618 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6698443-b029-4098-81d6-dba6d5f239f2","Type":"ContainerStarted","Data":"e126e33b52d2dc6b19be356eca9905c5a1a510ca7016f4ef6d57dc72b82f12fd"} Feb 19 21:46:33 crc kubenswrapper[4795]: I0219 21:46:33.053998 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac1caac2-edf5-453d-a76d-e1c65b7f038b","Type":"ContainerStarted","Data":"3b74237da4db43dc7c546ec4709a7afdab3dbcd047ed3e21f44f8f9ee5a66753"} Feb 19 21:46:33 crc kubenswrapper[4795]: I0219 21:46:33.055904 4795 generic.go:334] "Generic (PLEG): container finished" podID="1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec" containerID="b0effc166b07beb1020b33c2901a97075e3341db6cb3398e72144f393ccb6850" exitCode=0 Feb 19 21:46:33 crc kubenswrapper[4795]: I0219 21:46:33.055939 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-b4bcd" event={"ID":"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec","Type":"ContainerDied","Data":"b0effc166b07beb1020b33c2901a97075e3341db6cb3398e72144f393ccb6850"} Feb 19 21:46:33 crc kubenswrapper[4795]: I0219 21:46:33.060367 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b449064d-5c14-4362-ba7b-a24ee9292789","Type":"ContainerStarted","Data":"1cce505138da629126ee8b9c2d975891bbe03cb0751f01b86b7395429a4b3b1a"} Feb 19 21:46:33 crc kubenswrapper[4795]: I0219 21:46:33.527131 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f384824-f8ad-42d9-b09b-decb5280b448" path="/var/lib/kubelet/pods/2f384824-f8ad-42d9-b09b-decb5280b448/volumes" Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.039301 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-768666cd57-5c55b" podUID="2f384824-f8ad-42d9-b09b-decb5280b448" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: i/o timeout" Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.071517 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b449064d-5c14-4362-ba7b-a24ee9292789","Type":"ContainerStarted","Data":"24b4c0022da799cfde408a58dfc57a52b4277c9601d8bb59da3e5c565074603d"} Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.075535 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac1caac2-edf5-453d-a76d-e1c65b7f038b","Type":"ContainerStarted","Data":"32a8f97c4bdeeea5fdb4b24c48b5ee8e3dda515976e342ae4939a42ab5261eec"} Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.075588 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac1caac2-edf5-453d-a76d-e1c65b7f038b","Type":"ContainerStarted","Data":"7826593332e6fe0d625ec77b566a78383702574fe609c8ca89088f745857f981"} Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.077574 4795 generic.go:334] "Generic (PLEG): container finished" podID="454af6b2-4c9e-4706-a537-b3e3d468353d" containerID="a55b1033c9d990cf35d1c92321ef2ffff86d5c602754349685623befa9e70257" exitCode=0 Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.077622 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jkspq" event={"ID":"454af6b2-4c9e-4706-a537-b3e3d468353d","Type":"ContainerDied","Data":"a55b1033c9d990cf35d1c92321ef2ffff86d5c602754349685623befa9e70257"} Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.106923 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.10690428 podStartE2EDuration="4.10690428s" podCreationTimestamp="2026-02-19 21:46:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:34.094620163 +0000 UTC m=+1105.287138057" watchObservedRunningTime="2026-02-19 21:46:34.10690428 +0000 UTC m=+1105.299422144" Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.128772 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.128747196 podStartE2EDuration="4.128747196s" podCreationTimestamp="2026-02-19 21:46:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:34.122360496 +0000 UTC m=+1105.314878380" watchObservedRunningTime="2026-02-19 21:46:34.128747196 +0000 UTC m=+1105.321265060" Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.495098 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-b4bcd" Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.608132 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-config\") pod \"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec\" (UID: \"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec\") " Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.608206 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6xqk\" (UniqueName: \"kubernetes.io/projected/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-kube-api-access-c6xqk\") pod \"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec\" (UID: \"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec\") " Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.608237 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-combined-ca-bundle\") pod \"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec\" (UID: \"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec\") " Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.620453 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-kube-api-access-c6xqk" (OuterVolumeSpecName: "kube-api-access-c6xqk") pod "1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec" (UID: "1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec"). InnerVolumeSpecName "kube-api-access-c6xqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.638095 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-config" (OuterVolumeSpecName: "config") pod "1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec" (UID: "1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.644410 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec" (UID: "1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.710895 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6xqk\" (UniqueName: \"kubernetes.io/projected/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-kube-api-access-c6xqk\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.710951 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.710974 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.089463 4795 generic.go:334] "Generic (PLEG): container finished" podID="e61f40e0-d6c3-49f7-a93f-d9956f086d4b" containerID="76c63f1533e7c3fa5624057c7772474f1729c45e568e970f6a98d3e337ab74ae" exitCode=0 Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.089517 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ttz5x" event={"ID":"e61f40e0-d6c3-49f7-a93f-d9956f086d4b","Type":"ContainerDied","Data":"76c63f1533e7c3fa5624057c7772474f1729c45e568e970f6a98d3e337ab74ae"} Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.091254 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-b4bcd" event={"ID":"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec","Type":"ContainerDied","Data":"9f412f3971266eb4c52987ab5a1d19a79a80d58662b3e55c9fe1df068708db16"} Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.091280 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f412f3971266eb4c52987ab5a1d19a79a80d58662b3e55c9fe1df068708db16" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.091372 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-b4bcd" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.104287 4795 generic.go:334] "Generic (PLEG): container finished" podID="a2a7b298-40b6-43b3-9099-ec74f2f0bfad" containerID="da621ffa210adf5451b6eb9d78a9cbed2608c7b5e2b1a9ce00eba1c1e65c6ec4" exitCode=0 Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.104409 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nffrq" event={"ID":"a2a7b298-40b6-43b3-9099-ec74f2f0bfad","Type":"ContainerDied","Data":"da621ffa210adf5451b6eb9d78a9cbed2608c7b5e2b1a9ce00eba1c1e65c6ec4"} Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.318120 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-db5c97f8f-vsr59"] Feb 19 21:46:35 crc kubenswrapper[4795]: E0219 21:46:35.318516 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f384824-f8ad-42d9-b09b-decb5280b448" containerName="init" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.318530 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f384824-f8ad-42d9-b09b-decb5280b448" containerName="init" Feb 19 21:46:35 crc kubenswrapper[4795]: E0219 21:46:35.318546 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f384824-f8ad-42d9-b09b-decb5280b448" containerName="dnsmasq-dns" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.318553 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f384824-f8ad-42d9-b09b-decb5280b448" containerName="dnsmasq-dns" Feb 19 21:46:35 crc kubenswrapper[4795]: E0219 21:46:35.318562 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec" containerName="neutron-db-sync" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.318568 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec" containerName="neutron-db-sync" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.318703 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f384824-f8ad-42d9-b09b-decb5280b448" containerName="dnsmasq-dns" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.318728 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec" containerName="neutron-db-sync" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.319619 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.332985 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-db5c97f8f-vsr59"] Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.424900 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-ovsdbserver-nb\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.424979 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-ovsdbserver-sb\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.425020 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-dns-swift-storage-0\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.425038 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rcql\" (UniqueName: \"kubernetes.io/projected/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-kube-api-access-9rcql\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.425109 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-dns-svc\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.425231 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-config\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.478383 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-78d7c97684-8rgnf"] Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.480606 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.483565 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.483669 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.483755 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.483855 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-nf9cn" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.525022 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-78d7c97684-8rgnf"] Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.526248 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-ovndb-tls-certs\") pod \"neutron-78d7c97684-8rgnf\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.526295 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp5fp\" (UniqueName: \"kubernetes.io/projected/2af40d0f-93fe-4592-a07b-0cee3eefbde5-kube-api-access-kp5fp\") pod \"neutron-78d7c97684-8rgnf\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.526323 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-config\") pod \"neutron-78d7c97684-8rgnf\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.526354 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-config\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.526378 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-combined-ca-bundle\") pod \"neutron-78d7c97684-8rgnf\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.526398 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-ovsdbserver-nb\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.526423 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-ovsdbserver-sb\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.526441 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-httpd-config\") pod \"neutron-78d7c97684-8rgnf\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.526469 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-dns-swift-storage-0\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.526484 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rcql\" (UniqueName: \"kubernetes.io/projected/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-kube-api-access-9rcql\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.526505 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-dns-svc\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.527662 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-dns-svc\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.528985 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-ovsdbserver-sb\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.529503 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-ovsdbserver-nb\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.529971 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-dns-swift-storage-0\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.531456 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-config\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.549951 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rcql\" (UniqueName: \"kubernetes.io/projected/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-kube-api-access-9rcql\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.618829 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.628129 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-ovndb-tls-certs\") pod \"neutron-78d7c97684-8rgnf\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.628544 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp5fp\" (UniqueName: \"kubernetes.io/projected/2af40d0f-93fe-4592-a07b-0cee3eefbde5-kube-api-access-kp5fp\") pod \"neutron-78d7c97684-8rgnf\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.628584 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-config\") pod \"neutron-78d7c97684-8rgnf\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.628623 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-combined-ca-bundle\") pod \"neutron-78d7c97684-8rgnf\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.628650 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-httpd-config\") pod \"neutron-78d7c97684-8rgnf\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.634726 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-httpd-config\") pod \"neutron-78d7c97684-8rgnf\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.641032 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-combined-ca-bundle\") pod \"neutron-78d7c97684-8rgnf\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.642942 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-ovndb-tls-certs\") pod \"neutron-78d7c97684-8rgnf\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.643419 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.647623 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-config\") pod \"neutron-78d7c97684-8rgnf\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.660930 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp5fp\" (UniqueName: \"kubernetes.io/projected/2af40d0f-93fe-4592-a07b-0cee3eefbde5-kube-api-access-kp5fp\") pod \"neutron-78d7c97684-8rgnf\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.730241 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/454af6b2-4c9e-4706-a537-b3e3d468353d-logs\") pod \"454af6b2-4c9e-4706-a537-b3e3d468353d\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.730363 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppjmb\" (UniqueName: \"kubernetes.io/projected/454af6b2-4c9e-4706-a537-b3e3d468353d-kube-api-access-ppjmb\") pod \"454af6b2-4c9e-4706-a537-b3e3d468353d\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.730476 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-combined-ca-bundle\") pod \"454af6b2-4c9e-4706-a537-b3e3d468353d\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.730529 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-scripts\") pod \"454af6b2-4c9e-4706-a537-b3e3d468353d\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.730588 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-config-data\") pod \"454af6b2-4c9e-4706-a537-b3e3d468353d\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.732233 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/454af6b2-4c9e-4706-a537-b3e3d468353d-logs" (OuterVolumeSpecName: "logs") pod "454af6b2-4c9e-4706-a537-b3e3d468353d" (UID: "454af6b2-4c9e-4706-a537-b3e3d468353d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.737408 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-scripts" (OuterVolumeSpecName: "scripts") pod "454af6b2-4c9e-4706-a537-b3e3d468353d" (UID: "454af6b2-4c9e-4706-a537-b3e3d468353d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.737718 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/454af6b2-4c9e-4706-a537-b3e3d468353d-kube-api-access-ppjmb" (OuterVolumeSpecName: "kube-api-access-ppjmb") pod "454af6b2-4c9e-4706-a537-b3e3d468353d" (UID: "454af6b2-4c9e-4706-a537-b3e3d468353d"). InnerVolumeSpecName "kube-api-access-ppjmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.788378 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "454af6b2-4c9e-4706-a537-b3e3d468353d" (UID: "454af6b2-4c9e-4706-a537-b3e3d468353d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.793474 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-config-data" (OuterVolumeSpecName: "config-data") pod "454af6b2-4c9e-4706-a537-b3e3d468353d" (UID: "454af6b2-4c9e-4706-a537-b3e3d468353d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.812511 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.833657 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.833690 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.833700 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.833712 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/454af6b2-4c9e-4706-a537-b3e3d468353d-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.833722 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppjmb\" (UniqueName: \"kubernetes.io/projected/454af6b2-4c9e-4706-a537-b3e3d468353d-kube-api-access-ppjmb\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.121046 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jkspq" event={"ID":"454af6b2-4c9e-4706-a537-b3e3d468353d","Type":"ContainerDied","Data":"2f56b820ecab38a5f4f0b9d439b2a430292837a387794fcf0430bf0243ab99f1"} Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.121512 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f56b820ecab38a5f4f0b9d439b2a430292837a387794fcf0430bf0243ab99f1" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.121397 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.148893 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-db5c97f8f-vsr59"] Feb 19 21:46:36 crc kubenswrapper[4795]: W0219 21:46:36.161440 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b5d1bab_ed1a_4a5e_a194_ad6b25d16210.slice/crio-d1d8cd14d0dca706ef7e60716badbe8723a3cc32c8f0ce72a5ad49bcb63f1b70 WatchSource:0}: Error finding container d1d8cd14d0dca706ef7e60716badbe8723a3cc32c8f0ce72a5ad49bcb63f1b70: Status 404 returned error can't find the container with id d1d8cd14d0dca706ef7e60716badbe8723a3cc32c8f0ce72a5ad49bcb63f1b70 Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.283932 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-76bfdcc9c4-d56mx"] Feb 19 21:46:36 crc kubenswrapper[4795]: E0219 21:46:36.284318 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="454af6b2-4c9e-4706-a537-b3e3d468353d" containerName="placement-db-sync" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.284335 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="454af6b2-4c9e-4706-a537-b3e3d468353d" containerName="placement-db-sync" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.284488 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="454af6b2-4c9e-4706-a537-b3e3d468353d" containerName="placement-db-sync" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.285343 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.287295 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.287739 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.287831 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.288006 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-bkmsl" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.290111 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.303580 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-76bfdcc9c4-d56mx"] Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.355829 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-scripts\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.355919 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdw57\" (UniqueName: \"kubernetes.io/projected/bd5855e1-cadb-4170-8339-5f10945c6ce9-kube-api-access-fdw57\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.355944 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-public-tls-certs\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.355999 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-combined-ca-bundle\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.356425 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-internal-tls-certs\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.356696 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd5855e1-cadb-4170-8339-5f10945c6ce9-logs\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.356811 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-config-data\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.445608 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-78d7c97684-8rgnf"] Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.466700 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd5855e1-cadb-4170-8339-5f10945c6ce9-logs\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.466814 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-config-data\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.466921 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-scripts\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.467434 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd5855e1-cadb-4170-8339-5f10945c6ce9-logs\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.467976 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdw57\" (UniqueName: \"kubernetes.io/projected/bd5855e1-cadb-4170-8339-5f10945c6ce9-kube-api-access-fdw57\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.468017 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-public-tls-certs\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.468055 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-combined-ca-bundle\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.468193 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-internal-tls-certs\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.475937 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-public-tls-certs\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.475969 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-combined-ca-bundle\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.476355 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-config-data\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.479125 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-internal-tls-certs\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.479523 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-scripts\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: W0219 21:46:36.481513 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2af40d0f_93fe_4592_a07b_0cee3eefbde5.slice/crio-0c9496366294be7a896bc72ec610b06b4b589bece135bb9b445591c9fe5a825f WatchSource:0}: Error finding container 0c9496366294be7a896bc72ec610b06b4b589bece135bb9b445591c9fe5a825f: Status 404 returned error can't find the container with id 0c9496366294be7a896bc72ec610b06b4b589bece135bb9b445591c9fe5a825f Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.487625 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdw57\" (UniqueName: \"kubernetes.io/projected/bd5855e1-cadb-4170-8339-5f10945c6ce9-kube-api-access-fdw57\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.617832 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.816991 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.820259 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ttz5x" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.875756 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-db-sync-config-data\") pod \"e61f40e0-d6c3-49f7-a93f-d9956f086d4b\" (UID: \"e61f40e0-d6c3-49f7-a93f-d9956f086d4b\") " Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.875795 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-credential-keys\") pod \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.875831 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-config-data\") pod \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.875878 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g59s7\" (UniqueName: \"kubernetes.io/projected/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-kube-api-access-g59s7\") pod \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.875919 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-fernet-keys\") pod \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.875945 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-combined-ca-bundle\") pod \"e61f40e0-d6c3-49f7-a93f-d9956f086d4b\" (UID: \"e61f40e0-d6c3-49f7-a93f-d9956f086d4b\") " Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.875987 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-combined-ca-bundle\") pod \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.876026 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7sch\" (UniqueName: \"kubernetes.io/projected/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-kube-api-access-k7sch\") pod \"e61f40e0-d6c3-49f7-a93f-d9956f086d4b\" (UID: \"e61f40e0-d6c3-49f7-a93f-d9956f086d4b\") " Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.876095 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-scripts\") pod \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.888997 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a2a7b298-40b6-43b3-9099-ec74f2f0bfad" (UID: "a2a7b298-40b6-43b3-9099-ec74f2f0bfad"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.890114 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-scripts" (OuterVolumeSpecName: "scripts") pod "a2a7b298-40b6-43b3-9099-ec74f2f0bfad" (UID: "a2a7b298-40b6-43b3-9099-ec74f2f0bfad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.890548 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e61f40e0-d6c3-49f7-a93f-d9956f086d4b" (UID: "e61f40e0-d6c3-49f7-a93f-d9956f086d4b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.890908 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-kube-api-access-g59s7" (OuterVolumeSpecName: "kube-api-access-g59s7") pod "a2a7b298-40b6-43b3-9099-ec74f2f0bfad" (UID: "a2a7b298-40b6-43b3-9099-ec74f2f0bfad"). InnerVolumeSpecName "kube-api-access-g59s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.892641 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-kube-api-access-k7sch" (OuterVolumeSpecName: "kube-api-access-k7sch") pod "e61f40e0-d6c3-49f7-a93f-d9956f086d4b" (UID: "e61f40e0-d6c3-49f7-a93f-d9956f086d4b"). InnerVolumeSpecName "kube-api-access-k7sch". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.899522 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a2a7b298-40b6-43b3-9099-ec74f2f0bfad" (UID: "a2a7b298-40b6-43b3-9099-ec74f2f0bfad"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.933759 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-config-data" (OuterVolumeSpecName: "config-data") pod "a2a7b298-40b6-43b3-9099-ec74f2f0bfad" (UID: "a2a7b298-40b6-43b3-9099-ec74f2f0bfad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.942235 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2a7b298-40b6-43b3-9099-ec74f2f0bfad" (UID: "a2a7b298-40b6-43b3-9099-ec74f2f0bfad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.954858 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e61f40e0-d6c3-49f7-a93f-d9956f086d4b" (UID: "e61f40e0-d6c3-49f7-a93f-d9956f086d4b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.979636 4795 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.979684 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.979695 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.979705 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7sch\" (UniqueName: \"kubernetes.io/projected/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-kube-api-access-k7sch\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.979713 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.979721 4795 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.979730 4795 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.979737 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.979745 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g59s7\" (UniqueName: \"kubernetes.io/projected/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-kube-api-access-g59s7\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.097253 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-76bfdcc9c4-d56mx"] Feb 19 21:46:37 crc kubenswrapper[4795]: W0219 21:46:37.106332 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd5855e1_cadb_4170_8339_5f10945c6ce9.slice/crio-fd2853bf461ad171b9d2aba20648d237de7c6fd1d48533d33bcc6a56c0d7fd46 WatchSource:0}: Error finding container fd2853bf461ad171b9d2aba20648d237de7c6fd1d48533d33bcc6a56c0d7fd46: Status 404 returned error can't find the container with id fd2853bf461ad171b9d2aba20648d237de7c6fd1d48533d33bcc6a56c0d7fd46 Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.135613 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nffrq" event={"ID":"a2a7b298-40b6-43b3-9099-ec74f2f0bfad","Type":"ContainerDied","Data":"476e1da2e4d000d91846aabf81bf70ab643e3d012d9f8cc8cadac543daeaf6ad"} Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.135668 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="476e1da2e4d000d91846aabf81bf70ab643e3d012d9f8cc8cadac543daeaf6ad" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.135680 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.137921 4795 generic.go:334] "Generic (PLEG): container finished" podID="1b5d1bab-ed1a-4a5e-a194-ad6b25d16210" containerID="c0639f1ced9910964acb68b012da7b3244f83a7c9a5d9a111a018fafc99f2185" exitCode=0 Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.137984 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" event={"ID":"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210","Type":"ContainerDied","Data":"c0639f1ced9910964acb68b012da7b3244f83a7c9a5d9a111a018fafc99f2185"} Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.138009 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" event={"ID":"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210","Type":"ContainerStarted","Data":"d1d8cd14d0dca706ef7e60716badbe8723a3cc32c8f0ce72a5ad49bcb63f1b70"} Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.144777 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78d7c97684-8rgnf" event={"ID":"2af40d0f-93fe-4592-a07b-0cee3eefbde5","Type":"ContainerStarted","Data":"2b75649a0876fc8b2fab0d3c547c6041ea1386b0e2f40ee0a4dc2afb93521ff9"} Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.144833 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78d7c97684-8rgnf" event={"ID":"2af40d0f-93fe-4592-a07b-0cee3eefbde5","Type":"ContainerStarted","Data":"45b6d7396950a55ef31e5b8bf7af7c2f0a2555905f99260630834dbe1b48c393"} Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.144848 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78d7c97684-8rgnf" event={"ID":"2af40d0f-93fe-4592-a07b-0cee3eefbde5","Type":"ContainerStarted","Data":"0c9496366294be7a896bc72ec610b06b4b589bece135bb9b445591c9fe5a825f"} Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.145001 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.146441 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76bfdcc9c4-d56mx" event={"ID":"bd5855e1-cadb-4170-8339-5f10945c6ce9","Type":"ContainerStarted","Data":"fd2853bf461ad171b9d2aba20648d237de7c6fd1d48533d33bcc6a56c0d7fd46"} Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.175704 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ttz5x" event={"ID":"e61f40e0-d6c3-49f7-a93f-d9956f086d4b","Type":"ContainerDied","Data":"e8b62c3b7599823f742eaf7d07a077b7a0c11e4d7c8b7810bc63fd0eb218d413"} Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.175739 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8b62c3b7599823f742eaf7d07a077b7a0c11e4d7c8b7810bc63fd0eb218d413" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.175800 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ttz5x" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.192912 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-78d7c97684-8rgnf" podStartSLOduration=2.192895103 podStartE2EDuration="2.192895103s" podCreationTimestamp="2026-02-19 21:46:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:37.190768123 +0000 UTC m=+1108.383286007" watchObservedRunningTime="2026-02-19 21:46:37.192895103 +0000 UTC m=+1108.385412967" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.235779 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6945f64f65-rnq2b"] Feb 19 21:46:37 crc kubenswrapper[4795]: E0219 21:46:37.236072 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2a7b298-40b6-43b3-9099-ec74f2f0bfad" containerName="keystone-bootstrap" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.236089 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2a7b298-40b6-43b3-9099-ec74f2f0bfad" containerName="keystone-bootstrap" Feb 19 21:46:37 crc kubenswrapper[4795]: E0219 21:46:37.236116 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e61f40e0-d6c3-49f7-a93f-d9956f086d4b" containerName="barbican-db-sync" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.236124 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e61f40e0-d6c3-49f7-a93f-d9956f086d4b" containerName="barbican-db-sync" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.236289 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e61f40e0-d6c3-49f7-a93f-d9956f086d4b" containerName="barbican-db-sync" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.236302 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2a7b298-40b6-43b3-9099-ec74f2f0bfad" containerName="keystone-bootstrap" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.236817 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.250154 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8pz2x" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.250439 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.253506 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.253653 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.253940 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.256418 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.276133 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6945f64f65-rnq2b"] Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.403995 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-internal-tls-certs\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.404343 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-scripts\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.404452 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-credential-keys\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.404469 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-fernet-keys\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.404492 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8hb5\" (UniqueName: \"kubernetes.io/projected/4b928260-ac65-479d-bd4b-f14b48d24ddb-kube-api-access-v8hb5\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.404536 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-config-data\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.404552 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-public-tls-certs\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.404575 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-combined-ca-bundle\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.429275 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-54956bdb55-m77pq"] Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.431014 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.455183 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5d487967c6-th49z"] Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.458148 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.469472 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-54956bdb55-m77pq"] Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.472643 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-xv49j" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.472783 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.472881 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.473074 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.476429 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5d487967c6-th49z"] Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.506046 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-config-data\") pod \"barbican-worker-54956bdb55-m77pq\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.506094 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-config-data-custom\") pod \"barbican-worker-54956bdb55-m77pq\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.506121 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-credential-keys\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.506135 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-fernet-keys\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.506158 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8hb5\" (UniqueName: \"kubernetes.io/projected/4b928260-ac65-479d-bd4b-f14b48d24ddb-kube-api-access-v8hb5\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.506209 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-combined-ca-bundle\") pod \"barbican-worker-54956bdb55-m77pq\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.506251 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-config-data\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.506267 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-public-tls-certs\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.506287 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/159cf586-d51b-49e8-bea8-99af238b8a3e-logs\") pod \"barbican-worker-54956bdb55-m77pq\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.506307 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-combined-ca-bundle\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.506332 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-internal-tls-certs\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.506347 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp762\" (UniqueName: \"kubernetes.io/projected/159cf586-d51b-49e8-bea8-99af238b8a3e-kube-api-access-cp762\") pod \"barbican-worker-54956bdb55-m77pq\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.506367 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-scripts\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.552029 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-combined-ca-bundle\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.552683 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-config-data\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.552994 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-credential-keys\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.553410 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-internal-tls-certs\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.557433 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-public-tls-certs\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.557848 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-scripts\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.567554 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-fernet-keys\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.611447 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/159cf586-d51b-49e8-bea8-99af238b8a3e-logs\") pod \"barbican-worker-54956bdb55-m77pq\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.611506 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62013c47-67bc-44c5-a250-390102661c05-logs\") pod \"barbican-keystone-listener-5d487967c6-th49z\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.611533 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp762\" (UniqueName: \"kubernetes.io/projected/159cf586-d51b-49e8-bea8-99af238b8a3e-kube-api-access-cp762\") pod \"barbican-worker-54956bdb55-m77pq\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.617654 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/159cf586-d51b-49e8-bea8-99af238b8a3e-logs\") pod \"barbican-worker-54956bdb55-m77pq\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.633049 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slh8s\" (UniqueName: \"kubernetes.io/projected/62013c47-67bc-44c5-a250-390102661c05-kube-api-access-slh8s\") pod \"barbican-keystone-listener-5d487967c6-th49z\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.633191 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-config-data\") pod \"barbican-keystone-listener-5d487967c6-th49z\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.633218 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-config-data-custom\") pod \"barbican-keystone-listener-5d487967c6-th49z\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.633256 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-config-data\") pod \"barbican-worker-54956bdb55-m77pq\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.633310 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-config-data-custom\") pod \"barbican-worker-54956bdb55-m77pq\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.633399 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-combined-ca-bundle\") pod \"barbican-worker-54956bdb55-m77pq\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.633451 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-combined-ca-bundle\") pod \"barbican-keystone-listener-5d487967c6-th49z\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.796546 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slh8s\" (UniqueName: \"kubernetes.io/projected/62013c47-67bc-44c5-a250-390102661c05-kube-api-access-slh8s\") pod \"barbican-keystone-listener-5d487967c6-th49z\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.796613 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-config-data\") pod \"barbican-keystone-listener-5d487967c6-th49z\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.796632 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-config-data-custom\") pod \"barbican-keystone-listener-5d487967c6-th49z\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.796702 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-combined-ca-bundle\") pod \"barbican-keystone-listener-5d487967c6-th49z\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.796783 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62013c47-67bc-44c5-a250-390102661c05-logs\") pod \"barbican-keystone-listener-5d487967c6-th49z\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.797283 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62013c47-67bc-44c5-a250-390102661c05-logs\") pod \"barbican-keystone-listener-5d487967c6-th49z\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.839231 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-db5c97f8f-vsr59"] Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.895581 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9d49dd75f-xfpr5"] Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.897137 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.903483 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5f5874b546-54bp8"] Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.904966 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.906530 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.910300 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9d49dd75f-xfpr5"] Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.914931 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8hb5\" (UniqueName: \"kubernetes.io/projected/4b928260-ac65-479d-bd4b-f14b48d24ddb-kube-api-access-v8hb5\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.915609 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-combined-ca-bundle\") pod \"barbican-worker-54956bdb55-m77pq\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.916413 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-config-data\") pod \"barbican-worker-54956bdb55-m77pq\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.916752 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-config-data-custom\") pod \"barbican-worker-54956bdb55-m77pq\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.919046 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp762\" (UniqueName: \"kubernetes.io/projected/159cf586-d51b-49e8-bea8-99af238b8a3e-kube-api-access-cp762\") pod \"barbican-worker-54956bdb55-m77pq\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.919133 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-combined-ca-bundle\") pod \"barbican-keystone-listener-5d487967c6-th49z\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.921257 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7cd95cf589-2gw48"] Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.924776 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.926719 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-config-data-custom\") pod \"barbican-keystone-listener-5d487967c6-th49z\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.929922 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-config-data\") pod \"barbican-keystone-listener-5d487967c6-th49z\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.930838 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slh8s\" (UniqueName: \"kubernetes.io/projected/62013c47-67bc-44c5-a250-390102661c05-kube-api-access-slh8s\") pod \"barbican-keystone-listener-5d487967c6-th49z\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.932439 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6b6d98fbd4-svzc8"] Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.934133 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.941344 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5f5874b546-54bp8"] Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.947264 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6b6d98fbd4-svzc8"] Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.955312 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7cd95cf589-2gw48"] Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.999363 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-dns-swift-storage-0\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.000365 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-ovsdbserver-nb\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.000403 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-dns-svc\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.000435 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-config\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.000457 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-ovsdbserver-sb\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.000488 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49w8w\" (UniqueName: \"kubernetes.io/projected/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-kube-api-access-49w8w\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.019142 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5fdbf9784-tjjsd"] Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.034405 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5fdbf9784-tjjsd"] Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.034550 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.076285 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.082298 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103274 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6npn5\" (UniqueName: \"kubernetes.io/projected/250e9cae-06d9-44da-88af-239d15356a3c-kube-api-access-6npn5\") pod \"barbican-worker-7cd95cf589-2gw48\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103334 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-dns-svc\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103361 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-config\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103380 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4532c069-4eb7-48ab-b575-b6a130e2b438-logs\") pod \"barbican-keystone-listener-6b6d98fbd4-svzc8\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103415 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgvdx\" (UniqueName: \"kubernetes.io/projected/c1ee7c17-521f-45b3-bdb4-748939838e60-kube-api-access-xgvdx\") pod \"barbican-api-5f5874b546-54bp8\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103436 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4hjh\" (UniqueName: \"kubernetes.io/projected/4532c069-4eb7-48ab-b575-b6a130e2b438-kube-api-access-v4hjh\") pod \"barbican-keystone-listener-6b6d98fbd4-svzc8\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103456 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-ovsdbserver-sb\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103488 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-config-data\") pod \"barbican-api-5f5874b546-54bp8\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103504 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4krhj\" (UniqueName: \"kubernetes.io/projected/ff9dd6ee-d043-41af-bcfa-8385ae786038-kube-api-access-4krhj\") pod \"barbican-api-5fdbf9784-tjjsd\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103521 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-config-data-custom\") pod \"barbican-keystone-listener-6b6d98fbd4-svzc8\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103543 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-combined-ca-bundle\") pod \"barbican-api-5f5874b546-54bp8\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103585 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49w8w\" (UniqueName: \"kubernetes.io/projected/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-kube-api-access-49w8w\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103755 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-combined-ca-bundle\") pod \"barbican-worker-7cd95cf589-2gw48\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103802 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-config-data\") pod \"barbican-worker-7cd95cf589-2gw48\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103842 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-config-data-custom\") pod \"barbican-api-5f5874b546-54bp8\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103866 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-config-data\") pod \"barbican-keystone-listener-6b6d98fbd4-svzc8\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103897 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-combined-ca-bundle\") pod \"barbican-keystone-listener-6b6d98fbd4-svzc8\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103986 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-config-data-custom\") pod \"barbican-worker-7cd95cf589-2gw48\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.104090 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1ee7c17-521f-45b3-bdb4-748939838e60-logs\") pod \"barbican-api-5f5874b546-54bp8\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.104141 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-combined-ca-bundle\") pod \"barbican-api-5fdbf9784-tjjsd\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.104243 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-config-data\") pod \"barbican-api-5fdbf9784-tjjsd\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.104331 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff9dd6ee-d043-41af-bcfa-8385ae786038-logs\") pod \"barbican-api-5fdbf9784-tjjsd\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.104373 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-dns-swift-storage-0\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.104412 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/250e9cae-06d9-44da-88af-239d15356a3c-logs\") pod \"barbican-worker-7cd95cf589-2gw48\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.104429 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-config-data-custom\") pod \"barbican-api-5fdbf9784-tjjsd\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.104479 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-ovsdbserver-nb\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.104486 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-dns-svc\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.104646 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-config\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.105294 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-ovsdbserver-nb\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.105534 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-dns-swift-storage-0\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.119014 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-ovsdbserver-sb\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.137843 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49w8w\" (UniqueName: \"kubernetes.io/projected/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-kube-api-access-49w8w\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.187609 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.210474 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-combined-ca-bundle\") pod \"barbican-api-5fdbf9784-tjjsd\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.210752 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-config-data\") pod \"barbican-api-5fdbf9784-tjjsd\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.210789 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff9dd6ee-d043-41af-bcfa-8385ae786038-logs\") pod \"barbican-api-5fdbf9784-tjjsd\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.210841 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/250e9cae-06d9-44da-88af-239d15356a3c-logs\") pod \"barbican-worker-7cd95cf589-2gw48\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.210863 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-config-data-custom\") pod \"barbican-api-5fdbf9784-tjjsd\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.210923 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6npn5\" (UniqueName: \"kubernetes.io/projected/250e9cae-06d9-44da-88af-239d15356a3c-kube-api-access-6npn5\") pod \"barbican-worker-7cd95cf589-2gw48\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.210992 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4532c069-4eb7-48ab-b575-b6a130e2b438-logs\") pod \"barbican-keystone-listener-6b6d98fbd4-svzc8\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.211022 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgvdx\" (UniqueName: \"kubernetes.io/projected/c1ee7c17-521f-45b3-bdb4-748939838e60-kube-api-access-xgvdx\") pod \"barbican-api-5f5874b546-54bp8\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.211054 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4hjh\" (UniqueName: \"kubernetes.io/projected/4532c069-4eb7-48ab-b575-b6a130e2b438-kube-api-access-v4hjh\") pod \"barbican-keystone-listener-6b6d98fbd4-svzc8\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.212346 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff9dd6ee-d043-41af-bcfa-8385ae786038-logs\") pod \"barbican-api-5fdbf9784-tjjsd\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.212759 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-config-data\") pod \"barbican-api-5f5874b546-54bp8\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.212793 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4krhj\" (UniqueName: \"kubernetes.io/projected/ff9dd6ee-d043-41af-bcfa-8385ae786038-kube-api-access-4krhj\") pod \"barbican-api-5fdbf9784-tjjsd\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.212823 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-config-data-custom\") pod \"barbican-keystone-listener-6b6d98fbd4-svzc8\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.212857 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-combined-ca-bundle\") pod \"barbican-api-5f5874b546-54bp8\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.212963 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-combined-ca-bundle\") pod \"barbican-worker-7cd95cf589-2gw48\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.212996 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-config-data\") pod \"barbican-worker-7cd95cf589-2gw48\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.213030 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-config-data-custom\") pod \"barbican-api-5f5874b546-54bp8\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.213065 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-config-data\") pod \"barbican-keystone-listener-6b6d98fbd4-svzc8\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.213088 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-combined-ca-bundle\") pod \"barbican-keystone-listener-6b6d98fbd4-svzc8\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.213338 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-config-data-custom\") pod \"barbican-worker-7cd95cf589-2gw48\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.213344 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/250e9cae-06d9-44da-88af-239d15356a3c-logs\") pod \"barbican-worker-7cd95cf589-2gw48\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.213389 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1ee7c17-521f-45b3-bdb4-748939838e60-logs\") pod \"barbican-api-5f5874b546-54bp8\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.213950 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1ee7c17-521f-45b3-bdb4-748939838e60-logs\") pod \"barbican-api-5f5874b546-54bp8\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.214315 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4532c069-4eb7-48ab-b575-b6a130e2b438-logs\") pod \"barbican-keystone-listener-6b6d98fbd4-svzc8\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.215235 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76bfdcc9c4-d56mx" event={"ID":"bd5855e1-cadb-4170-8339-5f10945c6ce9","Type":"ContainerStarted","Data":"07e0d10967942d332970ea7b450b84e11544b898580db0b6d739db3ac549d6ec"} Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.220300 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-combined-ca-bundle\") pod \"barbican-api-5fdbf9784-tjjsd\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.223551 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-config-data-custom\") pod \"barbican-api-5fdbf9784-tjjsd\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.224323 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-config-data-custom\") pod \"barbican-keystone-listener-6b6d98fbd4-svzc8\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.225641 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-config-data\") pod \"barbican-worker-7cd95cf589-2gw48\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.226647 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-combined-ca-bundle\") pod \"barbican-worker-7cd95cf589-2gw48\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.228326 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-config-data-custom\") pod \"barbican-worker-7cd95cf589-2gw48\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.229895 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-combined-ca-bundle\") pod \"barbican-api-5f5874b546-54bp8\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.230215 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-config-data-custom\") pod \"barbican-api-5f5874b546-54bp8\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.236636 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4hjh\" (UniqueName: \"kubernetes.io/projected/4532c069-4eb7-48ab-b575-b6a130e2b438-kube-api-access-v4hjh\") pod \"barbican-keystone-listener-6b6d98fbd4-svzc8\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.238263 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-config-data\") pod \"barbican-api-5fdbf9784-tjjsd\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.238442 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-config-data\") pod \"barbican-keystone-listener-6b6d98fbd4-svzc8\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.239267 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-combined-ca-bundle\") pod \"barbican-keystone-listener-6b6d98fbd4-svzc8\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.245787 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6npn5\" (UniqueName: \"kubernetes.io/projected/250e9cae-06d9-44da-88af-239d15356a3c-kube-api-access-6npn5\") pod \"barbican-worker-7cd95cf589-2gw48\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.246103 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" event={"ID":"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210","Type":"ContainerStarted","Data":"67ae128465c4be3388435d97fad254a20c1f3ba339605e04ac27d63cac4910f3"} Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.246560 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.247667 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" podUID="1b5d1bab-ed1a-4a5e-a194-ad6b25d16210" containerName="dnsmasq-dns" containerID="cri-o://67ae128465c4be3388435d97fad254a20c1f3ba339605e04ac27d63cac4910f3" gracePeriod=10 Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.251028 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgvdx\" (UniqueName: \"kubernetes.io/projected/c1ee7c17-521f-45b3-bdb4-748939838e60-kube-api-access-xgvdx\") pod \"barbican-api-5f5874b546-54bp8\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.252905 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4krhj\" (UniqueName: \"kubernetes.io/projected/ff9dd6ee-d043-41af-bcfa-8385ae786038-kube-api-access-4krhj\") pod \"barbican-api-5fdbf9784-tjjsd\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.273229 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-config-data\") pod \"barbican-api-5f5874b546-54bp8\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.301939 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" podStartSLOduration=3.30190427 podStartE2EDuration="3.30190427s" podCreationTimestamp="2026-02-19 21:46:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:38.26998949 +0000 UTC m=+1109.462507354" watchObservedRunningTime="2026-02-19 21:46:38.30190427 +0000 UTC m=+1109.494422134" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.425368 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.434245 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.457104 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.485206 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.514763 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.653695 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-54956bdb55-m77pq"] Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.746768 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5d487967c6-th49z"] Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.935777 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6945f64f65-rnq2b"] Feb 19 21:46:38 crc kubenswrapper[4795]: W0219 21:46:38.955195 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b928260_ac65_479d_bd4b_f14b48d24ddb.slice/crio-bdfd7065946391c6d353cb9168c7226ec5dc670a459303e629e9266fcea79077 WatchSource:0}: Error finding container bdfd7065946391c6d353cb9168c7226ec5dc670a459303e629e9266fcea79077: Status 404 returned error can't find the container with id bdfd7065946391c6d353cb9168c7226ec5dc670a459303e629e9266fcea79077 Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.018082 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.157929 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-ovsdbserver-nb\") pod \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.157972 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-ovsdbserver-sb\") pod \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.158017 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-config\") pod \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.158077 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-dns-swift-storage-0\") pod \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.158134 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-dns-svc\") pod \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.158231 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rcql\" (UniqueName: \"kubernetes.io/projected/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-kube-api-access-9rcql\") pod \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.165329 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-kube-api-access-9rcql" (OuterVolumeSpecName: "kube-api-access-9rcql") pod "1b5d1bab-ed1a-4a5e-a194-ad6b25d16210" (UID: "1b5d1bab-ed1a-4a5e-a194-ad6b25d16210"). InnerVolumeSpecName "kube-api-access-9rcql". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.210829 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9d49dd75f-xfpr5"] Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.231904 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5f5874b546-54bp8"] Feb 19 21:46:39 crc kubenswrapper[4795]: W0219 21:46:39.246076 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ae0e35c_6ee4_4e25_a76d_7033c2a3f09b.slice/crio-e42790fe679a04fe8e1a343d997c7b72c08e413899c7c4625b70ee788a6866db WatchSource:0}: Error finding container e42790fe679a04fe8e1a343d997c7b72c08e413899c7c4625b70ee788a6866db: Status 404 returned error can't find the container with id e42790fe679a04fe8e1a343d997c7b72c08e413899c7c4625b70ee788a6866db Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.303498 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rcql\" (UniqueName: \"kubernetes.io/projected/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-kube-api-access-9rcql\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.316344 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1b5d1bab-ed1a-4a5e-a194-ad6b25d16210" (UID: "1b5d1bab-ed1a-4a5e-a194-ad6b25d16210"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.329856 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-config" (OuterVolumeSpecName: "config") pod "1b5d1bab-ed1a-4a5e-a194-ad6b25d16210" (UID: "1b5d1bab-ed1a-4a5e-a194-ad6b25d16210"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.354447 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d487967c6-th49z" event={"ID":"62013c47-67bc-44c5-a250-390102661c05","Type":"ContainerStarted","Data":"03d4ba2c4c2e12794fc2992e51f7d78cc75a2714702ce6582a4cf8391b4968e8"} Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.366540 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1b5d1bab-ed1a-4a5e-a194-ad6b25d16210" (UID: "1b5d1bab-ed1a-4a5e-a194-ad6b25d16210"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.368871 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1b5d1bab-ed1a-4a5e-a194-ad6b25d16210" (UID: "1b5d1bab-ed1a-4a5e-a194-ad6b25d16210"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.370594 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76bfdcc9c4-d56mx" event={"ID":"bd5855e1-cadb-4170-8339-5f10945c6ce9","Type":"ContainerStarted","Data":"79eaa173c80318b05a89d976dd736c832c6c7ad22a55bf02162ef029353efbf8"} Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.370823 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.370852 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.373042 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" event={"ID":"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b","Type":"ContainerStarted","Data":"e42790fe679a04fe8e1a343d997c7b72c08e413899c7c4625b70ee788a6866db"} Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.375754 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5fdbf9784-tjjsd"] Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.384633 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1b5d1bab-ed1a-4a5e-a194-ad6b25d16210" (UID: "1b5d1bab-ed1a-4a5e-a194-ad6b25d16210"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.398011 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54956bdb55-m77pq" event={"ID":"159cf586-d51b-49e8-bea8-99af238b8a3e","Type":"ContainerStarted","Data":"4e28f3e465acd4c6f672fce5b1b114fcf10939feb9816a3820f5b6f281f46c1c"} Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.407142 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.407189 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.407198 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.407207 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.407216 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.410392 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7cd95cf589-2gw48"] Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.411245 4795 generic.go:334] "Generic (PLEG): container finished" podID="1b5d1bab-ed1a-4a5e-a194-ad6b25d16210" containerID="67ae128465c4be3388435d97fad254a20c1f3ba339605e04ac27d63cac4910f3" exitCode=0 Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.411353 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.411360 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" event={"ID":"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210","Type":"ContainerDied","Data":"67ae128465c4be3388435d97fad254a20c1f3ba339605e04ac27d63cac4910f3"} Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.411397 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" event={"ID":"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210","Type":"ContainerDied","Data":"d1d8cd14d0dca706ef7e60716badbe8723a3cc32c8f0ce72a5ad49bcb63f1b70"} Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.411416 4795 scope.go:117] "RemoveContainer" containerID="67ae128465c4be3388435d97fad254a20c1f3ba339605e04ac27d63cac4910f3" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.420428 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-76bfdcc9c4-d56mx" podStartSLOduration=3.420403535 podStartE2EDuration="3.420403535s" podCreationTimestamp="2026-02-19 21:46:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:39.391466859 +0000 UTC m=+1110.583984733" watchObservedRunningTime="2026-02-19 21:46:39.420403535 +0000 UTC m=+1110.612921399" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.422887 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6945f64f65-rnq2b" event={"ID":"4b928260-ac65-479d-bd4b-f14b48d24ddb","Type":"ContainerStarted","Data":"bdfd7065946391c6d353cb9168c7226ec5dc670a459303e629e9266fcea79077"} Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.448156 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6b6d98fbd4-svzc8"] Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.478341 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-db5c97f8f-vsr59"] Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.487983 4795 scope.go:117] "RemoveContainer" containerID="c0639f1ced9910964acb68b012da7b3244f83a7c9a5d9a111a018fafc99f2185" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.492855 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-db5c97f8f-vsr59"] Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.535457 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b5d1bab-ed1a-4a5e-a194-ad6b25d16210" path="/var/lib/kubelet/pods/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210/volumes" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.536031 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-576c65f985-r97z7"] Feb 19 21:46:39 crc kubenswrapper[4795]: E0219 21:46:39.536443 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b5d1bab-ed1a-4a5e-a194-ad6b25d16210" containerName="dnsmasq-dns" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.536459 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b5d1bab-ed1a-4a5e-a194-ad6b25d16210" containerName="dnsmasq-dns" Feb 19 21:46:39 crc kubenswrapper[4795]: E0219 21:46:39.536512 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b5d1bab-ed1a-4a5e-a194-ad6b25d16210" containerName="init" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.536521 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b5d1bab-ed1a-4a5e-a194-ad6b25d16210" containerName="init" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.536786 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b5d1bab-ed1a-4a5e-a194-ad6b25d16210" containerName="dnsmasq-dns" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.537769 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.551042 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.552508 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-576c65f985-r97z7"] Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.551136 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.570347 4795 scope.go:117] "RemoveContainer" containerID="67ae128465c4be3388435d97fad254a20c1f3ba339605e04ac27d63cac4910f3" Feb 19 21:46:39 crc kubenswrapper[4795]: E0219 21:46:39.571365 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67ae128465c4be3388435d97fad254a20c1f3ba339605e04ac27d63cac4910f3\": container with ID starting with 67ae128465c4be3388435d97fad254a20c1f3ba339605e04ac27d63cac4910f3 not found: ID does not exist" containerID="67ae128465c4be3388435d97fad254a20c1f3ba339605e04ac27d63cac4910f3" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.571470 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67ae128465c4be3388435d97fad254a20c1f3ba339605e04ac27d63cac4910f3"} err="failed to get container status \"67ae128465c4be3388435d97fad254a20c1f3ba339605e04ac27d63cac4910f3\": rpc error: code = NotFound desc = could not find container \"67ae128465c4be3388435d97fad254a20c1f3ba339605e04ac27d63cac4910f3\": container with ID starting with 67ae128465c4be3388435d97fad254a20c1f3ba339605e04ac27d63cac4910f3 not found: ID does not exist" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.571558 4795 scope.go:117] "RemoveContainer" containerID="c0639f1ced9910964acb68b012da7b3244f83a7c9a5d9a111a018fafc99f2185" Feb 19 21:46:39 crc kubenswrapper[4795]: E0219 21:46:39.575383 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0639f1ced9910964acb68b012da7b3244f83a7c9a5d9a111a018fafc99f2185\": container with ID starting with c0639f1ced9910964acb68b012da7b3244f83a7c9a5d9a111a018fafc99f2185 not found: ID does not exist" containerID="c0639f1ced9910964acb68b012da7b3244f83a7c9a5d9a111a018fafc99f2185" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.575428 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0639f1ced9910964acb68b012da7b3244f83a7c9a5d9a111a018fafc99f2185"} err="failed to get container status \"c0639f1ced9910964acb68b012da7b3244f83a7c9a5d9a111a018fafc99f2185\": rpc error: code = NotFound desc = could not find container \"c0639f1ced9910964acb68b012da7b3244f83a7c9a5d9a111a018fafc99f2185\": container with ID starting with c0639f1ced9910964acb68b012da7b3244f83a7c9a5d9a111a018fafc99f2185 not found: ID does not exist" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.655092 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6df95dfbd4-ftf6x"] Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.674650 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.675961 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6df95dfbd4-ftf6x"] Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.714261 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-ovndb-tls-certs\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.714302 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-public-tls-certs\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.714322 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-internal-tls-certs\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.714348 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-httpd-config\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.714376 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dln52\" (UniqueName: \"kubernetes.io/projected/30f2c894-7a7a-4e5a-a090-a28ab50c766a-kube-api-access-dln52\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.714414 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-config\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.714462 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-combined-ca-bundle\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.816213 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-public-tls-certs\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.817562 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-config\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.817609 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-internal-tls-certs\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.817651 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-config-data\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.817712 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-combined-ca-bundle\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.817756 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-combined-ca-bundle\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.817877 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-scripts\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.818135 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84vz2\" (UniqueName: \"kubernetes.io/projected/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-kube-api-access-84vz2\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.818224 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-ovndb-tls-certs\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.818298 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-logs\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.818350 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-public-tls-certs\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.818373 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-internal-tls-certs\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.818404 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-httpd-config\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.818437 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dln52\" (UniqueName: \"kubernetes.io/projected/30f2c894-7a7a-4e5a-a090-a28ab50c766a-kube-api-access-dln52\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.838733 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-public-tls-certs\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.838794 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-httpd-config\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.838844 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-config\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.839480 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-internal-tls-certs\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.841846 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dln52\" (UniqueName: \"kubernetes.io/projected/30f2c894-7a7a-4e5a-a090-a28ab50c766a-kube-api-access-dln52\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.848027 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-combined-ca-bundle\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.872910 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-ovndb-tls-certs\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.920587 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-public-tls-certs\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.920659 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-internal-tls-certs\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.920700 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-config-data\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.920748 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-combined-ca-bundle\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.920783 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-scripts\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.920820 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84vz2\" (UniqueName: \"kubernetes.io/projected/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-kube-api-access-84vz2\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.921298 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-logs\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.921857 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-logs\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.926901 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-public-tls-certs\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.932465 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-scripts\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.933502 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-internal-tls-certs\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.940954 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-config-data\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.945851 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-combined-ca-bundle\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.947485 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84vz2\" (UniqueName: \"kubernetes.io/projected/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-kube-api-access-84vz2\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.021698 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.058143 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.435081 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.435140 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.454844 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f5874b546-54bp8" event={"ID":"c1ee7c17-521f-45b3-bdb4-748939838e60","Type":"ContainerStarted","Data":"d05922b241c5606cc7cb67f0611459164ffa20eef296f34ff91982e9383ca569"} Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.455354 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f5874b546-54bp8" event={"ID":"c1ee7c17-521f-45b3-bdb4-748939838e60","Type":"ContainerStarted","Data":"be89fa2e2a317aa99f43093ff5681ba4ce22ead52fd8adae9138024217c6894f"} Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.458120 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" event={"ID":"4532c069-4eb7-48ab-b575-b6a130e2b438","Type":"ContainerStarted","Data":"c0b18fd78cd092c133f6dd779fd8c2b41870a6c99e45b8bcd625ff594cb4d9de"} Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.460013 4795 generic.go:334] "Generic (PLEG): container finished" podID="6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b" containerID="81fbd576f753564d8cc96fbaca875356e50537cd709002f5a9b77dc1f35e7279" exitCode=0 Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.460094 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" event={"ID":"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b","Type":"ContainerDied","Data":"81fbd576f753564d8cc96fbaca875356e50537cd709002f5a9b77dc1f35e7279"} Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.461237 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cd95cf589-2gw48" event={"ID":"250e9cae-06d9-44da-88af-239d15356a3c","Type":"ContainerStarted","Data":"26ff50c7b1851e9704bfa4221d66176820b3417a16cff032c2c82bc2945df7a8"} Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.472875 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.473807 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.473857 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6945f64f65-rnq2b" event={"ID":"4b928260-ac65-479d-bd4b-f14b48d24ddb","Type":"ContainerStarted","Data":"77a5c881bfb4b3162733203d6d06eed351c1cde8f2967461288b978f94eeb5ba"} Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.473894 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.476212 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fdbf9784-tjjsd" event={"ID":"ff9dd6ee-d043-41af-bcfa-8385ae786038","Type":"ContainerStarted","Data":"628cf011c1e3cfa652516057f286c880a7bb4ae1632cbe08579b65a9e2873451"} Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.476256 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fdbf9784-tjjsd" event={"ID":"ff9dd6ee-d043-41af-bcfa-8385ae786038","Type":"ContainerStarted","Data":"86ab35cfd9c80c07dee3068dbe14e2a6dadda5cbd9e0d550648a5054e671ff43"} Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.488460 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.510243 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.519705 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6945f64f65-rnq2b" podStartSLOduration=3.519685517 podStartE2EDuration="3.519685517s" podCreationTimestamp="2026-02-19 21:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:40.505105146 +0000 UTC m=+1111.697623020" watchObservedRunningTime="2026-02-19 21:46:40.519685517 +0000 UTC m=+1111.712203381" Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.534956 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.563506 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 21:46:41 crc kubenswrapper[4795]: I0219 21:46:41.486431 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 21:46:41 crc kubenswrapper[4795]: I0219 21:46:41.486471 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 21:46:41 crc kubenswrapper[4795]: I0219 21:46:41.486496 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 21:46:41 crc kubenswrapper[4795]: I0219 21:46:41.486780 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.176306 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5f5874b546-54bp8"] Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.204922 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6f98bf9994-pr48x"] Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.206309 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.208312 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.212877 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.234186 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6f98bf9994-pr48x"] Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.366800 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6pv9\" (UniqueName: \"kubernetes.io/projected/e4a6a069-904a-4072-b98c-346f67f22def-kube-api-access-h6pv9\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.367108 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4a6a069-904a-4072-b98c-346f67f22def-logs\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.367226 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-config-data\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.367316 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-config-data-custom\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.367385 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-public-tls-certs\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.367456 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-combined-ca-bundle\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.367546 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-internal-tls-certs\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.469264 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-internal-tls-certs\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.469371 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6pv9\" (UniqueName: \"kubernetes.io/projected/e4a6a069-904a-4072-b98c-346f67f22def-kube-api-access-h6pv9\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.469467 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4a6a069-904a-4072-b98c-346f67f22def-logs\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.469496 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-config-data\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.469525 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-config-data-custom\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.469544 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-public-tls-certs\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.469571 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-combined-ca-bundle\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.470422 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4a6a069-904a-4072-b98c-346f67f22def-logs\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.476369 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-public-tls-certs\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.477056 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-combined-ca-bundle\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.479305 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-internal-tls-certs\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.482466 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-config-data-custom\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.483539 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-config-data\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.490843 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6pv9\" (UniqueName: \"kubernetes.io/projected/e4a6a069-904a-4072-b98c-346f67f22def-kube-api-access-h6pv9\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.525157 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:43 crc kubenswrapper[4795]: I0219 21:46:43.506579 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 21:46:43 crc kubenswrapper[4795]: I0219 21:46:43.507074 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 21:46:43 crc kubenswrapper[4795]: I0219 21:46:43.653086 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 21:46:43 crc kubenswrapper[4795]: I0219 21:46:43.660694 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 21:46:43 crc kubenswrapper[4795]: I0219 21:46:43.660825 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 21:46:43 crc kubenswrapper[4795]: I0219 21:46:43.737129 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6df95dfbd4-ftf6x"] Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.016873 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.033823 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6f98bf9994-pr48x"] Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.123936 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.299678 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-576c65f985-r97z7"] Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.598919 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cd95cf589-2gw48" event={"ID":"250e9cae-06d9-44da-88af-239d15356a3c","Type":"ContainerStarted","Data":"0d1ad96e830846d9034790e753be1811679f1bbf15af5f12e70081c7ae374cfd"} Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.600673 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fdbf9784-tjjsd" event={"ID":"ff9dd6ee-d043-41af-bcfa-8385ae786038","Type":"ContainerStarted","Data":"1e747e5d80d698d410aec3a2712fd1002c46a3b555382ce77d51ba1659f59721"} Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.602424 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.602443 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.607471 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f5874b546-54bp8" event={"ID":"c1ee7c17-521f-45b3-bdb4-748939838e60","Type":"ContainerStarted","Data":"bb53ff7aacbd051823727a3e5e9c0df1aa87aded1e6321d5bbd77a8a437a808e"} Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.607693 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5f5874b546-54bp8" podUID="c1ee7c17-521f-45b3-bdb4-748939838e60" containerName="barbican-api-log" containerID="cri-o://d05922b241c5606cc7cb67f0611459164ffa20eef296f34ff91982e9383ca569" gracePeriod=30 Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.607804 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5f5874b546-54bp8" podUID="c1ee7c17-521f-45b3-bdb4-748939838e60" containerName="barbican-api" containerID="cri-o://bb53ff7aacbd051823727a3e5e9c0df1aa87aded1e6321d5bbd77a8a437a808e" gracePeriod=30 Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.607898 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.607913 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.617364 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f98bf9994-pr48x" event={"ID":"e4a6a069-904a-4072-b98c-346f67f22def","Type":"ContainerStarted","Data":"95eca73f9943de18ca7dd19f1ef5d95e39ab42d81563dce332afbfa7377d20f4"} Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.619944 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6df95dfbd4-ftf6x" event={"ID":"b01bcd5b-435a-4702-b0a4-8dfe8f553c23","Type":"ContainerStarted","Data":"16d1434f729c32f7f5af098d1664dafb8ed3d4636079462bf6c45b1454ee08ef"} Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.619981 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6df95dfbd4-ftf6x" event={"ID":"b01bcd5b-435a-4702-b0a4-8dfe8f553c23","Type":"ContainerStarted","Data":"47caa9a7519cc7b778b03d7e938c02973816e703da61178a9af0e7d1bdc77812"} Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.639704 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6698443-b029-4098-81d6-dba6d5f239f2","Type":"ContainerStarted","Data":"f0e44fbdb8e3ea25287eea5edbbd58cd883e463ecd967da058e1d18fafd8bc62"} Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.656253 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5fdbf9784-tjjsd" podStartSLOduration=7.656226029 podStartE2EDuration="7.656226029s" podCreationTimestamp="2026-02-19 21:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:44.637029677 +0000 UTC m=+1115.829547541" watchObservedRunningTime="2026-02-19 21:46:44.656226029 +0000 UTC m=+1115.848743893" Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.661615 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5f5874b546-54bp8" podStartSLOduration=7.66159949 podStartE2EDuration="7.66159949s" podCreationTimestamp="2026-02-19 21:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:44.657193726 +0000 UTC m=+1115.849711590" watchObservedRunningTime="2026-02-19 21:46:44.66159949 +0000 UTC m=+1115.854117354" Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.667955 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" event={"ID":"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b","Type":"ContainerStarted","Data":"af8b691ea9f453a749ca6ae641397eb75df993ab21c7f57160c14bfd356a318c"} Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.668877 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.704829 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5f5874b546-54bp8" podUID="c1ee7c17-521f-45b3-bdb4-748939838e60" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.158:9311/healthcheck\": EOF" Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.709265 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54956bdb55-m77pq" event={"ID":"159cf586-d51b-49e8-bea8-99af238b8a3e","Type":"ContainerStarted","Data":"6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab"} Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.710492 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" podStartSLOduration=7.710475729 podStartE2EDuration="7.710475729s" podCreationTimestamp="2026-02-19 21:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:44.70912437 +0000 UTC m=+1115.901642254" watchObservedRunningTime="2026-02-19 21:46:44.710475729 +0000 UTC m=+1115.902993593" Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.718800 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-576c65f985-r97z7" event={"ID":"30f2c894-7a7a-4e5a-a090-a28ab50c766a","Type":"ContainerStarted","Data":"cc411b717439dc2d51f309775cfcf3728048016bc68869b8b28221a90840d6fb"} Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.747784 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" event={"ID":"4532c069-4eb7-48ab-b575-b6a130e2b438","Type":"ContainerStarted","Data":"374356bcc0d8632ce08f62d54e2492534b8e8fc2a40b07f92483295295d884ef"} Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.750062 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d487967c6-th49z" event={"ID":"62013c47-67bc-44c5-a250-390102661c05","Type":"ContainerStarted","Data":"c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b"} Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.762876 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6df95dfbd4-ftf6x" event={"ID":"b01bcd5b-435a-4702-b0a4-8dfe8f553c23","Type":"ContainerStarted","Data":"9d2e881624f7800e5e6e027e5487084188713e96e14cab5c281c53ae829851a2"} Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.764533 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.764561 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.780276 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-576c65f985-r97z7" event={"ID":"30f2c894-7a7a-4e5a-a090-a28ab50c766a","Type":"ContainerStarted","Data":"fe9cf6c1b6f422c771781be2cfb21deabc54071e241ab6b5c24d3d83414ee4ec"} Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.780340 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-576c65f985-r97z7" event={"ID":"30f2c894-7a7a-4e5a-a090-a28ab50c766a","Type":"ContainerStarted","Data":"b4418cb3b4a933e20130924748ce8ea933d7929e7a32c146e7e7ce593a7c13a0"} Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.780390 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.796643 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54956bdb55-m77pq" event={"ID":"159cf586-d51b-49e8-bea8-99af238b8a3e","Type":"ContainerStarted","Data":"8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9"} Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.800157 4795 generic.go:334] "Generic (PLEG): container finished" podID="c1ee7c17-521f-45b3-bdb4-748939838e60" containerID="d05922b241c5606cc7cb67f0611459164ffa20eef296f34ff91982e9383ca569" exitCode=143 Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.800221 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f5874b546-54bp8" event={"ID":"c1ee7c17-521f-45b3-bdb4-748939838e60","Type":"ContainerDied","Data":"d05922b241c5606cc7cb67f0611459164ffa20eef296f34ff91982e9383ca569"} Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.802100 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" event={"ID":"4532c069-4eb7-48ab-b575-b6a130e2b438","Type":"ContainerStarted","Data":"b139002ae58e716fbddbfc8a4fe687ac4f1df52f0c26c4b6314da64bed2d0075"} Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.803445 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f98bf9994-pr48x" event={"ID":"e4a6a069-904a-4072-b98c-346f67f22def","Type":"ContainerStarted","Data":"a6cf86c6cb492d1373f389e62b5d8e8687a30a53e6a19fc785851a2a8a42b429"} Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.803540 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f98bf9994-pr48x" event={"ID":"e4a6a069-904a-4072-b98c-346f67f22def","Type":"ContainerStarted","Data":"d4e7caf152ae88ec057977570b5950355e310fd918c2c78401ccea557c71b632"} Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.803619 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.803680 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.806212 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cd95cf589-2gw48" event={"ID":"250e9cae-06d9-44da-88af-239d15356a3c","Type":"ContainerStarted","Data":"f1227cc577cca2da8d7067560947f94ac089651b67e263368202e928196a7bc8"} Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.815518 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6df95dfbd4-ftf6x" podStartSLOduration=6.815499723 podStartE2EDuration="6.815499723s" podCreationTimestamp="2026-02-19 21:46:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:45.79303627 +0000 UTC m=+1116.985554144" watchObservedRunningTime="2026-02-19 21:46:45.815499723 +0000 UTC m=+1117.008017587" Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.819208 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-576c65f985-r97z7" podStartSLOduration=6.819198288 podStartE2EDuration="6.819198288s" podCreationTimestamp="2026-02-19 21:46:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:45.811618654 +0000 UTC m=+1117.004136518" watchObservedRunningTime="2026-02-19 21:46:45.819198288 +0000 UTC m=+1117.011716152" Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.830025 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-54956bdb55-m77pq" podStartSLOduration=4.027117709 podStartE2EDuration="8.830010173s" podCreationTimestamp="2026-02-19 21:46:37 +0000 UTC" firstStartedPulling="2026-02-19 21:46:38.697210499 +0000 UTC m=+1109.889728363" lastFinishedPulling="2026-02-19 21:46:43.500102963 +0000 UTC m=+1114.692620827" observedRunningTime="2026-02-19 21:46:45.826535355 +0000 UTC m=+1117.019053219" watchObservedRunningTime="2026-02-19 21:46:45.830010173 +0000 UTC m=+1117.022528037" Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.830438 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d487967c6-th49z" event={"ID":"62013c47-67bc-44c5-a250-390102661c05","Type":"ContainerStarted","Data":"5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075"} Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.845916 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zjbsw" event={"ID":"5084e7b9-4923-449e-b0d7-28c602faeff0","Type":"ContainerStarted","Data":"0852bee2926e9dad886249433c004e0f4241817093b65fb2480708d4c6c502c0"} Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.885579 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" podStartSLOduration=4.749627204 podStartE2EDuration="8.885560869s" podCreationTimestamp="2026-02-19 21:46:37 +0000 UTC" firstStartedPulling="2026-02-19 21:46:39.446507331 +0000 UTC m=+1110.639025195" lastFinishedPulling="2026-02-19 21:46:43.582440996 +0000 UTC m=+1114.774958860" observedRunningTime="2026-02-19 21:46:45.846064695 +0000 UTC m=+1117.038582549" watchObservedRunningTime="2026-02-19 21:46:45.885560869 +0000 UTC m=+1117.078078733" Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.894911 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7cd95cf589-2gw48" podStartSLOduration=4.751367894 podStartE2EDuration="8.894893313s" podCreationTimestamp="2026-02-19 21:46:37 +0000 UTC" firstStartedPulling="2026-02-19 21:46:39.424741827 +0000 UTC m=+1110.617259691" lastFinishedPulling="2026-02-19 21:46:43.568267246 +0000 UTC m=+1114.760785110" observedRunningTime="2026-02-19 21:46:45.865066751 +0000 UTC m=+1117.057584605" watchObservedRunningTime="2026-02-19 21:46:45.894893313 +0000 UTC m=+1117.087411177" Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.924186 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-5d487967c6-th49z"] Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.930028 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6f98bf9994-pr48x" podStartSLOduration=3.930000943 podStartE2EDuration="3.930000943s" podCreationTimestamp="2026-02-19 21:46:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:45.892255948 +0000 UTC m=+1117.084773812" watchObservedRunningTime="2026-02-19 21:46:45.930000943 +0000 UTC m=+1117.122518817" Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.974788 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-54956bdb55-m77pq"] Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.986024 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-zjbsw" podStartSLOduration=3.622193946 podStartE2EDuration="36.986003302s" podCreationTimestamp="2026-02-19 21:46:09 +0000 UTC" firstStartedPulling="2026-02-19 21:46:10.219688239 +0000 UTC m=+1081.412206103" lastFinishedPulling="2026-02-19 21:46:43.583497595 +0000 UTC m=+1114.776015459" observedRunningTime="2026-02-19 21:46:45.921925185 +0000 UTC m=+1117.114443049" watchObservedRunningTime="2026-02-19 21:46:45.986003302 +0000 UTC m=+1117.178521166" Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.993604 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5d487967c6-th49z" podStartSLOduration=4.3497769680000005 podStartE2EDuration="8.993591346s" podCreationTimestamp="2026-02-19 21:46:37 +0000 UTC" firstStartedPulling="2026-02-19 21:46:38.828525432 +0000 UTC m=+1110.021043296" lastFinishedPulling="2026-02-19 21:46:43.47233981 +0000 UTC m=+1114.664857674" observedRunningTime="2026-02-19 21:46:45.948568786 +0000 UTC m=+1117.141086650" watchObservedRunningTime="2026-02-19 21:46:45.993591346 +0000 UTC m=+1117.186109210" Feb 19 21:46:46 crc kubenswrapper[4795]: I0219 21:46:46.854705 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 21:46:46 crc kubenswrapper[4795]: I0219 21:46:46.921507 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:47 crc kubenswrapper[4795]: I0219 21:46:47.862266 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-54956bdb55-m77pq" podUID="159cf586-d51b-49e8-bea8-99af238b8a3e" containerName="barbican-worker-log" containerID="cri-o://6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab" gracePeriod=30 Feb 19 21:46:47 crc kubenswrapper[4795]: I0219 21:46:47.862595 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-54956bdb55-m77pq" podUID="159cf586-d51b-49e8-bea8-99af238b8a3e" containerName="barbican-worker" containerID="cri-o://8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9" gracePeriod=30 Feb 19 21:46:47 crc kubenswrapper[4795]: I0219 21:46:47.863111 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-5d487967c6-th49z" podUID="62013c47-67bc-44c5-a250-390102661c05" containerName="barbican-keystone-listener-log" containerID="cri-o://c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b" gracePeriod=30 Feb 19 21:46:47 crc kubenswrapper[4795]: I0219 21:46:47.863220 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-5d487967c6-th49z" podUID="62013c47-67bc-44c5-a250-390102661c05" containerName="barbican-keystone-listener" containerID="cri-o://5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075" gracePeriod=30 Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.622912 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.715092 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.776980 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-combined-ca-bundle\") pod \"62013c47-67bc-44c5-a250-390102661c05\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.777028 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slh8s\" (UniqueName: \"kubernetes.io/projected/62013c47-67bc-44c5-a250-390102661c05-kube-api-access-slh8s\") pod \"62013c47-67bc-44c5-a250-390102661c05\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.777053 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cp762\" (UniqueName: \"kubernetes.io/projected/159cf586-d51b-49e8-bea8-99af238b8a3e-kube-api-access-cp762\") pod \"159cf586-d51b-49e8-bea8-99af238b8a3e\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.777082 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-config-data\") pod \"159cf586-d51b-49e8-bea8-99af238b8a3e\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.777100 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-config-data-custom\") pod \"159cf586-d51b-49e8-bea8-99af238b8a3e\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.777120 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-config-data\") pod \"62013c47-67bc-44c5-a250-390102661c05\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.777188 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/159cf586-d51b-49e8-bea8-99af238b8a3e-logs\") pod \"159cf586-d51b-49e8-bea8-99af238b8a3e\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.777215 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-config-data-custom\") pod \"62013c47-67bc-44c5-a250-390102661c05\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.777258 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62013c47-67bc-44c5-a250-390102661c05-logs\") pod \"62013c47-67bc-44c5-a250-390102661c05\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.777285 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-combined-ca-bundle\") pod \"159cf586-d51b-49e8-bea8-99af238b8a3e\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.779027 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62013c47-67bc-44c5-a250-390102661c05-logs" (OuterVolumeSpecName: "logs") pod "62013c47-67bc-44c5-a250-390102661c05" (UID: "62013c47-67bc-44c5-a250-390102661c05"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.779221 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/159cf586-d51b-49e8-bea8-99af238b8a3e-logs" (OuterVolumeSpecName: "logs") pod "159cf586-d51b-49e8-bea8-99af238b8a3e" (UID: "159cf586-d51b-49e8-bea8-99af238b8a3e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.785523 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "159cf586-d51b-49e8-bea8-99af238b8a3e" (UID: "159cf586-d51b-49e8-bea8-99af238b8a3e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.786580 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62013c47-67bc-44c5-a250-390102661c05-kube-api-access-slh8s" (OuterVolumeSpecName: "kube-api-access-slh8s") pod "62013c47-67bc-44c5-a250-390102661c05" (UID: "62013c47-67bc-44c5-a250-390102661c05"). InnerVolumeSpecName "kube-api-access-slh8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.790889 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "62013c47-67bc-44c5-a250-390102661c05" (UID: "62013c47-67bc-44c5-a250-390102661c05"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.792377 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/159cf586-d51b-49e8-bea8-99af238b8a3e-kube-api-access-cp762" (OuterVolumeSpecName: "kube-api-access-cp762") pod "159cf586-d51b-49e8-bea8-99af238b8a3e" (UID: "159cf586-d51b-49e8-bea8-99af238b8a3e"). InnerVolumeSpecName "kube-api-access-cp762". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.822725 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62013c47-67bc-44c5-a250-390102661c05" (UID: "62013c47-67bc-44c5-a250-390102661c05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.822855 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "159cf586-d51b-49e8-bea8-99af238b8a3e" (UID: "159cf586-d51b-49e8-bea8-99af238b8a3e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.840925 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-config-data" (OuterVolumeSpecName: "config-data") pod "62013c47-67bc-44c5-a250-390102661c05" (UID: "62013c47-67bc-44c5-a250-390102661c05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.863852 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-config-data" (OuterVolumeSpecName: "config-data") pod "159cf586-d51b-49e8-bea8-99af238b8a3e" (UID: "159cf586-d51b-49e8-bea8-99af238b8a3e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.872966 4795 generic.go:334] "Generic (PLEG): container finished" podID="159cf586-d51b-49e8-bea8-99af238b8a3e" containerID="8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9" exitCode=0 Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.873009 4795 generic.go:334] "Generic (PLEG): container finished" podID="159cf586-d51b-49e8-bea8-99af238b8a3e" containerID="6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab" exitCode=143 Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.873051 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.873045 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54956bdb55-m77pq" event={"ID":"159cf586-d51b-49e8-bea8-99af238b8a3e","Type":"ContainerDied","Data":"8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9"} Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.873137 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54956bdb55-m77pq" event={"ID":"159cf586-d51b-49e8-bea8-99af238b8a3e","Type":"ContainerDied","Data":"6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab"} Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.873150 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54956bdb55-m77pq" event={"ID":"159cf586-d51b-49e8-bea8-99af238b8a3e","Type":"ContainerDied","Data":"4e28f3e465acd4c6f672fce5b1b114fcf10939feb9816a3820f5b6f281f46c1c"} Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.873179 4795 scope.go:117] "RemoveContainer" containerID="8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.878984 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.879005 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62013c47-67bc-44c5-a250-390102661c05-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.879015 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.879023 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.879032 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slh8s\" (UniqueName: \"kubernetes.io/projected/62013c47-67bc-44c5-a250-390102661c05-kube-api-access-slh8s\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.879041 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cp762\" (UniqueName: \"kubernetes.io/projected/159cf586-d51b-49e8-bea8-99af238b8a3e-kube-api-access-cp762\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.879052 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.879061 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.879071 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.879080 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/159cf586-d51b-49e8-bea8-99af238b8a3e-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.880701 4795 generic.go:334] "Generic (PLEG): container finished" podID="62013c47-67bc-44c5-a250-390102661c05" containerID="5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075" exitCode=0 Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.880732 4795 generic.go:334] "Generic (PLEG): container finished" podID="62013c47-67bc-44c5-a250-390102661c05" containerID="c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b" exitCode=143 Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.880756 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d487967c6-th49z" event={"ID":"62013c47-67bc-44c5-a250-390102661c05","Type":"ContainerDied","Data":"5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075"} Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.880773 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.880788 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d487967c6-th49z" event={"ID":"62013c47-67bc-44c5-a250-390102661c05","Type":"ContainerDied","Data":"c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b"} Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.880934 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d487967c6-th49z" event={"ID":"62013c47-67bc-44c5-a250-390102661c05","Type":"ContainerDied","Data":"03d4ba2c4c2e12794fc2992e51f7d78cc75a2714702ce6582a4cf8391b4968e8"} Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.907097 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-54956bdb55-m77pq"] Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.918075 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-54956bdb55-m77pq"] Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.922996 4795 scope.go:117] "RemoveContainer" containerID="6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.926059 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-5d487967c6-th49z"] Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.935093 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-5d487967c6-th49z"] Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.956947 4795 scope.go:117] "RemoveContainer" containerID="8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9" Feb 19 21:46:48 crc kubenswrapper[4795]: E0219 21:46:48.957391 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9\": container with ID starting with 8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9 not found: ID does not exist" containerID="8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.957436 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9"} err="failed to get container status \"8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9\": rpc error: code = NotFound desc = could not find container \"8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9\": container with ID starting with 8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9 not found: ID does not exist" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.957465 4795 scope.go:117] "RemoveContainer" containerID="6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab" Feb 19 21:46:48 crc kubenswrapper[4795]: E0219 21:46:48.958065 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab\": container with ID starting with 6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab not found: ID does not exist" containerID="6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.958116 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab"} err="failed to get container status \"6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab\": rpc error: code = NotFound desc = could not find container \"6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab\": container with ID starting with 6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab not found: ID does not exist" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.958144 4795 scope.go:117] "RemoveContainer" containerID="8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.958589 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9"} err="failed to get container status \"8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9\": rpc error: code = NotFound desc = could not find container \"8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9\": container with ID starting with 8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9 not found: ID does not exist" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.958631 4795 scope.go:117] "RemoveContainer" containerID="6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.959057 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab"} err="failed to get container status \"6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab\": rpc error: code = NotFound desc = could not find container \"6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab\": container with ID starting with 6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab not found: ID does not exist" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.959102 4795 scope.go:117] "RemoveContainer" containerID="5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.980337 4795 scope.go:117] "RemoveContainer" containerID="c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b" Feb 19 21:46:49 crc kubenswrapper[4795]: I0219 21:46:49.044889 4795 scope.go:117] "RemoveContainer" containerID="5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075" Feb 19 21:46:49 crc kubenswrapper[4795]: E0219 21:46:49.045350 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075\": container with ID starting with 5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075 not found: ID does not exist" containerID="5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075" Feb 19 21:46:49 crc kubenswrapper[4795]: I0219 21:46:49.045379 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075"} err="failed to get container status \"5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075\": rpc error: code = NotFound desc = could not find container \"5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075\": container with ID starting with 5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075 not found: ID does not exist" Feb 19 21:46:49 crc kubenswrapper[4795]: I0219 21:46:49.045399 4795 scope.go:117] "RemoveContainer" containerID="c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b" Feb 19 21:46:49 crc kubenswrapper[4795]: E0219 21:46:49.045737 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b\": container with ID starting with c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b not found: ID does not exist" containerID="c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b" Feb 19 21:46:49 crc kubenswrapper[4795]: I0219 21:46:49.045766 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b"} err="failed to get container status \"c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b\": rpc error: code = NotFound desc = could not find container \"c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b\": container with ID starting with c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b not found: ID does not exist" Feb 19 21:46:49 crc kubenswrapper[4795]: I0219 21:46:49.045779 4795 scope.go:117] "RemoveContainer" containerID="5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075" Feb 19 21:46:49 crc kubenswrapper[4795]: I0219 21:46:49.046058 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075"} err="failed to get container status \"5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075\": rpc error: code = NotFound desc = could not find container \"5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075\": container with ID starting with 5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075 not found: ID does not exist" Feb 19 21:46:49 crc kubenswrapper[4795]: I0219 21:46:49.046089 4795 scope.go:117] "RemoveContainer" containerID="c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b" Feb 19 21:46:49 crc kubenswrapper[4795]: I0219 21:46:49.046252 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b"} err="failed to get container status \"c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b\": rpc error: code = NotFound desc = could not find container \"c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b\": container with ID starting with c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b not found: ID does not exist" Feb 19 21:46:49 crc kubenswrapper[4795]: I0219 21:46:49.525261 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="159cf586-d51b-49e8-bea8-99af238b8a3e" path="/var/lib/kubelet/pods/159cf586-d51b-49e8-bea8-99af238b8a3e/volumes" Feb 19 21:46:49 crc kubenswrapper[4795]: I0219 21:46:49.526097 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62013c47-67bc-44c5-a250-390102661c05" path="/var/lib/kubelet/pods/62013c47-67bc-44c5-a250-390102661c05/volumes" Feb 19 21:46:49 crc kubenswrapper[4795]: I0219 21:46:49.915385 4795 generic.go:334] "Generic (PLEG): container finished" podID="5084e7b9-4923-449e-b0d7-28c602faeff0" containerID="0852bee2926e9dad886249433c004e0f4241817093b65fb2480708d4c6c502c0" exitCode=0 Feb 19 21:46:49 crc kubenswrapper[4795]: I0219 21:46:49.915479 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zjbsw" event={"ID":"5084e7b9-4923-449e-b0d7-28c602faeff0","Type":"ContainerDied","Data":"0852bee2926e9dad886249433c004e0f4241817093b65fb2480708d4c6c502c0"} Feb 19 21:46:51 crc kubenswrapper[4795]: I0219 21:46:51.112437 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5f5874b546-54bp8" podUID="c1ee7c17-521f-45b3-bdb4-748939838e60" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.158:9311/healthcheck\": read tcp 10.217.0.2:37872->10.217.0.158:9311: read: connection reset by peer" Feb 19 21:46:51 crc kubenswrapper[4795]: I0219 21:46:51.941272 4795 generic.go:334] "Generic (PLEG): container finished" podID="c1ee7c17-521f-45b3-bdb4-748939838e60" containerID="bb53ff7aacbd051823727a3e5e9c0df1aa87aded1e6321d5bbd77a8a437a808e" exitCode=0 Feb 19 21:46:51 crc kubenswrapper[4795]: I0219 21:46:51.941340 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f5874b546-54bp8" event={"ID":"c1ee7c17-521f-45b3-bdb4-748939838e60","Type":"ContainerDied","Data":"bb53ff7aacbd051823727a3e5e9c0df1aa87aded1e6321d5bbd77a8a437a808e"} Feb 19 21:46:53 crc kubenswrapper[4795]: I0219 21:46:53.427468 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:53 crc kubenswrapper[4795]: I0219 21:46:53.435520 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5f5874b546-54bp8" podUID="c1ee7c17-521f-45b3-bdb4-748939838e60" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.158:9311/healthcheck\": dial tcp 10.217.0.158:9311: connect: connection refused" Feb 19 21:46:53 crc kubenswrapper[4795]: I0219 21:46:53.435763 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5f5874b546-54bp8" podUID="c1ee7c17-521f-45b3-bdb4-748939838e60" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.158:9311/healthcheck\": dial tcp 10.217.0.158:9311: connect: connection refused" Feb 19 21:46:53 crc kubenswrapper[4795]: I0219 21:46:53.593588 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67dccc895-rxl4z"] Feb 19 21:46:53 crc kubenswrapper[4795]: I0219 21:46:53.594134 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67dccc895-rxl4z" podUID="db8db625-527b-49de-bab0-c2065360d792" containerName="dnsmasq-dns" containerID="cri-o://c943900c340586ccd6fb6dfea1e9575c93f20a12cfb4d8dd7d04eaf8b69bb4cb" gracePeriod=10 Feb 19 21:46:53 crc kubenswrapper[4795]: I0219 21:46:53.962411 4795 generic.go:334] "Generic (PLEG): container finished" podID="db8db625-527b-49de-bab0-c2065360d792" containerID="c943900c340586ccd6fb6dfea1e9575c93f20a12cfb4d8dd7d04eaf8b69bb4cb" exitCode=0 Feb 19 21:46:53 crc kubenswrapper[4795]: I0219 21:46:53.962470 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dccc895-rxl4z" event={"ID":"db8db625-527b-49de-bab0-c2065360d792","Type":"ContainerDied","Data":"c943900c340586ccd6fb6dfea1e9575c93f20a12cfb4d8dd7d04eaf8b69bb4cb"} Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.005043 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.038500 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.087612 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5fdbf9784-tjjsd"] Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.087860 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5fdbf9784-tjjsd" podUID="ff9dd6ee-d043-41af-bcfa-8385ae786038" containerName="barbican-api-log" containerID="cri-o://628cf011c1e3cfa652516057f286c880a7bb4ae1632cbe08579b65a9e2873451" gracePeriod=30 Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.088245 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5fdbf9784-tjjsd" podUID="ff9dd6ee-d043-41af-bcfa-8385ae786038" containerName="barbican-api" containerID="cri-o://1e747e5d80d698d410aec3a2712fd1002c46a3b555382ce77d51ba1659f59721" gracePeriod=30 Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.105978 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5fdbf9784-tjjsd" podUID="ff9dd6ee-d043-41af-bcfa-8385ae786038" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": EOF" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.106025 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5fdbf9784-tjjsd" podUID="ff9dd6ee-d043-41af-bcfa-8385ae786038" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": EOF" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.301299 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.357097 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-scripts\") pod \"5084e7b9-4923-449e-b0d7-28c602faeff0\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.357361 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5084e7b9-4923-449e-b0d7-28c602faeff0-etc-machine-id\") pod \"5084e7b9-4923-449e-b0d7-28c602faeff0\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.357437 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-db-sync-config-data\") pod \"5084e7b9-4923-449e-b0d7-28c602faeff0\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.357480 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-combined-ca-bundle\") pod \"5084e7b9-4923-449e-b0d7-28c602faeff0\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.357509 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-config-data\") pod \"5084e7b9-4923-449e-b0d7-28c602faeff0\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.357570 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbqsc\" (UniqueName: \"kubernetes.io/projected/5084e7b9-4923-449e-b0d7-28c602faeff0-kube-api-access-qbqsc\") pod \"5084e7b9-4923-449e-b0d7-28c602faeff0\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.360206 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5084e7b9-4923-449e-b0d7-28c602faeff0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5084e7b9-4923-449e-b0d7-28c602faeff0" (UID: "5084e7b9-4923-449e-b0d7-28c602faeff0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.361188 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5084e7b9-4923-449e-b0d7-28c602faeff0" (UID: "5084e7b9-4923-449e-b0d7-28c602faeff0"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.365082 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-scripts" (OuterVolumeSpecName: "scripts") pod "5084e7b9-4923-449e-b0d7-28c602faeff0" (UID: "5084e7b9-4923-449e-b0d7-28c602faeff0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.381352 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5084e7b9-4923-449e-b0d7-28c602faeff0-kube-api-access-qbqsc" (OuterVolumeSpecName: "kube-api-access-qbqsc") pod "5084e7b9-4923-449e-b0d7-28c602faeff0" (UID: "5084e7b9-4923-449e-b0d7-28c602faeff0"). InnerVolumeSpecName "kube-api-access-qbqsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.429029 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-config-data" (OuterVolumeSpecName: "config-data") pod "5084e7b9-4923-449e-b0d7-28c602faeff0" (UID: "5084e7b9-4923-449e-b0d7-28c602faeff0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.440301 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5084e7b9-4923-449e-b0d7-28c602faeff0" (UID: "5084e7b9-4923-449e-b0d7-28c602faeff0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.460632 4795 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5084e7b9-4923-449e-b0d7-28c602faeff0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.462312 4795 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.462336 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.462350 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.462639 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbqsc\" (UniqueName: \"kubernetes.io/projected/5084e7b9-4923-449e-b0d7-28c602faeff0-kube-api-access-qbqsc\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.462661 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.529764 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.563763 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgvdx\" (UniqueName: \"kubernetes.io/projected/c1ee7c17-521f-45b3-bdb4-748939838e60-kube-api-access-xgvdx\") pod \"c1ee7c17-521f-45b3-bdb4-748939838e60\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.564133 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-combined-ca-bundle\") pod \"c1ee7c17-521f-45b3-bdb4-748939838e60\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.564162 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-config-data-custom\") pod \"c1ee7c17-521f-45b3-bdb4-748939838e60\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.564283 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1ee7c17-521f-45b3-bdb4-748939838e60-logs\") pod \"c1ee7c17-521f-45b3-bdb4-748939838e60\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.564423 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-config-data\") pod \"c1ee7c17-521f-45b3-bdb4-748939838e60\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.571696 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c1ee7c17-521f-45b3-bdb4-748939838e60" (UID: "c1ee7c17-521f-45b3-bdb4-748939838e60"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.573459 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1ee7c17-521f-45b3-bdb4-748939838e60-kube-api-access-xgvdx" (OuterVolumeSpecName: "kube-api-access-xgvdx") pod "c1ee7c17-521f-45b3-bdb4-748939838e60" (UID: "c1ee7c17-521f-45b3-bdb4-748939838e60"). InnerVolumeSpecName "kube-api-access-xgvdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.574321 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1ee7c17-521f-45b3-bdb4-748939838e60-logs" (OuterVolumeSpecName: "logs") pod "c1ee7c17-521f-45b3-bdb4-748939838e60" (UID: "c1ee7c17-521f-45b3-bdb4-748939838e60"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.596639 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1ee7c17-521f-45b3-bdb4-748939838e60" (UID: "c1ee7c17-521f-45b3-bdb4-748939838e60"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.647374 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.647583 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-config-data" (OuterVolumeSpecName: "config-data") pod "c1ee7c17-521f-45b3-bdb4-748939838e60" (UID: "c1ee7c17-521f-45b3-bdb4-748939838e60"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.668074 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-config\") pod \"db8db625-527b-49de-bab0-c2065360d792\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.668145 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfv2b\" (UniqueName: \"kubernetes.io/projected/db8db625-527b-49de-bab0-c2065360d792-kube-api-access-jfv2b\") pod \"db8db625-527b-49de-bab0-c2065360d792\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.668196 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-ovsdbserver-sb\") pod \"db8db625-527b-49de-bab0-c2065360d792\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.668294 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-ovsdbserver-nb\") pod \"db8db625-527b-49de-bab0-c2065360d792\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.668344 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-dns-swift-storage-0\") pod \"db8db625-527b-49de-bab0-c2065360d792\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.668378 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-dns-svc\") pod \"db8db625-527b-49de-bab0-c2065360d792\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.669094 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgvdx\" (UniqueName: \"kubernetes.io/projected/c1ee7c17-521f-45b3-bdb4-748939838e60-kube-api-access-xgvdx\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.669112 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.669121 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.669129 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1ee7c17-521f-45b3-bdb4-748939838e60-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.669138 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.682501 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db8db625-527b-49de-bab0-c2065360d792-kube-api-access-jfv2b" (OuterVolumeSpecName: "kube-api-access-jfv2b") pod "db8db625-527b-49de-bab0-c2065360d792" (UID: "db8db625-527b-49de-bab0-c2065360d792"). InnerVolumeSpecName "kube-api-access-jfv2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.735072 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "db8db625-527b-49de-bab0-c2065360d792" (UID: "db8db625-527b-49de-bab0-c2065360d792"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.738762 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "db8db625-527b-49de-bab0-c2065360d792" (UID: "db8db625-527b-49de-bab0-c2065360d792"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.757679 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-config" (OuterVolumeSpecName: "config") pod "db8db625-527b-49de-bab0-c2065360d792" (UID: "db8db625-527b-49de-bab0-c2065360d792"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.758615 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "db8db625-527b-49de-bab0-c2065360d792" (UID: "db8db625-527b-49de-bab0-c2065360d792"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.769984 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.770027 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfv2b\" (UniqueName: \"kubernetes.io/projected/db8db625-527b-49de-bab0-c2065360d792-kube-api-access-jfv2b\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.770037 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.770047 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.770056 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.778079 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "db8db625-527b-49de-bab0-c2065360d792" (UID: "db8db625-527b-49de-bab0-c2065360d792"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.871317 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.972978 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6698443-b029-4098-81d6-dba6d5f239f2","Type":"ContainerStarted","Data":"99e3eeb18daa232515a54095f899a9b422cbf1a8c8121e9f859ed25245a01eaf"} Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.973110 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" containerName="ceilometer-central-agent" containerID="cri-o://b61288d5e6e3a8ef1a9301b874e513c48be039a918932bba9b9edb5495a72e81" gracePeriod=30 Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.973239 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.973251 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" containerName="proxy-httpd" containerID="cri-o://99e3eeb18daa232515a54095f899a9b422cbf1a8c8121e9f859ed25245a01eaf" gracePeriod=30 Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.973311 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" containerName="sg-core" containerID="cri-o://f0e44fbdb8e3ea25287eea5edbbd58cd883e463ecd967da058e1d18fafd8bc62" gracePeriod=30 Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.973364 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" containerName="ceilometer-notification-agent" containerID="cri-o://e126e33b52d2dc6b19be356eca9905c5a1a510ca7016f4ef6d57dc72b82f12fd" gracePeriod=30 Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.983943 4795 generic.go:334] "Generic (PLEG): container finished" podID="ff9dd6ee-d043-41af-bcfa-8385ae786038" containerID="628cf011c1e3cfa652516057f286c880a7bb4ae1632cbe08579b65a9e2873451" exitCode=143 Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.984033 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fdbf9784-tjjsd" event={"ID":"ff9dd6ee-d043-41af-bcfa-8385ae786038","Type":"ContainerDied","Data":"628cf011c1e3cfa652516057f286c880a7bb4ae1632cbe08579b65a9e2873451"} Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.990111 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f5874b546-54bp8" event={"ID":"c1ee7c17-521f-45b3-bdb4-748939838e60","Type":"ContainerDied","Data":"be89fa2e2a317aa99f43093ff5681ba4ce22ead52fd8adae9138024217c6894f"} Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.990156 4795 scope.go:117] "RemoveContainer" containerID="bb53ff7aacbd051823727a3e5e9c0df1aa87aded1e6321d5bbd77a8a437a808e" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.990304 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.003133 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.850288343 podStartE2EDuration="46.003120068s" podCreationTimestamp="2026-02-19 21:46:09 +0000 UTC" firstStartedPulling="2026-02-19 21:46:10.187077849 +0000 UTC m=+1081.379595713" lastFinishedPulling="2026-02-19 21:46:54.339909574 +0000 UTC m=+1125.532427438" observedRunningTime="2026-02-19 21:46:55.001498892 +0000 UTC m=+1126.194016756" watchObservedRunningTime="2026-02-19 21:46:55.003120068 +0000 UTC m=+1126.195637932" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.025754 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dccc895-rxl4z" event={"ID":"db8db625-527b-49de-bab0-c2065360d792","Type":"ContainerDied","Data":"4fa1b7dc967ebd0720fc00eaa4e4599d7f0abf1d8d22f716edb0032cd429d0aa"} Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.025879 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.032928 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.033016 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zjbsw" event={"ID":"5084e7b9-4923-449e-b0d7-28c602faeff0","Type":"ContainerDied","Data":"afea16e4a0f822c8b9769634b3e5dede9add2d921ddeb3308c942ab11c4579f9"} Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.033046 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afea16e4a0f822c8b9769634b3e5dede9add2d921ddeb3308c942ab11c4579f9" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.050464 4795 scope.go:117] "RemoveContainer" containerID="d05922b241c5606cc7cb67f0611459164ffa20eef296f34ff91982e9383ca569" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.077247 4795 scope.go:117] "RemoveContainer" containerID="c943900c340586ccd6fb6dfea1e9575c93f20a12cfb4d8dd7d04eaf8b69bb4cb" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.096831 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5f5874b546-54bp8"] Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.112415 4795 scope.go:117] "RemoveContainer" containerID="3f9a72a27cc208f60203c5a7cd9bfc613d8657082c6f35b5044badc8bc0caace" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.112670 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5f5874b546-54bp8"] Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.120634 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67dccc895-rxl4z"] Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.128156 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67dccc895-rxl4z"] Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.520729 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1ee7c17-521f-45b3-bdb4-748939838e60" path="/var/lib/kubelet/pods/c1ee7c17-521f-45b3-bdb4-748939838e60/volumes" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.521318 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db8db625-527b-49de-bab0-c2065360d792" path="/var/lib/kubelet/pods/db8db625-527b-49de-bab0-c2065360d792/volumes" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.606801 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:46:55 crc kubenswrapper[4795]: E0219 21:46:55.607152 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1ee7c17-521f-45b3-bdb4-748939838e60" containerName="barbican-api-log" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607195 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1ee7c17-521f-45b3-bdb4-748939838e60" containerName="barbican-api-log" Feb 19 21:46:55 crc kubenswrapper[4795]: E0219 21:46:55.607210 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62013c47-67bc-44c5-a250-390102661c05" containerName="barbican-keystone-listener-log" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607217 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="62013c47-67bc-44c5-a250-390102661c05" containerName="barbican-keystone-listener-log" Feb 19 21:46:55 crc kubenswrapper[4795]: E0219 21:46:55.607227 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5084e7b9-4923-449e-b0d7-28c602faeff0" containerName="cinder-db-sync" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607233 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5084e7b9-4923-449e-b0d7-28c602faeff0" containerName="cinder-db-sync" Feb 19 21:46:55 crc kubenswrapper[4795]: E0219 21:46:55.607244 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1ee7c17-521f-45b3-bdb4-748939838e60" containerName="barbican-api" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607250 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1ee7c17-521f-45b3-bdb4-748939838e60" containerName="barbican-api" Feb 19 21:46:55 crc kubenswrapper[4795]: E0219 21:46:55.607259 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="159cf586-d51b-49e8-bea8-99af238b8a3e" containerName="barbican-worker" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607265 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="159cf586-d51b-49e8-bea8-99af238b8a3e" containerName="barbican-worker" Feb 19 21:46:55 crc kubenswrapper[4795]: E0219 21:46:55.607276 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db8db625-527b-49de-bab0-c2065360d792" containerName="init" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607281 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="db8db625-527b-49de-bab0-c2065360d792" containerName="init" Feb 19 21:46:55 crc kubenswrapper[4795]: E0219 21:46:55.607289 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="159cf586-d51b-49e8-bea8-99af238b8a3e" containerName="barbican-worker-log" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607295 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="159cf586-d51b-49e8-bea8-99af238b8a3e" containerName="barbican-worker-log" Feb 19 21:46:55 crc kubenswrapper[4795]: E0219 21:46:55.607301 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db8db625-527b-49de-bab0-c2065360d792" containerName="dnsmasq-dns" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607309 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="db8db625-527b-49de-bab0-c2065360d792" containerName="dnsmasq-dns" Feb 19 21:46:55 crc kubenswrapper[4795]: E0219 21:46:55.607323 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62013c47-67bc-44c5-a250-390102661c05" containerName="barbican-keystone-listener" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607329 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="62013c47-67bc-44c5-a250-390102661c05" containerName="barbican-keystone-listener" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607496 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1ee7c17-521f-45b3-bdb4-748939838e60" containerName="barbican-api" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607512 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="62013c47-67bc-44c5-a250-390102661c05" containerName="barbican-keystone-listener" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607520 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="db8db625-527b-49de-bab0-c2065360d792" containerName="dnsmasq-dns" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607528 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="159cf586-d51b-49e8-bea8-99af238b8a3e" containerName="barbican-worker-log" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607544 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5084e7b9-4923-449e-b0d7-28c602faeff0" containerName="cinder-db-sync" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607561 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1ee7c17-521f-45b3-bdb4-748939838e60" containerName="barbican-api-log" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607574 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="62013c47-67bc-44c5-a250-390102661c05" containerName="barbican-keystone-listener-log" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607585 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="159cf586-d51b-49e8-bea8-99af238b8a3e" containerName="barbican-worker" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.614202 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.619446 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.619609 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.619630 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.622076 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-wf9zm" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.625045 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.662996 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c8dc7b4d9-wgth2"] Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.664370 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.688001 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c8dc7b4d9-wgth2"] Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.700021 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-ovsdbserver-nb\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.700084 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hb2s\" (UniqueName: \"kubernetes.io/projected/df387754-5537-4d85-950b-02743c881da8-kube-api-access-2hb2s\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.700114 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2384b48a-ae68-4495-9c68-2faf894de9f9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.700193 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-scripts\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.700218 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-ovsdbserver-sb\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.700347 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.700441 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-dns-svc\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.700464 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-dns-swift-storage-0\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.700511 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5clv6\" (UniqueName: \"kubernetes.io/projected/2384b48a-ae68-4495-9c68-2faf894de9f9-kube-api-access-5clv6\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.700539 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-config-data\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.700555 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-config\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.700585 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.803033 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5clv6\" (UniqueName: \"kubernetes.io/projected/2384b48a-ae68-4495-9c68-2faf894de9f9-kube-api-access-5clv6\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.803092 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-config-data\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.803116 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-config\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.803144 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.803183 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-ovsdbserver-nb\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.803206 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hb2s\" (UniqueName: \"kubernetes.io/projected/df387754-5537-4d85-950b-02743c881da8-kube-api-access-2hb2s\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.803222 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2384b48a-ae68-4495-9c68-2faf894de9f9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.803250 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-scripts\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.803265 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-ovsdbserver-sb\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.803290 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.803328 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-dns-svc\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.803348 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-dns-swift-storage-0\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.804298 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-dns-swift-storage-0\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.804352 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2384b48a-ae68-4495-9c68-2faf894de9f9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.804967 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-config\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.805322 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-ovsdbserver-nb\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.805328 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-ovsdbserver-sb\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.805698 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-dns-svc\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.811395 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.813363 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-config-data\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.822690 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-scripts\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.830761 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.840856 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hb2s\" (UniqueName: \"kubernetes.io/projected/df387754-5537-4d85-950b-02743c881da8-kube-api-access-2hb2s\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.845027 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5clv6\" (UniqueName: \"kubernetes.io/projected/2384b48a-ae68-4495-9c68-2faf894de9f9-kube-api-access-5clv6\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.907215 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.908636 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.913331 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.921410 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.932546 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.988128 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.016062 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-scripts\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.016120 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dff4a455-15cc-4733-adfd-0f27404e54ed-logs\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.016181 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8fxd\" (UniqueName: \"kubernetes.io/projected/dff4a455-15cc-4733-adfd-0f27404e54ed-kube-api-access-r8fxd\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.016235 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.016262 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-config-data\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.016283 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dff4a455-15cc-4733-adfd-0f27404e54ed-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.016325 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-config-data-custom\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.079892 4795 generic.go:334] "Generic (PLEG): container finished" podID="f6698443-b029-4098-81d6-dba6d5f239f2" containerID="99e3eeb18daa232515a54095f899a9b422cbf1a8c8121e9f859ed25245a01eaf" exitCode=0 Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.080242 4795 generic.go:334] "Generic (PLEG): container finished" podID="f6698443-b029-4098-81d6-dba6d5f239f2" containerID="f0e44fbdb8e3ea25287eea5edbbd58cd883e463ecd967da058e1d18fafd8bc62" exitCode=2 Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.080252 4795 generic.go:334] "Generic (PLEG): container finished" podID="f6698443-b029-4098-81d6-dba6d5f239f2" containerID="b61288d5e6e3a8ef1a9301b874e513c48be039a918932bba9b9edb5495a72e81" exitCode=0 Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.080002 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6698443-b029-4098-81d6-dba6d5f239f2","Type":"ContainerDied","Data":"99e3eeb18daa232515a54095f899a9b422cbf1a8c8121e9f859ed25245a01eaf"} Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.080334 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6698443-b029-4098-81d6-dba6d5f239f2","Type":"ContainerDied","Data":"f0e44fbdb8e3ea25287eea5edbbd58cd883e463ecd967da058e1d18fafd8bc62"} Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.080360 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6698443-b029-4098-81d6-dba6d5f239f2","Type":"ContainerDied","Data":"b61288d5e6e3a8ef1a9301b874e513c48be039a918932bba9b9edb5495a72e81"} Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.119090 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8fxd\" (UniqueName: \"kubernetes.io/projected/dff4a455-15cc-4733-adfd-0f27404e54ed-kube-api-access-r8fxd\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.119240 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.119273 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-config-data\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.119296 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dff4a455-15cc-4733-adfd-0f27404e54ed-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.119346 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-config-data-custom\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.119375 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-scripts\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.119400 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dff4a455-15cc-4733-adfd-0f27404e54ed-logs\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.119783 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dff4a455-15cc-4733-adfd-0f27404e54ed-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.119997 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dff4a455-15cc-4733-adfd-0f27404e54ed-logs\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.123980 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.124400 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-config-data\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.124974 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-config-data-custom\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.125318 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-scripts\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.140574 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8fxd\" (UniqueName: \"kubernetes.io/projected/dff4a455-15cc-4733-adfd-0f27404e54ed-kube-api-access-r8fxd\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.252080 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.584766 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:46:56 crc kubenswrapper[4795]: W0219 21:46:56.587227 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2384b48a_ae68_4495_9c68_2faf894de9f9.slice/crio-1bb0109c274477813b9f48090bc2380bc85368bf7c3040d3068fd48ba9c4524c WatchSource:0}: Error finding container 1bb0109c274477813b9f48090bc2380bc85368bf7c3040d3068fd48ba9c4524c: Status 404 returned error can't find the container with id 1bb0109c274477813b9f48090bc2380bc85368bf7c3040d3068fd48ba9c4524c Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.700856 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c8dc7b4d9-wgth2"] Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.833042 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:46:56 crc kubenswrapper[4795]: W0219 21:46:56.859314 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddff4a455_15cc_4733_adfd_0f27404e54ed.slice/crio-34626286d7c48cbd7fc95276ad3459553314042c017067627d69a5fb7eeaa601 WatchSource:0}: Error finding container 34626286d7c48cbd7fc95276ad3459553314042c017067627d69a5fb7eeaa601: Status 404 returned error can't find the container with id 34626286d7c48cbd7fc95276ad3459553314042c017067627d69a5fb7eeaa601 Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.940576 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.038370 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-sg-core-conf-yaml\") pod \"f6698443-b029-4098-81d6-dba6d5f239f2\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.038651 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-scripts\") pod \"f6698443-b029-4098-81d6-dba6d5f239f2\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.038722 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drpsg\" (UniqueName: \"kubernetes.io/projected/f6698443-b029-4098-81d6-dba6d5f239f2-kube-api-access-drpsg\") pod \"f6698443-b029-4098-81d6-dba6d5f239f2\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.038765 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6698443-b029-4098-81d6-dba6d5f239f2-log-httpd\") pod \"f6698443-b029-4098-81d6-dba6d5f239f2\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.038895 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6698443-b029-4098-81d6-dba6d5f239f2-run-httpd\") pod \"f6698443-b029-4098-81d6-dba6d5f239f2\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.038942 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-config-data\") pod \"f6698443-b029-4098-81d6-dba6d5f239f2\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.038990 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-combined-ca-bundle\") pod \"f6698443-b029-4098-81d6-dba6d5f239f2\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.040477 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6698443-b029-4098-81d6-dba6d5f239f2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f6698443-b029-4098-81d6-dba6d5f239f2" (UID: "f6698443-b029-4098-81d6-dba6d5f239f2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.041672 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6698443-b029-4098-81d6-dba6d5f239f2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f6698443-b029-4098-81d6-dba6d5f239f2" (UID: "f6698443-b029-4098-81d6-dba6d5f239f2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.044946 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-scripts" (OuterVolumeSpecName: "scripts") pod "f6698443-b029-4098-81d6-dba6d5f239f2" (UID: "f6698443-b029-4098-81d6-dba6d5f239f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.047849 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6698443-b029-4098-81d6-dba6d5f239f2-kube-api-access-drpsg" (OuterVolumeSpecName: "kube-api-access-drpsg") pod "f6698443-b029-4098-81d6-dba6d5f239f2" (UID: "f6698443-b029-4098-81d6-dba6d5f239f2"). InnerVolumeSpecName "kube-api-access-drpsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.065100 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f6698443-b029-4098-81d6-dba6d5f239f2" (UID: "f6698443-b029-4098-81d6-dba6d5f239f2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.093068 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dff4a455-15cc-4733-adfd-0f27404e54ed","Type":"ContainerStarted","Data":"34626286d7c48cbd7fc95276ad3459553314042c017067627d69a5fb7eeaa601"} Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.094524 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2384b48a-ae68-4495-9c68-2faf894de9f9","Type":"ContainerStarted","Data":"1bb0109c274477813b9f48090bc2380bc85368bf7c3040d3068fd48ba9c4524c"} Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.098652 4795 generic.go:334] "Generic (PLEG): container finished" podID="f6698443-b029-4098-81d6-dba6d5f239f2" containerID="e126e33b52d2dc6b19be356eca9905c5a1a510ca7016f4ef6d57dc72b82f12fd" exitCode=0 Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.098736 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6698443-b029-4098-81d6-dba6d5f239f2","Type":"ContainerDied","Data":"e126e33b52d2dc6b19be356eca9905c5a1a510ca7016f4ef6d57dc72b82f12fd"} Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.098765 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.098786 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6698443-b029-4098-81d6-dba6d5f239f2","Type":"ContainerDied","Data":"333abcac5e9ec8b6d96f2784182bddc2611944e9296eb36664d925e9b90b96b2"} Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.098805 4795 scope.go:117] "RemoveContainer" containerID="99e3eeb18daa232515a54095f899a9b422cbf1a8c8121e9f859ed25245a01eaf" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.113739 4795 generic.go:334] "Generic (PLEG): container finished" podID="df387754-5537-4d85-950b-02743c881da8" containerID="93a750f0a2156e5774cb54d38a230aaaa13bdcb55ce20e481877fcd04f00ee2e" exitCode=0 Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.113786 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" event={"ID":"df387754-5537-4d85-950b-02743c881da8","Type":"ContainerDied","Data":"93a750f0a2156e5774cb54d38a230aaaa13bdcb55ce20e481877fcd04f00ee2e"} Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.113834 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" event={"ID":"df387754-5537-4d85-950b-02743c881da8","Type":"ContainerStarted","Data":"8495c4433e798d1ac2e79ddf9c88ebf569c3cedf6bdc46579d7bb36aaf2eff72"} Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.117079 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6698443-b029-4098-81d6-dba6d5f239f2" (UID: "f6698443-b029-4098-81d6-dba6d5f239f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.118356 4795 scope.go:117] "RemoveContainer" containerID="f0e44fbdb8e3ea25287eea5edbbd58cd883e463ecd967da058e1d18fafd8bc62" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.138857 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-config-data" (OuterVolumeSpecName: "config-data") pod "f6698443-b029-4098-81d6-dba6d5f239f2" (UID: "f6698443-b029-4098-81d6-dba6d5f239f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.141507 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.141528 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drpsg\" (UniqueName: \"kubernetes.io/projected/f6698443-b029-4098-81d6-dba6d5f239f2-kube-api-access-drpsg\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.141540 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6698443-b029-4098-81d6-dba6d5f239f2-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.141549 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6698443-b029-4098-81d6-dba6d5f239f2-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.141556 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.141565 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.141573 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.153457 4795 scope.go:117] "RemoveContainer" containerID="e126e33b52d2dc6b19be356eca9905c5a1a510ca7016f4ef6d57dc72b82f12fd" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.331274 4795 scope.go:117] "RemoveContainer" containerID="b61288d5e6e3a8ef1a9301b874e513c48be039a918932bba9b9edb5495a72e81" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.360574 4795 scope.go:117] "RemoveContainer" containerID="99e3eeb18daa232515a54095f899a9b422cbf1a8c8121e9f859ed25245a01eaf" Feb 19 21:46:57 crc kubenswrapper[4795]: E0219 21:46:57.361543 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99e3eeb18daa232515a54095f899a9b422cbf1a8c8121e9f859ed25245a01eaf\": container with ID starting with 99e3eeb18daa232515a54095f899a9b422cbf1a8c8121e9f859ed25245a01eaf not found: ID does not exist" containerID="99e3eeb18daa232515a54095f899a9b422cbf1a8c8121e9f859ed25245a01eaf" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.361586 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99e3eeb18daa232515a54095f899a9b422cbf1a8c8121e9f859ed25245a01eaf"} err="failed to get container status \"99e3eeb18daa232515a54095f899a9b422cbf1a8c8121e9f859ed25245a01eaf\": rpc error: code = NotFound desc = could not find container \"99e3eeb18daa232515a54095f899a9b422cbf1a8c8121e9f859ed25245a01eaf\": container with ID starting with 99e3eeb18daa232515a54095f899a9b422cbf1a8c8121e9f859ed25245a01eaf not found: ID does not exist" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.361611 4795 scope.go:117] "RemoveContainer" containerID="f0e44fbdb8e3ea25287eea5edbbd58cd883e463ecd967da058e1d18fafd8bc62" Feb 19 21:46:57 crc kubenswrapper[4795]: E0219 21:46:57.361889 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0e44fbdb8e3ea25287eea5edbbd58cd883e463ecd967da058e1d18fafd8bc62\": container with ID starting with f0e44fbdb8e3ea25287eea5edbbd58cd883e463ecd967da058e1d18fafd8bc62 not found: ID does not exist" containerID="f0e44fbdb8e3ea25287eea5edbbd58cd883e463ecd967da058e1d18fafd8bc62" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.361932 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0e44fbdb8e3ea25287eea5edbbd58cd883e463ecd967da058e1d18fafd8bc62"} err="failed to get container status \"f0e44fbdb8e3ea25287eea5edbbd58cd883e463ecd967da058e1d18fafd8bc62\": rpc error: code = NotFound desc = could not find container \"f0e44fbdb8e3ea25287eea5edbbd58cd883e463ecd967da058e1d18fafd8bc62\": container with ID starting with f0e44fbdb8e3ea25287eea5edbbd58cd883e463ecd967da058e1d18fafd8bc62 not found: ID does not exist" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.361961 4795 scope.go:117] "RemoveContainer" containerID="e126e33b52d2dc6b19be356eca9905c5a1a510ca7016f4ef6d57dc72b82f12fd" Feb 19 21:46:57 crc kubenswrapper[4795]: E0219 21:46:57.362225 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e126e33b52d2dc6b19be356eca9905c5a1a510ca7016f4ef6d57dc72b82f12fd\": container with ID starting with e126e33b52d2dc6b19be356eca9905c5a1a510ca7016f4ef6d57dc72b82f12fd not found: ID does not exist" containerID="e126e33b52d2dc6b19be356eca9905c5a1a510ca7016f4ef6d57dc72b82f12fd" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.362248 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e126e33b52d2dc6b19be356eca9905c5a1a510ca7016f4ef6d57dc72b82f12fd"} err="failed to get container status \"e126e33b52d2dc6b19be356eca9905c5a1a510ca7016f4ef6d57dc72b82f12fd\": rpc error: code = NotFound desc = could not find container \"e126e33b52d2dc6b19be356eca9905c5a1a510ca7016f4ef6d57dc72b82f12fd\": container with ID starting with e126e33b52d2dc6b19be356eca9905c5a1a510ca7016f4ef6d57dc72b82f12fd not found: ID does not exist" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.362263 4795 scope.go:117] "RemoveContainer" containerID="b61288d5e6e3a8ef1a9301b874e513c48be039a918932bba9b9edb5495a72e81" Feb 19 21:46:57 crc kubenswrapper[4795]: E0219 21:46:57.362582 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b61288d5e6e3a8ef1a9301b874e513c48be039a918932bba9b9edb5495a72e81\": container with ID starting with b61288d5e6e3a8ef1a9301b874e513c48be039a918932bba9b9edb5495a72e81 not found: ID does not exist" containerID="b61288d5e6e3a8ef1a9301b874e513c48be039a918932bba9b9edb5495a72e81" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.362609 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b61288d5e6e3a8ef1a9301b874e513c48be039a918932bba9b9edb5495a72e81"} err="failed to get container status \"b61288d5e6e3a8ef1a9301b874e513c48be039a918932bba9b9edb5495a72e81\": rpc error: code = NotFound desc = could not find container \"b61288d5e6e3a8ef1a9301b874e513c48be039a918932bba9b9edb5495a72e81\": container with ID starting with b61288d5e6e3a8ef1a9301b874e513c48be039a918932bba9b9edb5495a72e81 not found: ID does not exist" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.568370 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.579759 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.625141 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:46:57 crc kubenswrapper[4795]: E0219 21:46:57.625590 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" containerName="ceilometer-central-agent" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.625606 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" containerName="ceilometer-central-agent" Feb 19 21:46:57 crc kubenswrapper[4795]: E0219 21:46:57.625627 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" containerName="ceilometer-notification-agent" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.625634 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" containerName="ceilometer-notification-agent" Feb 19 21:46:57 crc kubenswrapper[4795]: E0219 21:46:57.625663 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" containerName="sg-core" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.625669 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" containerName="sg-core" Feb 19 21:46:57 crc kubenswrapper[4795]: E0219 21:46:57.625681 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" containerName="proxy-httpd" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.625687 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" containerName="proxy-httpd" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.625860 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" containerName="ceilometer-notification-agent" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.625881 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" containerName="sg-core" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.625897 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" containerName="proxy-httpd" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.625909 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" containerName="ceilometer-central-agent" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.627417 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.630520 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.630913 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.652038 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.652640 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-scripts\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.652682 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59wzg\" (UniqueName: \"kubernetes.io/projected/d2b418ec-23ae-4edd-8e61-0522a69c6be4-kube-api-access-59wzg\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.652706 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2b418ec-23ae-4edd-8e61-0522a69c6be4-run-httpd\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.652726 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.652765 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-config-data\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.652800 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2b418ec-23ae-4edd-8e61-0522a69c6be4-log-httpd\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.652826 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.754074 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.754472 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-scripts\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.754497 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59wzg\" (UniqueName: \"kubernetes.io/projected/d2b418ec-23ae-4edd-8e61-0522a69c6be4-kube-api-access-59wzg\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.754519 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2b418ec-23ae-4edd-8e61-0522a69c6be4-run-httpd\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.754539 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.754575 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-config-data\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.754611 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2b418ec-23ae-4edd-8e61-0522a69c6be4-log-httpd\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.755046 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2b418ec-23ae-4edd-8e61-0522a69c6be4-log-httpd\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.755289 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2b418ec-23ae-4edd-8e61-0522a69c6be4-run-httpd\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.762935 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.769894 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-scripts\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.770301 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-config-data\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.778932 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.784692 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59wzg\" (UniqueName: \"kubernetes.io/projected/d2b418ec-23ae-4edd-8e61-0522a69c6be4-kube-api-access-59wzg\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.951102 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.123292 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dff4a455-15cc-4733-adfd-0f27404e54ed","Type":"ContainerStarted","Data":"de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14"} Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.129805 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" event={"ID":"df387754-5537-4d85-950b-02743c881da8","Type":"ContainerStarted","Data":"70af76578d539ade05715b4f45e530828d624afd3981206c0f1b5d67746b15d9"} Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.129943 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.167889 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" podStartSLOduration=3.167872322 podStartE2EDuration="3.167872322s" podCreationTimestamp="2026-02-19 21:46:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:58.153339542 +0000 UTC m=+1129.345857406" watchObservedRunningTime="2026-02-19 21:46:58.167872322 +0000 UTC m=+1129.360390186" Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.278013 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.427178 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.427515 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.472394 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:46:58 crc kubenswrapper[4795]: W0219 21:46:58.504570 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2b418ec_23ae_4edd_8e61_0522a69c6be4.slice/crio-aca75b960d03859bf274a8a3797de1abbc9baeeee38219c6dcc1576889b95f11 WatchSource:0}: Error finding container aca75b960d03859bf274a8a3797de1abbc9baeeee38219c6dcc1576889b95f11: Status 404 returned error can't find the container with id aca75b960d03859bf274a8a3797de1abbc9baeeee38219c6dcc1576889b95f11 Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.599972 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5fdbf9784-tjjsd" podUID="ff9dd6ee-d043-41af-bcfa-8385ae786038" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:39470->10.217.0.161:9311: read: connection reset by peer" Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.600020 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5fdbf9784-tjjsd" podUID="ff9dd6ee-d043-41af-bcfa-8385ae786038" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:37784->10.217.0.161:9311: read: connection reset by peer" Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.600331 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5fdbf9784-tjjsd" podUID="ff9dd6ee-d043-41af-bcfa-8385ae786038" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": dial tcp 10.217.0.161:9311: connect: connection refused" Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.934926 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.991825 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4krhj\" (UniqueName: \"kubernetes.io/projected/ff9dd6ee-d043-41af-bcfa-8385ae786038-kube-api-access-4krhj\") pod \"ff9dd6ee-d043-41af-bcfa-8385ae786038\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.991963 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-combined-ca-bundle\") pod \"ff9dd6ee-d043-41af-bcfa-8385ae786038\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.992052 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff9dd6ee-d043-41af-bcfa-8385ae786038-logs\") pod \"ff9dd6ee-d043-41af-bcfa-8385ae786038\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.992233 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-config-data\") pod \"ff9dd6ee-d043-41af-bcfa-8385ae786038\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.992275 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-config-data-custom\") pod \"ff9dd6ee-d043-41af-bcfa-8385ae786038\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:58.992681 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff9dd6ee-d043-41af-bcfa-8385ae786038-logs" (OuterVolumeSpecName: "logs") pod "ff9dd6ee-d043-41af-bcfa-8385ae786038" (UID: "ff9dd6ee-d043-41af-bcfa-8385ae786038"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.000226 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff9dd6ee-d043-41af-bcfa-8385ae786038-kube-api-access-4krhj" (OuterVolumeSpecName: "kube-api-access-4krhj") pod "ff9dd6ee-d043-41af-bcfa-8385ae786038" (UID: "ff9dd6ee-d043-41af-bcfa-8385ae786038"). InnerVolumeSpecName "kube-api-access-4krhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.004766 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ff9dd6ee-d043-41af-bcfa-8385ae786038" (UID: "ff9dd6ee-d043-41af-bcfa-8385ae786038"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.046496 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff9dd6ee-d043-41af-bcfa-8385ae786038" (UID: "ff9dd6ee-d043-41af-bcfa-8385ae786038"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.072531 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-config-data" (OuterVolumeSpecName: "config-data") pod "ff9dd6ee-d043-41af-bcfa-8385ae786038" (UID: "ff9dd6ee-d043-41af-bcfa-8385ae786038"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.094707 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.094743 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.094757 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4krhj\" (UniqueName: \"kubernetes.io/projected/ff9dd6ee-d043-41af-bcfa-8385ae786038-kube-api-access-4krhj\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.094768 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.094780 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff9dd6ee-d043-41af-bcfa-8385ae786038-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.141445 4795 generic.go:334] "Generic (PLEG): container finished" podID="ff9dd6ee-d043-41af-bcfa-8385ae786038" containerID="1e747e5d80d698d410aec3a2712fd1002c46a3b555382ce77d51ba1659f59721" exitCode=0 Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.141489 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fdbf9784-tjjsd" event={"ID":"ff9dd6ee-d043-41af-bcfa-8385ae786038","Type":"ContainerDied","Data":"1e747e5d80d698d410aec3a2712fd1002c46a3b555382ce77d51ba1659f59721"} Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.141532 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.141555 4795 scope.go:117] "RemoveContainer" containerID="1e747e5d80d698d410aec3a2712fd1002c46a3b555382ce77d51ba1659f59721" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.141542 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fdbf9784-tjjsd" event={"ID":"ff9dd6ee-d043-41af-bcfa-8385ae786038","Type":"ContainerDied","Data":"86ab35cfd9c80c07dee3068dbe14e2a6dadda5cbd9e0d550648a5054e671ff43"} Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.149087 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dff4a455-15cc-4733-adfd-0f27404e54ed","Type":"ContainerStarted","Data":"be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237"} Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.149285 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="dff4a455-15cc-4733-adfd-0f27404e54ed" containerName="cinder-api-log" containerID="cri-o://de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14" gracePeriod=30 Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.149346 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="dff4a455-15cc-4733-adfd-0f27404e54ed" containerName="cinder-api" containerID="cri-o://be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237" gracePeriod=30 Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.149436 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.157553 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2b418ec-23ae-4edd-8e61-0522a69c6be4","Type":"ContainerStarted","Data":"f5d643cd66b59a6396d5feca0a7263f30adfef3399f6c9e0364f6fef62658bb6"} Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.157605 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2b418ec-23ae-4edd-8e61-0522a69c6be4","Type":"ContainerStarted","Data":"aca75b960d03859bf274a8a3797de1abbc9baeeee38219c6dcc1576889b95f11"} Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.159576 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2384b48a-ae68-4495-9c68-2faf894de9f9","Type":"ContainerStarted","Data":"9e6688b1bae5051b5c2b4f82f1c19ba9f7ca8fd1dd09d8ac892122e2dcf35f0c"} Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.159627 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2384b48a-ae68-4495-9c68-2faf894de9f9","Type":"ContainerStarted","Data":"749f08a4249525b1452b1f0b3c7c1bdacd64cc1b2a83d3c0d1cfa675182559ad"} Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.173328 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.173311988 podStartE2EDuration="4.173311988s" podCreationTimestamp="2026-02-19 21:46:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:59.169730297 +0000 UTC m=+1130.362248161" watchObservedRunningTime="2026-02-19 21:46:59.173311988 +0000 UTC m=+1130.365829852" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.180862 4795 scope.go:117] "RemoveContainer" containerID="628cf011c1e3cfa652516057f286c880a7bb4ae1632cbe08579b65a9e2873451" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.207879 4795 scope.go:117] "RemoveContainer" containerID="1e747e5d80d698d410aec3a2712fd1002c46a3b555382ce77d51ba1659f59721" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.211315 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.416279198 podStartE2EDuration="4.211297259s" podCreationTimestamp="2026-02-19 21:46:55 +0000 UTC" firstStartedPulling="2026-02-19 21:46:56.590459146 +0000 UTC m=+1127.782977010" lastFinishedPulling="2026-02-19 21:46:57.385477207 +0000 UTC m=+1128.577995071" observedRunningTime="2026-02-19 21:46:59.190295457 +0000 UTC m=+1130.382813331" watchObservedRunningTime="2026-02-19 21:46:59.211297259 +0000 UTC m=+1130.403815123" Feb 19 21:46:59 crc kubenswrapper[4795]: E0219 21:46:59.222785 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e747e5d80d698d410aec3a2712fd1002c46a3b555382ce77d51ba1659f59721\": container with ID starting with 1e747e5d80d698d410aec3a2712fd1002c46a3b555382ce77d51ba1659f59721 not found: ID does not exist" containerID="1e747e5d80d698d410aec3a2712fd1002c46a3b555382ce77d51ba1659f59721" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.222887 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e747e5d80d698d410aec3a2712fd1002c46a3b555382ce77d51ba1659f59721"} err="failed to get container status \"1e747e5d80d698d410aec3a2712fd1002c46a3b555382ce77d51ba1659f59721\": rpc error: code = NotFound desc = could not find container \"1e747e5d80d698d410aec3a2712fd1002c46a3b555382ce77d51ba1659f59721\": container with ID starting with 1e747e5d80d698d410aec3a2712fd1002c46a3b555382ce77d51ba1659f59721 not found: ID does not exist" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.222929 4795 scope.go:117] "RemoveContainer" containerID="628cf011c1e3cfa652516057f286c880a7bb4ae1632cbe08579b65a9e2873451" Feb 19 21:46:59 crc kubenswrapper[4795]: E0219 21:46:59.223601 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"628cf011c1e3cfa652516057f286c880a7bb4ae1632cbe08579b65a9e2873451\": container with ID starting with 628cf011c1e3cfa652516057f286c880a7bb4ae1632cbe08579b65a9e2873451 not found: ID does not exist" containerID="628cf011c1e3cfa652516057f286c880a7bb4ae1632cbe08579b65a9e2873451" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.223654 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"628cf011c1e3cfa652516057f286c880a7bb4ae1632cbe08579b65a9e2873451"} err="failed to get container status \"628cf011c1e3cfa652516057f286c880a7bb4ae1632cbe08579b65a9e2873451\": rpc error: code = NotFound desc = could not find container \"628cf011c1e3cfa652516057f286c880a7bb4ae1632cbe08579b65a9e2873451\": container with ID starting with 628cf011c1e3cfa652516057f286c880a7bb4ae1632cbe08579b65a9e2873451 not found: ID does not exist" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.237110 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5fdbf9784-tjjsd"] Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.248222 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5fdbf9784-tjjsd"] Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.542007 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" path="/var/lib/kubelet/pods/f6698443-b029-4098-81d6-dba6d5f239f2/volumes" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.543415 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff9dd6ee-d043-41af-bcfa-8385ae786038" path="/var/lib/kubelet/pods/ff9dd6ee-d043-41af-bcfa-8385ae786038/volumes" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.750039 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.808619 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-scripts\") pod \"dff4a455-15cc-4733-adfd-0f27404e54ed\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.808850 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dff4a455-15cc-4733-adfd-0f27404e54ed-etc-machine-id\") pod \"dff4a455-15cc-4733-adfd-0f27404e54ed\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.808997 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-config-data\") pod \"dff4a455-15cc-4733-adfd-0f27404e54ed\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.809103 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-combined-ca-bundle\") pod \"dff4a455-15cc-4733-adfd-0f27404e54ed\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.809224 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8fxd\" (UniqueName: \"kubernetes.io/projected/dff4a455-15cc-4733-adfd-0f27404e54ed-kube-api-access-r8fxd\") pod \"dff4a455-15cc-4733-adfd-0f27404e54ed\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.809599 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dff4a455-15cc-4733-adfd-0f27404e54ed-logs\") pod \"dff4a455-15cc-4733-adfd-0f27404e54ed\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.809719 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-config-data-custom\") pod \"dff4a455-15cc-4733-adfd-0f27404e54ed\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.811915 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dff4a455-15cc-4733-adfd-0f27404e54ed-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "dff4a455-15cc-4733-adfd-0f27404e54ed" (UID: "dff4a455-15cc-4733-adfd-0f27404e54ed"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.812663 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dff4a455-15cc-4733-adfd-0f27404e54ed-logs" (OuterVolumeSpecName: "logs") pod "dff4a455-15cc-4733-adfd-0f27404e54ed" (UID: "dff4a455-15cc-4733-adfd-0f27404e54ed"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.814212 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "dff4a455-15cc-4733-adfd-0f27404e54ed" (UID: "dff4a455-15cc-4733-adfd-0f27404e54ed"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.816330 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-scripts" (OuterVolumeSpecName: "scripts") pod "dff4a455-15cc-4733-adfd-0f27404e54ed" (UID: "dff4a455-15cc-4733-adfd-0f27404e54ed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.818387 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dff4a455-15cc-4733-adfd-0f27404e54ed-kube-api-access-r8fxd" (OuterVolumeSpecName: "kube-api-access-r8fxd") pod "dff4a455-15cc-4733-adfd-0f27404e54ed" (UID: "dff4a455-15cc-4733-adfd-0f27404e54ed"). InnerVolumeSpecName "kube-api-access-r8fxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.848349 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dff4a455-15cc-4733-adfd-0f27404e54ed" (UID: "dff4a455-15cc-4733-adfd-0f27404e54ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.885988 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-config-data" (OuterVolumeSpecName: "config-data") pod "dff4a455-15cc-4733-adfd-0f27404e54ed" (UID: "dff4a455-15cc-4733-adfd-0f27404e54ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.911995 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.912028 4795 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dff4a455-15cc-4733-adfd-0f27404e54ed-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.912037 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.912046 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.912056 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8fxd\" (UniqueName: \"kubernetes.io/projected/dff4a455-15cc-4733-adfd-0f27404e54ed-kube-api-access-r8fxd\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.912065 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dff4a455-15cc-4733-adfd-0f27404e54ed-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.912073 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.169679 4795 generic.go:334] "Generic (PLEG): container finished" podID="dff4a455-15cc-4733-adfd-0f27404e54ed" containerID="be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237" exitCode=0 Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.169708 4795 generic.go:334] "Generic (PLEG): container finished" podID="dff4a455-15cc-4733-adfd-0f27404e54ed" containerID="de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14" exitCode=143 Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.169753 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.169765 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dff4a455-15cc-4733-adfd-0f27404e54ed","Type":"ContainerDied","Data":"be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237"} Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.169794 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dff4a455-15cc-4733-adfd-0f27404e54ed","Type":"ContainerDied","Data":"de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14"} Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.169805 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dff4a455-15cc-4733-adfd-0f27404e54ed","Type":"ContainerDied","Data":"34626286d7c48cbd7fc95276ad3459553314042c017067627d69a5fb7eeaa601"} Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.169820 4795 scope.go:117] "RemoveContainer" containerID="be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.172417 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2b418ec-23ae-4edd-8e61-0522a69c6be4","Type":"ContainerStarted","Data":"fdbe214ee22443ca571dd54e58e45a8266df69233f71d59ca89e8fe90ae2ab86"} Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.203070 4795 scope.go:117] "RemoveContainer" containerID="de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.214354 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.224275 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.236723 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:47:00 crc kubenswrapper[4795]: E0219 21:47:00.237034 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dff4a455-15cc-4733-adfd-0f27404e54ed" containerName="cinder-api-log" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.237050 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff4a455-15cc-4733-adfd-0f27404e54ed" containerName="cinder-api-log" Feb 19 21:47:00 crc kubenswrapper[4795]: E0219 21:47:00.237073 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff9dd6ee-d043-41af-bcfa-8385ae786038" containerName="barbican-api" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.237079 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff9dd6ee-d043-41af-bcfa-8385ae786038" containerName="barbican-api" Feb 19 21:47:00 crc kubenswrapper[4795]: E0219 21:47:00.237107 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dff4a455-15cc-4733-adfd-0f27404e54ed" containerName="cinder-api" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.237113 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff4a455-15cc-4733-adfd-0f27404e54ed" containerName="cinder-api" Feb 19 21:47:00 crc kubenswrapper[4795]: E0219 21:47:00.237123 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff9dd6ee-d043-41af-bcfa-8385ae786038" containerName="barbican-api-log" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.237130 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff9dd6ee-d043-41af-bcfa-8385ae786038" containerName="barbican-api-log" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.237282 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="dff4a455-15cc-4733-adfd-0f27404e54ed" containerName="cinder-api" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.237297 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff9dd6ee-d043-41af-bcfa-8385ae786038" containerName="barbican-api-log" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.237314 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff9dd6ee-d043-41af-bcfa-8385ae786038" containerName="barbican-api" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.237324 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="dff4a455-15cc-4733-adfd-0f27404e54ed" containerName="cinder-api-log" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.238104 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.251798 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.252054 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.253115 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.278743 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.319949 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.319997 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj2cq\" (UniqueName: \"kubernetes.io/projected/d2561f4e-0a01-4927-96f8-ee7bef69f561-kube-api-access-gj2cq\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.320030 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2561f4e-0a01-4927-96f8-ee7bef69f561-logs\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.320235 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-scripts\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.320287 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.320342 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.320383 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-config-data-custom\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.320486 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-config-data\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.320645 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d2561f4e-0a01-4927-96f8-ee7bef69f561-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.402389 4795 scope.go:117] "RemoveContainer" containerID="be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237" Feb 19 21:47:00 crc kubenswrapper[4795]: E0219 21:47:00.403597 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237\": container with ID starting with be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237 not found: ID does not exist" containerID="be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.403650 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237"} err="failed to get container status \"be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237\": rpc error: code = NotFound desc = could not find container \"be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237\": container with ID starting with be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237 not found: ID does not exist" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.403680 4795 scope.go:117] "RemoveContainer" containerID="de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14" Feb 19 21:47:00 crc kubenswrapper[4795]: E0219 21:47:00.407465 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14\": container with ID starting with de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14 not found: ID does not exist" containerID="de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.407499 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14"} err="failed to get container status \"de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14\": rpc error: code = NotFound desc = could not find container \"de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14\": container with ID starting with de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14 not found: ID does not exist" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.407520 4795 scope.go:117] "RemoveContainer" containerID="be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.407875 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237"} err="failed to get container status \"be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237\": rpc error: code = NotFound desc = could not find container \"be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237\": container with ID starting with be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237 not found: ID does not exist" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.407929 4795 scope.go:117] "RemoveContainer" containerID="de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.408531 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14"} err="failed to get container status \"de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14\": rpc error: code = NotFound desc = could not find container \"de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14\": container with ID starting with de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14 not found: ID does not exist" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.422486 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2561f4e-0a01-4927-96f8-ee7bef69f561-logs\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.422547 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-scripts\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.422566 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.422582 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.422598 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-config-data-custom\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.422635 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-config-data\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.422687 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d2561f4e-0a01-4927-96f8-ee7bef69f561-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.422732 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.422758 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj2cq\" (UniqueName: \"kubernetes.io/projected/d2561f4e-0a01-4927-96f8-ee7bef69f561-kube-api-access-gj2cq\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.423283 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2561f4e-0a01-4927-96f8-ee7bef69f561-logs\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.423352 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d2561f4e-0a01-4927-96f8-ee7bef69f561-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.432610 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.432786 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-config-data-custom\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.439428 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-config-data\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.439658 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.451833 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-scripts\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.458259 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.464895 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj2cq\" (UniqueName: \"kubernetes.io/projected/d2561f4e-0a01-4927-96f8-ee7bef69f561-kube-api-access-gj2cq\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.690765 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.934040 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 19 21:47:01 crc kubenswrapper[4795]: I0219 21:47:01.157479 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:47:01 crc kubenswrapper[4795]: I0219 21:47:01.180515 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d2561f4e-0a01-4927-96f8-ee7bef69f561","Type":"ContainerStarted","Data":"72a722e4ebd20de4e2ab880d4812af758e115c7dc2dbe4b6fadf7ad0adda880d"} Feb 19 21:47:01 crc kubenswrapper[4795]: I0219 21:47:01.185644 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2b418ec-23ae-4edd-8e61-0522a69c6be4","Type":"ContainerStarted","Data":"a592134b61717d57f63788206b6ae0f1532d3c76a4a60eee8a3ddfb80940ab9a"} Feb 19 21:47:01 crc kubenswrapper[4795]: I0219 21:47:01.523578 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dff4a455-15cc-4733-adfd-0f27404e54ed" path="/var/lib/kubelet/pods/dff4a455-15cc-4733-adfd-0f27404e54ed/volumes" Feb 19 21:47:02 crc kubenswrapper[4795]: I0219 21:47:02.196630 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2b418ec-23ae-4edd-8e61-0522a69c6be4","Type":"ContainerStarted","Data":"d19bddd1f9142f8194c0a0050a40c02459b91997d36600ebdae2e82529e31c8c"} Feb 19 21:47:02 crc kubenswrapper[4795]: I0219 21:47:02.196954 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 21:47:02 crc kubenswrapper[4795]: I0219 21:47:02.201394 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d2561f4e-0a01-4927-96f8-ee7bef69f561","Type":"ContainerStarted","Data":"46487241a29d4cc3bff33a03b2f13ce2e328740a30388d55d6c987233cf2d399"} Feb 19 21:47:02 crc kubenswrapper[4795]: I0219 21:47:02.225703 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.8638595100000002 podStartE2EDuration="5.225685472s" podCreationTimestamp="2026-02-19 21:46:57 +0000 UTC" firstStartedPulling="2026-02-19 21:46:58.516183685 +0000 UTC m=+1129.708701549" lastFinishedPulling="2026-02-19 21:47:01.878009637 +0000 UTC m=+1133.070527511" observedRunningTime="2026-02-19 21:47:02.220939269 +0000 UTC m=+1133.413457123" watchObservedRunningTime="2026-02-19 21:47:02.225685472 +0000 UTC m=+1133.418203336" Feb 19 21:47:03 crc kubenswrapper[4795]: I0219 21:47:03.211774 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d2561f4e-0a01-4927-96f8-ee7bef69f561","Type":"ContainerStarted","Data":"067a3784bb90d910b6b73dca0dc993d5c4844e46f49fddffcbfe6f467e1645d3"} Feb 19 21:47:03 crc kubenswrapper[4795]: I0219 21:47:03.231974 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.231957312 podStartE2EDuration="3.231957312s" podCreationTimestamp="2026-02-19 21:47:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:47:03.227915988 +0000 UTC m=+1134.420433852" watchObservedRunningTime="2026-02-19 21:47:03.231957312 +0000 UTC m=+1134.424475166" Feb 19 21:47:04 crc kubenswrapper[4795]: I0219 21:47:04.220559 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 19 21:47:05 crc kubenswrapper[4795]: I0219 21:47:05.823973 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:47:05 crc kubenswrapper[4795]: I0219 21:47:05.990441 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.057290 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9d49dd75f-xfpr5"] Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.057569 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" podUID="6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b" containerName="dnsmasq-dns" containerID="cri-o://af8b691ea9f453a749ca6ae641397eb75df993ab21c7f57160c14bfd356a318c" gracePeriod=10 Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.139135 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.206160 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.245413 4795 generic.go:334] "Generic (PLEG): container finished" podID="6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b" containerID="af8b691ea9f453a749ca6ae641397eb75df993ab21c7f57160c14bfd356a318c" exitCode=0 Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.245487 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" event={"ID":"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b","Type":"ContainerDied","Data":"af8b691ea9f453a749ca6ae641397eb75df993ab21c7f57160c14bfd356a318c"} Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.245617 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="2384b48a-ae68-4495-9c68-2faf894de9f9" containerName="cinder-scheduler" containerID="cri-o://749f08a4249525b1452b1f0b3c7c1bdacd64cc1b2a83d3c0d1cfa675182559ad" gracePeriod=30 Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.245656 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="2384b48a-ae68-4495-9c68-2faf894de9f9" containerName="probe" containerID="cri-o://9e6688b1bae5051b5c2b4f82f1c19ba9f7ca8fd1dd09d8ac892122e2dcf35f0c" gracePeriod=30 Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.561550 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.655723 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49w8w\" (UniqueName: \"kubernetes.io/projected/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-kube-api-access-49w8w\") pod \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.655759 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-dns-svc\") pod \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.655789 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-dns-swift-storage-0\") pod \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.655871 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-ovsdbserver-sb\") pod \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.655997 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-ovsdbserver-nb\") pod \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.656016 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-config\") pod \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.677045 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-kube-api-access-49w8w" (OuterVolumeSpecName: "kube-api-access-49w8w") pod "6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b" (UID: "6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b"). InnerVolumeSpecName "kube-api-access-49w8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.710932 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-config" (OuterVolumeSpecName: "config") pod "6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b" (UID: "6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.719350 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b" (UID: "6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.731145 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b" (UID: "6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.743646 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b" (UID: "6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.748624 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b" (UID: "6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.758473 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.758531 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.758542 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49w8w\" (UniqueName: \"kubernetes.io/projected/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-kube-api-access-49w8w\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.758552 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.758561 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.758569 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:07 crc kubenswrapper[4795]: I0219 21:47:07.256528 4795 generic.go:334] "Generic (PLEG): container finished" podID="2384b48a-ae68-4495-9c68-2faf894de9f9" containerID="9e6688b1bae5051b5c2b4f82f1c19ba9f7ca8fd1dd09d8ac892122e2dcf35f0c" exitCode=0 Feb 19 21:47:07 crc kubenswrapper[4795]: I0219 21:47:07.256607 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2384b48a-ae68-4495-9c68-2faf894de9f9","Type":"ContainerDied","Data":"9e6688b1bae5051b5c2b4f82f1c19ba9f7ca8fd1dd09d8ac892122e2dcf35f0c"} Feb 19 21:47:07 crc kubenswrapper[4795]: I0219 21:47:07.258572 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" event={"ID":"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b","Type":"ContainerDied","Data":"e42790fe679a04fe8e1a343d997c7b72c08e413899c7c4625b70ee788a6866db"} Feb 19 21:47:07 crc kubenswrapper[4795]: I0219 21:47:07.258614 4795 scope.go:117] "RemoveContainer" containerID="af8b691ea9f453a749ca6ae641397eb75df993ab21c7f57160c14bfd356a318c" Feb 19 21:47:07 crc kubenswrapper[4795]: I0219 21:47:07.258637 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:47:07 crc kubenswrapper[4795]: I0219 21:47:07.287090 4795 scope.go:117] "RemoveContainer" containerID="81fbd576f753564d8cc96fbaca875356e50537cd709002f5a9b77dc1f35e7279" Feb 19 21:47:07 crc kubenswrapper[4795]: I0219 21:47:07.290029 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9d49dd75f-xfpr5"] Feb 19 21:47:07 crc kubenswrapper[4795]: I0219 21:47:07.308782 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9d49dd75f-xfpr5"] Feb 19 21:47:07 crc kubenswrapper[4795]: I0219 21:47:07.525855 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b" path="/var/lib/kubelet/pods/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b/volumes" Feb 19 21:47:07 crc kubenswrapper[4795]: I0219 21:47:07.693213 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:47:07 crc kubenswrapper[4795]: I0219 21:47:07.733867 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:47:09 crc kubenswrapper[4795]: I0219 21:47:09.625872 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:47:09 crc kubenswrapper[4795]: I0219 21:47:09.925607 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.014684 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-combined-ca-bundle\") pod \"2384b48a-ae68-4495-9c68-2faf894de9f9\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.014761 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5clv6\" (UniqueName: \"kubernetes.io/projected/2384b48a-ae68-4495-9c68-2faf894de9f9-kube-api-access-5clv6\") pod \"2384b48a-ae68-4495-9c68-2faf894de9f9\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.014802 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-config-data-custom\") pod \"2384b48a-ae68-4495-9c68-2faf894de9f9\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.014855 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-config-data\") pod \"2384b48a-ae68-4495-9c68-2faf894de9f9\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.014950 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-scripts\") pod \"2384b48a-ae68-4495-9c68-2faf894de9f9\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.014970 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2384b48a-ae68-4495-9c68-2faf894de9f9-etc-machine-id\") pod \"2384b48a-ae68-4495-9c68-2faf894de9f9\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.015415 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2384b48a-ae68-4495-9c68-2faf894de9f9-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2384b48a-ae68-4495-9c68-2faf894de9f9" (UID: "2384b48a-ae68-4495-9c68-2faf894de9f9"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.019959 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-scripts" (OuterVolumeSpecName: "scripts") pod "2384b48a-ae68-4495-9c68-2faf894de9f9" (UID: "2384b48a-ae68-4495-9c68-2faf894de9f9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.020023 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2384b48a-ae68-4495-9c68-2faf894de9f9-kube-api-access-5clv6" (OuterVolumeSpecName: "kube-api-access-5clv6") pod "2384b48a-ae68-4495-9c68-2faf894de9f9" (UID: "2384b48a-ae68-4495-9c68-2faf894de9f9"). InnerVolumeSpecName "kube-api-access-5clv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.020364 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2384b48a-ae68-4495-9c68-2faf894de9f9" (UID: "2384b48a-ae68-4495-9c68-2faf894de9f9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.046593 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.121769 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2384b48a-ae68-4495-9c68-2faf894de9f9" (UID: "2384b48a-ae68-4495-9c68-2faf894de9f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.124901 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-combined-ca-bundle\") pod \"2384b48a-ae68-4495-9c68-2faf894de9f9\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.125532 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.125546 4795 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2384b48a-ae68-4495-9c68-2faf894de9f9-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.125565 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5clv6\" (UniqueName: \"kubernetes.io/projected/2384b48a-ae68-4495-9c68-2faf894de9f9-kube-api-access-5clv6\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.125574 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:10 crc kubenswrapper[4795]: W0219 21:47:10.126046 4795 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/2384b48a-ae68-4495-9c68-2faf894de9f9/volumes/kubernetes.io~secret/combined-ca-bundle Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.126056 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2384b48a-ae68-4495-9c68-2faf894de9f9" (UID: "2384b48a-ae68-4495-9c68-2faf894de9f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.127155 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-78d7c97684-8rgnf"] Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.127362 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-78d7c97684-8rgnf" podUID="2af40d0f-93fe-4592-a07b-0cee3eefbde5" containerName="neutron-api" containerID="cri-o://45b6d7396950a55ef31e5b8bf7af7c2f0a2555905f99260630834dbe1b48c393" gracePeriod=30 Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.127821 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-78d7c97684-8rgnf" podUID="2af40d0f-93fe-4592-a07b-0cee3eefbde5" containerName="neutron-httpd" containerID="cri-o://2b75649a0876fc8b2fab0d3c547c6041ea1386b0e2f40ee0a4dc2afb93521ff9" gracePeriod=30 Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.187971 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-config-data" (OuterVolumeSpecName: "config-data") pod "2384b48a-ae68-4495-9c68-2faf894de9f9" (UID: "2384b48a-ae68-4495-9c68-2faf894de9f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.226557 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.226583 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.293999 4795 generic.go:334] "Generic (PLEG): container finished" podID="2384b48a-ae68-4495-9c68-2faf894de9f9" containerID="749f08a4249525b1452b1f0b3c7c1bdacd64cc1b2a83d3c0d1cfa675182559ad" exitCode=0 Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.294075 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2384b48a-ae68-4495-9c68-2faf894de9f9","Type":"ContainerDied","Data":"749f08a4249525b1452b1f0b3c7c1bdacd64cc1b2a83d3c0d1cfa675182559ad"} Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.294112 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2384b48a-ae68-4495-9c68-2faf894de9f9","Type":"ContainerDied","Data":"1bb0109c274477813b9f48090bc2380bc85368bf7c3040d3068fd48ba9c4524c"} Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.294135 4795 scope.go:117] "RemoveContainer" containerID="9e6688b1bae5051b5c2b4f82f1c19ba9f7ca8fd1dd09d8ac892122e2dcf35f0c" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.294330 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.307137 4795 generic.go:334] "Generic (PLEG): container finished" podID="2af40d0f-93fe-4592-a07b-0cee3eefbde5" containerID="2b75649a0876fc8b2fab0d3c547c6041ea1386b0e2f40ee0a4dc2afb93521ff9" exitCode=0 Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.307203 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78d7c97684-8rgnf" event={"ID":"2af40d0f-93fe-4592-a07b-0cee3eefbde5","Type":"ContainerDied","Data":"2b75649a0876fc8b2fab0d3c547c6041ea1386b0e2f40ee0a4dc2afb93521ff9"} Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.340665 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.361578 4795 scope.go:117] "RemoveContainer" containerID="749f08a4249525b1452b1f0b3c7c1bdacd64cc1b2a83d3c0d1cfa675182559ad" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.366580 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.396404 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:47:10 crc kubenswrapper[4795]: E0219 21:47:10.396752 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b" containerName="dnsmasq-dns" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.396769 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b" containerName="dnsmasq-dns" Feb 19 21:47:10 crc kubenswrapper[4795]: E0219 21:47:10.396785 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2384b48a-ae68-4495-9c68-2faf894de9f9" containerName="probe" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.396792 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2384b48a-ae68-4495-9c68-2faf894de9f9" containerName="probe" Feb 19 21:47:10 crc kubenswrapper[4795]: E0219 21:47:10.396811 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b" containerName="init" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.396817 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b" containerName="init" Feb 19 21:47:10 crc kubenswrapper[4795]: E0219 21:47:10.396829 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2384b48a-ae68-4495-9c68-2faf894de9f9" containerName="cinder-scheduler" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.396835 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2384b48a-ae68-4495-9c68-2faf894de9f9" containerName="cinder-scheduler" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.396990 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2384b48a-ae68-4495-9c68-2faf894de9f9" containerName="cinder-scheduler" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.397006 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2384b48a-ae68-4495-9c68-2faf894de9f9" containerName="probe" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.397020 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b" containerName="dnsmasq-dns" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.397866 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.405375 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.409292 4795 scope.go:117] "RemoveContainer" containerID="9e6688b1bae5051b5c2b4f82f1c19ba9f7ca8fd1dd09d8ac892122e2dcf35f0c" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.409987 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:47:10 crc kubenswrapper[4795]: E0219 21:47:10.414138 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e6688b1bae5051b5c2b4f82f1c19ba9f7ca8fd1dd09d8ac892122e2dcf35f0c\": container with ID starting with 9e6688b1bae5051b5c2b4f82f1c19ba9f7ca8fd1dd09d8ac892122e2dcf35f0c not found: ID does not exist" containerID="9e6688b1bae5051b5c2b4f82f1c19ba9f7ca8fd1dd09d8ac892122e2dcf35f0c" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.414251 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e6688b1bae5051b5c2b4f82f1c19ba9f7ca8fd1dd09d8ac892122e2dcf35f0c"} err="failed to get container status \"9e6688b1bae5051b5c2b4f82f1c19ba9f7ca8fd1dd09d8ac892122e2dcf35f0c\": rpc error: code = NotFound desc = could not find container \"9e6688b1bae5051b5c2b4f82f1c19ba9f7ca8fd1dd09d8ac892122e2dcf35f0c\": container with ID starting with 9e6688b1bae5051b5c2b4f82f1c19ba9f7ca8fd1dd09d8ac892122e2dcf35f0c not found: ID does not exist" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.414273 4795 scope.go:117] "RemoveContainer" containerID="749f08a4249525b1452b1f0b3c7c1bdacd64cc1b2a83d3c0d1cfa675182559ad" Feb 19 21:47:10 crc kubenswrapper[4795]: E0219 21:47:10.418379 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"749f08a4249525b1452b1f0b3c7c1bdacd64cc1b2a83d3c0d1cfa675182559ad\": container with ID starting with 749f08a4249525b1452b1f0b3c7c1bdacd64cc1b2a83d3c0d1cfa675182559ad not found: ID does not exist" containerID="749f08a4249525b1452b1f0b3c7c1bdacd64cc1b2a83d3c0d1cfa675182559ad" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.418437 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"749f08a4249525b1452b1f0b3c7c1bdacd64cc1b2a83d3c0d1cfa675182559ad"} err="failed to get container status \"749f08a4249525b1452b1f0b3c7c1bdacd64cc1b2a83d3c0d1cfa675182559ad\": rpc error: code = NotFound desc = could not find container \"749f08a4249525b1452b1f0b3c7c1bdacd64cc1b2a83d3c0d1cfa675182559ad\": container with ID starting with 749f08a4249525b1452b1f0b3c7c1bdacd64cc1b2a83d3c0d1cfa675182559ad not found: ID does not exist" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.439194 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.439269 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c54f77a4-1095-4ff1-bc74-b845cde659d9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.439287 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-scripts\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.439318 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.439366 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4bdc\" (UniqueName: \"kubernetes.io/projected/c54f77a4-1095-4ff1-bc74-b845cde659d9-kube-api-access-b4bdc\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.439407 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-config-data\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.543368 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-config-data\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.543474 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.543527 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c54f77a4-1095-4ff1-bc74-b845cde659d9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.543546 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-scripts\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.543580 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.543632 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4bdc\" (UniqueName: \"kubernetes.io/projected/c54f77a4-1095-4ff1-bc74-b845cde659d9-kube-api-access-b4bdc\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.544272 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c54f77a4-1095-4ff1-bc74-b845cde659d9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.549486 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-scripts\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.551099 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-config-data\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.551895 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.552689 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.565660 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4bdc\" (UniqueName: \"kubernetes.io/projected/c54f77a4-1095-4ff1-bc74-b845cde659d9-kube-api-access-b4bdc\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.728583 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 21:47:11 crc kubenswrapper[4795]: W0219 21:47:11.350597 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc54f77a4_1095_4ff1_bc74_b845cde659d9.slice/crio-edafb0a067aa15654ff7c76ec822cff24dc2310ed79e9d7093d41cbc935fc540 WatchSource:0}: Error finding container edafb0a067aa15654ff7c76ec822cff24dc2310ed79e9d7093d41cbc935fc540: Status 404 returned error can't find the container with id edafb0a067aa15654ff7c76ec822cff24dc2310ed79e9d7093d41cbc935fc540 Feb 19 21:47:11 crc kubenswrapper[4795]: I0219 21:47:11.354040 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:47:11 crc kubenswrapper[4795]: I0219 21:47:11.357438 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:47:11 crc kubenswrapper[4795]: I0219 21:47:11.381096 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:47:11 crc kubenswrapper[4795]: I0219 21:47:11.450453 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-76bfdcc9c4-d56mx"] Feb 19 21:47:11 crc kubenswrapper[4795]: I0219 21:47:11.454357 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-76bfdcc9c4-d56mx" podUID="bd5855e1-cadb-4170-8339-5f10945c6ce9" containerName="placement-log" containerID="cri-o://07e0d10967942d332970ea7b450b84e11544b898580db0b6d739db3ac549d6ec" gracePeriod=30 Feb 19 21:47:11 crc kubenswrapper[4795]: I0219 21:47:11.454534 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-76bfdcc9c4-d56mx" podUID="bd5855e1-cadb-4170-8339-5f10945c6ce9" containerName="placement-api" containerID="cri-o://79eaa173c80318b05a89d976dd736c832c6c7ad22a55bf02162ef029353efbf8" gracePeriod=30 Feb 19 21:47:11 crc kubenswrapper[4795]: I0219 21:47:11.531416 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2384b48a-ae68-4495-9c68-2faf894de9f9" path="/var/lib/kubelet/pods/2384b48a-ae68-4495-9c68-2faf894de9f9/volumes" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.116092 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.185940 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp5fp\" (UniqueName: \"kubernetes.io/projected/2af40d0f-93fe-4592-a07b-0cee3eefbde5-kube-api-access-kp5fp\") pod \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.186287 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-combined-ca-bundle\") pod \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.186350 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-httpd-config\") pod \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.186400 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-config\") pod \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.186590 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-ovndb-tls-certs\") pod \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.197349 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "2af40d0f-93fe-4592-a07b-0cee3eefbde5" (UID: "2af40d0f-93fe-4592-a07b-0cee3eefbde5"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.198281 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2af40d0f-93fe-4592-a07b-0cee3eefbde5-kube-api-access-kp5fp" (OuterVolumeSpecName: "kube-api-access-kp5fp") pod "2af40d0f-93fe-4592-a07b-0cee3eefbde5" (UID: "2af40d0f-93fe-4592-a07b-0cee3eefbde5"). InnerVolumeSpecName "kube-api-access-kp5fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.240359 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2af40d0f-93fe-4592-a07b-0cee3eefbde5" (UID: "2af40d0f-93fe-4592-a07b-0cee3eefbde5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.240912 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-config" (OuterVolumeSpecName: "config") pod "2af40d0f-93fe-4592-a07b-0cee3eefbde5" (UID: "2af40d0f-93fe-4592-a07b-0cee3eefbde5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.272443 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "2af40d0f-93fe-4592-a07b-0cee3eefbde5" (UID: "2af40d0f-93fe-4592-a07b-0cee3eefbde5"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.288377 4795 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.288410 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp5fp\" (UniqueName: \"kubernetes.io/projected/2af40d0f-93fe-4592-a07b-0cee3eefbde5-kube-api-access-kp5fp\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.288424 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.288432 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.288442 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.326682 4795 generic.go:334] "Generic (PLEG): container finished" podID="2af40d0f-93fe-4592-a07b-0cee3eefbde5" containerID="45b6d7396950a55ef31e5b8bf7af7c2f0a2555905f99260630834dbe1b48c393" exitCode=0 Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.326725 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78d7c97684-8rgnf" event={"ID":"2af40d0f-93fe-4592-a07b-0cee3eefbde5","Type":"ContainerDied","Data":"45b6d7396950a55ef31e5b8bf7af7c2f0a2555905f99260630834dbe1b48c393"} Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.326771 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78d7c97684-8rgnf" event={"ID":"2af40d0f-93fe-4592-a07b-0cee3eefbde5","Type":"ContainerDied","Data":"0c9496366294be7a896bc72ec610b06b4b589bece135bb9b445591c9fe5a825f"} Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.326766 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.326786 4795 scope.go:117] "RemoveContainer" containerID="2b75649a0876fc8b2fab0d3c547c6041ea1386b0e2f40ee0a4dc2afb93521ff9" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.328976 4795 generic.go:334] "Generic (PLEG): container finished" podID="bd5855e1-cadb-4170-8339-5f10945c6ce9" containerID="07e0d10967942d332970ea7b450b84e11544b898580db0b6d739db3ac549d6ec" exitCode=143 Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.329032 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76bfdcc9c4-d56mx" event={"ID":"bd5855e1-cadb-4170-8339-5f10945c6ce9","Type":"ContainerDied","Data":"07e0d10967942d332970ea7b450b84e11544b898580db0b6d739db3ac549d6ec"} Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.333421 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c54f77a4-1095-4ff1-bc74-b845cde659d9","Type":"ContainerStarted","Data":"5256ff7cce2d8a61bc1b79dbf58379aff2f08fd19479fafdd317965f24177cc8"} Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.333471 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c54f77a4-1095-4ff1-bc74-b845cde659d9","Type":"ContainerStarted","Data":"edafb0a067aa15654ff7c76ec822cff24dc2310ed79e9d7093d41cbc935fc540"} Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.359902 4795 scope.go:117] "RemoveContainer" containerID="45b6d7396950a55ef31e5b8bf7af7c2f0a2555905f99260630834dbe1b48c393" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.361707 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-78d7c97684-8rgnf"] Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.373393 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-78d7c97684-8rgnf"] Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.387641 4795 scope.go:117] "RemoveContainer" containerID="2b75649a0876fc8b2fab0d3c547c6041ea1386b0e2f40ee0a4dc2afb93521ff9" Feb 19 21:47:12 crc kubenswrapper[4795]: E0219 21:47:12.388052 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b75649a0876fc8b2fab0d3c547c6041ea1386b0e2f40ee0a4dc2afb93521ff9\": container with ID starting with 2b75649a0876fc8b2fab0d3c547c6041ea1386b0e2f40ee0a4dc2afb93521ff9 not found: ID does not exist" containerID="2b75649a0876fc8b2fab0d3c547c6041ea1386b0e2f40ee0a4dc2afb93521ff9" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.388082 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b75649a0876fc8b2fab0d3c547c6041ea1386b0e2f40ee0a4dc2afb93521ff9"} err="failed to get container status \"2b75649a0876fc8b2fab0d3c547c6041ea1386b0e2f40ee0a4dc2afb93521ff9\": rpc error: code = NotFound desc = could not find container \"2b75649a0876fc8b2fab0d3c547c6041ea1386b0e2f40ee0a4dc2afb93521ff9\": container with ID starting with 2b75649a0876fc8b2fab0d3c547c6041ea1386b0e2f40ee0a4dc2afb93521ff9 not found: ID does not exist" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.388102 4795 scope.go:117] "RemoveContainer" containerID="45b6d7396950a55ef31e5b8bf7af7c2f0a2555905f99260630834dbe1b48c393" Feb 19 21:47:12 crc kubenswrapper[4795]: E0219 21:47:12.388715 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45b6d7396950a55ef31e5b8bf7af7c2f0a2555905f99260630834dbe1b48c393\": container with ID starting with 45b6d7396950a55ef31e5b8bf7af7c2f0a2555905f99260630834dbe1b48c393 not found: ID does not exist" containerID="45b6d7396950a55ef31e5b8bf7af7c2f0a2555905f99260630834dbe1b48c393" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.388736 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45b6d7396950a55ef31e5b8bf7af7c2f0a2555905f99260630834dbe1b48c393"} err="failed to get container status \"45b6d7396950a55ef31e5b8bf7af7c2f0a2555905f99260630834dbe1b48c393\": rpc error: code = NotFound desc = could not find container \"45b6d7396950a55ef31e5b8bf7af7c2f0a2555905f99260630834dbe1b48c393\": container with ID starting with 45b6d7396950a55ef31e5b8bf7af7c2f0a2555905f99260630834dbe1b48c393 not found: ID does not exist" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.183757 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.345753 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c54f77a4-1095-4ff1-bc74-b845cde659d9","Type":"ContainerStarted","Data":"b84203bdd91f64f0aed89ae5d8334985c117eb72cc298fbd0865d60f726218c1"} Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.367752 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.367734398 podStartE2EDuration="3.367734398s" podCreationTimestamp="2026-02-19 21:47:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:47:13.367238394 +0000 UTC m=+1144.559756268" watchObservedRunningTime="2026-02-19 21:47:13.367734398 +0000 UTC m=+1144.560252262" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.524679 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2af40d0f-93fe-4592-a07b-0cee3eefbde5" path="/var/lib/kubelet/pods/2af40d0f-93fe-4592-a07b-0cee3eefbde5/volumes" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.570856 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 19 21:47:13 crc kubenswrapper[4795]: E0219 21:47:13.571244 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af40d0f-93fe-4592-a07b-0cee3eefbde5" containerName="neutron-httpd" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.571266 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af40d0f-93fe-4592-a07b-0cee3eefbde5" containerName="neutron-httpd" Feb 19 21:47:13 crc kubenswrapper[4795]: E0219 21:47:13.571293 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af40d0f-93fe-4592-a07b-0cee3eefbde5" containerName="neutron-api" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.571302 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af40d0f-93fe-4592-a07b-0cee3eefbde5" containerName="neutron-api" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.571490 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2af40d0f-93fe-4592-a07b-0cee3eefbde5" containerName="neutron-httpd" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.571508 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2af40d0f-93fe-4592-a07b-0cee3eefbde5" containerName="neutron-api" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.572088 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.573793 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.573804 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-r9sqp" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.575778 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.581246 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.613878 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/336beec4-e534-448f-8367-78645b53650e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"336beec4-e534-448f-8367-78645b53650e\") " pod="openstack/openstackclient" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.613923 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/336beec4-e534-448f-8367-78645b53650e-openstack-config-secret\") pod \"openstackclient\" (UID: \"336beec4-e534-448f-8367-78645b53650e\") " pod="openstack/openstackclient" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.614011 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/336beec4-e534-448f-8367-78645b53650e-openstack-config\") pod \"openstackclient\" (UID: \"336beec4-e534-448f-8367-78645b53650e\") " pod="openstack/openstackclient" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.614144 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbtv2\" (UniqueName: \"kubernetes.io/projected/336beec4-e534-448f-8367-78645b53650e-kube-api-access-bbtv2\") pod \"openstackclient\" (UID: \"336beec4-e534-448f-8367-78645b53650e\") " pod="openstack/openstackclient" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.722635 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/336beec4-e534-448f-8367-78645b53650e-openstack-config\") pod \"openstackclient\" (UID: \"336beec4-e534-448f-8367-78645b53650e\") " pod="openstack/openstackclient" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.722777 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbtv2\" (UniqueName: \"kubernetes.io/projected/336beec4-e534-448f-8367-78645b53650e-kube-api-access-bbtv2\") pod \"openstackclient\" (UID: \"336beec4-e534-448f-8367-78645b53650e\") " pod="openstack/openstackclient" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.722832 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/336beec4-e534-448f-8367-78645b53650e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"336beec4-e534-448f-8367-78645b53650e\") " pod="openstack/openstackclient" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.722858 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/336beec4-e534-448f-8367-78645b53650e-openstack-config-secret\") pod \"openstackclient\" (UID: \"336beec4-e534-448f-8367-78645b53650e\") " pod="openstack/openstackclient" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.743286 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/336beec4-e534-448f-8367-78645b53650e-openstack-config\") pod \"openstackclient\" (UID: \"336beec4-e534-448f-8367-78645b53650e\") " pod="openstack/openstackclient" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.754434 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/336beec4-e534-448f-8367-78645b53650e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"336beec4-e534-448f-8367-78645b53650e\") " pod="openstack/openstackclient" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.761251 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/336beec4-e534-448f-8367-78645b53650e-openstack-config-secret\") pod \"openstackclient\" (UID: \"336beec4-e534-448f-8367-78645b53650e\") " pod="openstack/openstackclient" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.789599 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbtv2\" (UniqueName: \"kubernetes.io/projected/336beec4-e534-448f-8367-78645b53650e-kube-api-access-bbtv2\") pod \"openstackclient\" (UID: \"336beec4-e534-448f-8367-78645b53650e\") " pod="openstack/openstackclient" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.893634 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 21:47:14 crc kubenswrapper[4795]: I0219 21:47:14.395453 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.074157 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.151244 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdw57\" (UniqueName: \"kubernetes.io/projected/bd5855e1-cadb-4170-8339-5f10945c6ce9-kube-api-access-fdw57\") pod \"bd5855e1-cadb-4170-8339-5f10945c6ce9\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.151290 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-internal-tls-certs\") pod \"bd5855e1-cadb-4170-8339-5f10945c6ce9\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.151329 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd5855e1-cadb-4170-8339-5f10945c6ce9-logs\") pod \"bd5855e1-cadb-4170-8339-5f10945c6ce9\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.151367 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-public-tls-certs\") pod \"bd5855e1-cadb-4170-8339-5f10945c6ce9\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.151494 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-combined-ca-bundle\") pod \"bd5855e1-cadb-4170-8339-5f10945c6ce9\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.151527 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-config-data\") pod \"bd5855e1-cadb-4170-8339-5f10945c6ce9\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.151630 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-scripts\") pod \"bd5855e1-cadb-4170-8339-5f10945c6ce9\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.152915 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd5855e1-cadb-4170-8339-5f10945c6ce9-logs" (OuterVolumeSpecName: "logs") pod "bd5855e1-cadb-4170-8339-5f10945c6ce9" (UID: "bd5855e1-cadb-4170-8339-5f10945c6ce9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.165268 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-scripts" (OuterVolumeSpecName: "scripts") pod "bd5855e1-cadb-4170-8339-5f10945c6ce9" (UID: "bd5855e1-cadb-4170-8339-5f10945c6ce9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.193717 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd5855e1-cadb-4170-8339-5f10945c6ce9-kube-api-access-fdw57" (OuterVolumeSpecName: "kube-api-access-fdw57") pod "bd5855e1-cadb-4170-8339-5f10945c6ce9" (UID: "bd5855e1-cadb-4170-8339-5f10945c6ce9"). InnerVolumeSpecName "kube-api-access-fdw57". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.221730 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd5855e1-cadb-4170-8339-5f10945c6ce9" (UID: "bd5855e1-cadb-4170-8339-5f10945c6ce9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.227214 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-config-data" (OuterVolumeSpecName: "config-data") pod "bd5855e1-cadb-4170-8339-5f10945c6ce9" (UID: "bd5855e1-cadb-4170-8339-5f10945c6ce9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.253428 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.253481 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.253493 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.253501 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdw57\" (UniqueName: \"kubernetes.io/projected/bd5855e1-cadb-4170-8339-5f10945c6ce9-kube-api-access-fdw57\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.253510 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd5855e1-cadb-4170-8339-5f10945c6ce9-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.272738 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bd5855e1-cadb-4170-8339-5f10945c6ce9" (UID: "bd5855e1-cadb-4170-8339-5f10945c6ce9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.314326 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bd5855e1-cadb-4170-8339-5f10945c6ce9" (UID: "bd5855e1-cadb-4170-8339-5f10945c6ce9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.355303 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.355342 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.361978 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"336beec4-e534-448f-8367-78645b53650e","Type":"ContainerStarted","Data":"9f40a6e1a339f74b374579f38441616a24d07c91d67da5f54b0e5c6df69736a0"} Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.364075 4795 generic.go:334] "Generic (PLEG): container finished" podID="bd5855e1-cadb-4170-8339-5f10945c6ce9" containerID="79eaa173c80318b05a89d976dd736c832c6c7ad22a55bf02162ef029353efbf8" exitCode=0 Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.364112 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76bfdcc9c4-d56mx" event={"ID":"bd5855e1-cadb-4170-8339-5f10945c6ce9","Type":"ContainerDied","Data":"79eaa173c80318b05a89d976dd736c832c6c7ad22a55bf02162ef029353efbf8"} Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.364137 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76bfdcc9c4-d56mx" event={"ID":"bd5855e1-cadb-4170-8339-5f10945c6ce9","Type":"ContainerDied","Data":"fd2853bf461ad171b9d2aba20648d237de7c6fd1d48533d33bcc6a56c0d7fd46"} Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.364153 4795 scope.go:117] "RemoveContainer" containerID="79eaa173c80318b05a89d976dd736c832c6c7ad22a55bf02162ef029353efbf8" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.364208 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.397964 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-76bfdcc9c4-d56mx"] Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.406100 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-76bfdcc9c4-d56mx"] Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.413296 4795 scope.go:117] "RemoveContainer" containerID="07e0d10967942d332970ea7b450b84e11544b898580db0b6d739db3ac549d6ec" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.458382 4795 scope.go:117] "RemoveContainer" containerID="79eaa173c80318b05a89d976dd736c832c6c7ad22a55bf02162ef029353efbf8" Feb 19 21:47:15 crc kubenswrapper[4795]: E0219 21:47:15.458855 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79eaa173c80318b05a89d976dd736c832c6c7ad22a55bf02162ef029353efbf8\": container with ID starting with 79eaa173c80318b05a89d976dd736c832c6c7ad22a55bf02162ef029353efbf8 not found: ID does not exist" containerID="79eaa173c80318b05a89d976dd736c832c6c7ad22a55bf02162ef029353efbf8" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.458897 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79eaa173c80318b05a89d976dd736c832c6c7ad22a55bf02162ef029353efbf8"} err="failed to get container status \"79eaa173c80318b05a89d976dd736c832c6c7ad22a55bf02162ef029353efbf8\": rpc error: code = NotFound desc = could not find container \"79eaa173c80318b05a89d976dd736c832c6c7ad22a55bf02162ef029353efbf8\": container with ID starting with 79eaa173c80318b05a89d976dd736c832c6c7ad22a55bf02162ef029353efbf8 not found: ID does not exist" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.458925 4795 scope.go:117] "RemoveContainer" containerID="07e0d10967942d332970ea7b450b84e11544b898580db0b6d739db3ac549d6ec" Feb 19 21:47:15 crc kubenswrapper[4795]: E0219 21:47:15.459389 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07e0d10967942d332970ea7b450b84e11544b898580db0b6d739db3ac549d6ec\": container with ID starting with 07e0d10967942d332970ea7b450b84e11544b898580db0b6d739db3ac549d6ec not found: ID does not exist" containerID="07e0d10967942d332970ea7b450b84e11544b898580db0b6d739db3ac549d6ec" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.459431 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07e0d10967942d332970ea7b450b84e11544b898580db0b6d739db3ac549d6ec"} err="failed to get container status \"07e0d10967942d332970ea7b450b84e11544b898580db0b6d739db3ac549d6ec\": rpc error: code = NotFound desc = could not find container \"07e0d10967942d332970ea7b450b84e11544b898580db0b6d739db3ac549d6ec\": container with ID starting with 07e0d10967942d332970ea7b450b84e11544b898580db0b6d739db3ac549d6ec not found: ID does not exist" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.521879 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd5855e1-cadb-4170-8339-5f10945c6ce9" path="/var/lib/kubelet/pods/bd5855e1-cadb-4170-8339-5f10945c6ce9/volumes" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.729492 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.031813 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-858c4dcd57-whkj2"] Feb 19 21:47:16 crc kubenswrapper[4795]: E0219 21:47:16.032156 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd5855e1-cadb-4170-8339-5f10945c6ce9" containerName="placement-log" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.032186 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd5855e1-cadb-4170-8339-5f10945c6ce9" containerName="placement-log" Feb 19 21:47:16 crc kubenswrapper[4795]: E0219 21:47:16.032223 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd5855e1-cadb-4170-8339-5f10945c6ce9" containerName="placement-api" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.032229 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd5855e1-cadb-4170-8339-5f10945c6ce9" containerName="placement-api" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.032401 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd5855e1-cadb-4170-8339-5f10945c6ce9" containerName="placement-log" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.032426 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd5855e1-cadb-4170-8339-5f10945c6ce9" containerName="placement-api" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.038398 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.041132 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.041233 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.044029 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.048821 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-858c4dcd57-whkj2"] Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.172298 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-combined-ca-bundle\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.172380 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-config-data\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.172409 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/da2e3f89-bf0b-4371-8e5b-a0037f266c70-etc-swift\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.172433 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da2e3f89-bf0b-4371-8e5b-a0037f266c70-log-httpd\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.172455 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wfxq\" (UniqueName: \"kubernetes.io/projected/da2e3f89-bf0b-4371-8e5b-a0037f266c70-kube-api-access-2wfxq\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.172612 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da2e3f89-bf0b-4371-8e5b-a0037f266c70-run-httpd\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.172669 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-internal-tls-certs\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.172835 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-public-tls-certs\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.274706 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-internal-tls-certs\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.274832 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-public-tls-certs\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.274855 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-combined-ca-bundle\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.274906 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-config-data\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.274935 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/da2e3f89-bf0b-4371-8e5b-a0037f266c70-etc-swift\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.274961 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da2e3f89-bf0b-4371-8e5b-a0037f266c70-log-httpd\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.274986 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wfxq\" (UniqueName: \"kubernetes.io/projected/da2e3f89-bf0b-4371-8e5b-a0037f266c70-kube-api-access-2wfxq\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.275027 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da2e3f89-bf0b-4371-8e5b-a0037f266c70-run-httpd\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.275484 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da2e3f89-bf0b-4371-8e5b-a0037f266c70-log-httpd\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.275514 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da2e3f89-bf0b-4371-8e5b-a0037f266c70-run-httpd\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.279585 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-public-tls-certs\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.288841 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-internal-tls-certs\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.288894 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-config-data\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.289321 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-combined-ca-bundle\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.289374 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/da2e3f89-bf0b-4371-8e5b-a0037f266c70-etc-swift\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.292012 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wfxq\" (UniqueName: \"kubernetes.io/projected/da2e3f89-bf0b-4371-8e5b-a0037f266c70-kube-api-access-2wfxq\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.353676 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:17 crc kubenswrapper[4795]: I0219 21:47:17.057688 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-858c4dcd57-whkj2"] Feb 19 21:47:17 crc kubenswrapper[4795]: I0219 21:47:17.091711 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:17 crc kubenswrapper[4795]: I0219 21:47:17.091949 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="ceilometer-central-agent" containerID="cri-o://f5d643cd66b59a6396d5feca0a7263f30adfef3399f6c9e0364f6fef62658bb6" gracePeriod=30 Feb 19 21:47:17 crc kubenswrapper[4795]: I0219 21:47:17.095569 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="sg-core" containerID="cri-o://a592134b61717d57f63788206b6ae0f1532d3c76a4a60eee8a3ddfb80940ab9a" gracePeriod=30 Feb 19 21:47:17 crc kubenswrapper[4795]: I0219 21:47:17.095647 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="proxy-httpd" containerID="cri-o://d19bddd1f9142f8194c0a0050a40c02459b91997d36600ebdae2e82529e31c8c" gracePeriod=30 Feb 19 21:47:17 crc kubenswrapper[4795]: I0219 21:47:17.095709 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="ceilometer-notification-agent" containerID="cri-o://fdbe214ee22443ca571dd54e58e45a8266df69233f71d59ca89e8fe90ae2ab86" gracePeriod=30 Feb 19 21:47:17 crc kubenswrapper[4795]: I0219 21:47:17.239303 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Feb 19 21:47:17 crc kubenswrapper[4795]: I0219 21:47:17.421834 4795 generic.go:334] "Generic (PLEG): container finished" podID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerID="a592134b61717d57f63788206b6ae0f1532d3c76a4a60eee8a3ddfb80940ab9a" exitCode=2 Feb 19 21:47:17 crc kubenswrapper[4795]: I0219 21:47:17.421912 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2b418ec-23ae-4edd-8e61-0522a69c6be4","Type":"ContainerDied","Data":"a592134b61717d57f63788206b6ae0f1532d3c76a4a60eee8a3ddfb80940ab9a"} Feb 19 21:47:17 crc kubenswrapper[4795]: I0219 21:47:17.426569 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-858c4dcd57-whkj2" event={"ID":"da2e3f89-bf0b-4371-8e5b-a0037f266c70","Type":"ContainerStarted","Data":"38273143a291266be6dd29c71788a99ae4aa366ccd575844722fcc6687631e66"} Feb 19 21:47:17 crc kubenswrapper[4795]: I0219 21:47:17.426606 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-858c4dcd57-whkj2" event={"ID":"da2e3f89-bf0b-4371-8e5b-a0037f266c70","Type":"ContainerStarted","Data":"fd10d4f85e04ded895f7718dd53443f09a3be089bf6f4718e6d017852d997436"} Feb 19 21:47:18 crc kubenswrapper[4795]: I0219 21:47:18.711356 4795 generic.go:334] "Generic (PLEG): container finished" podID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerID="d19bddd1f9142f8194c0a0050a40c02459b91997d36600ebdae2e82529e31c8c" exitCode=0 Feb 19 21:47:18 crc kubenswrapper[4795]: I0219 21:47:18.711608 4795 generic.go:334] "Generic (PLEG): container finished" podID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerID="f5d643cd66b59a6396d5feca0a7263f30adfef3399f6c9e0364f6fef62658bb6" exitCode=0 Feb 19 21:47:18 crc kubenswrapper[4795]: I0219 21:47:18.711648 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2b418ec-23ae-4edd-8e61-0522a69c6be4","Type":"ContainerDied","Data":"d19bddd1f9142f8194c0a0050a40c02459b91997d36600ebdae2e82529e31c8c"} Feb 19 21:47:18 crc kubenswrapper[4795]: I0219 21:47:18.711674 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2b418ec-23ae-4edd-8e61-0522a69c6be4","Type":"ContainerDied","Data":"f5d643cd66b59a6396d5feca0a7263f30adfef3399f6c9e0364f6fef62658bb6"} Feb 19 21:47:18 crc kubenswrapper[4795]: I0219 21:47:18.718132 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-858c4dcd57-whkj2" event={"ID":"da2e3f89-bf0b-4371-8e5b-a0037f266c70","Type":"ContainerStarted","Data":"812bd35c706001e18c0da5e2c3dd17a059f42e984bdd2e12b92f8fd91195b1e2"} Feb 19 21:47:18 crc kubenswrapper[4795]: I0219 21:47:18.719390 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:18 crc kubenswrapper[4795]: I0219 21:47:18.719415 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:18 crc kubenswrapper[4795]: I0219 21:47:18.744740 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-858c4dcd57-whkj2" podStartSLOduration=2.744714763 podStartE2EDuration="2.744714763s" podCreationTimestamp="2026-02-19 21:47:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:47:18.734153835 +0000 UTC m=+1149.926671699" watchObservedRunningTime="2026-02-19 21:47:18.744714763 +0000 UTC m=+1149.937232627" Feb 19 21:47:20 crc kubenswrapper[4795]: I0219 21:47:20.750718 4795 generic.go:334] "Generic (PLEG): container finished" podID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerID="fdbe214ee22443ca571dd54e58e45a8266df69233f71d59ca89e8fe90ae2ab86" exitCode=0 Feb 19 21:47:20 crc kubenswrapper[4795]: I0219 21:47:20.750788 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2b418ec-23ae-4edd-8e61-0522a69c6be4","Type":"ContainerDied","Data":"fdbe214ee22443ca571dd54e58e45a8266df69233f71d59ca89e8fe90ae2ab86"} Feb 19 21:47:21 crc kubenswrapper[4795]: I0219 21:47:21.039750 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.011833 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-r8v4f"] Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.013469 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-r8v4f" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.036624 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-r8v4f"] Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.099805 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-h72xz"] Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.100990 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-h72xz" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.109777 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-2d62-account-create-update-jrx2c"] Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.110968 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2d62-account-create-update-jrx2c" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.114478 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.127348 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/573a7aa5-43d9-4523-8eea-4c1a36da49fb-operator-scripts\") pod \"nova-api-db-create-r8v4f\" (UID: \"573a7aa5-43d9-4523-8eea-4c1a36da49fb\") " pod="openstack/nova-api-db-create-r8v4f" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.127394 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhtwv\" (UniqueName: \"kubernetes.io/projected/573a7aa5-43d9-4523-8eea-4c1a36da49fb-kube-api-access-vhtwv\") pod \"nova-api-db-create-r8v4f\" (UID: \"573a7aa5-43d9-4523-8eea-4c1a36da49fb\") " pod="openstack/nova-api-db-create-r8v4f" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.132934 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-h72xz"] Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.145023 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2d62-account-create-update-jrx2c"] Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.228871 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29d85454-a8db-47bc-b616-bbdb4f6d8920-operator-scripts\") pod \"nova-cell0-db-create-h72xz\" (UID: \"29d85454-a8db-47bc-b616-bbdb4f6d8920\") " pod="openstack/nova-cell0-db-create-h72xz" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.228961 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqs8n\" (UniqueName: \"kubernetes.io/projected/29d85454-a8db-47bc-b616-bbdb4f6d8920-kube-api-access-dqs8n\") pod \"nova-cell0-db-create-h72xz\" (UID: \"29d85454-a8db-47bc-b616-bbdb4f6d8920\") " pod="openstack/nova-cell0-db-create-h72xz" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.229302 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79mth\" (UniqueName: \"kubernetes.io/projected/282595b2-0eaa-4deb-9af4-288241817325-kube-api-access-79mth\") pod \"nova-api-2d62-account-create-update-jrx2c\" (UID: \"282595b2-0eaa-4deb-9af4-288241817325\") " pod="openstack/nova-api-2d62-account-create-update-jrx2c" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.229702 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/573a7aa5-43d9-4523-8eea-4c1a36da49fb-operator-scripts\") pod \"nova-api-db-create-r8v4f\" (UID: \"573a7aa5-43d9-4523-8eea-4c1a36da49fb\") " pod="openstack/nova-api-db-create-r8v4f" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.229731 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/282595b2-0eaa-4deb-9af4-288241817325-operator-scripts\") pod \"nova-api-2d62-account-create-update-jrx2c\" (UID: \"282595b2-0eaa-4deb-9af4-288241817325\") " pod="openstack/nova-api-2d62-account-create-update-jrx2c" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.229758 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhtwv\" (UniqueName: \"kubernetes.io/projected/573a7aa5-43d9-4523-8eea-4c1a36da49fb-kube-api-access-vhtwv\") pod \"nova-api-db-create-r8v4f\" (UID: \"573a7aa5-43d9-4523-8eea-4c1a36da49fb\") " pod="openstack/nova-api-db-create-r8v4f" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.230420 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/573a7aa5-43d9-4523-8eea-4c1a36da49fb-operator-scripts\") pod \"nova-api-db-create-r8v4f\" (UID: \"573a7aa5-43d9-4523-8eea-4c1a36da49fb\") " pod="openstack/nova-api-db-create-r8v4f" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.254002 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhtwv\" (UniqueName: \"kubernetes.io/projected/573a7aa5-43d9-4523-8eea-4c1a36da49fb-kube-api-access-vhtwv\") pod \"nova-api-db-create-r8v4f\" (UID: \"573a7aa5-43d9-4523-8eea-4c1a36da49fb\") " pod="openstack/nova-api-db-create-r8v4f" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.307029 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-p7s8n"] Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.308460 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-p7s8n" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.331525 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3778f66e-fd7f-4af5-ae3e-2a7c272785a0-operator-scripts\") pod \"nova-cell1-db-create-p7s8n\" (UID: \"3778f66e-fd7f-4af5-ae3e-2a7c272785a0\") " pod="openstack/nova-cell1-db-create-p7s8n" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.331576 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/282595b2-0eaa-4deb-9af4-288241817325-operator-scripts\") pod \"nova-api-2d62-account-create-update-jrx2c\" (UID: \"282595b2-0eaa-4deb-9af4-288241817325\") " pod="openstack/nova-api-2d62-account-create-update-jrx2c" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.331670 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29d85454-a8db-47bc-b616-bbdb4f6d8920-operator-scripts\") pod \"nova-cell0-db-create-h72xz\" (UID: \"29d85454-a8db-47bc-b616-bbdb4f6d8920\") " pod="openstack/nova-cell0-db-create-h72xz" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.331704 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4q9l\" (UniqueName: \"kubernetes.io/projected/3778f66e-fd7f-4af5-ae3e-2a7c272785a0-kube-api-access-d4q9l\") pod \"nova-cell1-db-create-p7s8n\" (UID: \"3778f66e-fd7f-4af5-ae3e-2a7c272785a0\") " pod="openstack/nova-cell1-db-create-p7s8n" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.331760 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqs8n\" (UniqueName: \"kubernetes.io/projected/29d85454-a8db-47bc-b616-bbdb4f6d8920-kube-api-access-dqs8n\") pod \"nova-cell0-db-create-h72xz\" (UID: \"29d85454-a8db-47bc-b616-bbdb4f6d8920\") " pod="openstack/nova-cell0-db-create-h72xz" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.331784 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79mth\" (UniqueName: \"kubernetes.io/projected/282595b2-0eaa-4deb-9af4-288241817325-kube-api-access-79mth\") pod \"nova-api-2d62-account-create-update-jrx2c\" (UID: \"282595b2-0eaa-4deb-9af4-288241817325\") " pod="openstack/nova-api-2d62-account-create-update-jrx2c" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.332531 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/282595b2-0eaa-4deb-9af4-288241817325-operator-scripts\") pod \"nova-api-2d62-account-create-update-jrx2c\" (UID: \"282595b2-0eaa-4deb-9af4-288241817325\") " pod="openstack/nova-api-2d62-account-create-update-jrx2c" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.332612 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29d85454-a8db-47bc-b616-bbdb4f6d8920-operator-scripts\") pod \"nova-cell0-db-create-h72xz\" (UID: \"29d85454-a8db-47bc-b616-bbdb4f6d8920\") " pod="openstack/nova-cell0-db-create-h72xz" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.339244 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-r8v4f" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.339584 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-p7s8n"] Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.360500 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79mth\" (UniqueName: \"kubernetes.io/projected/282595b2-0eaa-4deb-9af4-288241817325-kube-api-access-79mth\") pod \"nova-api-2d62-account-create-update-jrx2c\" (UID: \"282595b2-0eaa-4deb-9af4-288241817325\") " pod="openstack/nova-api-2d62-account-create-update-jrx2c" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.361871 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.363636 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.370646 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-1922-account-create-update-bnqt2"] Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.374305 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1922-account-create-update-bnqt2" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.375329 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqs8n\" (UniqueName: \"kubernetes.io/projected/29d85454-a8db-47bc-b616-bbdb4f6d8920-kube-api-access-dqs8n\") pod \"nova-cell0-db-create-h72xz\" (UID: \"29d85454-a8db-47bc-b616-bbdb4f6d8920\") " pod="openstack/nova-cell0-db-create-h72xz" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.379150 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.391720 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-1922-account-create-update-bnqt2"] Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.423211 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-h72xz" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.433632 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3778f66e-fd7f-4af5-ae3e-2a7c272785a0-operator-scripts\") pod \"nova-cell1-db-create-p7s8n\" (UID: \"3778f66e-fd7f-4af5-ae3e-2a7c272785a0\") " pod="openstack/nova-cell1-db-create-p7s8n" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.433765 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4q9l\" (UniqueName: \"kubernetes.io/projected/3778f66e-fd7f-4af5-ae3e-2a7c272785a0-kube-api-access-d4q9l\") pod \"nova-cell1-db-create-p7s8n\" (UID: \"3778f66e-fd7f-4af5-ae3e-2a7c272785a0\") " pod="openstack/nova-cell1-db-create-p7s8n" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.436245 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3778f66e-fd7f-4af5-ae3e-2a7c272785a0-operator-scripts\") pod \"nova-cell1-db-create-p7s8n\" (UID: \"3778f66e-fd7f-4af5-ae3e-2a7c272785a0\") " pod="openstack/nova-cell1-db-create-p7s8n" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.442153 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2d62-account-create-update-jrx2c" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.467150 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4q9l\" (UniqueName: \"kubernetes.io/projected/3778f66e-fd7f-4af5-ae3e-2a7c272785a0-kube-api-access-d4q9l\") pod \"nova-cell1-db-create-p7s8n\" (UID: \"3778f66e-fd7f-4af5-ae3e-2a7c272785a0\") " pod="openstack/nova-cell1-db-create-p7s8n" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.517773 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-e48f-account-create-update-48v7f"] Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.518969 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e48f-account-create-update-48v7f" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.523656 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.534126 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e48f-account-create-update-48v7f"] Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.537450 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkn9b\" (UniqueName: \"kubernetes.io/projected/1946f4fd-5254-4e66-8739-5a51af23e963-kube-api-access-dkn9b\") pod \"nova-cell0-1922-account-create-update-bnqt2\" (UID: \"1946f4fd-5254-4e66-8739-5a51af23e963\") " pod="openstack/nova-cell0-1922-account-create-update-bnqt2" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.537552 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1946f4fd-5254-4e66-8739-5a51af23e963-operator-scripts\") pod \"nova-cell0-1922-account-create-update-bnqt2\" (UID: \"1946f4fd-5254-4e66-8739-5a51af23e963\") " pod="openstack/nova-cell0-1922-account-create-update-bnqt2" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.631645 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-p7s8n" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.639321 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvxgl\" (UniqueName: \"kubernetes.io/projected/6952c796-d85e-49b3-b931-60966311a0c0-kube-api-access-jvxgl\") pod \"nova-cell1-e48f-account-create-update-48v7f\" (UID: \"6952c796-d85e-49b3-b931-60966311a0c0\") " pod="openstack/nova-cell1-e48f-account-create-update-48v7f" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.639410 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6952c796-d85e-49b3-b931-60966311a0c0-operator-scripts\") pod \"nova-cell1-e48f-account-create-update-48v7f\" (UID: \"6952c796-d85e-49b3-b931-60966311a0c0\") " pod="openstack/nova-cell1-e48f-account-create-update-48v7f" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.639524 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkn9b\" (UniqueName: \"kubernetes.io/projected/1946f4fd-5254-4e66-8739-5a51af23e963-kube-api-access-dkn9b\") pod \"nova-cell0-1922-account-create-update-bnqt2\" (UID: \"1946f4fd-5254-4e66-8739-5a51af23e963\") " pod="openstack/nova-cell0-1922-account-create-update-bnqt2" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.639542 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1946f4fd-5254-4e66-8739-5a51af23e963-operator-scripts\") pod \"nova-cell0-1922-account-create-update-bnqt2\" (UID: \"1946f4fd-5254-4e66-8739-5a51af23e963\") " pod="openstack/nova-cell0-1922-account-create-update-bnqt2" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.640325 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1946f4fd-5254-4e66-8739-5a51af23e963-operator-scripts\") pod \"nova-cell0-1922-account-create-update-bnqt2\" (UID: \"1946f4fd-5254-4e66-8739-5a51af23e963\") " pod="openstack/nova-cell0-1922-account-create-update-bnqt2" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.672291 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkn9b\" (UniqueName: \"kubernetes.io/projected/1946f4fd-5254-4e66-8739-5a51af23e963-kube-api-access-dkn9b\") pod \"nova-cell0-1922-account-create-update-bnqt2\" (UID: \"1946f4fd-5254-4e66-8739-5a51af23e963\") " pod="openstack/nova-cell0-1922-account-create-update-bnqt2" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.734189 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.734412 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ac1caac2-edf5-453d-a76d-e1c65b7f038b" containerName="glance-log" containerID="cri-o://7826593332e6fe0d625ec77b566a78383702574fe609c8ca89088f745857f981" gracePeriod=30 Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.734582 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ac1caac2-edf5-453d-a76d-e1c65b7f038b" containerName="glance-httpd" containerID="cri-o://32a8f97c4bdeeea5fdb4b24c48b5ee8e3dda515976e342ae4939a42ab5261eec" gracePeriod=30 Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.741971 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvxgl\" (UniqueName: \"kubernetes.io/projected/6952c796-d85e-49b3-b931-60966311a0c0-kube-api-access-jvxgl\") pod \"nova-cell1-e48f-account-create-update-48v7f\" (UID: \"6952c796-d85e-49b3-b931-60966311a0c0\") " pod="openstack/nova-cell1-e48f-account-create-update-48v7f" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.742496 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6952c796-d85e-49b3-b931-60966311a0c0-operator-scripts\") pod \"nova-cell1-e48f-account-create-update-48v7f\" (UID: \"6952c796-d85e-49b3-b931-60966311a0c0\") " pod="openstack/nova-cell1-e48f-account-create-update-48v7f" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.743369 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6952c796-d85e-49b3-b931-60966311a0c0-operator-scripts\") pod \"nova-cell1-e48f-account-create-update-48v7f\" (UID: \"6952c796-d85e-49b3-b931-60966311a0c0\") " pod="openstack/nova-cell1-e48f-account-create-update-48v7f" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.768636 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvxgl\" (UniqueName: \"kubernetes.io/projected/6952c796-d85e-49b3-b931-60966311a0c0-kube-api-access-jvxgl\") pod \"nova-cell1-e48f-account-create-update-48v7f\" (UID: \"6952c796-d85e-49b3-b931-60966311a0c0\") " pod="openstack/nova-cell1-e48f-account-create-update-48v7f" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.850676 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1922-account-create-update-bnqt2" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.860139 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e48f-account-create-update-48v7f" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.110468 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.268284 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-sg-core-conf-yaml\") pod \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.268470 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-config-data\") pod \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.268639 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2b418ec-23ae-4edd-8e61-0522a69c6be4-log-httpd\") pod \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.268944 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-combined-ca-bundle\") pod \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.269059 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-scripts\") pod \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.269101 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59wzg\" (UniqueName: \"kubernetes.io/projected/d2b418ec-23ae-4edd-8e61-0522a69c6be4-kube-api-access-59wzg\") pod \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.269125 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2b418ec-23ae-4edd-8e61-0522a69c6be4-run-httpd\") pod \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.270132 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2b418ec-23ae-4edd-8e61-0522a69c6be4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d2b418ec-23ae-4edd-8e61-0522a69c6be4" (UID: "d2b418ec-23ae-4edd-8e61-0522a69c6be4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.272279 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2b418ec-23ae-4edd-8e61-0522a69c6be4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d2b418ec-23ae-4edd-8e61-0522a69c6be4" (UID: "d2b418ec-23ae-4edd-8e61-0522a69c6be4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.280847 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2b418ec-23ae-4edd-8e61-0522a69c6be4-kube-api-access-59wzg" (OuterVolumeSpecName: "kube-api-access-59wzg") pod "d2b418ec-23ae-4edd-8e61-0522a69c6be4" (UID: "d2b418ec-23ae-4edd-8e61-0522a69c6be4"). InnerVolumeSpecName "kube-api-access-59wzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.286408 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-scripts" (OuterVolumeSpecName: "scripts") pod "d2b418ec-23ae-4edd-8e61-0522a69c6be4" (UID: "d2b418ec-23ae-4edd-8e61-0522a69c6be4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.328105 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-r8v4f"] Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.331055 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d2b418ec-23ae-4edd-8e61-0522a69c6be4" (UID: "d2b418ec-23ae-4edd-8e61-0522a69c6be4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:27 crc kubenswrapper[4795]: W0219 21:47:27.342902 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod282595b2_0eaa_4deb_9af4_288241817325.slice/crio-f5a89b0c123f3a7776b51e9bb99d6cb38a702a838a7a412b9f1def886d685c88 WatchSource:0}: Error finding container f5a89b0c123f3a7776b51e9bb99d6cb38a702a838a7a412b9f1def886d685c88: Status 404 returned error can't find the container with id f5a89b0c123f3a7776b51e9bb99d6cb38a702a838a7a412b9f1def886d685c88 Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.370991 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2b418ec-23ae-4edd-8e61-0522a69c6be4-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.371099 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.371154 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59wzg\" (UniqueName: \"kubernetes.io/projected/d2b418ec-23ae-4edd-8e61-0522a69c6be4-kube-api-access-59wzg\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.372016 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2b418ec-23ae-4edd-8e61-0522a69c6be4-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.372048 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.397933 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-h72xz"] Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.404617 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2d62-account-create-update-jrx2c"] Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.425396 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-config-data" (OuterVolumeSpecName: "config-data") pod "d2b418ec-23ae-4edd-8e61-0522a69c6be4" (UID: "d2b418ec-23ae-4edd-8e61-0522a69c6be4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.434134 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2b418ec-23ae-4edd-8e61-0522a69c6be4" (UID: "d2b418ec-23ae-4edd-8e61-0522a69c6be4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.458415 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-p7s8n"] Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.480031 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.480059 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.624649 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-1922-account-create-update-bnqt2"] Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.649958 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e48f-account-create-update-48v7f"] Feb 19 21:47:27 crc kubenswrapper[4795]: W0219 21:47:27.699916 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6952c796_d85e_49b3_b931_60966311a0c0.slice/crio-0748dc0aa50d4d39dfbdabcf9ed72680e0e959eb3eb965f01e22779a3df14276 WatchSource:0}: Error finding container 0748dc0aa50d4d39dfbdabcf9ed72680e0e959eb3eb965f01e22779a3df14276: Status 404 returned error can't find the container with id 0748dc0aa50d4d39dfbdabcf9ed72680e0e959eb3eb965f01e22779a3df14276 Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.819647 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-p7s8n" event={"ID":"3778f66e-fd7f-4af5-ae3e-2a7c272785a0","Type":"ContainerStarted","Data":"68df0b7ecf93ce028557646109c36262df3e9ce4a929fd966aed1e0fe5f5daff"} Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.820699 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"336beec4-e534-448f-8367-78645b53650e","Type":"ContainerStarted","Data":"6f5fd7fb0db869022abdd3586a7debaf90502df56c7437b86541c8afbbd3687a"} Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.825909 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2b418ec-23ae-4edd-8e61-0522a69c6be4","Type":"ContainerDied","Data":"aca75b960d03859bf274a8a3797de1abbc9baeeee38219c6dcc1576889b95f11"} Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.825936 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.825963 4795 scope.go:117] "RemoveContainer" containerID="d19bddd1f9142f8194c0a0050a40c02459b91997d36600ebdae2e82529e31c8c" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.842857 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.490945388 podStartE2EDuration="14.842838503s" podCreationTimestamp="2026-02-19 21:47:13 +0000 UTC" firstStartedPulling="2026-02-19 21:47:14.406319799 +0000 UTC m=+1145.598837663" lastFinishedPulling="2026-02-19 21:47:26.758212914 +0000 UTC m=+1157.950730778" observedRunningTime="2026-02-19 21:47:27.842260237 +0000 UTC m=+1159.034778101" watchObservedRunningTime="2026-02-19 21:47:27.842838503 +0000 UTC m=+1159.035356367" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.869215 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.878628 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-r8v4f" event={"ID":"573a7aa5-43d9-4523-8eea-4c1a36da49fb","Type":"ContainerStarted","Data":"b1129ce7375e075c1c1844910d66cfc1b6308074f1d6f73dea4c3d974a9f4054"} Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.885243 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.899981 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:27 crc kubenswrapper[4795]: E0219 21:47:27.900453 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="proxy-httpd" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.900487 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="proxy-httpd" Feb 19 21:47:27 crc kubenswrapper[4795]: E0219 21:47:27.900515 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="ceilometer-central-agent" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.900521 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="ceilometer-central-agent" Feb 19 21:47:27 crc kubenswrapper[4795]: E0219 21:47:27.900551 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="ceilometer-notification-agent" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.900558 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="ceilometer-notification-agent" Feb 19 21:47:27 crc kubenswrapper[4795]: E0219 21:47:27.900568 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="sg-core" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.900573 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="sg-core" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.900796 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="sg-core" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.900817 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="proxy-httpd" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.900834 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="ceilometer-central-agent" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.900842 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="ceilometer-notification-agent" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.902947 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.905432 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.905528 4795 generic.go:334] "Generic (PLEG): container finished" podID="ac1caac2-edf5-453d-a76d-e1c65b7f038b" containerID="7826593332e6fe0d625ec77b566a78383702574fe609c8ca89088f745857f981" exitCode=143 Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.905659 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac1caac2-edf5-453d-a76d-e1c65b7f038b","Type":"ContainerDied","Data":"7826593332e6fe0d625ec77b566a78383702574fe609c8ca89088f745857f981"} Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.905713 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.908690 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e48f-account-create-update-48v7f" event={"ID":"6952c796-d85e-49b3-b931-60966311a0c0","Type":"ContainerStarted","Data":"0748dc0aa50d4d39dfbdabcf9ed72680e0e959eb3eb965f01e22779a3df14276"} Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.913106 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.930009 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-h72xz" event={"ID":"29d85454-a8db-47bc-b616-bbdb4f6d8920","Type":"ContainerStarted","Data":"af2bb649161fdcaac08f1bb77ef6e41cc4143b0e27292cc1c3c0624b994df5bf"} Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.940854 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2d62-account-create-update-jrx2c" event={"ID":"282595b2-0eaa-4deb-9af4-288241817325","Type":"ContainerStarted","Data":"f5a89b0c123f3a7776b51e9bb99d6cb38a702a838a7a412b9f1def886d685c88"} Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.942195 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1922-account-create-update-bnqt2" event={"ID":"1946f4fd-5254-4e66-8739-5a51af23e963","Type":"ContainerStarted","Data":"1c20c0048ca6ea52ec70d130bc06285f1a2ec973f63fcbb08a3874e51e60bfaa"} Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.949000 4795 scope.go:117] "RemoveContainer" containerID="a592134b61717d57f63788206b6ae0f1532d3c76a4a60eee8a3ddfb80940ab9a" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.976264 4795 scope.go:117] "RemoveContainer" containerID="fdbe214ee22443ca571dd54e58e45a8266df69233f71d59ca89e8fe90ae2ab86" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.996849 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-scripts\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.996887 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqzt6\" (UniqueName: \"kubernetes.io/projected/253d2f67-fdba-4a38-9b30-8544e6e54cc4-kube-api-access-hqzt6\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.996998 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-config-data\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.997024 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.997042 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/253d2f67-fdba-4a38-9b30-8544e6e54cc4-run-httpd\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.997057 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/253d2f67-fdba-4a38-9b30-8544e6e54cc4-log-httpd\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.997070 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.010318 4795 scope.go:117] "RemoveContainer" containerID="f5d643cd66b59a6396d5feca0a7263f30adfef3399f6c9e0364f6fef62658bb6" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.098362 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-config-data\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.098411 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.098431 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/253d2f67-fdba-4a38-9b30-8544e6e54cc4-run-httpd\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.098446 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/253d2f67-fdba-4a38-9b30-8544e6e54cc4-log-httpd\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.098459 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.098511 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-scripts\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.098535 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqzt6\" (UniqueName: \"kubernetes.io/projected/253d2f67-fdba-4a38-9b30-8544e6e54cc4-kube-api-access-hqzt6\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.099053 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/253d2f67-fdba-4a38-9b30-8544e6e54cc4-log-httpd\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.099447 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/253d2f67-fdba-4a38-9b30-8544e6e54cc4-run-httpd\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.105020 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.106590 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.106862 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-config-data\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.109029 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-scripts\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.122107 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqzt6\" (UniqueName: \"kubernetes.io/projected/253d2f67-fdba-4a38-9b30-8544e6e54cc4-kube-api-access-hqzt6\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.231191 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.427282 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.430770 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.430827 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.442128 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"26a524895f2f97a5543d7713a3a6fad00cc54e588f3f27507aef436c0255d593"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.442209 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://26a524895f2f97a5543d7713a3a6fad00cc54e588f3f27507aef436c0255d593" gracePeriod=600 Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.753508 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.952202 4795 generic.go:334] "Generic (PLEG): container finished" podID="6952c796-d85e-49b3-b931-60966311a0c0" containerID="df968256162833d5078e440bc65555f6f0195c60776c40f5962f3cbd9e8c0552" exitCode=0 Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.952287 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e48f-account-create-update-48v7f" event={"ID":"6952c796-d85e-49b3-b931-60966311a0c0","Type":"ContainerDied","Data":"df968256162833d5078e440bc65555f6f0195c60776c40f5962f3cbd9e8c0552"} Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.954055 4795 generic.go:334] "Generic (PLEG): container finished" podID="29d85454-a8db-47bc-b616-bbdb4f6d8920" containerID="a0f7c3f34c80b9a269fbc6418536c8bd187965bff2a033810a317f2452cc049c" exitCode=0 Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.954175 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-h72xz" event={"ID":"29d85454-a8db-47bc-b616-bbdb4f6d8920","Type":"ContainerDied","Data":"a0f7c3f34c80b9a269fbc6418536c8bd187965bff2a033810a317f2452cc049c"} Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.955698 4795 generic.go:334] "Generic (PLEG): container finished" podID="282595b2-0eaa-4deb-9af4-288241817325" containerID="5c8a8280c91cc71f3e85d0b7e361ab76b49d5efc392b87ec1bb737d4336102fe" exitCode=0 Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.955721 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2d62-account-create-update-jrx2c" event={"ID":"282595b2-0eaa-4deb-9af4-288241817325","Type":"ContainerDied","Data":"5c8a8280c91cc71f3e85d0b7e361ab76b49d5efc392b87ec1bb737d4336102fe"} Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.958058 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"253d2f67-fdba-4a38-9b30-8544e6e54cc4","Type":"ContainerStarted","Data":"84b0bcbc0062b2eec5eb90cbad2c6d5b12462c44819f0b7f936e0b7ddb57186c"} Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.960593 4795 generic.go:334] "Generic (PLEG): container finished" podID="3778f66e-fd7f-4af5-ae3e-2a7c272785a0" containerID="e324028f2d7a41155077f14e0d48b2d58c21ebdbf00e4a7e4bd8a8141187b6bd" exitCode=0 Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.960632 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-p7s8n" event={"ID":"3778f66e-fd7f-4af5-ae3e-2a7c272785a0","Type":"ContainerDied","Data":"e324028f2d7a41155077f14e0d48b2d58c21ebdbf00e4a7e4bd8a8141187b6bd"} Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.964649 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="26a524895f2f97a5543d7713a3a6fad00cc54e588f3f27507aef436c0255d593" exitCode=0 Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.964709 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"26a524895f2f97a5543d7713a3a6fad00cc54e588f3f27507aef436c0255d593"} Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.964741 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"d1fe8d148e55484c7e32b6632ef0256602ab969af6bc815e1058a95087794811"} Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.964755 4795 scope.go:117] "RemoveContainer" containerID="baca31a8ff8f8b420ab1c2fee031dade1a9efccb0543c74090535aa06f41da2f" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.966666 4795 generic.go:334] "Generic (PLEG): container finished" podID="573a7aa5-43d9-4523-8eea-4c1a36da49fb" containerID="a5072d3cb490beed9fbcf82cc94732aed7e308348a7a93463173543a97f32666" exitCode=0 Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.966736 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-r8v4f" event={"ID":"573a7aa5-43d9-4523-8eea-4c1a36da49fb","Type":"ContainerDied","Data":"a5072d3cb490beed9fbcf82cc94732aed7e308348a7a93463173543a97f32666"} Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.968653 4795 generic.go:334] "Generic (PLEG): container finished" podID="1946f4fd-5254-4e66-8739-5a51af23e963" containerID="c82c86becfd7208e347af805b27fa40c1a5698022b6509cd540c7832c74ea578" exitCode=0 Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.968711 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1922-account-create-update-bnqt2" event={"ID":"1946f4fd-5254-4e66-8739-5a51af23e963","Type":"ContainerDied","Data":"c82c86becfd7208e347af805b27fa40c1a5698022b6509cd540c7832c74ea578"} Feb 19 21:47:29 crc kubenswrapper[4795]: I0219 21:47:29.524019 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" path="/var/lib/kubelet/pods/d2b418ec-23ae-4edd-8e61-0522a69c6be4/volumes" Feb 19 21:47:29 crc kubenswrapper[4795]: I0219 21:47:29.892129 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:47:29 crc kubenswrapper[4795]: I0219 21:47:29.892440 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b449064d-5c14-4362-ba7b-a24ee9292789" containerName="glance-log" containerID="cri-o://1cce505138da629126ee8b9c2d975891bbe03cb0751f01b86b7395429a4b3b1a" gracePeriod=30 Feb 19 21:47:29 crc kubenswrapper[4795]: I0219 21:47:29.892962 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b449064d-5c14-4362-ba7b-a24ee9292789" containerName="glance-httpd" containerID="cri-o://24b4c0022da799cfde408a58dfc57a52b4277c9601d8bb59da3e5c565074603d" gracePeriod=30 Feb 19 21:47:29 crc kubenswrapper[4795]: I0219 21:47:29.981735 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"253d2f67-fdba-4a38-9b30-8544e6e54cc4","Type":"ContainerStarted","Data":"92a2e0a3dc40a655fd29047a14680db6caca8d95bdb8b99de07de6f7f007aee5"} Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.485424 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="ac1caac2-edf5-453d-a76d-e1c65b7f038b" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.150:9292/healthcheck\": dial tcp 10.217.0.150:9292: connect: connection refused" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.485449 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="ac1caac2-edf5-453d-a76d-e1c65b7f038b" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.150:9292/healthcheck\": dial tcp 10.217.0.150:9292: connect: connection refused" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.558840 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-r8v4f" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.664454 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/573a7aa5-43d9-4523-8eea-4c1a36da49fb-operator-scripts\") pod \"573a7aa5-43d9-4523-8eea-4c1a36da49fb\" (UID: \"573a7aa5-43d9-4523-8eea-4c1a36da49fb\") " Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.664562 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhtwv\" (UniqueName: \"kubernetes.io/projected/573a7aa5-43d9-4523-8eea-4c1a36da49fb-kube-api-access-vhtwv\") pod \"573a7aa5-43d9-4523-8eea-4c1a36da49fb\" (UID: \"573a7aa5-43d9-4523-8eea-4c1a36da49fb\") " Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.665397 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/573a7aa5-43d9-4523-8eea-4c1a36da49fb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "573a7aa5-43d9-4523-8eea-4c1a36da49fb" (UID: "573a7aa5-43d9-4523-8eea-4c1a36da49fb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.690411 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/573a7aa5-43d9-4523-8eea-4c1a36da49fb-kube-api-access-vhtwv" (OuterVolumeSpecName: "kube-api-access-vhtwv") pod "573a7aa5-43d9-4523-8eea-4c1a36da49fb" (UID: "573a7aa5-43d9-4523-8eea-4c1a36da49fb"). InnerVolumeSpecName "kube-api-access-vhtwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.767783 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/573a7aa5-43d9-4523-8eea-4c1a36da49fb-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.767819 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhtwv\" (UniqueName: \"kubernetes.io/projected/573a7aa5-43d9-4523-8eea-4c1a36da49fb-kube-api-access-vhtwv\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.797525 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-p7s8n" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.817544 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2d62-account-create-update-jrx2c" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.821692 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1922-account-create-update-bnqt2" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.824022 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-h72xz" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.830900 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e48f-account-create-update-48v7f" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.970962 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4q9l\" (UniqueName: \"kubernetes.io/projected/3778f66e-fd7f-4af5-ae3e-2a7c272785a0-kube-api-access-d4q9l\") pod \"3778f66e-fd7f-4af5-ae3e-2a7c272785a0\" (UID: \"3778f66e-fd7f-4af5-ae3e-2a7c272785a0\") " Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.971021 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79mth\" (UniqueName: \"kubernetes.io/projected/282595b2-0eaa-4deb-9af4-288241817325-kube-api-access-79mth\") pod \"282595b2-0eaa-4deb-9af4-288241817325\" (UID: \"282595b2-0eaa-4deb-9af4-288241817325\") " Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.971060 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqs8n\" (UniqueName: \"kubernetes.io/projected/29d85454-a8db-47bc-b616-bbdb4f6d8920-kube-api-access-dqs8n\") pod \"29d85454-a8db-47bc-b616-bbdb4f6d8920\" (UID: \"29d85454-a8db-47bc-b616-bbdb4f6d8920\") " Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.971083 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvxgl\" (UniqueName: \"kubernetes.io/projected/6952c796-d85e-49b3-b931-60966311a0c0-kube-api-access-jvxgl\") pod \"6952c796-d85e-49b3-b931-60966311a0c0\" (UID: \"6952c796-d85e-49b3-b931-60966311a0c0\") " Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.971219 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6952c796-d85e-49b3-b931-60966311a0c0-operator-scripts\") pod \"6952c796-d85e-49b3-b931-60966311a0c0\" (UID: \"6952c796-d85e-49b3-b931-60966311a0c0\") " Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.972035 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/282595b2-0eaa-4deb-9af4-288241817325-operator-scripts\") pod \"282595b2-0eaa-4deb-9af4-288241817325\" (UID: \"282595b2-0eaa-4deb-9af4-288241817325\") " Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.972075 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29d85454-a8db-47bc-b616-bbdb4f6d8920-operator-scripts\") pod \"29d85454-a8db-47bc-b616-bbdb4f6d8920\" (UID: \"29d85454-a8db-47bc-b616-bbdb4f6d8920\") " Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.972475 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6952c796-d85e-49b3-b931-60966311a0c0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6952c796-d85e-49b3-b931-60966311a0c0" (UID: "6952c796-d85e-49b3-b931-60966311a0c0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.972526 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/282595b2-0eaa-4deb-9af4-288241817325-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "282595b2-0eaa-4deb-9af4-288241817325" (UID: "282595b2-0eaa-4deb-9af4-288241817325"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.972648 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1946f4fd-5254-4e66-8739-5a51af23e963-operator-scripts\") pod \"1946f4fd-5254-4e66-8739-5a51af23e963\" (UID: \"1946f4fd-5254-4e66-8739-5a51af23e963\") " Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.972678 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29d85454-a8db-47bc-b616-bbdb4f6d8920-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "29d85454-a8db-47bc-b616-bbdb4f6d8920" (UID: "29d85454-a8db-47bc-b616-bbdb4f6d8920"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.972676 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3778f66e-fd7f-4af5-ae3e-2a7c272785a0-operator-scripts\") pod \"3778f66e-fd7f-4af5-ae3e-2a7c272785a0\" (UID: \"3778f66e-fd7f-4af5-ae3e-2a7c272785a0\") " Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.972722 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkn9b\" (UniqueName: \"kubernetes.io/projected/1946f4fd-5254-4e66-8739-5a51af23e963-kube-api-access-dkn9b\") pod \"1946f4fd-5254-4e66-8739-5a51af23e963\" (UID: \"1946f4fd-5254-4e66-8739-5a51af23e963\") " Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.973017 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3778f66e-fd7f-4af5-ae3e-2a7c272785a0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3778f66e-fd7f-4af5-ae3e-2a7c272785a0" (UID: "3778f66e-fd7f-4af5-ae3e-2a7c272785a0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.973023 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1946f4fd-5254-4e66-8739-5a51af23e963-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1946f4fd-5254-4e66-8739-5a51af23e963" (UID: "1946f4fd-5254-4e66-8739-5a51af23e963"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.973448 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6952c796-d85e-49b3-b931-60966311a0c0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.973487 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/282595b2-0eaa-4deb-9af4-288241817325-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.973498 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29d85454-a8db-47bc-b616-bbdb4f6d8920-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.973507 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1946f4fd-5254-4e66-8739-5a51af23e963-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.973515 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3778f66e-fd7f-4af5-ae3e-2a7c272785a0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.976752 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29d85454-a8db-47bc-b616-bbdb4f6d8920-kube-api-access-dqs8n" (OuterVolumeSpecName: "kube-api-access-dqs8n") pod "29d85454-a8db-47bc-b616-bbdb4f6d8920" (UID: "29d85454-a8db-47bc-b616-bbdb4f6d8920"). InnerVolumeSpecName "kube-api-access-dqs8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.977497 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/282595b2-0eaa-4deb-9af4-288241817325-kube-api-access-79mth" (OuterVolumeSpecName: "kube-api-access-79mth") pod "282595b2-0eaa-4deb-9af4-288241817325" (UID: "282595b2-0eaa-4deb-9af4-288241817325"). InnerVolumeSpecName "kube-api-access-79mth". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.977774 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3778f66e-fd7f-4af5-ae3e-2a7c272785a0-kube-api-access-d4q9l" (OuterVolumeSpecName: "kube-api-access-d4q9l") pod "3778f66e-fd7f-4af5-ae3e-2a7c272785a0" (UID: "3778f66e-fd7f-4af5-ae3e-2a7c272785a0"). InnerVolumeSpecName "kube-api-access-d4q9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.982146 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6952c796-d85e-49b3-b931-60966311a0c0-kube-api-access-jvxgl" (OuterVolumeSpecName: "kube-api-access-jvxgl") pod "6952c796-d85e-49b3-b931-60966311a0c0" (UID: "6952c796-d85e-49b3-b931-60966311a0c0"). InnerVolumeSpecName "kube-api-access-jvxgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.983118 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1946f4fd-5254-4e66-8739-5a51af23e963-kube-api-access-dkn9b" (OuterVolumeSpecName: "kube-api-access-dkn9b") pod "1946f4fd-5254-4e66-8739-5a51af23e963" (UID: "1946f4fd-5254-4e66-8739-5a51af23e963"). InnerVolumeSpecName "kube-api-access-dkn9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.999768 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-h72xz" event={"ID":"29d85454-a8db-47bc-b616-bbdb4f6d8920","Type":"ContainerDied","Data":"af2bb649161fdcaac08f1bb77ef6e41cc4143b0e27292cc1c3c0624b994df5bf"} Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.999807 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af2bb649161fdcaac08f1bb77ef6e41cc4143b0e27292cc1c3c0624b994df5bf" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.999891 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-h72xz" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.004642 4795 generic.go:334] "Generic (PLEG): container finished" podID="b449064d-5c14-4362-ba7b-a24ee9292789" containerID="1cce505138da629126ee8b9c2d975891bbe03cb0751f01b86b7395429a4b3b1a" exitCode=143 Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.004753 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b449064d-5c14-4362-ba7b-a24ee9292789","Type":"ContainerDied","Data":"1cce505138da629126ee8b9c2d975891bbe03cb0751f01b86b7395429a4b3b1a"} Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.006432 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2d62-account-create-update-jrx2c" event={"ID":"282595b2-0eaa-4deb-9af4-288241817325","Type":"ContainerDied","Data":"f5a89b0c123f3a7776b51e9bb99d6cb38a702a838a7a412b9f1def886d685c88"} Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.006520 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5a89b0c123f3a7776b51e9bb99d6cb38a702a838a7a412b9f1def886d685c88" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.006652 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2d62-account-create-update-jrx2c" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.014410 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-r8v4f" event={"ID":"573a7aa5-43d9-4523-8eea-4c1a36da49fb","Type":"ContainerDied","Data":"b1129ce7375e075c1c1844910d66cfc1b6308074f1d6f73dea4c3d974a9f4054"} Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.014451 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1129ce7375e075c1c1844910d66cfc1b6308074f1d6f73dea4c3d974a9f4054" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.014467 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-r8v4f" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.018413 4795 generic.go:334] "Generic (PLEG): container finished" podID="ac1caac2-edf5-453d-a76d-e1c65b7f038b" containerID="32a8f97c4bdeeea5fdb4b24c48b5ee8e3dda515976e342ae4939a42ab5261eec" exitCode=0 Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.018487 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac1caac2-edf5-453d-a76d-e1c65b7f038b","Type":"ContainerDied","Data":"32a8f97c4bdeeea5fdb4b24c48b5ee8e3dda515976e342ae4939a42ab5261eec"} Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.044753 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e48f-account-create-update-48v7f" event={"ID":"6952c796-d85e-49b3-b931-60966311a0c0","Type":"ContainerDied","Data":"0748dc0aa50d4d39dfbdabcf9ed72680e0e959eb3eb965f01e22779a3df14276"} Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.044786 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0748dc0aa50d4d39dfbdabcf9ed72680e0e959eb3eb965f01e22779a3df14276" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.044838 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e48f-account-create-update-48v7f" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.050943 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"253d2f67-fdba-4a38-9b30-8544e6e54cc4","Type":"ContainerStarted","Data":"1d73a882957bce93c029299c95b3bfddc0c0d2a2db28a352e94f0a5c556118f9"} Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.056441 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.058674 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-p7s8n" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.058970 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-p7s8n" event={"ID":"3778f66e-fd7f-4af5-ae3e-2a7c272785a0","Type":"ContainerDied","Data":"68df0b7ecf93ce028557646109c36262df3e9ce4a929fd966aed1e0fe5f5daff"} Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.058999 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68df0b7ecf93ce028557646109c36262df3e9ce4a929fd966aed1e0fe5f5daff" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.060914 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1922-account-create-update-bnqt2" event={"ID":"1946f4fd-5254-4e66-8739-5a51af23e963","Type":"ContainerDied","Data":"1c20c0048ca6ea52ec70d130bc06285f1a2ec973f63fcbb08a3874e51e60bfaa"} Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.060951 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c20c0048ca6ea52ec70d130bc06285f1a2ec973f63fcbb08a3874e51e60bfaa" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.061011 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1922-account-create-update-bnqt2" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.075107 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkn9b\" (UniqueName: \"kubernetes.io/projected/1946f4fd-5254-4e66-8739-5a51af23e963-kube-api-access-dkn9b\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.075131 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4q9l\" (UniqueName: \"kubernetes.io/projected/3778f66e-fd7f-4af5-ae3e-2a7c272785a0-kube-api-access-d4q9l\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.075141 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79mth\" (UniqueName: \"kubernetes.io/projected/282595b2-0eaa-4deb-9af4-288241817325-kube-api-access-79mth\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.075150 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqs8n\" (UniqueName: \"kubernetes.io/projected/29d85454-a8db-47bc-b616-bbdb4f6d8920-kube-api-access-dqs8n\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.075158 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvxgl\" (UniqueName: \"kubernetes.io/projected/6952c796-d85e-49b3-b931-60966311a0c0-kube-api-access-jvxgl\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.176605 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac1caac2-edf5-453d-a76d-e1c65b7f038b-logs\") pod \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.176966 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-public-tls-certs\") pod \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.177040 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-config-data\") pod \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.177076 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.177105 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-combined-ca-bundle\") pod \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.177123 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac1caac2-edf5-453d-a76d-e1c65b7f038b-httpd-run\") pod \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.177156 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-scripts\") pod \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.177176 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac1caac2-edf5-453d-a76d-e1c65b7f038b-logs" (OuterVolumeSpecName: "logs") pod "ac1caac2-edf5-453d-a76d-e1c65b7f038b" (UID: "ac1caac2-edf5-453d-a76d-e1c65b7f038b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.177207 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz2gw\" (UniqueName: \"kubernetes.io/projected/ac1caac2-edf5-453d-a76d-e1c65b7f038b-kube-api-access-hz2gw\") pod \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.177359 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac1caac2-edf5-453d-a76d-e1c65b7f038b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ac1caac2-edf5-453d-a76d-e1c65b7f038b" (UID: "ac1caac2-edf5-453d-a76d-e1c65b7f038b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.177628 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac1caac2-edf5-453d-a76d-e1c65b7f038b-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.182128 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-scripts" (OuterVolumeSpecName: "scripts") pod "ac1caac2-edf5-453d-a76d-e1c65b7f038b" (UID: "ac1caac2-edf5-453d-a76d-e1c65b7f038b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.185302 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac1caac2-edf5-453d-a76d-e1c65b7f038b-kube-api-access-hz2gw" (OuterVolumeSpecName: "kube-api-access-hz2gw") pod "ac1caac2-edf5-453d-a76d-e1c65b7f038b" (UID: "ac1caac2-edf5-453d-a76d-e1c65b7f038b"). InnerVolumeSpecName "kube-api-access-hz2gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.187347 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "ac1caac2-edf5-453d-a76d-e1c65b7f038b" (UID: "ac1caac2-edf5-453d-a76d-e1c65b7f038b"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.227238 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac1caac2-edf5-453d-a76d-e1c65b7f038b" (UID: "ac1caac2-edf5-453d-a76d-e1c65b7f038b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.241154 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-config-data" (OuterVolumeSpecName: "config-data") pod "ac1caac2-edf5-453d-a76d-e1c65b7f038b" (UID: "ac1caac2-edf5-453d-a76d-e1c65b7f038b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.247387 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ac1caac2-edf5-453d-a76d-e1c65b7f038b" (UID: "ac1caac2-edf5-453d-a76d-e1c65b7f038b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.281790 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac1caac2-edf5-453d-a76d-e1c65b7f038b-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.281839 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.281854 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.281901 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.281914 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.281924 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.281933 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hz2gw\" (UniqueName: \"kubernetes.io/projected/ac1caac2-edf5-453d-a76d-e1c65b7f038b-kube-api-access-hz2gw\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.306926 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.383366 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.603772 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.071080 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac1caac2-edf5-453d-a76d-e1c65b7f038b","Type":"ContainerDied","Data":"3b74237da4db43dc7c546ec4709a7afdab3dbcd047ed3e21f44f8f9ee5a66753"} Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.071120 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.071137 4795 scope.go:117] "RemoveContainer" containerID="32a8f97c4bdeeea5fdb4b24c48b5ee8e3dda515976e342ae4939a42ab5261eec" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.075845 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"253d2f67-fdba-4a38-9b30-8544e6e54cc4","Type":"ContainerStarted","Data":"bf60e2985b3070dfff9d45f1b36e7c5e63972e743b6bc5b2679f84fa20f46972"} Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.096627 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.108055 4795 scope.go:117] "RemoveContainer" containerID="7826593332e6fe0d625ec77b566a78383702574fe609c8ca89088f745857f981" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.109760 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.122768 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:47:32 crc kubenswrapper[4795]: E0219 21:47:32.123332 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac1caac2-edf5-453d-a76d-e1c65b7f038b" containerName="glance-httpd" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.123356 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac1caac2-edf5-453d-a76d-e1c65b7f038b" containerName="glance-httpd" Feb 19 21:47:32 crc kubenswrapper[4795]: E0219 21:47:32.123377 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="282595b2-0eaa-4deb-9af4-288241817325" containerName="mariadb-account-create-update" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.123386 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="282595b2-0eaa-4deb-9af4-288241817325" containerName="mariadb-account-create-update" Feb 19 21:47:32 crc kubenswrapper[4795]: E0219 21:47:32.123395 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="573a7aa5-43d9-4523-8eea-4c1a36da49fb" containerName="mariadb-database-create" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.123403 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="573a7aa5-43d9-4523-8eea-4c1a36da49fb" containerName="mariadb-database-create" Feb 19 21:47:32 crc kubenswrapper[4795]: E0219 21:47:32.123417 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29d85454-a8db-47bc-b616-bbdb4f6d8920" containerName="mariadb-database-create" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.123424 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="29d85454-a8db-47bc-b616-bbdb4f6d8920" containerName="mariadb-database-create" Feb 19 21:47:32 crc kubenswrapper[4795]: E0219 21:47:32.123440 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6952c796-d85e-49b3-b931-60966311a0c0" containerName="mariadb-account-create-update" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.123448 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6952c796-d85e-49b3-b931-60966311a0c0" containerName="mariadb-account-create-update" Feb 19 21:47:32 crc kubenswrapper[4795]: E0219 21:47:32.123461 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1946f4fd-5254-4e66-8739-5a51af23e963" containerName="mariadb-account-create-update" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.123468 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1946f4fd-5254-4e66-8739-5a51af23e963" containerName="mariadb-account-create-update" Feb 19 21:47:32 crc kubenswrapper[4795]: E0219 21:47:32.123488 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac1caac2-edf5-453d-a76d-e1c65b7f038b" containerName="glance-log" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.123496 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac1caac2-edf5-453d-a76d-e1c65b7f038b" containerName="glance-log" Feb 19 21:47:32 crc kubenswrapper[4795]: E0219 21:47:32.123505 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3778f66e-fd7f-4af5-ae3e-2a7c272785a0" containerName="mariadb-database-create" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.123513 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3778f66e-fd7f-4af5-ae3e-2a7c272785a0" containerName="mariadb-database-create" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.123726 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="29d85454-a8db-47bc-b616-bbdb4f6d8920" containerName="mariadb-database-create" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.123750 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1946f4fd-5254-4e66-8739-5a51af23e963" containerName="mariadb-account-create-update" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.123767 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3778f66e-fd7f-4af5-ae3e-2a7c272785a0" containerName="mariadb-database-create" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.123779 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="573a7aa5-43d9-4523-8eea-4c1a36da49fb" containerName="mariadb-database-create" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.123794 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="282595b2-0eaa-4deb-9af4-288241817325" containerName="mariadb-account-create-update" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.123807 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac1caac2-edf5-453d-a76d-e1c65b7f038b" containerName="glance-log" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.123816 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac1caac2-edf5-453d-a76d-e1c65b7f038b" containerName="glance-httpd" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.123830 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6952c796-d85e-49b3-b931-60966311a0c0" containerName="mariadb-account-create-update" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.125140 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.127398 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.127627 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.149054 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.298222 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-scripts\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.298279 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-config-data\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.298304 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.298447 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-logs\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.298658 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8fsc\" (UniqueName: \"kubernetes.io/projected/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-kube-api-access-x8fsc\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.298809 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.298883 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.299046 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.400511 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.400580 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-scripts\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.400614 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-config-data\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.400634 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.400661 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-logs\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.400705 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8fsc\" (UniqueName: \"kubernetes.io/projected/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-kube-api-access-x8fsc\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.400742 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.400765 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.401355 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.401783 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.402688 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-logs\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.409743 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-config-data\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.410793 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.410986 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-scripts\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.411265 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.417942 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8fsc\" (UniqueName: \"kubernetes.io/projected/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-kube-api-access-x8fsc\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.427889 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.457400 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:47:33 crc kubenswrapper[4795]: I0219 21:47:33.085915 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"253d2f67-fdba-4a38-9b30-8544e6e54cc4","Type":"ContainerStarted","Data":"78614fb581b9ca38ffccfdb9316d7293579f666e082d3786d4b25f75d92eddaa"} Feb 19 21:47:33 crc kubenswrapper[4795]: I0219 21:47:33.086335 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 21:47:33 crc kubenswrapper[4795]: I0219 21:47:33.086246 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerName="proxy-httpd" containerID="cri-o://78614fb581b9ca38ffccfdb9316d7293579f666e082d3786d4b25f75d92eddaa" gracePeriod=30 Feb 19 21:47:33 crc kubenswrapper[4795]: I0219 21:47:33.086114 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerName="ceilometer-central-agent" containerID="cri-o://92a2e0a3dc40a655fd29047a14680db6caca8d95bdb8b99de07de6f7f007aee5" gracePeriod=30 Feb 19 21:47:33 crc kubenswrapper[4795]: I0219 21:47:33.086287 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerName="ceilometer-notification-agent" containerID="cri-o://1d73a882957bce93c029299c95b3bfddc0c0d2a2db28a352e94f0a5c556118f9" gracePeriod=30 Feb 19 21:47:33 crc kubenswrapper[4795]: I0219 21:47:33.086295 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerName="sg-core" containerID="cri-o://bf60e2985b3070dfff9d45f1b36e7c5e63972e743b6bc5b2679f84fa20f46972" gracePeriod=30 Feb 19 21:47:33 crc kubenswrapper[4795]: I0219 21:47:33.110241 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.338536036 podStartE2EDuration="6.110227328s" podCreationTimestamp="2026-02-19 21:47:27 +0000 UTC" firstStartedPulling="2026-02-19 21:47:28.779058868 +0000 UTC m=+1159.971576722" lastFinishedPulling="2026-02-19 21:47:32.55075015 +0000 UTC m=+1163.743268014" observedRunningTime="2026-02-19 21:47:33.109452157 +0000 UTC m=+1164.301970021" watchObservedRunningTime="2026-02-19 21:47:33.110227328 +0000 UTC m=+1164.302745192" Feb 19 21:47:33 crc kubenswrapper[4795]: I0219 21:47:33.180836 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:47:33 crc kubenswrapper[4795]: I0219 21:47:33.520046 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac1caac2-edf5-453d-a76d-e1c65b7f038b" path="/var/lib/kubelet/pods/ac1caac2-edf5-453d-a76d-e1c65b7f038b/volumes" Feb 19 21:47:33 crc kubenswrapper[4795]: I0219 21:47:33.556354 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="b449064d-5c14-4362-ba7b-a24ee9292789" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.149:9292/healthcheck\": read tcp 10.217.0.2:60368->10.217.0.149:9292: read: connection reset by peer" Feb 19 21:47:33 crc kubenswrapper[4795]: I0219 21:47:33.556543 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="b449064d-5c14-4362-ba7b-a24ee9292789" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.149:9292/healthcheck\": read tcp 10.217.0.2:60376->10.217.0.149:9292: read: connection reset by peer" Feb 19 21:47:33 crc kubenswrapper[4795]: I0219 21:47:33.984107 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.106887 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9","Type":"ContainerStarted","Data":"46cb4f812b6ae6e3f71ffef829cc67ce3250106de6851d56a2c280735844fd0f"} Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.107405 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9","Type":"ContainerStarted","Data":"6d8d28f68ae7a05b3b24448d485df065e39bc0509b04817346db5c0af58598b8"} Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.110797 4795 generic.go:334] "Generic (PLEG): container finished" podID="b449064d-5c14-4362-ba7b-a24ee9292789" containerID="24b4c0022da799cfde408a58dfc57a52b4277c9601d8bb59da3e5c565074603d" exitCode=0 Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.110875 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b449064d-5c14-4362-ba7b-a24ee9292789","Type":"ContainerDied","Data":"24b4c0022da799cfde408a58dfc57a52b4277c9601d8bb59da3e5c565074603d"} Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.110895 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b449064d-5c14-4362-ba7b-a24ee9292789","Type":"ContainerDied","Data":"fe3c35dfb7e24d88f06d64c3416cafe5c8ebe7a75022634f77450664da8f2158"} Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.110914 4795 scope.go:117] "RemoveContainer" containerID="24b4c0022da799cfde408a58dfc57a52b4277c9601d8bb59da3e5c565074603d" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.110931 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.126499 4795 generic.go:334] "Generic (PLEG): container finished" podID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerID="78614fb581b9ca38ffccfdb9316d7293579f666e082d3786d4b25f75d92eddaa" exitCode=0 Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.126717 4795 generic.go:334] "Generic (PLEG): container finished" podID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerID="bf60e2985b3070dfff9d45f1b36e7c5e63972e743b6bc5b2679f84fa20f46972" exitCode=2 Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.126806 4795 generic.go:334] "Generic (PLEG): container finished" podID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerID="1d73a882957bce93c029299c95b3bfddc0c0d2a2db28a352e94f0a5c556118f9" exitCode=0 Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.126875 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"253d2f67-fdba-4a38-9b30-8544e6e54cc4","Type":"ContainerDied","Data":"78614fb581b9ca38ffccfdb9316d7293579f666e082d3786d4b25f75d92eddaa"} Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.126955 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"253d2f67-fdba-4a38-9b30-8544e6e54cc4","Type":"ContainerDied","Data":"bf60e2985b3070dfff9d45f1b36e7c5e63972e743b6bc5b2679f84fa20f46972"} Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.127016 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"253d2f67-fdba-4a38-9b30-8544e6e54cc4","Type":"ContainerDied","Data":"1d73a882957bce93c029299c95b3bfddc0c0d2a2db28a352e94f0a5c556118f9"} Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.139614 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b449064d-5c14-4362-ba7b-a24ee9292789-httpd-run\") pod \"b449064d-5c14-4362-ba7b-a24ee9292789\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.139658 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b449064d-5c14-4362-ba7b-a24ee9292789-logs\") pod \"b449064d-5c14-4362-ba7b-a24ee9292789\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.139685 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-config-data\") pod \"b449064d-5c14-4362-ba7b-a24ee9292789\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.139748 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zp5z\" (UniqueName: \"kubernetes.io/projected/b449064d-5c14-4362-ba7b-a24ee9292789-kube-api-access-7zp5z\") pod \"b449064d-5c14-4362-ba7b-a24ee9292789\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.139781 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"b449064d-5c14-4362-ba7b-a24ee9292789\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.139808 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-internal-tls-certs\") pod \"b449064d-5c14-4362-ba7b-a24ee9292789\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.139834 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-scripts\") pod \"b449064d-5c14-4362-ba7b-a24ee9292789\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.139915 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-combined-ca-bundle\") pod \"b449064d-5c14-4362-ba7b-a24ee9292789\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.144233 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b449064d-5c14-4362-ba7b-a24ee9292789-logs" (OuterVolumeSpecName: "logs") pod "b449064d-5c14-4362-ba7b-a24ee9292789" (UID: "b449064d-5c14-4362-ba7b-a24ee9292789"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.144529 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b449064d-5c14-4362-ba7b-a24ee9292789-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b449064d-5c14-4362-ba7b-a24ee9292789" (UID: "b449064d-5c14-4362-ba7b-a24ee9292789"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.145806 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b449064d-5c14-4362-ba7b-a24ee9292789-kube-api-access-7zp5z" (OuterVolumeSpecName: "kube-api-access-7zp5z") pod "b449064d-5c14-4362-ba7b-a24ee9292789" (UID: "b449064d-5c14-4362-ba7b-a24ee9292789"). InnerVolumeSpecName "kube-api-access-7zp5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.149087 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "b449064d-5c14-4362-ba7b-a24ee9292789" (UID: "b449064d-5c14-4362-ba7b-a24ee9292789"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.149288 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-scripts" (OuterVolumeSpecName: "scripts") pod "b449064d-5c14-4362-ba7b-a24ee9292789" (UID: "b449064d-5c14-4362-ba7b-a24ee9292789"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.160463 4795 scope.go:117] "RemoveContainer" containerID="1cce505138da629126ee8b9c2d975891bbe03cb0751f01b86b7395429a4b3b1a" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.171346 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b449064d-5c14-4362-ba7b-a24ee9292789" (UID: "b449064d-5c14-4362-ba7b-a24ee9292789"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.184098 4795 scope.go:117] "RemoveContainer" containerID="24b4c0022da799cfde408a58dfc57a52b4277c9601d8bb59da3e5c565074603d" Feb 19 21:47:34 crc kubenswrapper[4795]: E0219 21:47:34.184596 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24b4c0022da799cfde408a58dfc57a52b4277c9601d8bb59da3e5c565074603d\": container with ID starting with 24b4c0022da799cfde408a58dfc57a52b4277c9601d8bb59da3e5c565074603d not found: ID does not exist" containerID="24b4c0022da799cfde408a58dfc57a52b4277c9601d8bb59da3e5c565074603d" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.184644 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24b4c0022da799cfde408a58dfc57a52b4277c9601d8bb59da3e5c565074603d"} err="failed to get container status \"24b4c0022da799cfde408a58dfc57a52b4277c9601d8bb59da3e5c565074603d\": rpc error: code = NotFound desc = could not find container \"24b4c0022da799cfde408a58dfc57a52b4277c9601d8bb59da3e5c565074603d\": container with ID starting with 24b4c0022da799cfde408a58dfc57a52b4277c9601d8bb59da3e5c565074603d not found: ID does not exist" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.184669 4795 scope.go:117] "RemoveContainer" containerID="1cce505138da629126ee8b9c2d975891bbe03cb0751f01b86b7395429a4b3b1a" Feb 19 21:47:34 crc kubenswrapper[4795]: E0219 21:47:34.185018 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cce505138da629126ee8b9c2d975891bbe03cb0751f01b86b7395429a4b3b1a\": container with ID starting with 1cce505138da629126ee8b9c2d975891bbe03cb0751f01b86b7395429a4b3b1a not found: ID does not exist" containerID="1cce505138da629126ee8b9c2d975891bbe03cb0751f01b86b7395429a4b3b1a" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.185041 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cce505138da629126ee8b9c2d975891bbe03cb0751f01b86b7395429a4b3b1a"} err="failed to get container status \"1cce505138da629126ee8b9c2d975891bbe03cb0751f01b86b7395429a4b3b1a\": rpc error: code = NotFound desc = could not find container \"1cce505138da629126ee8b9c2d975891bbe03cb0751f01b86b7395429a4b3b1a\": container with ID starting with 1cce505138da629126ee8b9c2d975891bbe03cb0751f01b86b7395429a4b3b1a not found: ID does not exist" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.195543 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b449064d-5c14-4362-ba7b-a24ee9292789" (UID: "b449064d-5c14-4362-ba7b-a24ee9292789"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.197968 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-config-data" (OuterVolumeSpecName: "config-data") pod "b449064d-5c14-4362-ba7b-a24ee9292789" (UID: "b449064d-5c14-4362-ba7b-a24ee9292789"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.242543 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.242583 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b449064d-5c14-4362-ba7b-a24ee9292789-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.242597 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b449064d-5c14-4362-ba7b-a24ee9292789-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.242609 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.242622 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zp5z\" (UniqueName: \"kubernetes.io/projected/b449064d-5c14-4362-ba7b-a24ee9292789-kube-api-access-7zp5z\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.242659 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.242672 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.242692 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.268503 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.344315 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.489529 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.500733 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.513891 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:47:34 crc kubenswrapper[4795]: E0219 21:47:34.514328 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b449064d-5c14-4362-ba7b-a24ee9292789" containerName="glance-log" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.514351 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b449064d-5c14-4362-ba7b-a24ee9292789" containerName="glance-log" Feb 19 21:47:34 crc kubenswrapper[4795]: E0219 21:47:34.514366 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b449064d-5c14-4362-ba7b-a24ee9292789" containerName="glance-httpd" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.514404 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b449064d-5c14-4362-ba7b-a24ee9292789" containerName="glance-httpd" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.514632 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b449064d-5c14-4362-ba7b-a24ee9292789" containerName="glance-httpd" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.516033 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b449064d-5c14-4362-ba7b-a24ee9292789" containerName="glance-log" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.517689 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.519801 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.520248 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.529493 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.649127 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.649433 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3697a3b0-4077-4837-bcdc-c17d8aa361f1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.649506 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3697a3b0-4077-4837-bcdc-c17d8aa361f1-logs\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.649554 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.649598 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.649635 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.649654 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.649692 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg5wb\" (UniqueName: \"kubernetes.io/projected/3697a3b0-4077-4837-bcdc-c17d8aa361f1-kube-api-access-bg5wb\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.751653 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.751723 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.751741 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.751778 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg5wb\" (UniqueName: \"kubernetes.io/projected/3697a3b0-4077-4837-bcdc-c17d8aa361f1-kube-api-access-bg5wb\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.751805 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.751833 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3697a3b0-4077-4837-bcdc-c17d8aa361f1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.751862 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3697a3b0-4077-4837-bcdc-c17d8aa361f1-logs\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.751909 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.753248 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3697a3b0-4077-4837-bcdc-c17d8aa361f1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.753340 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.753508 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3697a3b0-4077-4837-bcdc-c17d8aa361f1-logs\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.758095 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.758781 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.758815 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.759321 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.771831 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg5wb\" (UniqueName: \"kubernetes.io/projected/3697a3b0-4077-4837-bcdc-c17d8aa361f1-kube-api-access-bg5wb\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.777500 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.836246 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:47:35 crc kubenswrapper[4795]: I0219 21:47:35.148672 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9","Type":"ContainerStarted","Data":"708e4d014b2d6219a997cb5284977900cb2aafb592d9023c95962ea2c3074217"} Feb 19 21:47:35 crc kubenswrapper[4795]: I0219 21:47:35.175423 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.175403891 podStartE2EDuration="3.175403891s" podCreationTimestamp="2026-02-19 21:47:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:47:35.171852721 +0000 UTC m=+1166.364370595" watchObservedRunningTime="2026-02-19 21:47:35.175403891 +0000 UTC m=+1166.367921745" Feb 19 21:47:35 crc kubenswrapper[4795]: W0219 21:47:35.420708 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3697a3b0_4077_4837_bcdc_c17d8aa361f1.slice/crio-4b0f0860894e220641b68db9b622d33a66b09008a18bfa19149efafa413199c3 WatchSource:0}: Error finding container 4b0f0860894e220641b68db9b622d33a66b09008a18bfa19149efafa413199c3: Status 404 returned error can't find the container with id 4b0f0860894e220641b68db9b622d33a66b09008a18bfa19149efafa413199c3 Feb 19 21:47:35 crc kubenswrapper[4795]: I0219 21:47:35.422445 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:47:35 crc kubenswrapper[4795]: I0219 21:47:35.525251 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b449064d-5c14-4362-ba7b-a24ee9292789" path="/var/lib/kubelet/pods/b449064d-5c14-4362-ba7b-a24ee9292789/volumes" Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.160615 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3697a3b0-4077-4837-bcdc-c17d8aa361f1","Type":"ContainerStarted","Data":"e797be576a87ea7d2cd1a10d4fb93c6e0f25a6a5bebf1abb85c7b6e12aa13e38"} Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.160965 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3697a3b0-4077-4837-bcdc-c17d8aa361f1","Type":"ContainerStarted","Data":"4b0f0860894e220641b68db9b622d33a66b09008a18bfa19149efafa413199c3"} Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.692109 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7ssdv"] Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.693153 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7ssdv" Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.696480 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.696586 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.697851 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-b4vdh" Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.706101 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7ssdv"] Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.785873 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-scripts\") pod \"nova-cell0-conductor-db-sync-7ssdv\" (UID: \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\") " pod="openstack/nova-cell0-conductor-db-sync-7ssdv" Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.786075 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-config-data\") pod \"nova-cell0-conductor-db-sync-7ssdv\" (UID: \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\") " pod="openstack/nova-cell0-conductor-db-sync-7ssdv" Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.786262 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7ssdv\" (UID: \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\") " pod="openstack/nova-cell0-conductor-db-sync-7ssdv" Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.786351 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhcgt\" (UniqueName: \"kubernetes.io/projected/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-kube-api-access-zhcgt\") pod \"nova-cell0-conductor-db-sync-7ssdv\" (UID: \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\") " pod="openstack/nova-cell0-conductor-db-sync-7ssdv" Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.888140 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-scripts\") pod \"nova-cell0-conductor-db-sync-7ssdv\" (UID: \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\") " pod="openstack/nova-cell0-conductor-db-sync-7ssdv" Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.888245 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-config-data\") pod \"nova-cell0-conductor-db-sync-7ssdv\" (UID: \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\") " pod="openstack/nova-cell0-conductor-db-sync-7ssdv" Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.888289 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7ssdv\" (UID: \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\") " pod="openstack/nova-cell0-conductor-db-sync-7ssdv" Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.888317 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhcgt\" (UniqueName: \"kubernetes.io/projected/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-kube-api-access-zhcgt\") pod \"nova-cell0-conductor-db-sync-7ssdv\" (UID: \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\") " pod="openstack/nova-cell0-conductor-db-sync-7ssdv" Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.894990 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7ssdv\" (UID: \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\") " pod="openstack/nova-cell0-conductor-db-sync-7ssdv" Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.895083 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-config-data\") pod \"nova-cell0-conductor-db-sync-7ssdv\" (UID: \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\") " pod="openstack/nova-cell0-conductor-db-sync-7ssdv" Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.902815 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhcgt\" (UniqueName: \"kubernetes.io/projected/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-kube-api-access-zhcgt\") pod \"nova-cell0-conductor-db-sync-7ssdv\" (UID: \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\") " pod="openstack/nova-cell0-conductor-db-sync-7ssdv" Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.904558 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-scripts\") pod \"nova-cell0-conductor-db-sync-7ssdv\" (UID: \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\") " pod="openstack/nova-cell0-conductor-db-sync-7ssdv" Feb 19 21:47:37 crc kubenswrapper[4795]: I0219 21:47:37.007962 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7ssdv" Feb 19 21:47:37 crc kubenswrapper[4795]: I0219 21:47:37.176274 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3697a3b0-4077-4837-bcdc-c17d8aa361f1","Type":"ContainerStarted","Data":"c6abf78f9f811ce98af3de204165d6af86666923e425da520b7a47fdc3944ee7"} Feb 19 21:47:37 crc kubenswrapper[4795]: I0219 21:47:37.202720 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.202699623 podStartE2EDuration="3.202699623s" podCreationTimestamp="2026-02-19 21:47:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:47:37.200304456 +0000 UTC m=+1168.392822350" watchObservedRunningTime="2026-02-19 21:47:37.202699623 +0000 UTC m=+1168.395217497" Feb 19 21:47:37 crc kubenswrapper[4795]: I0219 21:47:37.526747 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7ssdv"] Feb 19 21:47:37 crc kubenswrapper[4795]: W0219 21:47:37.547450 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef98a0b8_d6d9_4075_ae60_e7d614a79e7f.slice/crio-0d7e8c41b27cf2ce2231d19e47302fd30c5de4373894fc72062d69152981c2ae WatchSource:0}: Error finding container 0d7e8c41b27cf2ce2231d19e47302fd30c5de4373894fc72062d69152981c2ae: Status 404 returned error can't find the container with id 0d7e8c41b27cf2ce2231d19e47302fd30c5de4373894fc72062d69152981c2ae Feb 19 21:47:38 crc kubenswrapper[4795]: I0219 21:47:38.188328 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7ssdv" event={"ID":"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f","Type":"ContainerStarted","Data":"0d7e8c41b27cf2ce2231d19e47302fd30c5de4373894fc72062d69152981c2ae"} Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.653235 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.743307 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-config-data\") pod \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.743356 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/253d2f67-fdba-4a38-9b30-8544e6e54cc4-run-httpd\") pod \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.743422 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-combined-ca-bundle\") pod \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.743447 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqzt6\" (UniqueName: \"kubernetes.io/projected/253d2f67-fdba-4a38-9b30-8544e6e54cc4-kube-api-access-hqzt6\") pod \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.743481 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/253d2f67-fdba-4a38-9b30-8544e6e54cc4-log-httpd\") pod \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.743537 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-scripts\") pod \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.743559 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-sg-core-conf-yaml\") pod \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.744023 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/253d2f67-fdba-4a38-9b30-8544e6e54cc4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "253d2f67-fdba-4a38-9b30-8544e6e54cc4" (UID: "253d2f67-fdba-4a38-9b30-8544e6e54cc4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.744222 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/253d2f67-fdba-4a38-9b30-8544e6e54cc4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "253d2f67-fdba-4a38-9b30-8544e6e54cc4" (UID: "253d2f67-fdba-4a38-9b30-8544e6e54cc4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.750443 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/253d2f67-fdba-4a38-9b30-8544e6e54cc4-kube-api-access-hqzt6" (OuterVolumeSpecName: "kube-api-access-hqzt6") pod "253d2f67-fdba-4a38-9b30-8544e6e54cc4" (UID: "253d2f67-fdba-4a38-9b30-8544e6e54cc4"). InnerVolumeSpecName "kube-api-access-hqzt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.751322 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-scripts" (OuterVolumeSpecName: "scripts") pod "253d2f67-fdba-4a38-9b30-8544e6e54cc4" (UID: "253d2f67-fdba-4a38-9b30-8544e6e54cc4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.776584 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "253d2f67-fdba-4a38-9b30-8544e6e54cc4" (UID: "253d2f67-fdba-4a38-9b30-8544e6e54cc4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.809628 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "253d2f67-fdba-4a38-9b30-8544e6e54cc4" (UID: "253d2f67-fdba-4a38-9b30-8544e6e54cc4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.846363 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/253d2f67-fdba-4a38-9b30-8544e6e54cc4-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.846392 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.846403 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqzt6\" (UniqueName: \"kubernetes.io/projected/253d2f67-fdba-4a38-9b30-8544e6e54cc4-kube-api-access-hqzt6\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.846417 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/253d2f67-fdba-4a38-9b30-8544e6e54cc4-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.846425 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.846433 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.848878 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-config-data" (OuterVolumeSpecName: "config-data") pod "253d2f67-fdba-4a38-9b30-8544e6e54cc4" (UID: "253d2f67-fdba-4a38-9b30-8544e6e54cc4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.950204 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.209408 4795 generic.go:334] "Generic (PLEG): container finished" podID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerID="92a2e0a3dc40a655fd29047a14680db6caca8d95bdb8b99de07de6f7f007aee5" exitCode=0 Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.209470 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"253d2f67-fdba-4a38-9b30-8544e6e54cc4","Type":"ContainerDied","Data":"92a2e0a3dc40a655fd29047a14680db6caca8d95bdb8b99de07de6f7f007aee5"} Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.209494 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"253d2f67-fdba-4a38-9b30-8544e6e54cc4","Type":"ContainerDied","Data":"84b0bcbc0062b2eec5eb90cbad2c6d5b12462c44819f0b7f936e0b7ddb57186c"} Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.209510 4795 scope.go:117] "RemoveContainer" containerID="78614fb581b9ca38ffccfdb9316d7293579f666e082d3786d4b25f75d92eddaa" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.209666 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.243945 4795 scope.go:117] "RemoveContainer" containerID="bf60e2985b3070dfff9d45f1b36e7c5e63972e743b6bc5b2679f84fa20f46972" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.244098 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.268607 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.269516 4795 scope.go:117] "RemoveContainer" containerID="1d73a882957bce93c029299c95b3bfddc0c0d2a2db28a352e94f0a5c556118f9" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.279592 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:40 crc kubenswrapper[4795]: E0219 21:47:40.280030 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerName="proxy-httpd" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.280051 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerName="proxy-httpd" Feb 19 21:47:40 crc kubenswrapper[4795]: E0219 21:47:40.280081 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerName="ceilometer-notification-agent" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.280090 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerName="ceilometer-notification-agent" Feb 19 21:47:40 crc kubenswrapper[4795]: E0219 21:47:40.280122 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerName="sg-core" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.280130 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerName="sg-core" Feb 19 21:47:40 crc kubenswrapper[4795]: E0219 21:47:40.280140 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerName="ceilometer-central-agent" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.280147 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerName="ceilometer-central-agent" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.280387 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerName="ceilometer-notification-agent" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.280405 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerName="proxy-httpd" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.280426 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerName="sg-core" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.280441 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerName="ceilometer-central-agent" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.281915 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.284036 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.284217 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.303709 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.311758 4795 scope.go:117] "RemoveContainer" containerID="92a2e0a3dc40a655fd29047a14680db6caca8d95bdb8b99de07de6f7f007aee5" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.348081 4795 scope.go:117] "RemoveContainer" containerID="78614fb581b9ca38ffccfdb9316d7293579f666e082d3786d4b25f75d92eddaa" Feb 19 21:47:40 crc kubenswrapper[4795]: E0219 21:47:40.348552 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78614fb581b9ca38ffccfdb9316d7293579f666e082d3786d4b25f75d92eddaa\": container with ID starting with 78614fb581b9ca38ffccfdb9316d7293579f666e082d3786d4b25f75d92eddaa not found: ID does not exist" containerID="78614fb581b9ca38ffccfdb9316d7293579f666e082d3786d4b25f75d92eddaa" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.348580 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78614fb581b9ca38ffccfdb9316d7293579f666e082d3786d4b25f75d92eddaa"} err="failed to get container status \"78614fb581b9ca38ffccfdb9316d7293579f666e082d3786d4b25f75d92eddaa\": rpc error: code = NotFound desc = could not find container \"78614fb581b9ca38ffccfdb9316d7293579f666e082d3786d4b25f75d92eddaa\": container with ID starting with 78614fb581b9ca38ffccfdb9316d7293579f666e082d3786d4b25f75d92eddaa not found: ID does not exist" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.348605 4795 scope.go:117] "RemoveContainer" containerID="bf60e2985b3070dfff9d45f1b36e7c5e63972e743b6bc5b2679f84fa20f46972" Feb 19 21:47:40 crc kubenswrapper[4795]: E0219 21:47:40.348853 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf60e2985b3070dfff9d45f1b36e7c5e63972e743b6bc5b2679f84fa20f46972\": container with ID starting with bf60e2985b3070dfff9d45f1b36e7c5e63972e743b6bc5b2679f84fa20f46972 not found: ID does not exist" containerID="bf60e2985b3070dfff9d45f1b36e7c5e63972e743b6bc5b2679f84fa20f46972" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.348872 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf60e2985b3070dfff9d45f1b36e7c5e63972e743b6bc5b2679f84fa20f46972"} err="failed to get container status \"bf60e2985b3070dfff9d45f1b36e7c5e63972e743b6bc5b2679f84fa20f46972\": rpc error: code = NotFound desc = could not find container \"bf60e2985b3070dfff9d45f1b36e7c5e63972e743b6bc5b2679f84fa20f46972\": container with ID starting with bf60e2985b3070dfff9d45f1b36e7c5e63972e743b6bc5b2679f84fa20f46972 not found: ID does not exist" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.348887 4795 scope.go:117] "RemoveContainer" containerID="1d73a882957bce93c029299c95b3bfddc0c0d2a2db28a352e94f0a5c556118f9" Feb 19 21:47:40 crc kubenswrapper[4795]: E0219 21:47:40.349122 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d73a882957bce93c029299c95b3bfddc0c0d2a2db28a352e94f0a5c556118f9\": container with ID starting with 1d73a882957bce93c029299c95b3bfddc0c0d2a2db28a352e94f0a5c556118f9 not found: ID does not exist" containerID="1d73a882957bce93c029299c95b3bfddc0c0d2a2db28a352e94f0a5c556118f9" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.349144 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d73a882957bce93c029299c95b3bfddc0c0d2a2db28a352e94f0a5c556118f9"} err="failed to get container status \"1d73a882957bce93c029299c95b3bfddc0c0d2a2db28a352e94f0a5c556118f9\": rpc error: code = NotFound desc = could not find container \"1d73a882957bce93c029299c95b3bfddc0c0d2a2db28a352e94f0a5c556118f9\": container with ID starting with 1d73a882957bce93c029299c95b3bfddc0c0d2a2db28a352e94f0a5c556118f9 not found: ID does not exist" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.349159 4795 scope.go:117] "RemoveContainer" containerID="92a2e0a3dc40a655fd29047a14680db6caca8d95bdb8b99de07de6f7f007aee5" Feb 19 21:47:40 crc kubenswrapper[4795]: E0219 21:47:40.350193 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92a2e0a3dc40a655fd29047a14680db6caca8d95bdb8b99de07de6f7f007aee5\": container with ID starting with 92a2e0a3dc40a655fd29047a14680db6caca8d95bdb8b99de07de6f7f007aee5 not found: ID does not exist" containerID="92a2e0a3dc40a655fd29047a14680db6caca8d95bdb8b99de07de6f7f007aee5" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.350225 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92a2e0a3dc40a655fd29047a14680db6caca8d95bdb8b99de07de6f7f007aee5"} err="failed to get container status \"92a2e0a3dc40a655fd29047a14680db6caca8d95bdb8b99de07de6f7f007aee5\": rpc error: code = NotFound desc = could not find container \"92a2e0a3dc40a655fd29047a14680db6caca8d95bdb8b99de07de6f7f007aee5\": container with ID starting with 92a2e0a3dc40a655fd29047a14680db6caca8d95bdb8b99de07de6f7f007aee5 not found: ID does not exist" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.356832 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.356891 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.356958 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-scripts\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.356985 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-config-data\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.357006 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c107007e-46bb-4d36-a899-18b499685b6c-log-httpd\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.357045 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbrc4\" (UniqueName: \"kubernetes.io/projected/c107007e-46bb-4d36-a899-18b499685b6c-kube-api-access-bbrc4\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.357091 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c107007e-46bb-4d36-a899-18b499685b6c-run-httpd\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.458694 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c107007e-46bb-4d36-a899-18b499685b6c-log-httpd\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.458757 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbrc4\" (UniqueName: \"kubernetes.io/projected/c107007e-46bb-4d36-a899-18b499685b6c-kube-api-access-bbrc4\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.458806 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c107007e-46bb-4d36-a899-18b499685b6c-run-httpd\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.458864 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.458897 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.458943 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-scripts\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.458964 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-config-data\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.459269 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c107007e-46bb-4d36-a899-18b499685b6c-log-httpd\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.462934 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-scripts\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.463074 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c107007e-46bb-4d36-a899-18b499685b6c-run-httpd\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.463701 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.464814 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-config-data\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.468592 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.478878 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbrc4\" (UniqueName: \"kubernetes.io/projected/c107007e-46bb-4d36-a899-18b499685b6c-kube-api-access-bbrc4\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.606063 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:47:41 crc kubenswrapper[4795]: I0219 21:47:41.090924 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:41 crc kubenswrapper[4795]: I0219 21:47:41.525885 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" path="/var/lib/kubelet/pods/253d2f67-fdba-4a38-9b30-8544e6e54cc4/volumes" Feb 19 21:47:42 crc kubenswrapper[4795]: I0219 21:47:42.458970 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 21:47:42 crc kubenswrapper[4795]: I0219 21:47:42.459009 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 21:47:42 crc kubenswrapper[4795]: I0219 21:47:42.507814 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 21:47:42 crc kubenswrapper[4795]: I0219 21:47:42.508135 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 21:47:43 crc kubenswrapper[4795]: I0219 21:47:43.239098 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 21:47:43 crc kubenswrapper[4795]: I0219 21:47:43.239158 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 21:47:44 crc kubenswrapper[4795]: I0219 21:47:44.836883 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 21:47:44 crc kubenswrapper[4795]: I0219 21:47:44.837266 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 21:47:44 crc kubenswrapper[4795]: I0219 21:47:44.871744 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 21:47:44 crc kubenswrapper[4795]: I0219 21:47:44.884593 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 21:47:45 crc kubenswrapper[4795]: I0219 21:47:45.101514 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 21:47:45 crc kubenswrapper[4795]: I0219 21:47:45.105281 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 21:47:45 crc kubenswrapper[4795]: I0219 21:47:45.260432 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 21:47:45 crc kubenswrapper[4795]: I0219 21:47:45.260748 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 21:47:45 crc kubenswrapper[4795]: W0219 21:47:45.560285 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc107007e_46bb_4d36_a899_18b499685b6c.slice/crio-8070c1271994839952f323f138b6727b9ea760bae63f0a2d0ecba838758ac966 WatchSource:0}: Error finding container 8070c1271994839952f323f138b6727b9ea760bae63f0a2d0ecba838758ac966: Status 404 returned error can't find the container with id 8070c1271994839952f323f138b6727b9ea760bae63f0a2d0ecba838758ac966 Feb 19 21:47:46 crc kubenswrapper[4795]: I0219 21:47:46.300925 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c107007e-46bb-4d36-a899-18b499685b6c","Type":"ContainerStarted","Data":"8070c1271994839952f323f138b6727b9ea760bae63f0a2d0ecba838758ac966"} Feb 19 21:47:47 crc kubenswrapper[4795]: I0219 21:47:47.308050 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7ssdv" event={"ID":"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f","Type":"ContainerStarted","Data":"820a0240f9371635d4e5f03ad2ffcfa48ce070182eb86809479efd07b7626507"} Feb 19 21:47:47 crc kubenswrapper[4795]: I0219 21:47:47.310899 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c107007e-46bb-4d36-a899-18b499685b6c","Type":"ContainerStarted","Data":"96afc7d12e4b4065345a42092f89f7771e25e07a46baa0fea369d89d2158ed38"} Feb 19 21:47:47 crc kubenswrapper[4795]: I0219 21:47:47.349699 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-7ssdv" podStartSLOduration=2.856475896 podStartE2EDuration="11.349681604s" podCreationTimestamp="2026-02-19 21:47:36 +0000 UTC" firstStartedPulling="2026-02-19 21:47:37.549568268 +0000 UTC m=+1168.742086142" lastFinishedPulling="2026-02-19 21:47:46.042773986 +0000 UTC m=+1177.235291850" observedRunningTime="2026-02-19 21:47:47.341675699 +0000 UTC m=+1178.534193583" watchObservedRunningTime="2026-02-19 21:47:47.349681604 +0000 UTC m=+1178.542199478" Feb 19 21:47:47 crc kubenswrapper[4795]: I0219 21:47:47.416657 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 21:47:47 crc kubenswrapper[4795]: I0219 21:47:47.416877 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 21:47:47 crc kubenswrapper[4795]: I0219 21:47:47.471499 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 21:47:48 crc kubenswrapper[4795]: I0219 21:47:48.320941 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c107007e-46bb-4d36-a899-18b499685b6c","Type":"ContainerStarted","Data":"23c4cdb855c62f25668dc60ba49d8d76113597549e62c5c0021a4b816d8cbb51"} Feb 19 21:47:48 crc kubenswrapper[4795]: I0219 21:47:48.321422 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c107007e-46bb-4d36-a899-18b499685b6c","Type":"ContainerStarted","Data":"7bfc7fbdbeaba8f96d2d0eeb4f40c479fdb828069040dd1babf326b03913a7ff"} Feb 19 21:47:50 crc kubenswrapper[4795]: I0219 21:47:50.339452 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c107007e-46bb-4d36-a899-18b499685b6c","Type":"ContainerStarted","Data":"6ea888400454dbf863a80de3a0bfbd0922a19e27a8915bf3832e7882097078a1"} Feb 19 21:47:50 crc kubenswrapper[4795]: I0219 21:47:50.340271 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 21:47:50 crc kubenswrapper[4795]: I0219 21:47:50.376638 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=6.989882565 podStartE2EDuration="10.376618566s" podCreationTimestamp="2026-02-19 21:47:40 +0000 UTC" firstStartedPulling="2026-02-19 21:47:45.944479394 +0000 UTC m=+1177.136997258" lastFinishedPulling="2026-02-19 21:47:49.331215395 +0000 UTC m=+1180.523733259" observedRunningTime="2026-02-19 21:47:50.365386661 +0000 UTC m=+1181.557904525" watchObservedRunningTime="2026-02-19 21:47:50.376618566 +0000 UTC m=+1181.569136430" Feb 19 21:47:52 crc kubenswrapper[4795]: I0219 21:47:52.420639 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:52 crc kubenswrapper[4795]: I0219 21:47:52.421483 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c107007e-46bb-4d36-a899-18b499685b6c" containerName="proxy-httpd" containerID="cri-o://6ea888400454dbf863a80de3a0bfbd0922a19e27a8915bf3832e7882097078a1" gracePeriod=30 Feb 19 21:47:52 crc kubenswrapper[4795]: I0219 21:47:52.421621 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c107007e-46bb-4d36-a899-18b499685b6c" containerName="ceilometer-central-agent" containerID="cri-o://96afc7d12e4b4065345a42092f89f7771e25e07a46baa0fea369d89d2158ed38" gracePeriod=30 Feb 19 21:47:52 crc kubenswrapper[4795]: I0219 21:47:52.421599 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c107007e-46bb-4d36-a899-18b499685b6c" containerName="ceilometer-notification-agent" containerID="cri-o://7bfc7fbdbeaba8f96d2d0eeb4f40c479fdb828069040dd1babf326b03913a7ff" gracePeriod=30 Feb 19 21:47:52 crc kubenswrapper[4795]: I0219 21:47:52.421790 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c107007e-46bb-4d36-a899-18b499685b6c" containerName="sg-core" containerID="cri-o://23c4cdb855c62f25668dc60ba49d8d76113597549e62c5c0021a4b816d8cbb51" gracePeriod=30 Feb 19 21:47:53 crc kubenswrapper[4795]: I0219 21:47:53.362056 4795 generic.go:334] "Generic (PLEG): container finished" podID="c107007e-46bb-4d36-a899-18b499685b6c" containerID="6ea888400454dbf863a80de3a0bfbd0922a19e27a8915bf3832e7882097078a1" exitCode=0 Feb 19 21:47:53 crc kubenswrapper[4795]: I0219 21:47:53.362396 4795 generic.go:334] "Generic (PLEG): container finished" podID="c107007e-46bb-4d36-a899-18b499685b6c" containerID="23c4cdb855c62f25668dc60ba49d8d76113597549e62c5c0021a4b816d8cbb51" exitCode=2 Feb 19 21:47:53 crc kubenswrapper[4795]: I0219 21:47:53.362405 4795 generic.go:334] "Generic (PLEG): container finished" podID="c107007e-46bb-4d36-a899-18b499685b6c" containerID="7bfc7fbdbeaba8f96d2d0eeb4f40c479fdb828069040dd1babf326b03913a7ff" exitCode=0 Feb 19 21:47:53 crc kubenswrapper[4795]: I0219 21:47:53.362425 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c107007e-46bb-4d36-a899-18b499685b6c","Type":"ContainerDied","Data":"6ea888400454dbf863a80de3a0bfbd0922a19e27a8915bf3832e7882097078a1"} Feb 19 21:47:53 crc kubenswrapper[4795]: I0219 21:47:53.362452 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c107007e-46bb-4d36-a899-18b499685b6c","Type":"ContainerDied","Data":"23c4cdb855c62f25668dc60ba49d8d76113597549e62c5c0021a4b816d8cbb51"} Feb 19 21:47:53 crc kubenswrapper[4795]: I0219 21:47:53.362462 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c107007e-46bb-4d36-a899-18b499685b6c","Type":"ContainerDied","Data":"7bfc7fbdbeaba8f96d2d0eeb4f40c479fdb828069040dd1babf326b03913a7ff"} Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.373433 4795 generic.go:334] "Generic (PLEG): container finished" podID="c107007e-46bb-4d36-a899-18b499685b6c" containerID="96afc7d12e4b4065345a42092f89f7771e25e07a46baa0fea369d89d2158ed38" exitCode=0 Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.373526 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c107007e-46bb-4d36-a899-18b499685b6c","Type":"ContainerDied","Data":"96afc7d12e4b4065345a42092f89f7771e25e07a46baa0fea369d89d2158ed38"} Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.484835 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.624274 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-combined-ca-bundle\") pod \"c107007e-46bb-4d36-a899-18b499685b6c\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.624472 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-sg-core-conf-yaml\") pod \"c107007e-46bb-4d36-a899-18b499685b6c\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.624511 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbrc4\" (UniqueName: \"kubernetes.io/projected/c107007e-46bb-4d36-a899-18b499685b6c-kube-api-access-bbrc4\") pod \"c107007e-46bb-4d36-a899-18b499685b6c\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.624602 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c107007e-46bb-4d36-a899-18b499685b6c-log-httpd\") pod \"c107007e-46bb-4d36-a899-18b499685b6c\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.624631 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-scripts\") pod \"c107007e-46bb-4d36-a899-18b499685b6c\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.624653 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-config-data\") pod \"c107007e-46bb-4d36-a899-18b499685b6c\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.624709 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c107007e-46bb-4d36-a899-18b499685b6c-run-httpd\") pod \"c107007e-46bb-4d36-a899-18b499685b6c\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.625288 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c107007e-46bb-4d36-a899-18b499685b6c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c107007e-46bb-4d36-a899-18b499685b6c" (UID: "c107007e-46bb-4d36-a899-18b499685b6c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.625515 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c107007e-46bb-4d36-a899-18b499685b6c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c107007e-46bb-4d36-a899-18b499685b6c" (UID: "c107007e-46bb-4d36-a899-18b499685b6c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.626045 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c107007e-46bb-4d36-a899-18b499685b6c-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.626068 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c107007e-46bb-4d36-a899-18b499685b6c-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.630335 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-scripts" (OuterVolumeSpecName: "scripts") pod "c107007e-46bb-4d36-a899-18b499685b6c" (UID: "c107007e-46bb-4d36-a899-18b499685b6c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.635455 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c107007e-46bb-4d36-a899-18b499685b6c-kube-api-access-bbrc4" (OuterVolumeSpecName: "kube-api-access-bbrc4") pod "c107007e-46bb-4d36-a899-18b499685b6c" (UID: "c107007e-46bb-4d36-a899-18b499685b6c"). InnerVolumeSpecName "kube-api-access-bbrc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.658534 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c107007e-46bb-4d36-a899-18b499685b6c" (UID: "c107007e-46bb-4d36-a899-18b499685b6c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.704214 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c107007e-46bb-4d36-a899-18b499685b6c" (UID: "c107007e-46bb-4d36-a899-18b499685b6c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.719892 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-config-data" (OuterVolumeSpecName: "config-data") pod "c107007e-46bb-4d36-a899-18b499685b6c" (UID: "c107007e-46bb-4d36-a899-18b499685b6c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.727975 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.727996 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.728006 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.728017 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.728025 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbrc4\" (UniqueName: \"kubernetes.io/projected/c107007e-46bb-4d36-a899-18b499685b6c-kube-api-access-bbrc4\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.385780 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c107007e-46bb-4d36-a899-18b499685b6c","Type":"ContainerDied","Data":"8070c1271994839952f323f138b6727b9ea760bae63f0a2d0ecba838758ac966"} Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.385835 4795 scope.go:117] "RemoveContainer" containerID="6ea888400454dbf863a80de3a0bfbd0922a19e27a8915bf3832e7882097078a1" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.385871 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.408855 4795 scope.go:117] "RemoveContainer" containerID="23c4cdb855c62f25668dc60ba49d8d76113597549e62c5c0021a4b816d8cbb51" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.434998 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.442863 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.457514 4795 scope.go:117] "RemoveContainer" containerID="7bfc7fbdbeaba8f96d2d0eeb4f40c479fdb828069040dd1babf326b03913a7ff" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.472515 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:55 crc kubenswrapper[4795]: E0219 21:47:55.477287 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c107007e-46bb-4d36-a899-18b499685b6c" containerName="sg-core" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.477337 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c107007e-46bb-4d36-a899-18b499685b6c" containerName="sg-core" Feb 19 21:47:55 crc kubenswrapper[4795]: E0219 21:47:55.477355 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c107007e-46bb-4d36-a899-18b499685b6c" containerName="proxy-httpd" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.477363 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c107007e-46bb-4d36-a899-18b499685b6c" containerName="proxy-httpd" Feb 19 21:47:55 crc kubenswrapper[4795]: E0219 21:47:55.477391 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c107007e-46bb-4d36-a899-18b499685b6c" containerName="ceilometer-notification-agent" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.477399 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c107007e-46bb-4d36-a899-18b499685b6c" containerName="ceilometer-notification-agent" Feb 19 21:47:55 crc kubenswrapper[4795]: E0219 21:47:55.477439 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c107007e-46bb-4d36-a899-18b499685b6c" containerName="ceilometer-central-agent" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.477447 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c107007e-46bb-4d36-a899-18b499685b6c" containerName="ceilometer-central-agent" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.477767 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c107007e-46bb-4d36-a899-18b499685b6c" containerName="ceilometer-notification-agent" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.477795 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c107007e-46bb-4d36-a899-18b499685b6c" containerName="proxy-httpd" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.477820 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c107007e-46bb-4d36-a899-18b499685b6c" containerName="ceilometer-central-agent" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.477834 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c107007e-46bb-4d36-a899-18b499685b6c" containerName="sg-core" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.479922 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.482569 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.482774 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.482964 4795 scope.go:117] "RemoveContainer" containerID="96afc7d12e4b4065345a42092f89f7771e25e07a46baa0fea369d89d2158ed38" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.484510 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.537399 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c107007e-46bb-4d36-a899-18b499685b6c" path="/var/lib/kubelet/pods/c107007e-46bb-4d36-a899-18b499685b6c/volumes" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.642087 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzxr6\" (UniqueName: \"kubernetes.io/projected/60557b75-bafe-4e92-937c-e541b84aaf70-kube-api-access-gzxr6\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.642141 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-scripts\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.642294 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.642362 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-config-data\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.642406 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.642437 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60557b75-bafe-4e92-937c-e541b84aaf70-run-httpd\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.642468 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60557b75-bafe-4e92-937c-e541b84aaf70-log-httpd\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.743991 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-config-data\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.744467 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.744505 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60557b75-bafe-4e92-937c-e541b84aaf70-run-httpd\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.744530 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60557b75-bafe-4e92-937c-e541b84aaf70-log-httpd\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.744616 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzxr6\" (UniqueName: \"kubernetes.io/projected/60557b75-bafe-4e92-937c-e541b84aaf70-kube-api-access-gzxr6\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.744640 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-scripts\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.744728 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.745503 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60557b75-bafe-4e92-937c-e541b84aaf70-log-httpd\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.745515 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60557b75-bafe-4e92-937c-e541b84aaf70-run-httpd\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.748591 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.749801 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-config-data\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.751138 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.761558 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-scripts\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.768018 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzxr6\" (UniqueName: \"kubernetes.io/projected/60557b75-bafe-4e92-937c-e541b84aaf70-kube-api-access-gzxr6\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.805484 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:47:56 crc kubenswrapper[4795]: I0219 21:47:56.301965 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:56 crc kubenswrapper[4795]: I0219 21:47:56.396997 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60557b75-bafe-4e92-937c-e541b84aaf70","Type":"ContainerStarted","Data":"2512790f4175863e7da7d55dc8c6ebb57bbf253fa486687fa567e90d3c41dca6"} Feb 19 21:47:58 crc kubenswrapper[4795]: I0219 21:47:58.438318 4795 generic.go:334] "Generic (PLEG): container finished" podID="ef98a0b8-d6d9-4075-ae60-e7d614a79e7f" containerID="820a0240f9371635d4e5f03ad2ffcfa48ce070182eb86809479efd07b7626507" exitCode=0 Feb 19 21:47:58 crc kubenswrapper[4795]: I0219 21:47:58.438400 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7ssdv" event={"ID":"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f","Type":"ContainerDied","Data":"820a0240f9371635d4e5f03ad2ffcfa48ce070182eb86809479efd07b7626507"} Feb 19 21:47:58 crc kubenswrapper[4795]: I0219 21:47:58.447133 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60557b75-bafe-4e92-937c-e541b84aaf70","Type":"ContainerStarted","Data":"1631e868548b8541b4586b49303d99f204a0c9cbd4601bd40535a084257996e8"} Feb 19 21:47:58 crc kubenswrapper[4795]: I0219 21:47:58.447233 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60557b75-bafe-4e92-937c-e541b84aaf70","Type":"ContainerStarted","Data":"8985ce1aa24fbc5f5bb8026bb427b370dba124cba19d3928ff7c3a92ca42615b"} Feb 19 21:47:59 crc kubenswrapper[4795]: I0219 21:47:59.457433 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60557b75-bafe-4e92-937c-e541b84aaf70","Type":"ContainerStarted","Data":"bd25350fb77f428e282ee6c0811cc7be2f53b67333596fcfe27b4e218e3549da"} Feb 19 21:47:59 crc kubenswrapper[4795]: I0219 21:47:59.858808 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7ssdv" Feb 19 21:47:59 crc kubenswrapper[4795]: I0219 21:47:59.924872 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-scripts\") pod \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\" (UID: \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\") " Feb 19 21:47:59 crc kubenswrapper[4795]: I0219 21:47:59.925037 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhcgt\" (UniqueName: \"kubernetes.io/projected/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-kube-api-access-zhcgt\") pod \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\" (UID: \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\") " Feb 19 21:47:59 crc kubenswrapper[4795]: I0219 21:47:59.925078 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-config-data\") pod \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\" (UID: \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\") " Feb 19 21:47:59 crc kubenswrapper[4795]: I0219 21:47:59.925248 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-combined-ca-bundle\") pod \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\" (UID: \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\") " Feb 19 21:47:59 crc kubenswrapper[4795]: I0219 21:47:59.930698 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-scripts" (OuterVolumeSpecName: "scripts") pod "ef98a0b8-d6d9-4075-ae60-e7d614a79e7f" (UID: "ef98a0b8-d6d9-4075-ae60-e7d614a79e7f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:59 crc kubenswrapper[4795]: I0219 21:47:59.934203 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-kube-api-access-zhcgt" (OuterVolumeSpecName: "kube-api-access-zhcgt") pod "ef98a0b8-d6d9-4075-ae60-e7d614a79e7f" (UID: "ef98a0b8-d6d9-4075-ae60-e7d614a79e7f"). InnerVolumeSpecName "kube-api-access-zhcgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:59 crc kubenswrapper[4795]: I0219 21:47:59.952309 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef98a0b8-d6d9-4075-ae60-e7d614a79e7f" (UID: "ef98a0b8-d6d9-4075-ae60-e7d614a79e7f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:59 crc kubenswrapper[4795]: I0219 21:47:59.957292 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-config-data" (OuterVolumeSpecName: "config-data") pod "ef98a0b8-d6d9-4075-ae60-e7d614a79e7f" (UID: "ef98a0b8-d6d9-4075-ae60-e7d614a79e7f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.027048 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.027071 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhcgt\" (UniqueName: \"kubernetes.io/projected/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-kube-api-access-zhcgt\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.027082 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.027091 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.473363 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7ssdv" event={"ID":"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f","Type":"ContainerDied","Data":"0d7e8c41b27cf2ce2231d19e47302fd30c5de4373894fc72062d69152981c2ae"} Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.473662 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d7e8c41b27cf2ce2231d19e47302fd30c5de4373894fc72062d69152981c2ae" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.473405 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7ssdv" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.481226 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60557b75-bafe-4e92-937c-e541b84aaf70","Type":"ContainerStarted","Data":"f916e17382481e444593150e0a833b6a6814157585865c277cdc15cc441ddb4b"} Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.481456 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.519511 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.832063113 podStartE2EDuration="5.519493402s" podCreationTimestamp="2026-02-19 21:47:55 +0000 UTC" firstStartedPulling="2026-02-19 21:47:56.311987671 +0000 UTC m=+1187.504505535" lastFinishedPulling="2026-02-19 21:47:59.99941796 +0000 UTC m=+1191.191935824" observedRunningTime="2026-02-19 21:48:00.513105972 +0000 UTC m=+1191.705623836" watchObservedRunningTime="2026-02-19 21:48:00.519493402 +0000 UTC m=+1191.712011286" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.563449 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 21:48:00 crc kubenswrapper[4795]: E0219 21:48:00.563818 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef98a0b8-d6d9-4075-ae60-e7d614a79e7f" containerName="nova-cell0-conductor-db-sync" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.563836 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef98a0b8-d6d9-4075-ae60-e7d614a79e7f" containerName="nova-cell0-conductor-db-sync" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.563988 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef98a0b8-d6d9-4075-ae60-e7d614a79e7f" containerName="nova-cell0-conductor-db-sync" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.564537 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.566758 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.566820 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-b4vdh" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.581954 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.637418 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57b83043-2f7c-4b55-a2b9-66eef96f0008-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"57b83043-2f7c-4b55-a2b9-66eef96f0008\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.637487 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v44bq\" (UniqueName: \"kubernetes.io/projected/57b83043-2f7c-4b55-a2b9-66eef96f0008-kube-api-access-v44bq\") pod \"nova-cell0-conductor-0\" (UID: \"57b83043-2f7c-4b55-a2b9-66eef96f0008\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.637532 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57b83043-2f7c-4b55-a2b9-66eef96f0008-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"57b83043-2f7c-4b55-a2b9-66eef96f0008\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.739038 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57b83043-2f7c-4b55-a2b9-66eef96f0008-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"57b83043-2f7c-4b55-a2b9-66eef96f0008\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.739120 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v44bq\" (UniqueName: \"kubernetes.io/projected/57b83043-2f7c-4b55-a2b9-66eef96f0008-kube-api-access-v44bq\") pod \"nova-cell0-conductor-0\" (UID: \"57b83043-2f7c-4b55-a2b9-66eef96f0008\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.739189 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57b83043-2f7c-4b55-a2b9-66eef96f0008-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"57b83043-2f7c-4b55-a2b9-66eef96f0008\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.743232 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57b83043-2f7c-4b55-a2b9-66eef96f0008-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"57b83043-2f7c-4b55-a2b9-66eef96f0008\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.744699 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57b83043-2f7c-4b55-a2b9-66eef96f0008-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"57b83043-2f7c-4b55-a2b9-66eef96f0008\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.761121 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v44bq\" (UniqueName: \"kubernetes.io/projected/57b83043-2f7c-4b55-a2b9-66eef96f0008-kube-api-access-v44bq\") pod \"nova-cell0-conductor-0\" (UID: \"57b83043-2f7c-4b55-a2b9-66eef96f0008\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.888238 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 21:48:01 crc kubenswrapper[4795]: I0219 21:48:01.371073 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 21:48:01 crc kubenswrapper[4795]: I0219 21:48:01.492985 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"57b83043-2f7c-4b55-a2b9-66eef96f0008","Type":"ContainerStarted","Data":"83f719d65e236fae031c225d4f8065a2b4c198be5a5993edbe70af70bfebe600"} Feb 19 21:48:02 crc kubenswrapper[4795]: I0219 21:48:02.502643 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"57b83043-2f7c-4b55-a2b9-66eef96f0008","Type":"ContainerStarted","Data":"4f011f521b748247939cf0cf1c345e321c4287fa532812c9357da77b932cf4c6"} Feb 19 21:48:02 crc kubenswrapper[4795]: I0219 21:48:02.503107 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 19 21:48:02 crc kubenswrapper[4795]: I0219 21:48:02.535224 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.535156602 podStartE2EDuration="2.535156602s" podCreationTimestamp="2026-02-19 21:48:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:02.524319177 +0000 UTC m=+1193.716837031" watchObservedRunningTime="2026-02-19 21:48:02.535156602 +0000 UTC m=+1193.727674466" Feb 19 21:48:10 crc kubenswrapper[4795]: I0219 21:48:10.938465 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.440493 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-x4mls"] Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.442346 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-x4mls" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.445268 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.445277 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.468492 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-x4mls"] Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.582218 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-x4mls\" (UID: \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\") " pod="openstack/nova-cell0-cell-mapping-x4mls" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.582305 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-config-data\") pod \"nova-cell0-cell-mapping-x4mls\" (UID: \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\") " pod="openstack/nova-cell0-cell-mapping-x4mls" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.582426 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qggxc\" (UniqueName: \"kubernetes.io/projected/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-kube-api-access-qggxc\") pod \"nova-cell0-cell-mapping-x4mls\" (UID: \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\") " pod="openstack/nova-cell0-cell-mapping-x4mls" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.582459 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-scripts\") pod \"nova-cell0-cell-mapping-x4mls\" (UID: \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\") " pod="openstack/nova-cell0-cell-mapping-x4mls" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.611545 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.612891 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.615903 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.637514 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.684541 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-x4mls\" (UID: \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\") " pod="openstack/nova-cell0-cell-mapping-x4mls" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.684611 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1207388e-c327-4eb4-b81f-dee124375ca8-config-data\") pod \"nova-api-0\" (UID: \"1207388e-c327-4eb4-b81f-dee124375ca8\") " pod="openstack/nova-api-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.684646 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-config-data\") pod \"nova-cell0-cell-mapping-x4mls\" (UID: \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\") " pod="openstack/nova-cell0-cell-mapping-x4mls" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.684732 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qggxc\" (UniqueName: \"kubernetes.io/projected/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-kube-api-access-qggxc\") pod \"nova-cell0-cell-mapping-x4mls\" (UID: \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\") " pod="openstack/nova-cell0-cell-mapping-x4mls" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.684765 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-scripts\") pod \"nova-cell0-cell-mapping-x4mls\" (UID: \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\") " pod="openstack/nova-cell0-cell-mapping-x4mls" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.684835 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1207388e-c327-4eb4-b81f-dee124375ca8-logs\") pod \"nova-api-0\" (UID: \"1207388e-c327-4eb4-b81f-dee124375ca8\") " pod="openstack/nova-api-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.684879 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v6ct\" (UniqueName: \"kubernetes.io/projected/1207388e-c327-4eb4-b81f-dee124375ca8-kube-api-access-8v6ct\") pod \"nova-api-0\" (UID: \"1207388e-c327-4eb4-b81f-dee124375ca8\") " pod="openstack/nova-api-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.684921 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1207388e-c327-4eb4-b81f-dee124375ca8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1207388e-c327-4eb4-b81f-dee124375ca8\") " pod="openstack/nova-api-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.705041 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-config-data\") pod \"nova-cell0-cell-mapping-x4mls\" (UID: \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\") " pod="openstack/nova-cell0-cell-mapping-x4mls" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.705439 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-scripts\") pod \"nova-cell0-cell-mapping-x4mls\" (UID: \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\") " pod="openstack/nova-cell0-cell-mapping-x4mls" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.705504 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-x4mls\" (UID: \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\") " pod="openstack/nova-cell0-cell-mapping-x4mls" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.715375 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.717910 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.719035 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qggxc\" (UniqueName: \"kubernetes.io/projected/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-kube-api-access-qggxc\") pod \"nova-cell0-cell-mapping-x4mls\" (UID: \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\") " pod="openstack/nova-cell0-cell-mapping-x4mls" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.723521 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.731859 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.762604 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-x4mls" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.765247 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.766531 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.770840 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.786251 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5aecdca-84a6-4987-8b84-a95fbb0096f9-config-data\") pod \"nova-scheduler-0\" (UID: \"d5aecdca-84a6-4987-8b84-a95fbb0096f9\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.786317 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5aecdca-84a6-4987-8b84-a95fbb0096f9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d5aecdca-84a6-4987-8b84-a95fbb0096f9\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.786348 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1207388e-c327-4eb4-b81f-dee124375ca8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1207388e-c327-4eb4-b81f-dee124375ca8\") " pod="openstack/nova-api-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.786385 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1207388e-c327-4eb4-b81f-dee124375ca8-config-data\") pod \"nova-api-0\" (UID: \"1207388e-c327-4eb4-b81f-dee124375ca8\") " pod="openstack/nova-api-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.786453 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9jbt\" (UniqueName: \"kubernetes.io/projected/d5aecdca-84a6-4987-8b84-a95fbb0096f9-kube-api-access-w9jbt\") pod \"nova-scheduler-0\" (UID: \"d5aecdca-84a6-4987-8b84-a95fbb0096f9\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.786517 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1207388e-c327-4eb4-b81f-dee124375ca8-logs\") pod \"nova-api-0\" (UID: \"1207388e-c327-4eb4-b81f-dee124375ca8\") " pod="openstack/nova-api-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.786541 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v6ct\" (UniqueName: \"kubernetes.io/projected/1207388e-c327-4eb4-b81f-dee124375ca8-kube-api-access-8v6ct\") pod \"nova-api-0\" (UID: \"1207388e-c327-4eb4-b81f-dee124375ca8\") " pod="openstack/nova-api-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.799847 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1207388e-c327-4eb4-b81f-dee124375ca8-logs\") pod \"nova-api-0\" (UID: \"1207388e-c327-4eb4-b81f-dee124375ca8\") " pod="openstack/nova-api-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.800650 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1207388e-c327-4eb4-b81f-dee124375ca8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1207388e-c327-4eb4-b81f-dee124375ca8\") " pod="openstack/nova-api-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.804424 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1207388e-c327-4eb4-b81f-dee124375ca8-config-data\") pod \"nova-api-0\" (UID: \"1207388e-c327-4eb4-b81f-dee124375ca8\") " pod="openstack/nova-api-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.832830 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v6ct\" (UniqueName: \"kubernetes.io/projected/1207388e-c327-4eb4-b81f-dee124375ca8-kube-api-access-8v6ct\") pod \"nova-api-0\" (UID: \"1207388e-c327-4eb4-b81f-dee124375ca8\") " pod="openstack/nova-api-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.835222 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.861629 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.863312 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.874703 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.888544 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5aecdca-84a6-4987-8b84-a95fbb0096f9-config-data\") pod \"nova-scheduler-0\" (UID: \"d5aecdca-84a6-4987-8b84-a95fbb0096f9\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.888586 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5aecdca-84a6-4987-8b84-a95fbb0096f9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d5aecdca-84a6-4987-8b84-a95fbb0096f9\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.888683 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dabff2c-427b-4307-b949-23fdde980292-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dabff2c-427b-4307-b949-23fdde980292\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.888710 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9jbt\" (UniqueName: \"kubernetes.io/projected/d5aecdca-84a6-4987-8b84-a95fbb0096f9-kube-api-access-w9jbt\") pod \"nova-scheduler-0\" (UID: \"d5aecdca-84a6-4987-8b84-a95fbb0096f9\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.888740 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29qhd\" (UniqueName: \"kubernetes.io/projected/4dabff2c-427b-4307-b949-23fdde980292-kube-api-access-29qhd\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dabff2c-427b-4307-b949-23fdde980292\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.888792 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dabff2c-427b-4307-b949-23fdde980292-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dabff2c-427b-4307-b949-23fdde980292\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.907810 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5aecdca-84a6-4987-8b84-a95fbb0096f9-config-data\") pod \"nova-scheduler-0\" (UID: \"d5aecdca-84a6-4987-8b84-a95fbb0096f9\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.910633 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5aecdca-84a6-4987-8b84-a95fbb0096f9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d5aecdca-84a6-4987-8b84-a95fbb0096f9\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.914531 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9jbt\" (UniqueName: \"kubernetes.io/projected/d5aecdca-84a6-4987-8b84-a95fbb0096f9-kube-api-access-w9jbt\") pod \"nova-scheduler-0\" (UID: \"d5aecdca-84a6-4987-8b84-a95fbb0096f9\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.914592 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.923423 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75ddbf7c75-f5qcb"] Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.925268 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.930528 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.940918 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75ddbf7c75-f5qcb"] Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.957545 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.989926 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lht4g\" (UniqueName: \"kubernetes.io/projected/c55a0492-b022-4514-85d5-d35d3ec46f05-kube-api-access-lht4g\") pod \"nova-metadata-0\" (UID: \"c55a0492-b022-4514-85d5-d35d3ec46f05\") " pod="openstack/nova-metadata-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.989964 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c55a0492-b022-4514-85d5-d35d3ec46f05-config-data\") pod \"nova-metadata-0\" (UID: \"c55a0492-b022-4514-85d5-d35d3ec46f05\") " pod="openstack/nova-metadata-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.989987 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c55a0492-b022-4514-85d5-d35d3ec46f05-logs\") pod \"nova-metadata-0\" (UID: \"c55a0492-b022-4514-85d5-d35d3ec46f05\") " pod="openstack/nova-metadata-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.990241 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-dns-svc\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.990310 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dabff2c-427b-4307-b949-23fdde980292-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dabff2c-427b-4307-b949-23fdde980292\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.990394 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29qhd\" (UniqueName: \"kubernetes.io/projected/4dabff2c-427b-4307-b949-23fdde980292-kube-api-access-29qhd\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dabff2c-427b-4307-b949-23fdde980292\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.990468 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-config\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.990502 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-ovsdbserver-sb\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.990604 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dabff2c-427b-4307-b949-23fdde980292-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dabff2c-427b-4307-b949-23fdde980292\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.990659 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-ovsdbserver-nb\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.990708 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zrmm\" (UniqueName: \"kubernetes.io/projected/59d807ee-6555-4c2f-8598-9f264d5a95f9-kube-api-access-5zrmm\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.990732 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-dns-swift-storage-0\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.990756 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c55a0492-b022-4514-85d5-d35d3ec46f05-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c55a0492-b022-4514-85d5-d35d3ec46f05\") " pod="openstack/nova-metadata-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.993870 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dabff2c-427b-4307-b949-23fdde980292-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dabff2c-427b-4307-b949-23fdde980292\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.995657 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dabff2c-427b-4307-b949-23fdde980292-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dabff2c-427b-4307-b949-23fdde980292\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.006194 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29qhd\" (UniqueName: \"kubernetes.io/projected/4dabff2c-427b-4307-b949-23fdde980292-kube-api-access-29qhd\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dabff2c-427b-4307-b949-23fdde980292\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.092690 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-dns-svc\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.092790 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-config\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.092821 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-ovsdbserver-sb\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.092877 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-ovsdbserver-nb\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.092903 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zrmm\" (UniqueName: \"kubernetes.io/projected/59d807ee-6555-4c2f-8598-9f264d5a95f9-kube-api-access-5zrmm\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.092930 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-dns-swift-storage-0\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.092959 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c55a0492-b022-4514-85d5-d35d3ec46f05-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c55a0492-b022-4514-85d5-d35d3ec46f05\") " pod="openstack/nova-metadata-0" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.092983 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lht4g\" (UniqueName: \"kubernetes.io/projected/c55a0492-b022-4514-85d5-d35d3ec46f05-kube-api-access-lht4g\") pod \"nova-metadata-0\" (UID: \"c55a0492-b022-4514-85d5-d35d3ec46f05\") " pod="openstack/nova-metadata-0" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.092999 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c55a0492-b022-4514-85d5-d35d3ec46f05-config-data\") pod \"nova-metadata-0\" (UID: \"c55a0492-b022-4514-85d5-d35d3ec46f05\") " pod="openstack/nova-metadata-0" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.093033 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c55a0492-b022-4514-85d5-d35d3ec46f05-logs\") pod \"nova-metadata-0\" (UID: \"c55a0492-b022-4514-85d5-d35d3ec46f05\") " pod="openstack/nova-metadata-0" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.093750 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c55a0492-b022-4514-85d5-d35d3ec46f05-logs\") pod \"nova-metadata-0\" (UID: \"c55a0492-b022-4514-85d5-d35d3ec46f05\") " pod="openstack/nova-metadata-0" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.094501 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-dns-swift-storage-0\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.095844 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-config\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.096302 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-ovsdbserver-nb\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.096479 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-ovsdbserver-sb\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.097141 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-dns-svc\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.100553 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c55a0492-b022-4514-85d5-d35d3ec46f05-config-data\") pod \"nova-metadata-0\" (UID: \"c55a0492-b022-4514-85d5-d35d3ec46f05\") " pod="openstack/nova-metadata-0" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.101313 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c55a0492-b022-4514-85d5-d35d3ec46f05-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c55a0492-b022-4514-85d5-d35d3ec46f05\") " pod="openstack/nova-metadata-0" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.111852 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zrmm\" (UniqueName: \"kubernetes.io/projected/59d807ee-6555-4c2f-8598-9f264d5a95f9-kube-api-access-5zrmm\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.114248 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lht4g\" (UniqueName: \"kubernetes.io/projected/c55a0492-b022-4514-85d5-d35d3ec46f05-kube-api-access-lht4g\") pod \"nova-metadata-0\" (UID: \"c55a0492-b022-4514-85d5-d35d3ec46f05\") " pod="openstack/nova-metadata-0" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.272141 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.288968 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.296756 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.420918 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-x4mls"] Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.535153 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.608316 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-x4mls" event={"ID":"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd","Type":"ContainerStarted","Data":"88f56eb2aa26f82d9b17df46fcb912c867ec3d628b4dff9a39f509961e5df80b"} Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.621764 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1207388e-c327-4eb4-b81f-dee124375ca8","Type":"ContainerStarted","Data":"70232baf2a46181b0fb51eefd50334fc1763134daec5b9978cd7cc19312a07a8"} Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.626361 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gxh8d"] Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.627529 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-gxh8d" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.630379 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.630633 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.640152 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.656997 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gxh8d"] Feb 19 21:48:12 crc kubenswrapper[4795]: W0219 21:48:12.657870 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5aecdca_84a6_4987_8b84_a95fbb0096f9.slice/crio-13cacc8b3b7ac3be0ed861e620fc89ce3271fa1dd8179aa143ee4013f7bace77 WatchSource:0}: Error finding container 13cacc8b3b7ac3be0ed861e620fc89ce3271fa1dd8179aa143ee4013f7bace77: Status 404 returned error can't find the container with id 13cacc8b3b7ac3be0ed861e620fc89ce3271fa1dd8179aa143ee4013f7bace77 Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.693156 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.710889 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrmdc\" (UniqueName: \"kubernetes.io/projected/fd8e89cd-b890-4f36-9008-59767ccbad91-kube-api-access-rrmdc\") pod \"nova-cell1-conductor-db-sync-gxh8d\" (UID: \"fd8e89cd-b890-4f36-9008-59767ccbad91\") " pod="openstack/nova-cell1-conductor-db-sync-gxh8d" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.710936 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-config-data\") pod \"nova-cell1-conductor-db-sync-gxh8d\" (UID: \"fd8e89cd-b890-4f36-9008-59767ccbad91\") " pod="openstack/nova-cell1-conductor-db-sync-gxh8d" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.710971 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-gxh8d\" (UID: \"fd8e89cd-b890-4f36-9008-59767ccbad91\") " pod="openstack/nova-cell1-conductor-db-sync-gxh8d" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.711009 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-scripts\") pod \"nova-cell1-conductor-db-sync-gxh8d\" (UID: \"fd8e89cd-b890-4f36-9008-59767ccbad91\") " pod="openstack/nova-cell1-conductor-db-sync-gxh8d" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.812964 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-scripts\") pod \"nova-cell1-conductor-db-sync-gxh8d\" (UID: \"fd8e89cd-b890-4f36-9008-59767ccbad91\") " pod="openstack/nova-cell1-conductor-db-sync-gxh8d" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.813123 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrmdc\" (UniqueName: \"kubernetes.io/projected/fd8e89cd-b890-4f36-9008-59767ccbad91-kube-api-access-rrmdc\") pod \"nova-cell1-conductor-db-sync-gxh8d\" (UID: \"fd8e89cd-b890-4f36-9008-59767ccbad91\") " pod="openstack/nova-cell1-conductor-db-sync-gxh8d" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.813182 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-config-data\") pod \"nova-cell1-conductor-db-sync-gxh8d\" (UID: \"fd8e89cd-b890-4f36-9008-59767ccbad91\") " pod="openstack/nova-cell1-conductor-db-sync-gxh8d" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.813229 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-gxh8d\" (UID: \"fd8e89cd-b890-4f36-9008-59767ccbad91\") " pod="openstack/nova-cell1-conductor-db-sync-gxh8d" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.822633 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-gxh8d\" (UID: \"fd8e89cd-b890-4f36-9008-59767ccbad91\") " pod="openstack/nova-cell1-conductor-db-sync-gxh8d" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.835100 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-config-data\") pod \"nova-cell1-conductor-db-sync-gxh8d\" (UID: \"fd8e89cd-b890-4f36-9008-59767ccbad91\") " pod="openstack/nova-cell1-conductor-db-sync-gxh8d" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.840796 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-scripts\") pod \"nova-cell1-conductor-db-sync-gxh8d\" (UID: \"fd8e89cd-b890-4f36-9008-59767ccbad91\") " pod="openstack/nova-cell1-conductor-db-sync-gxh8d" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.843171 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.843223 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrmdc\" (UniqueName: \"kubernetes.io/projected/fd8e89cd-b890-4f36-9008-59767ccbad91-kube-api-access-rrmdc\") pod \"nova-cell1-conductor-db-sync-gxh8d\" (UID: \"fd8e89cd-b890-4f36-9008-59767ccbad91\") " pod="openstack/nova-cell1-conductor-db-sync-gxh8d" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.954071 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-gxh8d" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.993279 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75ddbf7c75-f5qcb"] Feb 19 21:48:13 crc kubenswrapper[4795]: W0219 21:48:13.006503 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59d807ee_6555_4c2f_8598_9f264d5a95f9.slice/crio-919c6d3f60ce379525cac92e740a7e9dd73e2c7e47ddc1efebc10b6306ebfc94 WatchSource:0}: Error finding container 919c6d3f60ce379525cac92e740a7e9dd73e2c7e47ddc1efebc10b6306ebfc94: Status 404 returned error can't find the container with id 919c6d3f60ce379525cac92e740a7e9dd73e2c7e47ddc1efebc10b6306ebfc94 Feb 19 21:48:13 crc kubenswrapper[4795]: I0219 21:48:13.426246 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gxh8d"] Feb 19 21:48:13 crc kubenswrapper[4795]: I0219 21:48:13.634028 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4dabff2c-427b-4307-b949-23fdde980292","Type":"ContainerStarted","Data":"7a24e1fe5ec311d17f9a97363d6465ddfdef8ead02652c4242800fc85c6ff620"} Feb 19 21:48:13 crc kubenswrapper[4795]: I0219 21:48:13.636728 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-x4mls" event={"ID":"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd","Type":"ContainerStarted","Data":"54158f46698b83516d68eee860a71369629e83ec491495039fcc2f5f6a5bd317"} Feb 19 21:48:13 crc kubenswrapper[4795]: I0219 21:48:13.639413 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c55a0492-b022-4514-85d5-d35d3ec46f05","Type":"ContainerStarted","Data":"9eb4a81c900f6ce7b3c066bf56f65f67c3231408465f0d632475d3104e06db3a"} Feb 19 21:48:13 crc kubenswrapper[4795]: I0219 21:48:13.642481 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d5aecdca-84a6-4987-8b84-a95fbb0096f9","Type":"ContainerStarted","Data":"13cacc8b3b7ac3be0ed861e620fc89ce3271fa1dd8179aa143ee4013f7bace77"} Feb 19 21:48:13 crc kubenswrapper[4795]: I0219 21:48:13.643968 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-gxh8d" event={"ID":"fd8e89cd-b890-4f36-9008-59767ccbad91","Type":"ContainerStarted","Data":"199857e63a023ba2f3b45450f39bdc22aaf434e44f27b4974099a3d7f5c19db9"} Feb 19 21:48:13 crc kubenswrapper[4795]: I0219 21:48:13.649519 4795 generic.go:334] "Generic (PLEG): container finished" podID="59d807ee-6555-4c2f-8598-9f264d5a95f9" containerID="c4b13b710aaac472f707d3124011993b912c3a0c284fec1b64e6081245b45895" exitCode=0 Feb 19 21:48:13 crc kubenswrapper[4795]: I0219 21:48:13.649573 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" event={"ID":"59d807ee-6555-4c2f-8598-9f264d5a95f9","Type":"ContainerDied","Data":"c4b13b710aaac472f707d3124011993b912c3a0c284fec1b64e6081245b45895"} Feb 19 21:48:13 crc kubenswrapper[4795]: I0219 21:48:13.649598 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" event={"ID":"59d807ee-6555-4c2f-8598-9f264d5a95f9","Type":"ContainerStarted","Data":"919c6d3f60ce379525cac92e740a7e9dd73e2c7e47ddc1efebc10b6306ebfc94"} Feb 19 21:48:13 crc kubenswrapper[4795]: I0219 21:48:13.652404 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-x4mls" podStartSLOduration=2.652389391 podStartE2EDuration="2.652389391s" podCreationTimestamp="2026-02-19 21:48:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:13.649564192 +0000 UTC m=+1204.842082056" watchObservedRunningTime="2026-02-19 21:48:13.652389391 +0000 UTC m=+1204.844907255" Feb 19 21:48:14 crc kubenswrapper[4795]: I0219 21:48:14.668856 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" event={"ID":"59d807ee-6555-4c2f-8598-9f264d5a95f9","Type":"ContainerStarted","Data":"fd261f043799d7df60456eb5576e328975c0d9086db0abbd02ad64c7bf8efe3f"} Feb 19 21:48:14 crc kubenswrapper[4795]: I0219 21:48:14.670231 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:14 crc kubenswrapper[4795]: I0219 21:48:14.675222 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-gxh8d" event={"ID":"fd8e89cd-b890-4f36-9008-59767ccbad91","Type":"ContainerStarted","Data":"971150e9373b5419fd5efc7fc3e78faafd9906f987cb7e0a9e042f04e22cb5a9"} Feb 19 21:48:14 crc kubenswrapper[4795]: I0219 21:48:14.694040 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" podStartSLOduration=3.694018026 podStartE2EDuration="3.694018026s" podCreationTimestamp="2026-02-19 21:48:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:14.692317528 +0000 UTC m=+1205.884835392" watchObservedRunningTime="2026-02-19 21:48:14.694018026 +0000 UTC m=+1205.886535890" Feb 19 21:48:14 crc kubenswrapper[4795]: I0219 21:48:14.711761 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-gxh8d" podStartSLOduration=2.711745594 podStartE2EDuration="2.711745594s" podCreationTimestamp="2026-02-19 21:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:14.707907076 +0000 UTC m=+1205.900424940" watchObservedRunningTime="2026-02-19 21:48:14.711745594 +0000 UTC m=+1205.904263458" Feb 19 21:48:15 crc kubenswrapper[4795]: I0219 21:48:15.545234 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:48:15 crc kubenswrapper[4795]: I0219 21:48:15.556204 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:48:16 crc kubenswrapper[4795]: I0219 21:48:16.694128 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4dabff2c-427b-4307-b949-23fdde980292","Type":"ContainerStarted","Data":"0985f63802fd352b3aee39c14704d73368fdf114b51ac51270903568f12ddc45"} Feb 19 21:48:16 crc kubenswrapper[4795]: I0219 21:48:16.694690 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="4dabff2c-427b-4307-b949-23fdde980292" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://0985f63802fd352b3aee39c14704d73368fdf114b51ac51270903568f12ddc45" gracePeriod=30 Feb 19 21:48:16 crc kubenswrapper[4795]: I0219 21:48:16.696409 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c55a0492-b022-4514-85d5-d35d3ec46f05","Type":"ContainerStarted","Data":"2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe"} Feb 19 21:48:16 crc kubenswrapper[4795]: I0219 21:48:16.696458 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c55a0492-b022-4514-85d5-d35d3ec46f05","Type":"ContainerStarted","Data":"2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596"} Feb 19 21:48:16 crc kubenswrapper[4795]: I0219 21:48:16.696613 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c55a0492-b022-4514-85d5-d35d3ec46f05" containerName="nova-metadata-log" containerID="cri-o://2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596" gracePeriod=30 Feb 19 21:48:16 crc kubenswrapper[4795]: I0219 21:48:16.696727 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c55a0492-b022-4514-85d5-d35d3ec46f05" containerName="nova-metadata-metadata" containerID="cri-o://2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe" gracePeriod=30 Feb 19 21:48:16 crc kubenswrapper[4795]: I0219 21:48:16.699047 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d5aecdca-84a6-4987-8b84-a95fbb0096f9","Type":"ContainerStarted","Data":"6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6"} Feb 19 21:48:16 crc kubenswrapper[4795]: I0219 21:48:16.709403 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1207388e-c327-4eb4-b81f-dee124375ca8","Type":"ContainerStarted","Data":"70ad9e55a15e5820942ade2e6910d9f4bda067bc07e673f00b4a59d72193b117"} Feb 19 21:48:16 crc kubenswrapper[4795]: I0219 21:48:16.709434 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1207388e-c327-4eb4-b81f-dee124375ca8","Type":"ContainerStarted","Data":"2918358fad24b90cece507f40bf90974e931a773750ddc328657552d135bcffc"} Feb 19 21:48:16 crc kubenswrapper[4795]: I0219 21:48:16.712883 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.964727427 podStartE2EDuration="5.712864747s" podCreationTimestamp="2026-02-19 21:48:11 +0000 UTC" firstStartedPulling="2026-02-19 21:48:12.827740463 +0000 UTC m=+1204.020258327" lastFinishedPulling="2026-02-19 21:48:15.575877783 +0000 UTC m=+1206.768395647" observedRunningTime="2026-02-19 21:48:16.710387857 +0000 UTC m=+1207.902905721" watchObservedRunningTime="2026-02-19 21:48:16.712864747 +0000 UTC m=+1207.905382611" Feb 19 21:48:16 crc kubenswrapper[4795]: I0219 21:48:16.733814 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.9415001050000003 podStartE2EDuration="5.733793835s" podCreationTimestamp="2026-02-19 21:48:11 +0000 UTC" firstStartedPulling="2026-02-19 21:48:12.714366068 +0000 UTC m=+1203.906883932" lastFinishedPulling="2026-02-19 21:48:15.506659798 +0000 UTC m=+1206.699177662" observedRunningTime="2026-02-19 21:48:16.733497996 +0000 UTC m=+1207.926015860" watchObservedRunningTime="2026-02-19 21:48:16.733793835 +0000 UTC m=+1207.926311699" Feb 19 21:48:16 crc kubenswrapper[4795]: I0219 21:48:16.757118 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.736723271 podStartE2EDuration="5.757099019s" podCreationTimestamp="2026-02-19 21:48:11 +0000 UTC" firstStartedPulling="2026-02-19 21:48:12.555628388 +0000 UTC m=+1203.748146252" lastFinishedPulling="2026-02-19 21:48:15.576004136 +0000 UTC m=+1206.768522000" observedRunningTime="2026-02-19 21:48:16.751279485 +0000 UTC m=+1207.943797369" watchObservedRunningTime="2026-02-19 21:48:16.757099019 +0000 UTC m=+1207.949616883" Feb 19 21:48:16 crc kubenswrapper[4795]: I0219 21:48:16.770724 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.927286374 podStartE2EDuration="5.770708191s" podCreationTimestamp="2026-02-19 21:48:11 +0000 UTC" firstStartedPulling="2026-02-19 21:48:12.664442765 +0000 UTC m=+1203.856960629" lastFinishedPulling="2026-02-19 21:48:15.507864582 +0000 UTC m=+1206.700382446" observedRunningTime="2026-02-19 21:48:16.765911936 +0000 UTC m=+1207.958429800" watchObservedRunningTime="2026-02-19 21:48:16.770708191 +0000 UTC m=+1207.963226055" Feb 19 21:48:16 crc kubenswrapper[4795]: I0219 21:48:16.958479 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.272787 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.291280 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.291343 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.345114 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.532603 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lht4g\" (UniqueName: \"kubernetes.io/projected/c55a0492-b022-4514-85d5-d35d3ec46f05-kube-api-access-lht4g\") pod \"c55a0492-b022-4514-85d5-d35d3ec46f05\" (UID: \"c55a0492-b022-4514-85d5-d35d3ec46f05\") " Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.532679 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c55a0492-b022-4514-85d5-d35d3ec46f05-logs\") pod \"c55a0492-b022-4514-85d5-d35d3ec46f05\" (UID: \"c55a0492-b022-4514-85d5-d35d3ec46f05\") " Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.532835 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c55a0492-b022-4514-85d5-d35d3ec46f05-config-data\") pod \"c55a0492-b022-4514-85d5-d35d3ec46f05\" (UID: \"c55a0492-b022-4514-85d5-d35d3ec46f05\") " Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.532891 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c55a0492-b022-4514-85d5-d35d3ec46f05-combined-ca-bundle\") pod \"c55a0492-b022-4514-85d5-d35d3ec46f05\" (UID: \"c55a0492-b022-4514-85d5-d35d3ec46f05\") " Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.533700 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c55a0492-b022-4514-85d5-d35d3ec46f05-logs" (OuterVolumeSpecName: "logs") pod "c55a0492-b022-4514-85d5-d35d3ec46f05" (UID: "c55a0492-b022-4514-85d5-d35d3ec46f05"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.553977 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c55a0492-b022-4514-85d5-d35d3ec46f05-kube-api-access-lht4g" (OuterVolumeSpecName: "kube-api-access-lht4g") pod "c55a0492-b022-4514-85d5-d35d3ec46f05" (UID: "c55a0492-b022-4514-85d5-d35d3ec46f05"). InnerVolumeSpecName "kube-api-access-lht4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.561429 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c55a0492-b022-4514-85d5-d35d3ec46f05-config-data" (OuterVolumeSpecName: "config-data") pod "c55a0492-b022-4514-85d5-d35d3ec46f05" (UID: "c55a0492-b022-4514-85d5-d35d3ec46f05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.570907 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c55a0492-b022-4514-85d5-d35d3ec46f05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c55a0492-b022-4514-85d5-d35d3ec46f05" (UID: "c55a0492-b022-4514-85d5-d35d3ec46f05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.635258 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lht4g\" (UniqueName: \"kubernetes.io/projected/c55a0492-b022-4514-85d5-d35d3ec46f05-kube-api-access-lht4g\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.635295 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c55a0492-b022-4514-85d5-d35d3ec46f05-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.635308 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c55a0492-b022-4514-85d5-d35d3ec46f05-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.635322 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c55a0492-b022-4514-85d5-d35d3ec46f05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.722381 4795 generic.go:334] "Generic (PLEG): container finished" podID="c55a0492-b022-4514-85d5-d35d3ec46f05" containerID="2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe" exitCode=0 Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.722409 4795 generic.go:334] "Generic (PLEG): container finished" podID="c55a0492-b022-4514-85d5-d35d3ec46f05" containerID="2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596" exitCode=143 Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.722428 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c55a0492-b022-4514-85d5-d35d3ec46f05","Type":"ContainerDied","Data":"2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe"} Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.722489 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.722506 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c55a0492-b022-4514-85d5-d35d3ec46f05","Type":"ContainerDied","Data":"2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596"} Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.722531 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c55a0492-b022-4514-85d5-d35d3ec46f05","Type":"ContainerDied","Data":"9eb4a81c900f6ce7b3c066bf56f65f67c3231408465f0d632475d3104e06db3a"} Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.722561 4795 scope.go:117] "RemoveContainer" containerID="2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.761106 4795 scope.go:117] "RemoveContainer" containerID="2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.786771 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.788432 4795 scope.go:117] "RemoveContainer" containerID="2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe" Feb 19 21:48:17 crc kubenswrapper[4795]: E0219 21:48:17.788885 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe\": container with ID starting with 2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe not found: ID does not exist" containerID="2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.788931 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe"} err="failed to get container status \"2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe\": rpc error: code = NotFound desc = could not find container \"2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe\": container with ID starting with 2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe not found: ID does not exist" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.788958 4795 scope.go:117] "RemoveContainer" containerID="2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596" Feb 19 21:48:17 crc kubenswrapper[4795]: E0219 21:48:17.789377 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596\": container with ID starting with 2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596 not found: ID does not exist" containerID="2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.789417 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596"} err="failed to get container status \"2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596\": rpc error: code = NotFound desc = could not find container \"2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596\": container with ID starting with 2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596 not found: ID does not exist" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.789441 4795 scope.go:117] "RemoveContainer" containerID="2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.789757 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe"} err="failed to get container status \"2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe\": rpc error: code = NotFound desc = could not find container \"2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe\": container with ID starting with 2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe not found: ID does not exist" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.789791 4795 scope.go:117] "RemoveContainer" containerID="2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.790156 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596"} err="failed to get container status \"2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596\": rpc error: code = NotFound desc = could not find container \"2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596\": container with ID starting with 2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596 not found: ID does not exist" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.811254 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.825195 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:48:17 crc kubenswrapper[4795]: E0219 21:48:17.825599 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c55a0492-b022-4514-85d5-d35d3ec46f05" containerName="nova-metadata-log" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.825615 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c55a0492-b022-4514-85d5-d35d3ec46f05" containerName="nova-metadata-log" Feb 19 21:48:17 crc kubenswrapper[4795]: E0219 21:48:17.825637 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c55a0492-b022-4514-85d5-d35d3ec46f05" containerName="nova-metadata-metadata" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.825643 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c55a0492-b022-4514-85d5-d35d3ec46f05" containerName="nova-metadata-metadata" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.825797 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c55a0492-b022-4514-85d5-d35d3ec46f05" containerName="nova-metadata-log" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.825813 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c55a0492-b022-4514-85d5-d35d3ec46f05" containerName="nova-metadata-metadata" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.826720 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.829309 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.829567 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.834583 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.941185 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a3c4594-ea43-437d-8528-fd360fd4c4f9-logs\") pod \"nova-metadata-0\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " pod="openstack/nova-metadata-0" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.941229 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5hmb\" (UniqueName: \"kubernetes.io/projected/3a3c4594-ea43-437d-8528-fd360fd4c4f9-kube-api-access-f5hmb\") pod \"nova-metadata-0\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " pod="openstack/nova-metadata-0" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.941274 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " pod="openstack/nova-metadata-0" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.941300 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " pod="openstack/nova-metadata-0" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.941345 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-config-data\") pod \"nova-metadata-0\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " pod="openstack/nova-metadata-0" Feb 19 21:48:18 crc kubenswrapper[4795]: I0219 21:48:18.042862 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5hmb\" (UniqueName: \"kubernetes.io/projected/3a3c4594-ea43-437d-8528-fd360fd4c4f9-kube-api-access-f5hmb\") pod \"nova-metadata-0\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " pod="openstack/nova-metadata-0" Feb 19 21:48:18 crc kubenswrapper[4795]: I0219 21:48:18.042933 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " pod="openstack/nova-metadata-0" Feb 19 21:48:18 crc kubenswrapper[4795]: I0219 21:48:18.042963 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " pod="openstack/nova-metadata-0" Feb 19 21:48:18 crc kubenswrapper[4795]: I0219 21:48:18.043004 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-config-data\") pod \"nova-metadata-0\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " pod="openstack/nova-metadata-0" Feb 19 21:48:18 crc kubenswrapper[4795]: I0219 21:48:18.043088 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a3c4594-ea43-437d-8528-fd360fd4c4f9-logs\") pod \"nova-metadata-0\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " pod="openstack/nova-metadata-0" Feb 19 21:48:18 crc kubenswrapper[4795]: I0219 21:48:18.043485 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a3c4594-ea43-437d-8528-fd360fd4c4f9-logs\") pod \"nova-metadata-0\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " pod="openstack/nova-metadata-0" Feb 19 21:48:18 crc kubenswrapper[4795]: I0219 21:48:18.048557 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " pod="openstack/nova-metadata-0" Feb 19 21:48:18 crc kubenswrapper[4795]: I0219 21:48:18.062053 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5hmb\" (UniqueName: \"kubernetes.io/projected/3a3c4594-ea43-437d-8528-fd360fd4c4f9-kube-api-access-f5hmb\") pod \"nova-metadata-0\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " pod="openstack/nova-metadata-0" Feb 19 21:48:18 crc kubenswrapper[4795]: I0219 21:48:18.062589 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-config-data\") pod \"nova-metadata-0\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " pod="openstack/nova-metadata-0" Feb 19 21:48:18 crc kubenswrapper[4795]: I0219 21:48:18.072084 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " pod="openstack/nova-metadata-0" Feb 19 21:48:18 crc kubenswrapper[4795]: I0219 21:48:18.147568 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:48:18 crc kubenswrapper[4795]: I0219 21:48:18.640814 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:48:18 crc kubenswrapper[4795]: I0219 21:48:18.736323 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3a3c4594-ea43-437d-8528-fd360fd4c4f9","Type":"ContainerStarted","Data":"79584481ab899a406532b67e6a609dc80a7856d39f0219a0cc1743be67cc18e6"} Feb 19 21:48:19 crc kubenswrapper[4795]: I0219 21:48:19.528706 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c55a0492-b022-4514-85d5-d35d3ec46f05" path="/var/lib/kubelet/pods/c55a0492-b022-4514-85d5-d35d3ec46f05/volumes" Feb 19 21:48:19 crc kubenswrapper[4795]: I0219 21:48:19.747373 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3a3c4594-ea43-437d-8528-fd360fd4c4f9","Type":"ContainerStarted","Data":"b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1"} Feb 19 21:48:19 crc kubenswrapper[4795]: I0219 21:48:19.747429 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3a3c4594-ea43-437d-8528-fd360fd4c4f9","Type":"ContainerStarted","Data":"dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f"} Feb 19 21:48:19 crc kubenswrapper[4795]: I0219 21:48:19.790639 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.790579475 podStartE2EDuration="2.790579475s" podCreationTimestamp="2026-02-19 21:48:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:19.776369096 +0000 UTC m=+1210.968887000" watchObservedRunningTime="2026-02-19 21:48:19.790579475 +0000 UTC m=+1210.983097369" Feb 19 21:48:20 crc kubenswrapper[4795]: I0219 21:48:20.765133 4795 generic.go:334] "Generic (PLEG): container finished" podID="f8cadc24-23ed-4063-8e1a-47a27c1d6ffd" containerID="54158f46698b83516d68eee860a71369629e83ec491495039fcc2f5f6a5bd317" exitCode=0 Feb 19 21:48:20 crc kubenswrapper[4795]: I0219 21:48:20.765266 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-x4mls" event={"ID":"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd","Type":"ContainerDied","Data":"54158f46698b83516d68eee860a71369629e83ec491495039fcc2f5f6a5bd317"} Feb 19 21:48:20 crc kubenswrapper[4795]: I0219 21:48:20.769264 4795 generic.go:334] "Generic (PLEG): container finished" podID="fd8e89cd-b890-4f36-9008-59767ccbad91" containerID="971150e9373b5419fd5efc7fc3e78faafd9906f987cb7e0a9e042f04e22cb5a9" exitCode=0 Feb 19 21:48:20 crc kubenswrapper[4795]: I0219 21:48:20.769307 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-gxh8d" event={"ID":"fd8e89cd-b890-4f36-9008-59767ccbad91","Type":"ContainerDied","Data":"971150e9373b5419fd5efc7fc3e78faafd9906f987cb7e0a9e042f04e22cb5a9"} Feb 19 21:48:21 crc kubenswrapper[4795]: I0219 21:48:21.931923 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 21:48:21 crc kubenswrapper[4795]: I0219 21:48:21.931989 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 21:48:21 crc kubenswrapper[4795]: I0219 21:48:21.958972 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.022648 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.270814 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-x4mls" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.279768 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-gxh8d" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.298365 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.372967 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c8dc7b4d9-wgth2"] Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.373599 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" podUID="df387754-5537-4d85-950b-02743c881da8" containerName="dnsmasq-dns" containerID="cri-o://70af76578d539ade05715b4f45e530828d624afd3981206c0f1b5d67746b15d9" gracePeriod=10 Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.444599 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-config-data\") pod \"fd8e89cd-b890-4f36-9008-59767ccbad91\" (UID: \"fd8e89cd-b890-4f36-9008-59767ccbad91\") " Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.444682 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrmdc\" (UniqueName: \"kubernetes.io/projected/fd8e89cd-b890-4f36-9008-59767ccbad91-kube-api-access-rrmdc\") pod \"fd8e89cd-b890-4f36-9008-59767ccbad91\" (UID: \"fd8e89cd-b890-4f36-9008-59767ccbad91\") " Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.444795 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-combined-ca-bundle\") pod \"fd8e89cd-b890-4f36-9008-59767ccbad91\" (UID: \"fd8e89cd-b890-4f36-9008-59767ccbad91\") " Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.444846 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qggxc\" (UniqueName: \"kubernetes.io/projected/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-kube-api-access-qggxc\") pod \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\" (UID: \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\") " Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.444865 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-scripts\") pod \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\" (UID: \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\") " Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.444885 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-scripts\") pod \"fd8e89cd-b890-4f36-9008-59767ccbad91\" (UID: \"fd8e89cd-b890-4f36-9008-59767ccbad91\") " Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.444922 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-config-data\") pod \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\" (UID: \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\") " Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.445031 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-combined-ca-bundle\") pod \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\" (UID: \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\") " Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.460921 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd8e89cd-b890-4f36-9008-59767ccbad91-kube-api-access-rrmdc" (OuterVolumeSpecName: "kube-api-access-rrmdc") pod "fd8e89cd-b890-4f36-9008-59767ccbad91" (UID: "fd8e89cd-b890-4f36-9008-59767ccbad91"). InnerVolumeSpecName "kube-api-access-rrmdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.460987 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-scripts" (OuterVolumeSpecName: "scripts") pod "f8cadc24-23ed-4063-8e1a-47a27c1d6ffd" (UID: "f8cadc24-23ed-4063-8e1a-47a27c1d6ffd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.471239 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-scripts" (OuterVolumeSpecName: "scripts") pod "fd8e89cd-b890-4f36-9008-59767ccbad91" (UID: "fd8e89cd-b890-4f36-9008-59767ccbad91"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.475829 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-kube-api-access-qggxc" (OuterVolumeSpecName: "kube-api-access-qggxc") pod "f8cadc24-23ed-4063-8e1a-47a27c1d6ffd" (UID: "f8cadc24-23ed-4063-8e1a-47a27c1d6ffd"). InnerVolumeSpecName "kube-api-access-qggxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.506002 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-config-data" (OuterVolumeSpecName: "config-data") pod "fd8e89cd-b890-4f36-9008-59767ccbad91" (UID: "fd8e89cd-b890-4f36-9008-59767ccbad91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.509356 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8cadc24-23ed-4063-8e1a-47a27c1d6ffd" (UID: "f8cadc24-23ed-4063-8e1a-47a27c1d6ffd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.529450 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd8e89cd-b890-4f36-9008-59767ccbad91" (UID: "fd8e89cd-b890-4f36-9008-59767ccbad91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.541271 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-config-data" (OuterVolumeSpecName: "config-data") pod "f8cadc24-23ed-4063-8e1a-47a27c1d6ffd" (UID: "f8cadc24-23ed-4063-8e1a-47a27c1d6ffd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.548493 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.548540 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrmdc\" (UniqueName: \"kubernetes.io/projected/fd8e89cd-b890-4f36-9008-59767ccbad91-kube-api-access-rrmdc\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.548555 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.548568 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qggxc\" (UniqueName: \"kubernetes.io/projected/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-kube-api-access-qggxc\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.548578 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.548588 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.548599 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.548607 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.788285 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-x4mls" event={"ID":"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd","Type":"ContainerDied","Data":"88f56eb2aa26f82d9b17df46fcb912c867ec3d628b4dff9a39f509961e5df80b"} Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.788620 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88f56eb2aa26f82d9b17df46fcb912c867ec3d628b4dff9a39f509961e5df80b" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.788697 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-x4mls" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.798048 4795 generic.go:334] "Generic (PLEG): container finished" podID="df387754-5537-4d85-950b-02743c881da8" containerID="70af76578d539ade05715b4f45e530828d624afd3981206c0f1b5d67746b15d9" exitCode=0 Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.798103 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" event={"ID":"df387754-5537-4d85-950b-02743c881da8","Type":"ContainerDied","Data":"70af76578d539ade05715b4f45e530828d624afd3981206c0f1b5d67746b15d9"} Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.800483 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-gxh8d" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.800569 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-gxh8d" event={"ID":"fd8e89cd-b890-4f36-9008-59767ccbad91","Type":"ContainerDied","Data":"199857e63a023ba2f3b45450f39bdc22aaf434e44f27b4974099a3d7f5c19db9"} Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.800590 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="199857e63a023ba2f3b45450f39bdc22aaf434e44f27b4974099a3d7f5c19db9" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.858820 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.893817 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.910279 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 21:48:22 crc kubenswrapper[4795]: E0219 21:48:22.910939 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df387754-5537-4d85-950b-02743c881da8" containerName="init" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.911156 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="df387754-5537-4d85-950b-02743c881da8" containerName="init" Feb 19 21:48:22 crc kubenswrapper[4795]: E0219 21:48:22.911464 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd8e89cd-b890-4f36-9008-59767ccbad91" containerName="nova-cell1-conductor-db-sync" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.911527 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd8e89cd-b890-4f36-9008-59767ccbad91" containerName="nova-cell1-conductor-db-sync" Feb 19 21:48:22 crc kubenswrapper[4795]: E0219 21:48:22.911605 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df387754-5537-4d85-950b-02743c881da8" containerName="dnsmasq-dns" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.911669 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="df387754-5537-4d85-950b-02743c881da8" containerName="dnsmasq-dns" Feb 19 21:48:22 crc kubenswrapper[4795]: E0219 21:48:22.911750 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8cadc24-23ed-4063-8e1a-47a27c1d6ffd" containerName="nova-manage" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.911822 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8cadc24-23ed-4063-8e1a-47a27c1d6ffd" containerName="nova-manage" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.912071 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8cadc24-23ed-4063-8e1a-47a27c1d6ffd" containerName="nova-manage" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.912182 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="df387754-5537-4d85-950b-02743c881da8" containerName="dnsmasq-dns" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.912245 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd8e89cd-b890-4f36-9008-59767ccbad91" containerName="nova-cell1-conductor-db-sync" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.912924 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.914653 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.933277 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.990693 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.990977 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1207388e-c327-4eb4-b81f-dee124375ca8" containerName="nova-api-log" containerID="cri-o://2918358fad24b90cece507f40bf90974e931a773750ddc328657552d135bcffc" gracePeriod=30 Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.991112 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1207388e-c327-4eb4-b81f-dee124375ca8" containerName="nova-api-api" containerID="cri-o://70ad9e55a15e5820942ade2e6910d9f4bda067bc07e673f00b4a59d72193b117" gracePeriod=30 Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.995141 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1207388e-c327-4eb4-b81f-dee124375ca8" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.187:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.995316 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1207388e-c327-4eb4-b81f-dee124375ca8" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.187:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.056718 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.056957 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3a3c4594-ea43-437d-8528-fd360fd4c4f9" containerName="nova-metadata-log" containerID="cri-o://dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f" gracePeriod=30 Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.057280 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3a3c4594-ea43-437d-8528-fd360fd4c4f9" containerName="nova-metadata-metadata" containerID="cri-o://b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1" gracePeriod=30 Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.071303 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-ovsdbserver-sb\") pod \"df387754-5537-4d85-950b-02743c881da8\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.071620 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hb2s\" (UniqueName: \"kubernetes.io/projected/df387754-5537-4d85-950b-02743c881da8-kube-api-access-2hb2s\") pod \"df387754-5537-4d85-950b-02743c881da8\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.071713 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-ovsdbserver-nb\") pod \"df387754-5537-4d85-950b-02743c881da8\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.071826 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-dns-svc\") pod \"df387754-5537-4d85-950b-02743c881da8\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.071918 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-dns-swift-storage-0\") pod \"df387754-5537-4d85-950b-02743c881da8\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.072022 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-config\") pod \"df387754-5537-4d85-950b-02743c881da8\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.072342 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtr87\" (UniqueName: \"kubernetes.io/projected/f6fd7841-2a08-4786-8e96-b2ab0f477eff-kube-api-access-qtr87\") pod \"nova-cell1-conductor-0\" (UID: \"f6fd7841-2a08-4786-8e96-b2ab0f477eff\") " pod="openstack/nova-cell1-conductor-0" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.072481 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6fd7841-2a08-4786-8e96-b2ab0f477eff-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f6fd7841-2a08-4786-8e96-b2ab0f477eff\") " pod="openstack/nova-cell1-conductor-0" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.072609 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6fd7841-2a08-4786-8e96-b2ab0f477eff-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f6fd7841-2a08-4786-8e96-b2ab0f477eff\") " pod="openstack/nova-cell1-conductor-0" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.080531 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df387754-5537-4d85-950b-02743c881da8-kube-api-access-2hb2s" (OuterVolumeSpecName: "kube-api-access-2hb2s") pod "df387754-5537-4d85-950b-02743c881da8" (UID: "df387754-5537-4d85-950b-02743c881da8"). InnerVolumeSpecName "kube-api-access-2hb2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.129067 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "df387754-5537-4d85-950b-02743c881da8" (UID: "df387754-5537-4d85-950b-02743c881da8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.129080 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-config" (OuterVolumeSpecName: "config") pod "df387754-5537-4d85-950b-02743c881da8" (UID: "df387754-5537-4d85-950b-02743c881da8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.129415 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "df387754-5537-4d85-950b-02743c881da8" (UID: "df387754-5537-4d85-950b-02743c881da8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.132561 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "df387754-5537-4d85-950b-02743c881da8" (UID: "df387754-5537-4d85-950b-02743c881da8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.135894 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "df387754-5537-4d85-950b-02743c881da8" (UID: "df387754-5537-4d85-950b-02743c881da8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.148670 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.148895 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.174054 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtr87\" (UniqueName: \"kubernetes.io/projected/f6fd7841-2a08-4786-8e96-b2ab0f477eff-kube-api-access-qtr87\") pod \"nova-cell1-conductor-0\" (UID: \"f6fd7841-2a08-4786-8e96-b2ab0f477eff\") " pod="openstack/nova-cell1-conductor-0" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.174144 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6fd7841-2a08-4786-8e96-b2ab0f477eff-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f6fd7841-2a08-4786-8e96-b2ab0f477eff\") " pod="openstack/nova-cell1-conductor-0" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.174218 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6fd7841-2a08-4786-8e96-b2ab0f477eff-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f6fd7841-2a08-4786-8e96-b2ab0f477eff\") " pod="openstack/nova-cell1-conductor-0" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.174263 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.174274 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hb2s\" (UniqueName: \"kubernetes.io/projected/df387754-5537-4d85-950b-02743c881da8-kube-api-access-2hb2s\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.174283 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.174291 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.174299 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.174308 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.182793 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6fd7841-2a08-4786-8e96-b2ab0f477eff-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f6fd7841-2a08-4786-8e96-b2ab0f477eff\") " pod="openstack/nova-cell1-conductor-0" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.183938 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6fd7841-2a08-4786-8e96-b2ab0f477eff-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f6fd7841-2a08-4786-8e96-b2ab0f477eff\") " pod="openstack/nova-cell1-conductor-0" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.193777 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtr87\" (UniqueName: \"kubernetes.io/projected/f6fd7841-2a08-4786-8e96-b2ab0f477eff-kube-api-access-qtr87\") pod \"nova-cell1-conductor-0\" (UID: \"f6fd7841-2a08-4786-8e96-b2ab0f477eff\") " pod="openstack/nova-cell1-conductor-0" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.229254 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.402841 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.639870 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.801052 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-nova-metadata-tls-certs\") pod \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.802385 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-combined-ca-bundle\") pod \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.802792 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-config-data\") pod \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.802937 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5hmb\" (UniqueName: \"kubernetes.io/projected/3a3c4594-ea43-437d-8528-fd360fd4c4f9-kube-api-access-f5hmb\") pod \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.803058 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a3c4594-ea43-437d-8528-fd360fd4c4f9-logs\") pod \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.803977 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a3c4594-ea43-437d-8528-fd360fd4c4f9-logs" (OuterVolumeSpecName: "logs") pod "3a3c4594-ea43-437d-8528-fd360fd4c4f9" (UID: "3a3c4594-ea43-437d-8528-fd360fd4c4f9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.808491 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a3c4594-ea43-437d-8528-fd360fd4c4f9-kube-api-access-f5hmb" (OuterVolumeSpecName: "kube-api-access-f5hmb") pod "3a3c4594-ea43-437d-8528-fd360fd4c4f9" (UID: "3a3c4594-ea43-437d-8528-fd360fd4c4f9"). InnerVolumeSpecName "kube-api-access-f5hmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.813669 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" event={"ID":"df387754-5537-4d85-950b-02743c881da8","Type":"ContainerDied","Data":"8495c4433e798d1ac2e79ddf9c88ebf569c3cedf6bdc46579d7bb36aaf2eff72"} Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.813749 4795 scope.go:117] "RemoveContainer" containerID="70af76578d539ade05715b4f45e530828d624afd3981206c0f1b5d67746b15d9" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.813982 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.823026 4795 generic.go:334] "Generic (PLEG): container finished" podID="3a3c4594-ea43-437d-8528-fd360fd4c4f9" containerID="b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1" exitCode=0 Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.823072 4795 generic.go:334] "Generic (PLEG): container finished" podID="3a3c4594-ea43-437d-8528-fd360fd4c4f9" containerID="dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f" exitCode=143 Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.823220 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.823384 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3a3c4594-ea43-437d-8528-fd360fd4c4f9","Type":"ContainerDied","Data":"b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1"} Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.823446 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3a3c4594-ea43-437d-8528-fd360fd4c4f9","Type":"ContainerDied","Data":"dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f"} Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.823468 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3a3c4594-ea43-437d-8528-fd360fd4c4f9","Type":"ContainerDied","Data":"79584481ab899a406532b67e6a609dc80a7856d39f0219a0cc1743be67cc18e6"} Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.828351 4795 generic.go:334] "Generic (PLEG): container finished" podID="1207388e-c327-4eb4-b81f-dee124375ca8" containerID="2918358fad24b90cece507f40bf90974e931a773750ddc328657552d135bcffc" exitCode=143 Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.828435 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1207388e-c327-4eb4-b81f-dee124375ca8","Type":"ContainerDied","Data":"2918358fad24b90cece507f40bf90974e931a773750ddc328657552d135bcffc"} Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.843275 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.850072 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-config-data" (OuterVolumeSpecName: "config-data") pod "3a3c4594-ea43-437d-8528-fd360fd4c4f9" (UID: "3a3c4594-ea43-437d-8528-fd360fd4c4f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:23 crc kubenswrapper[4795]: W0219 21:48:23.850084 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6fd7841_2a08_4786_8e96_b2ab0f477eff.slice/crio-d9ff2d99d6679f8062f4938a68a22506ca0912d3767e7bc32eff175a9e262c96 WatchSource:0}: Error finding container d9ff2d99d6679f8062f4938a68a22506ca0912d3767e7bc32eff175a9e262c96: Status 404 returned error can't find the container with id d9ff2d99d6679f8062f4938a68a22506ca0912d3767e7bc32eff175a9e262c96 Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.857360 4795 scope.go:117] "RemoveContainer" containerID="93a750f0a2156e5774cb54d38a230aaaa13bdcb55ce20e481877fcd04f00ee2e" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.862311 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c8dc7b4d9-wgth2"] Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.871985 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c8dc7b4d9-wgth2"] Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.880417 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a3c4594-ea43-437d-8528-fd360fd4c4f9" (UID: "3a3c4594-ea43-437d-8528-fd360fd4c4f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.882792 4795 scope.go:117] "RemoveContainer" containerID="b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.891672 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "3a3c4594-ea43-437d-8528-fd360fd4c4f9" (UID: "3a3c4594-ea43-437d-8528-fd360fd4c4f9"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.907053 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.912467 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.912498 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5hmb\" (UniqueName: \"kubernetes.io/projected/3a3c4594-ea43-437d-8528-fd360fd4c4f9-kube-api-access-f5hmb\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.912510 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a3c4594-ea43-437d-8528-fd360fd4c4f9-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.912522 4795 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.915856 4795 scope.go:117] "RemoveContainer" containerID="dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.941719 4795 scope.go:117] "RemoveContainer" containerID="b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1" Feb 19 21:48:23 crc kubenswrapper[4795]: E0219 21:48:23.942501 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1\": container with ID starting with b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1 not found: ID does not exist" containerID="b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.942545 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1"} err="failed to get container status \"b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1\": rpc error: code = NotFound desc = could not find container \"b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1\": container with ID starting with b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1 not found: ID does not exist" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.942573 4795 scope.go:117] "RemoveContainer" containerID="dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f" Feb 19 21:48:23 crc kubenswrapper[4795]: E0219 21:48:23.942967 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f\": container with ID starting with dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f not found: ID does not exist" containerID="dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.942988 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f"} err="failed to get container status \"dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f\": rpc error: code = NotFound desc = could not find container \"dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f\": container with ID starting with dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f not found: ID does not exist" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.943002 4795 scope.go:117] "RemoveContainer" containerID="b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.943313 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1"} err="failed to get container status \"b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1\": rpc error: code = NotFound desc = could not find container \"b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1\": container with ID starting with b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1 not found: ID does not exist" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.943354 4795 scope.go:117] "RemoveContainer" containerID="dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.943606 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f"} err="failed to get container status \"dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f\": rpc error: code = NotFound desc = could not find container \"dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f\": container with ID starting with dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f not found: ID does not exist" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.201302 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.210419 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.227722 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:48:24 crc kubenswrapper[4795]: E0219 21:48:24.228140 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a3c4594-ea43-437d-8528-fd360fd4c4f9" containerName="nova-metadata-log" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.228159 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a3c4594-ea43-437d-8528-fd360fd4c4f9" containerName="nova-metadata-log" Feb 19 21:48:24 crc kubenswrapper[4795]: E0219 21:48:24.228205 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a3c4594-ea43-437d-8528-fd360fd4c4f9" containerName="nova-metadata-metadata" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.228213 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a3c4594-ea43-437d-8528-fd360fd4c4f9" containerName="nova-metadata-metadata" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.228390 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a3c4594-ea43-437d-8528-fd360fd4c4f9" containerName="nova-metadata-log" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.228413 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a3c4594-ea43-437d-8528-fd360fd4c4f9" containerName="nova-metadata-metadata" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.229342 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.231220 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.241737 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.246066 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.318065 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-logs\") pod \"nova-metadata-0\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.318154 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd6zt\" (UniqueName: \"kubernetes.io/projected/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-kube-api-access-fd6zt\") pod \"nova-metadata-0\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.318298 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.318426 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-config-data\") pod \"nova-metadata-0\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.318536 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.419782 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.419843 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-config-data\") pod \"nova-metadata-0\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.420053 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.420729 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-logs\") pod \"nova-metadata-0\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.420793 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd6zt\" (UniqueName: \"kubernetes.io/projected/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-kube-api-access-fd6zt\") pod \"nova-metadata-0\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.421408 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-logs\") pod \"nova-metadata-0\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.424032 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.432783 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.434597 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-config-data\") pod \"nova-metadata-0\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.442805 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd6zt\" (UniqueName: \"kubernetes.io/projected/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-kube-api-access-fd6zt\") pod \"nova-metadata-0\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.550113 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.840040 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f6fd7841-2a08-4786-8e96-b2ab0f477eff","Type":"ContainerStarted","Data":"4f511a80129a9b198e5be1216670e966f0623d26f1c7d17cc3a99a813e69b1c3"} Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.840389 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.840405 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f6fd7841-2a08-4786-8e96-b2ab0f477eff","Type":"ContainerStarted","Data":"d9ff2d99d6679f8062f4938a68a22506ca0912d3767e7bc32eff175a9e262c96"} Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.846312 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d5aecdca-84a6-4987-8b84-a95fbb0096f9" containerName="nova-scheduler-scheduler" containerID="cri-o://6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6" gracePeriod=30 Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.858468 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.858449587 podStartE2EDuration="2.858449587s" podCreationTimestamp="2026-02-19 21:48:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:24.854427914 +0000 UTC m=+1216.046945778" watchObservedRunningTime="2026-02-19 21:48:24.858449587 +0000 UTC m=+1216.050967451" Feb 19 21:48:25 crc kubenswrapper[4795]: I0219 21:48:25.046066 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:48:25 crc kubenswrapper[4795]: W0219 21:48:25.056534 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c0bc1cc_7985_4a3f_8ab8_26d49f7706c8.slice/crio-d8263151d90162e0d834534389009f120a5ff2fd0c5a460719918f5dbd83bc95 WatchSource:0}: Error finding container d8263151d90162e0d834534389009f120a5ff2fd0c5a460719918f5dbd83bc95: Status 404 returned error can't find the container with id d8263151d90162e0d834534389009f120a5ff2fd0c5a460719918f5dbd83bc95 Feb 19 21:48:25 crc kubenswrapper[4795]: I0219 21:48:25.526938 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a3c4594-ea43-437d-8528-fd360fd4c4f9" path="/var/lib/kubelet/pods/3a3c4594-ea43-437d-8528-fd360fd4c4f9/volumes" Feb 19 21:48:25 crc kubenswrapper[4795]: I0219 21:48:25.529026 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df387754-5537-4d85-950b-02743c881da8" path="/var/lib/kubelet/pods/df387754-5537-4d85-950b-02743c881da8/volumes" Feb 19 21:48:25 crc kubenswrapper[4795]: I0219 21:48:25.814674 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 19 21:48:25 crc kubenswrapper[4795]: I0219 21:48:25.866978 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8","Type":"ContainerStarted","Data":"a60101c154c65170b0cd0d24ec4c3534c546765af68be292c858d3e94c795294"} Feb 19 21:48:25 crc kubenswrapper[4795]: I0219 21:48:25.867044 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8","Type":"ContainerStarted","Data":"fac0b49dac38eb57e08779c1a5fd43f5502fc80faee614fd505dcf5a274e4145"} Feb 19 21:48:25 crc kubenswrapper[4795]: I0219 21:48:25.867056 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8","Type":"ContainerStarted","Data":"d8263151d90162e0d834534389009f120a5ff2fd0c5a460719918f5dbd83bc95"} Feb 19 21:48:25 crc kubenswrapper[4795]: I0219 21:48:25.901039 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.901019319 podStartE2EDuration="1.901019319s" podCreationTimestamp="2026-02-19 21:48:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:25.898791056 +0000 UTC m=+1217.091308920" watchObservedRunningTime="2026-02-19 21:48:25.901019319 +0000 UTC m=+1217.093537193" Feb 19 21:48:26 crc kubenswrapper[4795]: E0219 21:48:26.960408 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:48:26 crc kubenswrapper[4795]: E0219 21:48:26.961204 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6 is running failed: container process not found" containerID="6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:48:26 crc kubenswrapper[4795]: E0219 21:48:26.961430 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6 is running failed: container process not found" containerID="6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:48:26 crc kubenswrapper[4795]: E0219 21:48:26.961462 4795 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="d5aecdca-84a6-4987-8b84-a95fbb0096f9" containerName="nova-scheduler-scheduler" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.351417 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.390611 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9jbt\" (UniqueName: \"kubernetes.io/projected/d5aecdca-84a6-4987-8b84-a95fbb0096f9-kube-api-access-w9jbt\") pod \"d5aecdca-84a6-4987-8b84-a95fbb0096f9\" (UID: \"d5aecdca-84a6-4987-8b84-a95fbb0096f9\") " Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.390789 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5aecdca-84a6-4987-8b84-a95fbb0096f9-config-data\") pod \"d5aecdca-84a6-4987-8b84-a95fbb0096f9\" (UID: \"d5aecdca-84a6-4987-8b84-a95fbb0096f9\") " Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.390985 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5aecdca-84a6-4987-8b84-a95fbb0096f9-combined-ca-bundle\") pod \"d5aecdca-84a6-4987-8b84-a95fbb0096f9\" (UID: \"d5aecdca-84a6-4987-8b84-a95fbb0096f9\") " Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.403119 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5aecdca-84a6-4987-8b84-a95fbb0096f9-kube-api-access-w9jbt" (OuterVolumeSpecName: "kube-api-access-w9jbt") pod "d5aecdca-84a6-4987-8b84-a95fbb0096f9" (UID: "d5aecdca-84a6-4987-8b84-a95fbb0096f9"). InnerVolumeSpecName "kube-api-access-w9jbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.425513 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5aecdca-84a6-4987-8b84-a95fbb0096f9-config-data" (OuterVolumeSpecName: "config-data") pod "d5aecdca-84a6-4987-8b84-a95fbb0096f9" (UID: "d5aecdca-84a6-4987-8b84-a95fbb0096f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.444546 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5aecdca-84a6-4987-8b84-a95fbb0096f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5aecdca-84a6-4987-8b84-a95fbb0096f9" (UID: "d5aecdca-84a6-4987-8b84-a95fbb0096f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.493479 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5aecdca-84a6-4987-8b84-a95fbb0096f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.493515 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9jbt\" (UniqueName: \"kubernetes.io/projected/d5aecdca-84a6-4987-8b84-a95fbb0096f9-kube-api-access-w9jbt\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.493531 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5aecdca-84a6-4987-8b84-a95fbb0096f9-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.883800 4795 generic.go:334] "Generic (PLEG): container finished" podID="d5aecdca-84a6-4987-8b84-a95fbb0096f9" containerID="6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6" exitCode=0 Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.883844 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d5aecdca-84a6-4987-8b84-a95fbb0096f9","Type":"ContainerDied","Data":"6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6"} Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.883873 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d5aecdca-84a6-4987-8b84-a95fbb0096f9","Type":"ContainerDied","Data":"13cacc8b3b7ac3be0ed861e620fc89ce3271fa1dd8179aa143ee4013f7bace77"} Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.883893 4795 scope.go:117] "RemoveContainer" containerID="6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.883929 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.903728 4795 scope.go:117] "RemoveContainer" containerID="6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6" Feb 19 21:48:27 crc kubenswrapper[4795]: E0219 21:48:27.904110 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6\": container with ID starting with 6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6 not found: ID does not exist" containerID="6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.904183 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6"} err="failed to get container status \"6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6\": rpc error: code = NotFound desc = could not find container \"6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6\": container with ID starting with 6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6 not found: ID does not exist" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.929444 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.941604 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.953000 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:48:27 crc kubenswrapper[4795]: E0219 21:48:27.953660 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5aecdca-84a6-4987-8b84-a95fbb0096f9" containerName="nova-scheduler-scheduler" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.953694 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5aecdca-84a6-4987-8b84-a95fbb0096f9" containerName="nova-scheduler-scheduler" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.954023 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5aecdca-84a6-4987-8b84-a95fbb0096f9" containerName="nova-scheduler-scheduler" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.954913 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.956940 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.962937 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.004192 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f49585-0601-424b-9f28-304ae06c9d93-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c0f49585-0601-424b-9f28-304ae06c9d93\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.004257 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0f49585-0601-424b-9f28-304ae06c9d93-config-data\") pod \"nova-scheduler-0\" (UID: \"c0f49585-0601-424b-9f28-304ae06c9d93\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.004603 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tmq5\" (UniqueName: \"kubernetes.io/projected/c0f49585-0601-424b-9f28-304ae06c9d93-kube-api-access-8tmq5\") pod \"nova-scheduler-0\" (UID: \"c0f49585-0601-424b-9f28-304ae06c9d93\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.105992 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tmq5\" (UniqueName: \"kubernetes.io/projected/c0f49585-0601-424b-9f28-304ae06c9d93-kube-api-access-8tmq5\") pod \"nova-scheduler-0\" (UID: \"c0f49585-0601-424b-9f28-304ae06c9d93\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.106071 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f49585-0601-424b-9f28-304ae06c9d93-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c0f49585-0601-424b-9f28-304ae06c9d93\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.106129 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0f49585-0601-424b-9f28-304ae06c9d93-config-data\") pod \"nova-scheduler-0\" (UID: \"c0f49585-0601-424b-9f28-304ae06c9d93\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.111933 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0f49585-0601-424b-9f28-304ae06c9d93-config-data\") pod \"nova-scheduler-0\" (UID: \"c0f49585-0601-424b-9f28-304ae06c9d93\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.123357 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f49585-0601-424b-9f28-304ae06c9d93-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c0f49585-0601-424b-9f28-304ae06c9d93\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.126911 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tmq5\" (UniqueName: \"kubernetes.io/projected/c0f49585-0601-424b-9f28-304ae06c9d93-kube-api-access-8tmq5\") pod \"nova-scheduler-0\" (UID: \"c0f49585-0601-424b-9f28-304ae06c9d93\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.271945 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.736771 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.863424 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.897584 4795 generic.go:334] "Generic (PLEG): container finished" podID="1207388e-c327-4eb4-b81f-dee124375ca8" containerID="70ad9e55a15e5820942ade2e6910d9f4bda067bc07e673f00b4a59d72193b117" exitCode=0 Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.897650 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1207388e-c327-4eb4-b81f-dee124375ca8","Type":"ContainerDied","Data":"70ad9e55a15e5820942ade2e6910d9f4bda067bc07e673f00b4a59d72193b117"} Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.897673 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1207388e-c327-4eb4-b81f-dee124375ca8","Type":"ContainerDied","Data":"70232baf2a46181b0fb51eefd50334fc1763134daec5b9978cd7cc19312a07a8"} Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.897693 4795 scope.go:117] "RemoveContainer" containerID="70ad9e55a15e5820942ade2e6910d9f4bda067bc07e673f00b4a59d72193b117" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.897791 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.901959 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c0f49585-0601-424b-9f28-304ae06c9d93","Type":"ContainerStarted","Data":"a293322e7ade93b6e5ee39b06413dcdb22fcc62b8715355c0228e95452255ef5"} Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.902228 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c0f49585-0601-424b-9f28-304ae06c9d93","Type":"ContainerStarted","Data":"cd865b730fa4ce805f52ae67f6b00c0275c97a6018c4d5724e64d28e7cd4b5db"} Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.920891 4795 scope.go:117] "RemoveContainer" containerID="2918358fad24b90cece507f40bf90974e931a773750ddc328657552d135bcffc" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.923302 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.9232872300000001 podStartE2EDuration="1.92328723s" podCreationTimestamp="2026-02-19 21:48:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:28.920581364 +0000 UTC m=+1220.113099218" watchObservedRunningTime="2026-02-19 21:48:28.92328723 +0000 UTC m=+1220.115805094" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.946516 4795 scope.go:117] "RemoveContainer" containerID="70ad9e55a15e5820942ade2e6910d9f4bda067bc07e673f00b4a59d72193b117" Feb 19 21:48:28 crc kubenswrapper[4795]: E0219 21:48:28.946991 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70ad9e55a15e5820942ade2e6910d9f4bda067bc07e673f00b4a59d72193b117\": container with ID starting with 70ad9e55a15e5820942ade2e6910d9f4bda067bc07e673f00b4a59d72193b117 not found: ID does not exist" containerID="70ad9e55a15e5820942ade2e6910d9f4bda067bc07e673f00b4a59d72193b117" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.947045 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70ad9e55a15e5820942ade2e6910d9f4bda067bc07e673f00b4a59d72193b117"} err="failed to get container status \"70ad9e55a15e5820942ade2e6910d9f4bda067bc07e673f00b4a59d72193b117\": rpc error: code = NotFound desc = could not find container \"70ad9e55a15e5820942ade2e6910d9f4bda067bc07e673f00b4a59d72193b117\": container with ID starting with 70ad9e55a15e5820942ade2e6910d9f4bda067bc07e673f00b4a59d72193b117 not found: ID does not exist" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.947071 4795 scope.go:117] "RemoveContainer" containerID="2918358fad24b90cece507f40bf90974e931a773750ddc328657552d135bcffc" Feb 19 21:48:28 crc kubenswrapper[4795]: E0219 21:48:28.947462 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2918358fad24b90cece507f40bf90974e931a773750ddc328657552d135bcffc\": container with ID starting with 2918358fad24b90cece507f40bf90974e931a773750ddc328657552d135bcffc not found: ID does not exist" containerID="2918358fad24b90cece507f40bf90974e931a773750ddc328657552d135bcffc" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.947504 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2918358fad24b90cece507f40bf90974e931a773750ddc328657552d135bcffc"} err="failed to get container status \"2918358fad24b90cece507f40bf90974e931a773750ddc328657552d135bcffc\": rpc error: code = NotFound desc = could not find container \"2918358fad24b90cece507f40bf90974e931a773750ddc328657552d135bcffc\": container with ID starting with 2918358fad24b90cece507f40bf90974e931a773750ddc328657552d135bcffc not found: ID does not exist" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.024493 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v6ct\" (UniqueName: \"kubernetes.io/projected/1207388e-c327-4eb4-b81f-dee124375ca8-kube-api-access-8v6ct\") pod \"1207388e-c327-4eb4-b81f-dee124375ca8\" (UID: \"1207388e-c327-4eb4-b81f-dee124375ca8\") " Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.024551 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1207388e-c327-4eb4-b81f-dee124375ca8-combined-ca-bundle\") pod \"1207388e-c327-4eb4-b81f-dee124375ca8\" (UID: \"1207388e-c327-4eb4-b81f-dee124375ca8\") " Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.024769 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1207388e-c327-4eb4-b81f-dee124375ca8-logs\") pod \"1207388e-c327-4eb4-b81f-dee124375ca8\" (UID: \"1207388e-c327-4eb4-b81f-dee124375ca8\") " Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.024796 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1207388e-c327-4eb4-b81f-dee124375ca8-config-data\") pod \"1207388e-c327-4eb4-b81f-dee124375ca8\" (UID: \"1207388e-c327-4eb4-b81f-dee124375ca8\") " Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.025521 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1207388e-c327-4eb4-b81f-dee124375ca8-logs" (OuterVolumeSpecName: "logs") pod "1207388e-c327-4eb4-b81f-dee124375ca8" (UID: "1207388e-c327-4eb4-b81f-dee124375ca8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.029508 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1207388e-c327-4eb4-b81f-dee124375ca8-kube-api-access-8v6ct" (OuterVolumeSpecName: "kube-api-access-8v6ct") pod "1207388e-c327-4eb4-b81f-dee124375ca8" (UID: "1207388e-c327-4eb4-b81f-dee124375ca8"). InnerVolumeSpecName "kube-api-access-8v6ct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.053471 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1207388e-c327-4eb4-b81f-dee124375ca8-config-data" (OuterVolumeSpecName: "config-data") pod "1207388e-c327-4eb4-b81f-dee124375ca8" (UID: "1207388e-c327-4eb4-b81f-dee124375ca8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.063222 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1207388e-c327-4eb4-b81f-dee124375ca8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1207388e-c327-4eb4-b81f-dee124375ca8" (UID: "1207388e-c327-4eb4-b81f-dee124375ca8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.126774 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1207388e-c327-4eb4-b81f-dee124375ca8-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.126810 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1207388e-c327-4eb4-b81f-dee124375ca8-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.126821 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v6ct\" (UniqueName: \"kubernetes.io/projected/1207388e-c327-4eb4-b81f-dee124375ca8-kube-api-access-8v6ct\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.126830 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1207388e-c327-4eb4-b81f-dee124375ca8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.230661 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.249749 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.256865 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 21:48:29 crc kubenswrapper[4795]: E0219 21:48:29.257284 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1207388e-c327-4eb4-b81f-dee124375ca8" containerName="nova-api-api" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.257303 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1207388e-c327-4eb4-b81f-dee124375ca8" containerName="nova-api-api" Feb 19 21:48:29 crc kubenswrapper[4795]: E0219 21:48:29.257326 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1207388e-c327-4eb4-b81f-dee124375ca8" containerName="nova-api-log" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.257335 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1207388e-c327-4eb4-b81f-dee124375ca8" containerName="nova-api-log" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.257519 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1207388e-c327-4eb4-b81f-dee124375ca8" containerName="nova-api-api" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.257551 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1207388e-c327-4eb4-b81f-dee124375ca8" containerName="nova-api-log" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.258591 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.266764 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.281197 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.432157 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01859e80-9d51-4db2-8a48-9ad45d901f16-logs\") pod \"nova-api-0\" (UID: \"01859e80-9d51-4db2-8a48-9ad45d901f16\") " pod="openstack/nova-api-0" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.432232 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9krq\" (UniqueName: \"kubernetes.io/projected/01859e80-9d51-4db2-8a48-9ad45d901f16-kube-api-access-s9krq\") pod \"nova-api-0\" (UID: \"01859e80-9d51-4db2-8a48-9ad45d901f16\") " pod="openstack/nova-api-0" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.432296 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01859e80-9d51-4db2-8a48-9ad45d901f16-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"01859e80-9d51-4db2-8a48-9ad45d901f16\") " pod="openstack/nova-api-0" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.432367 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01859e80-9d51-4db2-8a48-9ad45d901f16-config-data\") pod \"nova-api-0\" (UID: \"01859e80-9d51-4db2-8a48-9ad45d901f16\") " pod="openstack/nova-api-0" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.521771 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1207388e-c327-4eb4-b81f-dee124375ca8" path="/var/lib/kubelet/pods/1207388e-c327-4eb4-b81f-dee124375ca8/volumes" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.522351 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5aecdca-84a6-4987-8b84-a95fbb0096f9" path="/var/lib/kubelet/pods/d5aecdca-84a6-4987-8b84-a95fbb0096f9/volumes" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.522829 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.522978 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="bb3b374e-f01b-4997-9ecf-fbeeb384cc2c" containerName="kube-state-metrics" containerID="cri-o://0c46b510d414f62d57f7afe61292d3a54a60ed7655ea208a609c0d72a9940824" gracePeriod=30 Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.535204 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01859e80-9d51-4db2-8a48-9ad45d901f16-logs\") pod \"nova-api-0\" (UID: \"01859e80-9d51-4db2-8a48-9ad45d901f16\") " pod="openstack/nova-api-0" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.535256 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9krq\" (UniqueName: \"kubernetes.io/projected/01859e80-9d51-4db2-8a48-9ad45d901f16-kube-api-access-s9krq\") pod \"nova-api-0\" (UID: \"01859e80-9d51-4db2-8a48-9ad45d901f16\") " pod="openstack/nova-api-0" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.535318 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01859e80-9d51-4db2-8a48-9ad45d901f16-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"01859e80-9d51-4db2-8a48-9ad45d901f16\") " pod="openstack/nova-api-0" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.535380 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01859e80-9d51-4db2-8a48-9ad45d901f16-config-data\") pod \"nova-api-0\" (UID: \"01859e80-9d51-4db2-8a48-9ad45d901f16\") " pod="openstack/nova-api-0" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.535798 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01859e80-9d51-4db2-8a48-9ad45d901f16-logs\") pod \"nova-api-0\" (UID: \"01859e80-9d51-4db2-8a48-9ad45d901f16\") " pod="openstack/nova-api-0" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.540442 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01859e80-9d51-4db2-8a48-9ad45d901f16-config-data\") pod \"nova-api-0\" (UID: \"01859e80-9d51-4db2-8a48-9ad45d901f16\") " pod="openstack/nova-api-0" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.542823 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01859e80-9d51-4db2-8a48-9ad45d901f16-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"01859e80-9d51-4db2-8a48-9ad45d901f16\") " pod="openstack/nova-api-0" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.551076 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.551406 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.552261 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9krq\" (UniqueName: \"kubernetes.io/projected/01859e80-9d51-4db2-8a48-9ad45d901f16-kube-api-access-s9krq\") pod \"nova-api-0\" (UID: \"01859e80-9d51-4db2-8a48-9ad45d901f16\") " pod="openstack/nova-api-0" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.578746 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.911727 4795 generic.go:334] "Generic (PLEG): container finished" podID="bb3b374e-f01b-4997-9ecf-fbeeb384cc2c" containerID="0c46b510d414f62d57f7afe61292d3a54a60ed7655ea208a609c0d72a9940824" exitCode=2 Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.911893 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"bb3b374e-f01b-4997-9ecf-fbeeb384cc2c","Type":"ContainerDied","Data":"0c46b510d414f62d57f7afe61292d3a54a60ed7655ea208a609c0d72a9940824"} Feb 19 21:48:30 crc kubenswrapper[4795]: I0219 21:48:30.000476 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 21:48:30 crc kubenswrapper[4795]: I0219 21:48:30.047027 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:48:30 crc kubenswrapper[4795]: I0219 21:48:30.145689 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qr8kk\" (UniqueName: \"kubernetes.io/projected/bb3b374e-f01b-4997-9ecf-fbeeb384cc2c-kube-api-access-qr8kk\") pod \"bb3b374e-f01b-4997-9ecf-fbeeb384cc2c\" (UID: \"bb3b374e-f01b-4997-9ecf-fbeeb384cc2c\") " Feb 19 21:48:30 crc kubenswrapper[4795]: I0219 21:48:30.150800 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb3b374e-f01b-4997-9ecf-fbeeb384cc2c-kube-api-access-qr8kk" (OuterVolumeSpecName: "kube-api-access-qr8kk") pod "bb3b374e-f01b-4997-9ecf-fbeeb384cc2c" (UID: "bb3b374e-f01b-4997-9ecf-fbeeb384cc2c"). InnerVolumeSpecName "kube-api-access-qr8kk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:30 crc kubenswrapper[4795]: I0219 21:48:30.247952 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qr8kk\" (UniqueName: \"kubernetes.io/projected/bb3b374e-f01b-4997-9ecf-fbeeb384cc2c-kube-api-access-qr8kk\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:30 crc kubenswrapper[4795]: I0219 21:48:30.930935 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01859e80-9d51-4db2-8a48-9ad45d901f16","Type":"ContainerStarted","Data":"a08e7a43bb7eb9940934542dc40899cae39a62cfccafdc60ea968829591c4b59"} Feb 19 21:48:30 crc kubenswrapper[4795]: I0219 21:48:30.931474 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01859e80-9d51-4db2-8a48-9ad45d901f16","Type":"ContainerStarted","Data":"cd6c91a8bbdff67fb129ad494141346fb0c9ffacbd675189463e7bb2a6d75d94"} Feb 19 21:48:30 crc kubenswrapper[4795]: I0219 21:48:30.931500 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01859e80-9d51-4db2-8a48-9ad45d901f16","Type":"ContainerStarted","Data":"4d23ebd426731022b2890eb99d72e3c34bf0eabb128346170d4b9e6c3457a311"} Feb 19 21:48:30 crc kubenswrapper[4795]: I0219 21:48:30.933254 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"bb3b374e-f01b-4997-9ecf-fbeeb384cc2c","Type":"ContainerDied","Data":"1c52434379a6a736b3ede83a285c9c03d25be4bf31ea4977955cffc137750ca3"} Feb 19 21:48:30 crc kubenswrapper[4795]: I0219 21:48:30.933306 4795 scope.go:117] "RemoveContainer" containerID="0c46b510d414f62d57f7afe61292d3a54a60ed7655ea208a609c0d72a9940824" Feb 19 21:48:30 crc kubenswrapper[4795]: I0219 21:48:30.933321 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 21:48:30 crc kubenswrapper[4795]: I0219 21:48:30.971499 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.971478244 podStartE2EDuration="1.971478244s" podCreationTimestamp="2026-02-19 21:48:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:30.958123439 +0000 UTC m=+1222.150641303" watchObservedRunningTime="2026-02-19 21:48:30.971478244 +0000 UTC m=+1222.163996128" Feb 19 21:48:30 crc kubenswrapper[4795]: I0219 21:48:30.983873 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:48:30 crc kubenswrapper[4795]: I0219 21:48:30.994129 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.003110 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:48:31 crc kubenswrapper[4795]: E0219 21:48:31.006097 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb3b374e-f01b-4997-9ecf-fbeeb384cc2c" containerName="kube-state-metrics" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.006191 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb3b374e-f01b-4997-9ecf-fbeeb384cc2c" containerName="kube-state-metrics" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.006477 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb3b374e-f01b-4997-9ecf-fbeeb384cc2c" containerName="kube-state-metrics" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.007182 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.009702 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.010451 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.019812 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.165128 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"296f6b57-de45-495d-abe9-8c779c157057\") " pod="openstack/kube-state-metrics-0" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.165374 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"296f6b57-de45-495d-abe9-8c779c157057\") " pod="openstack/kube-state-metrics-0" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.165471 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"296f6b57-de45-495d-abe9-8c779c157057\") " pod="openstack/kube-state-metrics-0" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.165755 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlmk9\" (UniqueName: \"kubernetes.io/projected/296f6b57-de45-495d-abe9-8c779c157057-kube-api-access-zlmk9\") pod \"kube-state-metrics-0\" (UID: \"296f6b57-de45-495d-abe9-8c779c157057\") " pod="openstack/kube-state-metrics-0" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.211876 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.212664 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" containerName="ceilometer-central-agent" containerID="cri-o://8985ce1aa24fbc5f5bb8026bb427b370dba124cba19d3928ff7c3a92ca42615b" gracePeriod=30 Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.213021 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" containerName="proxy-httpd" containerID="cri-o://f916e17382481e444593150e0a833b6a6814157585865c277cdc15cc441ddb4b" gracePeriod=30 Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.213260 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" containerName="ceilometer-notification-agent" containerID="cri-o://1631e868548b8541b4586b49303d99f204a0c9cbd4601bd40535a084257996e8" gracePeriod=30 Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.213322 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" containerName="sg-core" containerID="cri-o://bd25350fb77f428e282ee6c0811cc7be2f53b67333596fcfe27b4e218e3549da" gracePeriod=30 Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.267711 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"296f6b57-de45-495d-abe9-8c779c157057\") " pod="openstack/kube-state-metrics-0" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.267790 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"296f6b57-de45-495d-abe9-8c779c157057\") " pod="openstack/kube-state-metrics-0" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.267889 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlmk9\" (UniqueName: \"kubernetes.io/projected/296f6b57-de45-495d-abe9-8c779c157057-kube-api-access-zlmk9\") pod \"kube-state-metrics-0\" (UID: \"296f6b57-de45-495d-abe9-8c779c157057\") " pod="openstack/kube-state-metrics-0" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.267960 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"296f6b57-de45-495d-abe9-8c779c157057\") " pod="openstack/kube-state-metrics-0" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.273095 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"296f6b57-de45-495d-abe9-8c779c157057\") " pod="openstack/kube-state-metrics-0" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.279278 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"296f6b57-de45-495d-abe9-8c779c157057\") " pod="openstack/kube-state-metrics-0" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.279681 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"296f6b57-de45-495d-abe9-8c779c157057\") " pod="openstack/kube-state-metrics-0" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.283395 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlmk9\" (UniqueName: \"kubernetes.io/projected/296f6b57-de45-495d-abe9-8c779c157057-kube-api-access-zlmk9\") pod \"kube-state-metrics-0\" (UID: \"296f6b57-de45-495d-abe9-8c779c157057\") " pod="openstack/kube-state-metrics-0" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.337914 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.535485 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb3b374e-f01b-4997-9ecf-fbeeb384cc2c" path="/var/lib/kubelet/pods/bb3b374e-f01b-4997-9ecf-fbeeb384cc2c/volumes" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.800231 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:48:31 crc kubenswrapper[4795]: W0219 21:48:31.801625 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod296f6b57_de45_495d_abe9_8c779c157057.slice/crio-b8dfa3bc80470864a239275349fad5e129ee2b5e9ff5a093105f56a1aaa790d3 WatchSource:0}: Error finding container b8dfa3bc80470864a239275349fad5e129ee2b5e9ff5a093105f56a1aaa790d3: Status 404 returned error can't find the container with id b8dfa3bc80470864a239275349fad5e129ee2b5e9ff5a093105f56a1aaa790d3 Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.946861 4795 generic.go:334] "Generic (PLEG): container finished" podID="60557b75-bafe-4e92-937c-e541b84aaf70" containerID="f916e17382481e444593150e0a833b6a6814157585865c277cdc15cc441ddb4b" exitCode=0 Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.947096 4795 generic.go:334] "Generic (PLEG): container finished" podID="60557b75-bafe-4e92-937c-e541b84aaf70" containerID="bd25350fb77f428e282ee6c0811cc7be2f53b67333596fcfe27b4e218e3549da" exitCode=2 Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.947199 4795 generic.go:334] "Generic (PLEG): container finished" podID="60557b75-bafe-4e92-937c-e541b84aaf70" containerID="8985ce1aa24fbc5f5bb8026bb427b370dba124cba19d3928ff7c3a92ca42615b" exitCode=0 Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.946962 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60557b75-bafe-4e92-937c-e541b84aaf70","Type":"ContainerDied","Data":"f916e17382481e444593150e0a833b6a6814157585865c277cdc15cc441ddb4b"} Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.947421 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60557b75-bafe-4e92-937c-e541b84aaf70","Type":"ContainerDied","Data":"bd25350fb77f428e282ee6c0811cc7be2f53b67333596fcfe27b4e218e3549da"} Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.947444 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60557b75-bafe-4e92-937c-e541b84aaf70","Type":"ContainerDied","Data":"8985ce1aa24fbc5f5bb8026bb427b370dba124cba19d3928ff7c3a92ca42615b"} Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.951578 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"296f6b57-de45-495d-abe9-8c779c157057","Type":"ContainerStarted","Data":"b8dfa3bc80470864a239275349fad5e129ee2b5e9ff5a093105f56a1aaa790d3"} Feb 19 21:48:32 crc kubenswrapper[4795]: I0219 21:48:32.961422 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"296f6b57-de45-495d-abe9-8c779c157057","Type":"ContainerStarted","Data":"5a4765254bd1f522ba57b50f9f6989619b42ec50e10b34373400dceff9cc29f7"} Feb 19 21:48:32 crc kubenswrapper[4795]: I0219 21:48:32.961854 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 19 21:48:32 crc kubenswrapper[4795]: I0219 21:48:32.990229 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.537163662 podStartE2EDuration="2.99020322s" podCreationTimestamp="2026-02-19 21:48:30 +0000 UTC" firstStartedPulling="2026-02-19 21:48:31.803762107 +0000 UTC m=+1222.996279971" lastFinishedPulling="2026-02-19 21:48:32.256801665 +0000 UTC m=+1223.449319529" observedRunningTime="2026-02-19 21:48:32.977230756 +0000 UTC m=+1224.169748650" watchObservedRunningTime="2026-02-19 21:48:32.99020322 +0000 UTC m=+1224.182721124" Feb 19 21:48:33 crc kubenswrapper[4795]: I0219 21:48:33.265302 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 19 21:48:33 crc kubenswrapper[4795]: I0219 21:48:33.273106 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.396543 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.534985 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-combined-ca-bundle\") pod \"60557b75-bafe-4e92-937c-e541b84aaf70\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.535036 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-scripts\") pod \"60557b75-bafe-4e92-937c-e541b84aaf70\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.535072 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzxr6\" (UniqueName: \"kubernetes.io/projected/60557b75-bafe-4e92-937c-e541b84aaf70-kube-api-access-gzxr6\") pod \"60557b75-bafe-4e92-937c-e541b84aaf70\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.535098 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-config-data\") pod \"60557b75-bafe-4e92-937c-e541b84aaf70\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.535141 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60557b75-bafe-4e92-937c-e541b84aaf70-run-httpd\") pod \"60557b75-bafe-4e92-937c-e541b84aaf70\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.535225 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60557b75-bafe-4e92-937c-e541b84aaf70-log-httpd\") pod \"60557b75-bafe-4e92-937c-e541b84aaf70\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.536181 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-sg-core-conf-yaml\") pod \"60557b75-bafe-4e92-937c-e541b84aaf70\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.537558 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60557b75-bafe-4e92-937c-e541b84aaf70-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "60557b75-bafe-4e92-937c-e541b84aaf70" (UID: "60557b75-bafe-4e92-937c-e541b84aaf70"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.537974 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60557b75-bafe-4e92-937c-e541b84aaf70-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "60557b75-bafe-4e92-937c-e541b84aaf70" (UID: "60557b75-bafe-4e92-937c-e541b84aaf70"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.541947 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60557b75-bafe-4e92-937c-e541b84aaf70-kube-api-access-gzxr6" (OuterVolumeSpecName: "kube-api-access-gzxr6") pod "60557b75-bafe-4e92-937c-e541b84aaf70" (UID: "60557b75-bafe-4e92-937c-e541b84aaf70"). InnerVolumeSpecName "kube-api-access-gzxr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.542080 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-scripts" (OuterVolumeSpecName: "scripts") pod "60557b75-bafe-4e92-937c-e541b84aaf70" (UID: "60557b75-bafe-4e92-937c-e541b84aaf70"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.551666 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.551697 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.584320 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "60557b75-bafe-4e92-937c-e541b84aaf70" (UID: "60557b75-bafe-4e92-937c-e541b84aaf70"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.636054 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60557b75-bafe-4e92-937c-e541b84aaf70" (UID: "60557b75-bafe-4e92-937c-e541b84aaf70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.637867 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60557b75-bafe-4e92-937c-e541b84aaf70-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.637897 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.637907 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.637916 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.637925 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzxr6\" (UniqueName: \"kubernetes.io/projected/60557b75-bafe-4e92-937c-e541b84aaf70-kube-api-access-gzxr6\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.637933 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60557b75-bafe-4e92-937c-e541b84aaf70-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.651546 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-config-data" (OuterVolumeSpecName: "config-data") pod "60557b75-bafe-4e92-937c-e541b84aaf70" (UID: "60557b75-bafe-4e92-937c-e541b84aaf70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.743386 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.992946 4795 generic.go:334] "Generic (PLEG): container finished" podID="60557b75-bafe-4e92-937c-e541b84aaf70" containerID="1631e868548b8541b4586b49303d99f204a0c9cbd4601bd40535a084257996e8" exitCode=0 Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.992984 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60557b75-bafe-4e92-937c-e541b84aaf70","Type":"ContainerDied","Data":"1631e868548b8541b4586b49303d99f204a0c9cbd4601bd40535a084257996e8"} Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.993010 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60557b75-bafe-4e92-937c-e541b84aaf70","Type":"ContainerDied","Data":"2512790f4175863e7da7d55dc8c6ebb57bbf253fa486687fa567e90d3c41dca6"} Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.993026 4795 scope.go:117] "RemoveContainer" containerID="f916e17382481e444593150e0a833b6a6814157585865c277cdc15cc441ddb4b" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.993146 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.054338 4795 scope.go:117] "RemoveContainer" containerID="bd25350fb77f428e282ee6c0811cc7be2f53b67333596fcfe27b4e218e3549da" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.072064 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.085286 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.094791 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:48:35 crc kubenswrapper[4795]: E0219 21:48:35.095270 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" containerName="ceilometer-central-agent" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.095293 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" containerName="ceilometer-central-agent" Feb 19 21:48:35 crc kubenswrapper[4795]: E0219 21:48:35.095311 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" containerName="sg-core" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.095319 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" containerName="sg-core" Feb 19 21:48:35 crc kubenswrapper[4795]: E0219 21:48:35.095339 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" containerName="proxy-httpd" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.095346 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" containerName="proxy-httpd" Feb 19 21:48:35 crc kubenswrapper[4795]: E0219 21:48:35.095366 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" containerName="ceilometer-notification-agent" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.095373 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" containerName="ceilometer-notification-agent" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.095556 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" containerName="ceilometer-notification-agent" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.095574 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" containerName="sg-core" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.095586 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" containerName="proxy-httpd" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.095599 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" containerName="ceilometer-central-agent" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.097406 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.099205 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.099551 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.099890 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.104808 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.127399 4795 scope.go:117] "RemoveContainer" containerID="1631e868548b8541b4586b49303d99f204a0c9cbd4601bd40535a084257996e8" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.148422 4795 scope.go:117] "RemoveContainer" containerID="8985ce1aa24fbc5f5bb8026bb427b370dba124cba19d3928ff7c3a92ca42615b" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.171257 4795 scope.go:117] "RemoveContainer" containerID="f916e17382481e444593150e0a833b6a6814157585865c277cdc15cc441ddb4b" Feb 19 21:48:35 crc kubenswrapper[4795]: E0219 21:48:35.171775 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f916e17382481e444593150e0a833b6a6814157585865c277cdc15cc441ddb4b\": container with ID starting with f916e17382481e444593150e0a833b6a6814157585865c277cdc15cc441ddb4b not found: ID does not exist" containerID="f916e17382481e444593150e0a833b6a6814157585865c277cdc15cc441ddb4b" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.171892 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f916e17382481e444593150e0a833b6a6814157585865c277cdc15cc441ddb4b"} err="failed to get container status \"f916e17382481e444593150e0a833b6a6814157585865c277cdc15cc441ddb4b\": rpc error: code = NotFound desc = could not find container \"f916e17382481e444593150e0a833b6a6814157585865c277cdc15cc441ddb4b\": container with ID starting with f916e17382481e444593150e0a833b6a6814157585865c277cdc15cc441ddb4b not found: ID does not exist" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.171982 4795 scope.go:117] "RemoveContainer" containerID="bd25350fb77f428e282ee6c0811cc7be2f53b67333596fcfe27b4e218e3549da" Feb 19 21:48:35 crc kubenswrapper[4795]: E0219 21:48:35.172403 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd25350fb77f428e282ee6c0811cc7be2f53b67333596fcfe27b4e218e3549da\": container with ID starting with bd25350fb77f428e282ee6c0811cc7be2f53b67333596fcfe27b4e218e3549da not found: ID does not exist" containerID="bd25350fb77f428e282ee6c0811cc7be2f53b67333596fcfe27b4e218e3549da" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.172424 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd25350fb77f428e282ee6c0811cc7be2f53b67333596fcfe27b4e218e3549da"} err="failed to get container status \"bd25350fb77f428e282ee6c0811cc7be2f53b67333596fcfe27b4e218e3549da\": rpc error: code = NotFound desc = could not find container \"bd25350fb77f428e282ee6c0811cc7be2f53b67333596fcfe27b4e218e3549da\": container with ID starting with bd25350fb77f428e282ee6c0811cc7be2f53b67333596fcfe27b4e218e3549da not found: ID does not exist" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.172438 4795 scope.go:117] "RemoveContainer" containerID="1631e868548b8541b4586b49303d99f204a0c9cbd4601bd40535a084257996e8" Feb 19 21:48:35 crc kubenswrapper[4795]: E0219 21:48:35.172668 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1631e868548b8541b4586b49303d99f204a0c9cbd4601bd40535a084257996e8\": container with ID starting with 1631e868548b8541b4586b49303d99f204a0c9cbd4601bd40535a084257996e8 not found: ID does not exist" containerID="1631e868548b8541b4586b49303d99f204a0c9cbd4601bd40535a084257996e8" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.172748 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1631e868548b8541b4586b49303d99f204a0c9cbd4601bd40535a084257996e8"} err="failed to get container status \"1631e868548b8541b4586b49303d99f204a0c9cbd4601bd40535a084257996e8\": rpc error: code = NotFound desc = could not find container \"1631e868548b8541b4586b49303d99f204a0c9cbd4601bd40535a084257996e8\": container with ID starting with 1631e868548b8541b4586b49303d99f204a0c9cbd4601bd40535a084257996e8 not found: ID does not exist" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.172813 4795 scope.go:117] "RemoveContainer" containerID="8985ce1aa24fbc5f5bb8026bb427b370dba124cba19d3928ff7c3a92ca42615b" Feb 19 21:48:35 crc kubenswrapper[4795]: E0219 21:48:35.173086 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8985ce1aa24fbc5f5bb8026bb427b370dba124cba19d3928ff7c3a92ca42615b\": container with ID starting with 8985ce1aa24fbc5f5bb8026bb427b370dba124cba19d3928ff7c3a92ca42615b not found: ID does not exist" containerID="8985ce1aa24fbc5f5bb8026bb427b370dba124cba19d3928ff7c3a92ca42615b" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.173157 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8985ce1aa24fbc5f5bb8026bb427b370dba124cba19d3928ff7c3a92ca42615b"} err="failed to get container status \"8985ce1aa24fbc5f5bb8026bb427b370dba124cba19d3928ff7c3a92ca42615b\": rpc error: code = NotFound desc = could not find container \"8985ce1aa24fbc5f5bb8026bb427b370dba124cba19d3928ff7c3a92ca42615b\": container with ID starting with 8985ce1aa24fbc5f5bb8026bb427b370dba124cba19d3928ff7c3a92ca42615b not found: ID does not exist" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.253441 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.253737 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/621537d8-aeb6-42fa-842d-fb45f36c97f6-log-httpd\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.253875 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd9mf\" (UniqueName: \"kubernetes.io/projected/621537d8-aeb6-42fa-842d-fb45f36c97f6-kube-api-access-wd9mf\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.253953 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.254045 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-config-data\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.254139 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/621537d8-aeb6-42fa-842d-fb45f36c97f6-run-httpd\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.254238 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.254410 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-scripts\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.356040 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/621537d8-aeb6-42fa-842d-fb45f36c97f6-log-httpd\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.356095 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd9mf\" (UniqueName: \"kubernetes.io/projected/621537d8-aeb6-42fa-842d-fb45f36c97f6-kube-api-access-wd9mf\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.356128 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.356158 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-config-data\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.356219 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/621537d8-aeb6-42fa-842d-fb45f36c97f6-run-httpd\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.356243 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.356352 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-scripts\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.356410 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.356647 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/621537d8-aeb6-42fa-842d-fb45f36c97f6-log-httpd\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.357217 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/621537d8-aeb6-42fa-842d-fb45f36c97f6-run-httpd\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.362006 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.362733 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-scripts\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.363195 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.364701 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.372204 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-config-data\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.377568 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd9mf\" (UniqueName: \"kubernetes.io/projected/621537d8-aeb6-42fa-842d-fb45f36c97f6-kube-api-access-wd9mf\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.431723 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.521936 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" path="/var/lib/kubelet/pods/60557b75-bafe-4e92-937c-e541b84aaf70/volumes" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.565283 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.565908 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 21:48:35 crc kubenswrapper[4795]: W0219 21:48:35.871227 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod621537d8_aeb6_42fa_842d_fb45f36c97f6.slice/crio-ac61b3ede19f57aaca2d6d35d68c23fd53c055f4e5a6557a9c61b51dfdcd0f58 WatchSource:0}: Error finding container ac61b3ede19f57aaca2d6d35d68c23fd53c055f4e5a6557a9c61b51dfdcd0f58: Status 404 returned error can't find the container with id ac61b3ede19f57aaca2d6d35d68c23fd53c055f4e5a6557a9c61b51dfdcd0f58 Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.874041 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:48:36 crc kubenswrapper[4795]: I0219 21:48:36.006718 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"621537d8-aeb6-42fa-842d-fb45f36c97f6","Type":"ContainerStarted","Data":"ac61b3ede19f57aaca2d6d35d68c23fd53c055f4e5a6557a9c61b51dfdcd0f58"} Feb 19 21:48:37 crc kubenswrapper[4795]: I0219 21:48:37.019606 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"621537d8-aeb6-42fa-842d-fb45f36c97f6","Type":"ContainerStarted","Data":"b1b28d1c52ce3f489b010dcfd7b837144a7cef0672ea49ce480fe45e6bf3617d"} Feb 19 21:48:38 crc kubenswrapper[4795]: I0219 21:48:38.032101 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"621537d8-aeb6-42fa-842d-fb45f36c97f6","Type":"ContainerStarted","Data":"8fe83d7d7776c6b0f45e4b4637592d93329691b60de8268539524e1a0a6e9ef3"} Feb 19 21:48:38 crc kubenswrapper[4795]: I0219 21:48:38.272936 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 21:48:38 crc kubenswrapper[4795]: I0219 21:48:38.303180 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 21:48:39 crc kubenswrapper[4795]: I0219 21:48:39.047880 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"621537d8-aeb6-42fa-842d-fb45f36c97f6","Type":"ContainerStarted","Data":"a788f3417d2afefc2a01186d42309d4971e8fc20f66a806d1ba37dc07835932b"} Feb 19 21:48:39 crc kubenswrapper[4795]: I0219 21:48:39.077031 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 21:48:39 crc kubenswrapper[4795]: I0219 21:48:39.579513 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 21:48:39 crc kubenswrapper[4795]: I0219 21:48:39.579987 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 21:48:40 crc kubenswrapper[4795]: I0219 21:48:40.620422 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="01859e80-9d51-4db2-8a48-9ad45d901f16" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.197:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 21:48:40 crc kubenswrapper[4795]: I0219 21:48:40.620456 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="01859e80-9d51-4db2-8a48-9ad45d901f16" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.197:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 21:48:41 crc kubenswrapper[4795]: I0219 21:48:41.079121 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"621537d8-aeb6-42fa-842d-fb45f36c97f6","Type":"ContainerStarted","Data":"be0856d1ea223fc3a885a538e8752150b14a3fdcf5a3de8464ad71a3fb5d020a"} Feb 19 21:48:41 crc kubenswrapper[4795]: I0219 21:48:41.079423 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 21:48:41 crc kubenswrapper[4795]: I0219 21:48:41.126077 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.095338536 podStartE2EDuration="6.126052119s" podCreationTimestamp="2026-02-19 21:48:35 +0000 UTC" firstStartedPulling="2026-02-19 21:48:35.875696669 +0000 UTC m=+1227.068214553" lastFinishedPulling="2026-02-19 21:48:39.906410262 +0000 UTC m=+1231.098928136" observedRunningTime="2026-02-19 21:48:41.108153936 +0000 UTC m=+1232.300671860" watchObservedRunningTime="2026-02-19 21:48:41.126052119 +0000 UTC m=+1232.318570003" Feb 19 21:48:41 crc kubenswrapper[4795]: I0219 21:48:41.353465 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 19 21:48:44 crc kubenswrapper[4795]: I0219 21:48:44.557202 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 21:48:44 crc kubenswrapper[4795]: I0219 21:48:44.559657 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 21:48:44 crc kubenswrapper[4795]: I0219 21:48:44.562588 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 21:48:45 crc kubenswrapper[4795]: I0219 21:48:45.140514 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.058974 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.154928 4795 generic.go:334] "Generic (PLEG): container finished" podID="4dabff2c-427b-4307-b949-23fdde980292" containerID="0985f63802fd352b3aee39c14704d73368fdf114b51ac51270903568f12ddc45" exitCode=137 Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.154968 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4dabff2c-427b-4307-b949-23fdde980292","Type":"ContainerDied","Data":"0985f63802fd352b3aee39c14704d73368fdf114b51ac51270903568f12ddc45"} Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.155000 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.155014 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4dabff2c-427b-4307-b949-23fdde980292","Type":"ContainerDied","Data":"7a24e1fe5ec311d17f9a97363d6465ddfdef8ead02652c4242800fc85c6ff620"} Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.155034 4795 scope.go:117] "RemoveContainer" containerID="0985f63802fd352b3aee39c14704d73368fdf114b51ac51270903568f12ddc45" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.180658 4795 scope.go:117] "RemoveContainer" containerID="0985f63802fd352b3aee39c14704d73368fdf114b51ac51270903568f12ddc45" Feb 19 21:48:47 crc kubenswrapper[4795]: E0219 21:48:47.181085 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0985f63802fd352b3aee39c14704d73368fdf114b51ac51270903568f12ddc45\": container with ID starting with 0985f63802fd352b3aee39c14704d73368fdf114b51ac51270903568f12ddc45 not found: ID does not exist" containerID="0985f63802fd352b3aee39c14704d73368fdf114b51ac51270903568f12ddc45" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.181130 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0985f63802fd352b3aee39c14704d73368fdf114b51ac51270903568f12ddc45"} err="failed to get container status \"0985f63802fd352b3aee39c14704d73368fdf114b51ac51270903568f12ddc45\": rpc error: code = NotFound desc = could not find container \"0985f63802fd352b3aee39c14704d73368fdf114b51ac51270903568f12ddc45\": container with ID starting with 0985f63802fd352b3aee39c14704d73368fdf114b51ac51270903568f12ddc45 not found: ID does not exist" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.186801 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29qhd\" (UniqueName: \"kubernetes.io/projected/4dabff2c-427b-4307-b949-23fdde980292-kube-api-access-29qhd\") pod \"4dabff2c-427b-4307-b949-23fdde980292\" (UID: \"4dabff2c-427b-4307-b949-23fdde980292\") " Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.187015 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dabff2c-427b-4307-b949-23fdde980292-combined-ca-bundle\") pod \"4dabff2c-427b-4307-b949-23fdde980292\" (UID: \"4dabff2c-427b-4307-b949-23fdde980292\") " Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.187173 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dabff2c-427b-4307-b949-23fdde980292-config-data\") pod \"4dabff2c-427b-4307-b949-23fdde980292\" (UID: \"4dabff2c-427b-4307-b949-23fdde980292\") " Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.194927 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dabff2c-427b-4307-b949-23fdde980292-kube-api-access-29qhd" (OuterVolumeSpecName: "kube-api-access-29qhd") pod "4dabff2c-427b-4307-b949-23fdde980292" (UID: "4dabff2c-427b-4307-b949-23fdde980292"). InnerVolumeSpecName "kube-api-access-29qhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.216431 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dabff2c-427b-4307-b949-23fdde980292-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4dabff2c-427b-4307-b949-23fdde980292" (UID: "4dabff2c-427b-4307-b949-23fdde980292"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.232068 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dabff2c-427b-4307-b949-23fdde980292-config-data" (OuterVolumeSpecName: "config-data") pod "4dabff2c-427b-4307-b949-23fdde980292" (UID: "4dabff2c-427b-4307-b949-23fdde980292"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.289457 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dabff2c-427b-4307-b949-23fdde980292-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.289504 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dabff2c-427b-4307-b949-23fdde980292-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.289522 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29qhd\" (UniqueName: \"kubernetes.io/projected/4dabff2c-427b-4307-b949-23fdde980292-kube-api-access-29qhd\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.540557 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.540603 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.552271 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:48:47 crc kubenswrapper[4795]: E0219 21:48:47.552781 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dabff2c-427b-4307-b949-23fdde980292" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.552805 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dabff2c-427b-4307-b949-23fdde980292" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.553068 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dabff2c-427b-4307-b949-23fdde980292" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.553834 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.557684 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.564828 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.564996 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.579689 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.696770 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.697177 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz5kj\" (UniqueName: \"kubernetes.io/projected/0adadcd9-8949-443b-8042-d0d11191eae9-kube-api-access-cz5kj\") pod \"nova-cell1-novncproxy-0\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.697228 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.697285 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.697316 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.799361 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.799411 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz5kj\" (UniqueName: \"kubernetes.io/projected/0adadcd9-8949-443b-8042-d0d11191eae9-kube-api-access-cz5kj\") pod \"nova-cell1-novncproxy-0\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.799447 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.799488 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.799510 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.803034 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.803382 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.803481 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.803438 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.821229 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz5kj\" (UniqueName: \"kubernetes.io/projected/0adadcd9-8949-443b-8042-d0d11191eae9-kube-api-access-cz5kj\") pod \"nova-cell1-novncproxy-0\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.904256 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:48 crc kubenswrapper[4795]: W0219 21:48:48.362346 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0adadcd9_8949_443b_8042_d0d11191eae9.slice/crio-6555271b3d98ead4145953a8b269ef0cca6677e13c5b918d43b6b24ff869130e WatchSource:0}: Error finding container 6555271b3d98ead4145953a8b269ef0cca6677e13c5b918d43b6b24ff869130e: Status 404 returned error can't find the container with id 6555271b3d98ead4145953a8b269ef0cca6677e13c5b918d43b6b24ff869130e Feb 19 21:48:48 crc kubenswrapper[4795]: I0219 21:48:48.369750 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:48:49 crc kubenswrapper[4795]: I0219 21:48:49.182778 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0adadcd9-8949-443b-8042-d0d11191eae9","Type":"ContainerStarted","Data":"0f773609fa3d94e54bc6067d53960a0ec55209c713b15b0460c6c5287772d27f"} Feb 19 21:48:49 crc kubenswrapper[4795]: I0219 21:48:49.183190 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0adadcd9-8949-443b-8042-d0d11191eae9","Type":"ContainerStarted","Data":"6555271b3d98ead4145953a8b269ef0cca6677e13c5b918d43b6b24ff869130e"} Feb 19 21:48:49 crc kubenswrapper[4795]: I0219 21:48:49.221427 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.221396907 podStartE2EDuration="2.221396907s" podCreationTimestamp="2026-02-19 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:49.202346532 +0000 UTC m=+1240.394864396" watchObservedRunningTime="2026-02-19 21:48:49.221396907 +0000 UTC m=+1240.413914821" Feb 19 21:48:49 crc kubenswrapper[4795]: I0219 21:48:49.532265 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dabff2c-427b-4307-b949-23fdde980292" path="/var/lib/kubelet/pods/4dabff2c-427b-4307-b949-23fdde980292/volumes" Feb 19 21:48:49 crc kubenswrapper[4795]: I0219 21:48:49.590583 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 21:48:49 crc kubenswrapper[4795]: I0219 21:48:49.592609 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 21:48:49 crc kubenswrapper[4795]: I0219 21:48:49.592747 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 21:48:49 crc kubenswrapper[4795]: I0219 21:48:49.607698 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.193341 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.199999 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.401946 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7677694455-vk4hv"] Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.403504 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.432843 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7677694455-vk4hv"] Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.556466 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-ovsdbserver-nb\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.556521 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-ovsdbserver-sb\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.556596 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-dns-svc\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.556668 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8xm5\" (UniqueName: \"kubernetes.io/projected/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-kube-api-access-s8xm5\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.556713 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-dns-swift-storage-0\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.556752 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-config\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.658091 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-ovsdbserver-nb\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.658149 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-ovsdbserver-sb\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.658266 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-dns-svc\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.658345 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8xm5\" (UniqueName: \"kubernetes.io/projected/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-kube-api-access-s8xm5\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.658417 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-dns-swift-storage-0\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.658453 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-config\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.660224 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-ovsdbserver-nb\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.660827 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-ovsdbserver-sb\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.661392 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-dns-swift-storage-0\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.661771 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-config\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.662320 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-dns-svc\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.682142 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8xm5\" (UniqueName: \"kubernetes.io/projected/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-kube-api-access-s8xm5\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.731896 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:51 crc kubenswrapper[4795]: I0219 21:48:51.191565 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7677694455-vk4hv"] Feb 19 21:48:52 crc kubenswrapper[4795]: I0219 21:48:52.092761 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:48:52 crc kubenswrapper[4795]: I0219 21:48:52.093434 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="ceilometer-central-agent" containerID="cri-o://b1b28d1c52ce3f489b010dcfd7b837144a7cef0672ea49ce480fe45e6bf3617d" gracePeriod=30 Feb 19 21:48:52 crc kubenswrapper[4795]: I0219 21:48:52.093530 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="ceilometer-notification-agent" containerID="cri-o://8fe83d7d7776c6b0f45e4b4637592d93329691b60de8268539524e1a0a6e9ef3" gracePeriod=30 Feb 19 21:48:52 crc kubenswrapper[4795]: I0219 21:48:52.093533 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="sg-core" containerID="cri-o://a788f3417d2afefc2a01186d42309d4971e8fc20f66a806d1ba37dc07835932b" gracePeriod=30 Feb 19 21:48:52 crc kubenswrapper[4795]: I0219 21:48:52.093561 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="proxy-httpd" containerID="cri-o://be0856d1ea223fc3a885a538e8752150b14a3fdcf5a3de8464ad71a3fb5d020a" gracePeriod=30 Feb 19 21:48:52 crc kubenswrapper[4795]: I0219 21:48:52.099918 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.199:3000/\": read tcp 10.217.0.2:55466->10.217.0.199:3000: read: connection reset by peer" Feb 19 21:48:52 crc kubenswrapper[4795]: I0219 21:48:52.210509 4795 generic.go:334] "Generic (PLEG): container finished" podID="74df8ac0-77f2-4e8d-aa39-d05dc6ce7190" containerID="445b39298e77e683eee2d951a286aa4f9de79beb92f370dca4d81f0dfe56d255" exitCode=0 Feb 19 21:48:52 crc kubenswrapper[4795]: I0219 21:48:52.210585 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7677694455-vk4hv" event={"ID":"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190","Type":"ContainerDied","Data":"445b39298e77e683eee2d951a286aa4f9de79beb92f370dca4d81f0dfe56d255"} Feb 19 21:48:52 crc kubenswrapper[4795]: I0219 21:48:52.210638 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7677694455-vk4hv" event={"ID":"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190","Type":"ContainerStarted","Data":"cdc18778f810815fa368ef0cc45dbb0e103ffbfcfa9231a83aa5be2dd6cfe1c2"} Feb 19 21:48:52 crc kubenswrapper[4795]: I0219 21:48:52.905153 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:53 crc kubenswrapper[4795]: I0219 21:48:53.019764 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:48:53 crc kubenswrapper[4795]: I0219 21:48:53.221916 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7677694455-vk4hv" event={"ID":"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190","Type":"ContainerStarted","Data":"0b266e887f3cb16fe191b1e64cc8d8adf7464d874863c315ba4605ce8d79b678"} Feb 19 21:48:53 crc kubenswrapper[4795]: I0219 21:48:53.222084 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:53 crc kubenswrapper[4795]: I0219 21:48:53.224262 4795 generic.go:334] "Generic (PLEG): container finished" podID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerID="be0856d1ea223fc3a885a538e8752150b14a3fdcf5a3de8464ad71a3fb5d020a" exitCode=0 Feb 19 21:48:53 crc kubenswrapper[4795]: I0219 21:48:53.224283 4795 generic.go:334] "Generic (PLEG): container finished" podID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerID="a788f3417d2afefc2a01186d42309d4971e8fc20f66a806d1ba37dc07835932b" exitCode=2 Feb 19 21:48:53 crc kubenswrapper[4795]: I0219 21:48:53.224290 4795 generic.go:334] "Generic (PLEG): container finished" podID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerID="b1b28d1c52ce3f489b010dcfd7b837144a7cef0672ea49ce480fe45e6bf3617d" exitCode=0 Feb 19 21:48:53 crc kubenswrapper[4795]: I0219 21:48:53.224504 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="01859e80-9d51-4db2-8a48-9ad45d901f16" containerName="nova-api-log" containerID="cri-o://cd6c91a8bbdff67fb129ad494141346fb0c9ffacbd675189463e7bb2a6d75d94" gracePeriod=30 Feb 19 21:48:53 crc kubenswrapper[4795]: I0219 21:48:53.224600 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"621537d8-aeb6-42fa-842d-fb45f36c97f6","Type":"ContainerDied","Data":"be0856d1ea223fc3a885a538e8752150b14a3fdcf5a3de8464ad71a3fb5d020a"} Feb 19 21:48:53 crc kubenswrapper[4795]: I0219 21:48:53.224622 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"621537d8-aeb6-42fa-842d-fb45f36c97f6","Type":"ContainerDied","Data":"a788f3417d2afefc2a01186d42309d4971e8fc20f66a806d1ba37dc07835932b"} Feb 19 21:48:53 crc kubenswrapper[4795]: I0219 21:48:53.224649 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"621537d8-aeb6-42fa-842d-fb45f36c97f6","Type":"ContainerDied","Data":"b1b28d1c52ce3f489b010dcfd7b837144a7cef0672ea49ce480fe45e6bf3617d"} Feb 19 21:48:53 crc kubenswrapper[4795]: I0219 21:48:53.224696 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="01859e80-9d51-4db2-8a48-9ad45d901f16" containerName="nova-api-api" containerID="cri-o://a08e7a43bb7eb9940934542dc40899cae39a62cfccafdc60ea968829591c4b59" gracePeriod=30 Feb 19 21:48:53 crc kubenswrapper[4795]: I0219 21:48:53.242252 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7677694455-vk4hv" podStartSLOduration=3.242234383 podStartE2EDuration="3.242234383s" podCreationTimestamp="2026-02-19 21:48:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:53.240994939 +0000 UTC m=+1244.433512803" watchObservedRunningTime="2026-02-19 21:48:53.242234383 +0000 UTC m=+1244.434752257" Feb 19 21:48:53 crc kubenswrapper[4795]: I0219 21:48:53.981153 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.132272 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-combined-ca-bundle\") pod \"621537d8-aeb6-42fa-842d-fb45f36c97f6\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.132340 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-ceilometer-tls-certs\") pod \"621537d8-aeb6-42fa-842d-fb45f36c97f6\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.132366 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-scripts\") pod \"621537d8-aeb6-42fa-842d-fb45f36c97f6\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.132419 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/621537d8-aeb6-42fa-842d-fb45f36c97f6-log-httpd\") pod \"621537d8-aeb6-42fa-842d-fb45f36c97f6\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.132456 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/621537d8-aeb6-42fa-842d-fb45f36c97f6-run-httpd\") pod \"621537d8-aeb6-42fa-842d-fb45f36c97f6\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.132506 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-config-data\") pod \"621537d8-aeb6-42fa-842d-fb45f36c97f6\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.132569 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-sg-core-conf-yaml\") pod \"621537d8-aeb6-42fa-842d-fb45f36c97f6\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.132631 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd9mf\" (UniqueName: \"kubernetes.io/projected/621537d8-aeb6-42fa-842d-fb45f36c97f6-kube-api-access-wd9mf\") pod \"621537d8-aeb6-42fa-842d-fb45f36c97f6\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.132917 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/621537d8-aeb6-42fa-842d-fb45f36c97f6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "621537d8-aeb6-42fa-842d-fb45f36c97f6" (UID: "621537d8-aeb6-42fa-842d-fb45f36c97f6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.133019 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/621537d8-aeb6-42fa-842d-fb45f36c97f6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "621537d8-aeb6-42fa-842d-fb45f36c97f6" (UID: "621537d8-aeb6-42fa-842d-fb45f36c97f6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.133054 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/621537d8-aeb6-42fa-842d-fb45f36c97f6-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.138438 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-scripts" (OuterVolumeSpecName: "scripts") pod "621537d8-aeb6-42fa-842d-fb45f36c97f6" (UID: "621537d8-aeb6-42fa-842d-fb45f36c97f6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.151379 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/621537d8-aeb6-42fa-842d-fb45f36c97f6-kube-api-access-wd9mf" (OuterVolumeSpecName: "kube-api-access-wd9mf") pod "621537d8-aeb6-42fa-842d-fb45f36c97f6" (UID: "621537d8-aeb6-42fa-842d-fb45f36c97f6"). InnerVolumeSpecName "kube-api-access-wd9mf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.169541 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "621537d8-aeb6-42fa-842d-fb45f36c97f6" (UID: "621537d8-aeb6-42fa-842d-fb45f36c97f6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.198457 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "621537d8-aeb6-42fa-842d-fb45f36c97f6" (UID: "621537d8-aeb6-42fa-842d-fb45f36c97f6"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.224655 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-config-data" (OuterVolumeSpecName: "config-data") pod "621537d8-aeb6-42fa-842d-fb45f36c97f6" (UID: "621537d8-aeb6-42fa-842d-fb45f36c97f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.233914 4795 generic.go:334] "Generic (PLEG): container finished" podID="01859e80-9d51-4db2-8a48-9ad45d901f16" containerID="cd6c91a8bbdff67fb129ad494141346fb0c9ffacbd675189463e7bb2a6d75d94" exitCode=143 Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.233993 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01859e80-9d51-4db2-8a48-9ad45d901f16","Type":"ContainerDied","Data":"cd6c91a8bbdff67fb129ad494141346fb0c9ffacbd675189463e7bb2a6d75d94"} Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.234257 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.234280 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.234290 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd9mf\" (UniqueName: \"kubernetes.io/projected/621537d8-aeb6-42fa-842d-fb45f36c97f6-kube-api-access-wd9mf\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.234299 4795 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.234307 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.234316 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/621537d8-aeb6-42fa-842d-fb45f36c97f6-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.236240 4795 generic.go:334] "Generic (PLEG): container finished" podID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerID="8fe83d7d7776c6b0f45e4b4637592d93329691b60de8268539524e1a0a6e9ef3" exitCode=0 Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.236268 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"621537d8-aeb6-42fa-842d-fb45f36c97f6","Type":"ContainerDied","Data":"8fe83d7d7776c6b0f45e4b4637592d93329691b60de8268539524e1a0a6e9ef3"} Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.236296 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"621537d8-aeb6-42fa-842d-fb45f36c97f6","Type":"ContainerDied","Data":"ac61b3ede19f57aaca2d6d35d68c23fd53c055f4e5a6557a9c61b51dfdcd0f58"} Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.236311 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.236314 4795 scope.go:117] "RemoveContainer" containerID="be0856d1ea223fc3a885a538e8752150b14a3fdcf5a3de8464ad71a3fb5d020a" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.253330 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "621537d8-aeb6-42fa-842d-fb45f36c97f6" (UID: "621537d8-aeb6-42fa-842d-fb45f36c97f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.292286 4795 scope.go:117] "RemoveContainer" containerID="a788f3417d2afefc2a01186d42309d4971e8fc20f66a806d1ba37dc07835932b" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.310511 4795 scope.go:117] "RemoveContainer" containerID="8fe83d7d7776c6b0f45e4b4637592d93329691b60de8268539524e1a0a6e9ef3" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.329653 4795 scope.go:117] "RemoveContainer" containerID="b1b28d1c52ce3f489b010dcfd7b837144a7cef0672ea49ce480fe45e6bf3617d" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.340337 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.350944 4795 scope.go:117] "RemoveContainer" containerID="be0856d1ea223fc3a885a538e8752150b14a3fdcf5a3de8464ad71a3fb5d020a" Feb 19 21:48:54 crc kubenswrapper[4795]: E0219 21:48:54.351518 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be0856d1ea223fc3a885a538e8752150b14a3fdcf5a3de8464ad71a3fb5d020a\": container with ID starting with be0856d1ea223fc3a885a538e8752150b14a3fdcf5a3de8464ad71a3fb5d020a not found: ID does not exist" containerID="be0856d1ea223fc3a885a538e8752150b14a3fdcf5a3de8464ad71a3fb5d020a" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.351553 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be0856d1ea223fc3a885a538e8752150b14a3fdcf5a3de8464ad71a3fb5d020a"} err="failed to get container status \"be0856d1ea223fc3a885a538e8752150b14a3fdcf5a3de8464ad71a3fb5d020a\": rpc error: code = NotFound desc = could not find container \"be0856d1ea223fc3a885a538e8752150b14a3fdcf5a3de8464ad71a3fb5d020a\": container with ID starting with be0856d1ea223fc3a885a538e8752150b14a3fdcf5a3de8464ad71a3fb5d020a not found: ID does not exist" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.351584 4795 scope.go:117] "RemoveContainer" containerID="a788f3417d2afefc2a01186d42309d4971e8fc20f66a806d1ba37dc07835932b" Feb 19 21:48:54 crc kubenswrapper[4795]: E0219 21:48:54.351985 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a788f3417d2afefc2a01186d42309d4971e8fc20f66a806d1ba37dc07835932b\": container with ID starting with a788f3417d2afefc2a01186d42309d4971e8fc20f66a806d1ba37dc07835932b not found: ID does not exist" containerID="a788f3417d2afefc2a01186d42309d4971e8fc20f66a806d1ba37dc07835932b" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.352019 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a788f3417d2afefc2a01186d42309d4971e8fc20f66a806d1ba37dc07835932b"} err="failed to get container status \"a788f3417d2afefc2a01186d42309d4971e8fc20f66a806d1ba37dc07835932b\": rpc error: code = NotFound desc = could not find container \"a788f3417d2afefc2a01186d42309d4971e8fc20f66a806d1ba37dc07835932b\": container with ID starting with a788f3417d2afefc2a01186d42309d4971e8fc20f66a806d1ba37dc07835932b not found: ID does not exist" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.352038 4795 scope.go:117] "RemoveContainer" containerID="8fe83d7d7776c6b0f45e4b4637592d93329691b60de8268539524e1a0a6e9ef3" Feb 19 21:48:54 crc kubenswrapper[4795]: E0219 21:48:54.352362 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fe83d7d7776c6b0f45e4b4637592d93329691b60de8268539524e1a0a6e9ef3\": container with ID starting with 8fe83d7d7776c6b0f45e4b4637592d93329691b60de8268539524e1a0a6e9ef3 not found: ID does not exist" containerID="8fe83d7d7776c6b0f45e4b4637592d93329691b60de8268539524e1a0a6e9ef3" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.352400 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fe83d7d7776c6b0f45e4b4637592d93329691b60de8268539524e1a0a6e9ef3"} err="failed to get container status \"8fe83d7d7776c6b0f45e4b4637592d93329691b60de8268539524e1a0a6e9ef3\": rpc error: code = NotFound desc = could not find container \"8fe83d7d7776c6b0f45e4b4637592d93329691b60de8268539524e1a0a6e9ef3\": container with ID starting with 8fe83d7d7776c6b0f45e4b4637592d93329691b60de8268539524e1a0a6e9ef3 not found: ID does not exist" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.352426 4795 scope.go:117] "RemoveContainer" containerID="b1b28d1c52ce3f489b010dcfd7b837144a7cef0672ea49ce480fe45e6bf3617d" Feb 19 21:48:54 crc kubenswrapper[4795]: E0219 21:48:54.352690 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1b28d1c52ce3f489b010dcfd7b837144a7cef0672ea49ce480fe45e6bf3617d\": container with ID starting with b1b28d1c52ce3f489b010dcfd7b837144a7cef0672ea49ce480fe45e6bf3617d not found: ID does not exist" containerID="b1b28d1c52ce3f489b010dcfd7b837144a7cef0672ea49ce480fe45e6bf3617d" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.352712 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1b28d1c52ce3f489b010dcfd7b837144a7cef0672ea49ce480fe45e6bf3617d"} err="failed to get container status \"b1b28d1c52ce3f489b010dcfd7b837144a7cef0672ea49ce480fe45e6bf3617d\": rpc error: code = NotFound desc = could not find container \"b1b28d1c52ce3f489b010dcfd7b837144a7cef0672ea49ce480fe45e6bf3617d\": container with ID starting with b1b28d1c52ce3f489b010dcfd7b837144a7cef0672ea49ce480fe45e6bf3617d not found: ID does not exist" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.569093 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.581957 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.601455 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:48:54 crc kubenswrapper[4795]: E0219 21:48:54.601920 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="ceilometer-notification-agent" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.601946 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="ceilometer-notification-agent" Feb 19 21:48:54 crc kubenswrapper[4795]: E0219 21:48:54.601962 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="proxy-httpd" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.601970 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="proxy-httpd" Feb 19 21:48:54 crc kubenswrapper[4795]: E0219 21:48:54.601985 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="sg-core" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.601993 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="sg-core" Feb 19 21:48:54 crc kubenswrapper[4795]: E0219 21:48:54.602023 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="ceilometer-central-agent" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.602032 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="ceilometer-central-agent" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.602267 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="sg-core" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.602297 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="ceilometer-notification-agent" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.602314 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="proxy-httpd" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.602328 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="ceilometer-central-agent" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.604380 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.606398 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.609131 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.609467 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.614461 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.745916 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.745961 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-scripts\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.745989 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-config-data\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.746047 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.746095 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.746276 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-run-httpd\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.746628 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-246kt\" (UniqueName: \"kubernetes.io/projected/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-kube-api-access-246kt\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.746691 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-log-httpd\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.848420 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-246kt\" (UniqueName: \"kubernetes.io/projected/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-kube-api-access-246kt\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.848470 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-log-httpd\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.848497 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.848517 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-scripts\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.848540 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-config-data\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.848582 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.848630 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.848702 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-run-httpd\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.849352 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-run-httpd\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.849367 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-log-httpd\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.857727 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.857842 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.858071 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.858177 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-config-data\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.864821 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-scripts\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.866902 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-246kt\" (UniqueName: \"kubernetes.io/projected/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-kube-api-access-246kt\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.919123 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:48:55 crc kubenswrapper[4795]: I0219 21:48:55.347875 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:48:55 crc kubenswrapper[4795]: W0219 21:48:55.353122 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01d2bcc0_aacc_413f_bc5e_36f3aa7a4ed5.slice/crio-86abd155d63f2c0bc5b785f25429ac5c2d08a30858af34af122f1895bdc9fbc8 WatchSource:0}: Error finding container 86abd155d63f2c0bc5b785f25429ac5c2d08a30858af34af122f1895bdc9fbc8: Status 404 returned error can't find the container with id 86abd155d63f2c0bc5b785f25429ac5c2d08a30858af34af122f1895bdc9fbc8 Feb 19 21:48:55 crc kubenswrapper[4795]: I0219 21:48:55.369129 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:48:55 crc kubenswrapper[4795]: I0219 21:48:55.521500 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" path="/var/lib/kubelet/pods/621537d8-aeb6-42fa-842d-fb45f36c97f6/volumes" Feb 19 21:48:56 crc kubenswrapper[4795]: I0219 21:48:56.269788 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5","Type":"ContainerStarted","Data":"323b724b3b048a0396352d1a1fdda411a293b7403fd5ff6b7c5a586be082ee66"} Feb 19 21:48:56 crc kubenswrapper[4795]: I0219 21:48:56.270126 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5","Type":"ContainerStarted","Data":"86abd155d63f2c0bc5b785f25429ac5c2d08a30858af34af122f1895bdc9fbc8"} Feb 19 21:48:56 crc kubenswrapper[4795]: I0219 21:48:56.746135 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:48:56 crc kubenswrapper[4795]: I0219 21:48:56.894726 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01859e80-9d51-4db2-8a48-9ad45d901f16-logs\") pod \"01859e80-9d51-4db2-8a48-9ad45d901f16\" (UID: \"01859e80-9d51-4db2-8a48-9ad45d901f16\") " Feb 19 21:48:56 crc kubenswrapper[4795]: I0219 21:48:56.894787 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01859e80-9d51-4db2-8a48-9ad45d901f16-combined-ca-bundle\") pod \"01859e80-9d51-4db2-8a48-9ad45d901f16\" (UID: \"01859e80-9d51-4db2-8a48-9ad45d901f16\") " Feb 19 21:48:56 crc kubenswrapper[4795]: I0219 21:48:56.894972 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01859e80-9d51-4db2-8a48-9ad45d901f16-config-data\") pod \"01859e80-9d51-4db2-8a48-9ad45d901f16\" (UID: \"01859e80-9d51-4db2-8a48-9ad45d901f16\") " Feb 19 21:48:56 crc kubenswrapper[4795]: I0219 21:48:56.895030 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9krq\" (UniqueName: \"kubernetes.io/projected/01859e80-9d51-4db2-8a48-9ad45d901f16-kube-api-access-s9krq\") pod \"01859e80-9d51-4db2-8a48-9ad45d901f16\" (UID: \"01859e80-9d51-4db2-8a48-9ad45d901f16\") " Feb 19 21:48:56 crc kubenswrapper[4795]: I0219 21:48:56.895558 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01859e80-9d51-4db2-8a48-9ad45d901f16-logs" (OuterVolumeSpecName: "logs") pod "01859e80-9d51-4db2-8a48-9ad45d901f16" (UID: "01859e80-9d51-4db2-8a48-9ad45d901f16"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:48:56 crc kubenswrapper[4795]: I0219 21:48:56.907408 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01859e80-9d51-4db2-8a48-9ad45d901f16-kube-api-access-s9krq" (OuterVolumeSpecName: "kube-api-access-s9krq") pod "01859e80-9d51-4db2-8a48-9ad45d901f16" (UID: "01859e80-9d51-4db2-8a48-9ad45d901f16"). InnerVolumeSpecName "kube-api-access-s9krq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:56 crc kubenswrapper[4795]: I0219 21:48:56.930321 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01859e80-9d51-4db2-8a48-9ad45d901f16-config-data" (OuterVolumeSpecName: "config-data") pod "01859e80-9d51-4db2-8a48-9ad45d901f16" (UID: "01859e80-9d51-4db2-8a48-9ad45d901f16"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:56 crc kubenswrapper[4795]: I0219 21:48:56.942367 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01859e80-9d51-4db2-8a48-9ad45d901f16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01859e80-9d51-4db2-8a48-9ad45d901f16" (UID: "01859e80-9d51-4db2-8a48-9ad45d901f16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:56 crc kubenswrapper[4795]: I0219 21:48:56.997586 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01859e80-9d51-4db2-8a48-9ad45d901f16-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:56 crc kubenswrapper[4795]: I0219 21:48:56.997625 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9krq\" (UniqueName: \"kubernetes.io/projected/01859e80-9d51-4db2-8a48-9ad45d901f16-kube-api-access-s9krq\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:56 crc kubenswrapper[4795]: I0219 21:48:56.997635 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01859e80-9d51-4db2-8a48-9ad45d901f16-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:56 crc kubenswrapper[4795]: I0219 21:48:56.997644 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01859e80-9d51-4db2-8a48-9ad45d901f16-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.281219 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5","Type":"ContainerStarted","Data":"d031e57a2a8228a96f6d8c7aec2dddf131a18030a5301dfcd9286e4c94fa3f52"} Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.283242 4795 generic.go:334] "Generic (PLEG): container finished" podID="01859e80-9d51-4db2-8a48-9ad45d901f16" containerID="a08e7a43bb7eb9940934542dc40899cae39a62cfccafdc60ea968829591c4b59" exitCode=0 Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.283397 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01859e80-9d51-4db2-8a48-9ad45d901f16","Type":"ContainerDied","Data":"a08e7a43bb7eb9940934542dc40899cae39a62cfccafdc60ea968829591c4b59"} Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.283402 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.283514 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01859e80-9d51-4db2-8a48-9ad45d901f16","Type":"ContainerDied","Data":"4d23ebd426731022b2890eb99d72e3c34bf0eabb128346170d4b9e6c3457a311"} Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.283605 4795 scope.go:117] "RemoveContainer" containerID="a08e7a43bb7eb9940934542dc40899cae39a62cfccafdc60ea968829591c4b59" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.303205 4795 scope.go:117] "RemoveContainer" containerID="cd6c91a8bbdff67fb129ad494141346fb0c9ffacbd675189463e7bb2a6d75d94" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.320956 4795 scope.go:117] "RemoveContainer" containerID="a08e7a43bb7eb9940934542dc40899cae39a62cfccafdc60ea968829591c4b59" Feb 19 21:48:57 crc kubenswrapper[4795]: E0219 21:48:57.321433 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a08e7a43bb7eb9940934542dc40899cae39a62cfccafdc60ea968829591c4b59\": container with ID starting with a08e7a43bb7eb9940934542dc40899cae39a62cfccafdc60ea968829591c4b59 not found: ID does not exist" containerID="a08e7a43bb7eb9940934542dc40899cae39a62cfccafdc60ea968829591c4b59" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.321465 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a08e7a43bb7eb9940934542dc40899cae39a62cfccafdc60ea968829591c4b59"} err="failed to get container status \"a08e7a43bb7eb9940934542dc40899cae39a62cfccafdc60ea968829591c4b59\": rpc error: code = NotFound desc = could not find container \"a08e7a43bb7eb9940934542dc40899cae39a62cfccafdc60ea968829591c4b59\": container with ID starting with a08e7a43bb7eb9940934542dc40899cae39a62cfccafdc60ea968829591c4b59 not found: ID does not exist" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.321483 4795 scope.go:117] "RemoveContainer" containerID="cd6c91a8bbdff67fb129ad494141346fb0c9ffacbd675189463e7bb2a6d75d94" Feb 19 21:48:57 crc kubenswrapper[4795]: E0219 21:48:57.321991 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd6c91a8bbdff67fb129ad494141346fb0c9ffacbd675189463e7bb2a6d75d94\": container with ID starting with cd6c91a8bbdff67fb129ad494141346fb0c9ffacbd675189463e7bb2a6d75d94 not found: ID does not exist" containerID="cd6c91a8bbdff67fb129ad494141346fb0c9ffacbd675189463e7bb2a6d75d94" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.322012 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd6c91a8bbdff67fb129ad494141346fb0c9ffacbd675189463e7bb2a6d75d94"} err="failed to get container status \"cd6c91a8bbdff67fb129ad494141346fb0c9ffacbd675189463e7bb2a6d75d94\": rpc error: code = NotFound desc = could not find container \"cd6c91a8bbdff67fb129ad494141346fb0c9ffacbd675189463e7bb2a6d75d94\": container with ID starting with cd6c91a8bbdff67fb129ad494141346fb0c9ffacbd675189463e7bb2a6d75d94 not found: ID does not exist" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.325106 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.335424 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.345468 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 21:48:57 crc kubenswrapper[4795]: E0219 21:48:57.345822 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01859e80-9d51-4db2-8a48-9ad45d901f16" containerName="nova-api-log" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.345840 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="01859e80-9d51-4db2-8a48-9ad45d901f16" containerName="nova-api-log" Feb 19 21:48:57 crc kubenswrapper[4795]: E0219 21:48:57.345866 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01859e80-9d51-4db2-8a48-9ad45d901f16" containerName="nova-api-api" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.345873 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="01859e80-9d51-4db2-8a48-9ad45d901f16" containerName="nova-api-api" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.346131 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="01859e80-9d51-4db2-8a48-9ad45d901f16" containerName="nova-api-api" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.346155 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="01859e80-9d51-4db2-8a48-9ad45d901f16" containerName="nova-api-log" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.349929 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.352199 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.352353 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.353937 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.356917 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.504782 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ecdad61-afa9-43fa-9321-1b58d9abf074-logs\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.505285 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.505517 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.505589 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-public-tls-certs\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.505653 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhjf8\" (UniqueName: \"kubernetes.io/projected/7ecdad61-afa9-43fa-9321-1b58d9abf074-kube-api-access-qhjf8\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.505698 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-config-data\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.537308 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01859e80-9d51-4db2-8a48-9ad45d901f16" path="/var/lib/kubelet/pods/01859e80-9d51-4db2-8a48-9ad45d901f16/volumes" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.607029 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.607070 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-public-tls-certs\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.607106 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhjf8\" (UniqueName: \"kubernetes.io/projected/7ecdad61-afa9-43fa-9321-1b58d9abf074-kube-api-access-qhjf8\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.607132 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-config-data\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.607208 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ecdad61-afa9-43fa-9321-1b58d9abf074-logs\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.607295 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.610337 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-public-tls-certs\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.610931 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ecdad61-afa9-43fa-9321-1b58d9abf074-logs\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.611267 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.613285 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.615709 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-config-data\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.629117 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhjf8\" (UniqueName: \"kubernetes.io/projected/7ecdad61-afa9-43fa-9321-1b58d9abf074-kube-api-access-qhjf8\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.670061 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.904441 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.921422 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.132097 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:48:58 crc kubenswrapper[4795]: W0219 21:48:58.132788 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ecdad61_afa9_43fa_9321_1b58d9abf074.slice/crio-ffda1e0cd8af360a3425530f0623937f6ca5f730c7420ebf643f335d5eaf911c WatchSource:0}: Error finding container ffda1e0cd8af360a3425530f0623937f6ca5f730c7420ebf643f335d5eaf911c: Status 404 returned error can't find the container with id ffda1e0cd8af360a3425530f0623937f6ca5f730c7420ebf643f335d5eaf911c Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.315075 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7ecdad61-afa9-43fa-9321-1b58d9abf074","Type":"ContainerStarted","Data":"ffda1e0cd8af360a3425530f0623937f6ca5f730c7420ebf643f335d5eaf911c"} Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.324719 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5","Type":"ContainerStarted","Data":"7f760caa528ce072e5792c18e9f808627474b8c013128a1c46f50177df48e444"} Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.496196 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.729591 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-w8x5b"] Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.748572 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-w8x5b" Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.749907 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-w8x5b"] Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.755754 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.756032 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.841464 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-scripts\") pod \"nova-cell1-cell-mapping-w8x5b\" (UID: \"d1d3a710-addc-4f86-b77c-0d05dc98695f\") " pod="openstack/nova-cell1-cell-mapping-w8x5b" Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.841591 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-w8x5b\" (UID: \"d1d3a710-addc-4f86-b77c-0d05dc98695f\") " pod="openstack/nova-cell1-cell-mapping-w8x5b" Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.841626 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spfz4\" (UniqueName: \"kubernetes.io/projected/d1d3a710-addc-4f86-b77c-0d05dc98695f-kube-api-access-spfz4\") pod \"nova-cell1-cell-mapping-w8x5b\" (UID: \"d1d3a710-addc-4f86-b77c-0d05dc98695f\") " pod="openstack/nova-cell1-cell-mapping-w8x5b" Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.841771 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-config-data\") pod \"nova-cell1-cell-mapping-w8x5b\" (UID: \"d1d3a710-addc-4f86-b77c-0d05dc98695f\") " pod="openstack/nova-cell1-cell-mapping-w8x5b" Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.943896 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-scripts\") pod \"nova-cell1-cell-mapping-w8x5b\" (UID: \"d1d3a710-addc-4f86-b77c-0d05dc98695f\") " pod="openstack/nova-cell1-cell-mapping-w8x5b" Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.943992 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-w8x5b\" (UID: \"d1d3a710-addc-4f86-b77c-0d05dc98695f\") " pod="openstack/nova-cell1-cell-mapping-w8x5b" Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.944027 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spfz4\" (UniqueName: \"kubernetes.io/projected/d1d3a710-addc-4f86-b77c-0d05dc98695f-kube-api-access-spfz4\") pod \"nova-cell1-cell-mapping-w8x5b\" (UID: \"d1d3a710-addc-4f86-b77c-0d05dc98695f\") " pod="openstack/nova-cell1-cell-mapping-w8x5b" Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.944065 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-config-data\") pod \"nova-cell1-cell-mapping-w8x5b\" (UID: \"d1d3a710-addc-4f86-b77c-0d05dc98695f\") " pod="openstack/nova-cell1-cell-mapping-w8x5b" Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.949092 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-w8x5b\" (UID: \"d1d3a710-addc-4f86-b77c-0d05dc98695f\") " pod="openstack/nova-cell1-cell-mapping-w8x5b" Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.951448 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-scripts\") pod \"nova-cell1-cell-mapping-w8x5b\" (UID: \"d1d3a710-addc-4f86-b77c-0d05dc98695f\") " pod="openstack/nova-cell1-cell-mapping-w8x5b" Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.952763 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-config-data\") pod \"nova-cell1-cell-mapping-w8x5b\" (UID: \"d1d3a710-addc-4f86-b77c-0d05dc98695f\") " pod="openstack/nova-cell1-cell-mapping-w8x5b" Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.963899 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spfz4\" (UniqueName: \"kubernetes.io/projected/d1d3a710-addc-4f86-b77c-0d05dc98695f-kube-api-access-spfz4\") pod \"nova-cell1-cell-mapping-w8x5b\" (UID: \"d1d3a710-addc-4f86-b77c-0d05dc98695f\") " pod="openstack/nova-cell1-cell-mapping-w8x5b" Feb 19 21:48:59 crc kubenswrapper[4795]: I0219 21:48:59.073704 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-w8x5b" Feb 19 21:48:59 crc kubenswrapper[4795]: I0219 21:48:59.347931 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7ecdad61-afa9-43fa-9321-1b58d9abf074","Type":"ContainerStarted","Data":"5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c"} Feb 19 21:48:59 crc kubenswrapper[4795]: I0219 21:48:59.348278 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7ecdad61-afa9-43fa-9321-1b58d9abf074","Type":"ContainerStarted","Data":"7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc"} Feb 19 21:48:59 crc kubenswrapper[4795]: I0219 21:48:59.369632 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.369617664 podStartE2EDuration="2.369617664s" podCreationTimestamp="2026-02-19 21:48:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:59.369305275 +0000 UTC m=+1250.561823149" watchObservedRunningTime="2026-02-19 21:48:59.369617664 +0000 UTC m=+1250.562135518" Feb 19 21:48:59 crc kubenswrapper[4795]: I0219 21:48:59.505638 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-w8x5b"] Feb 19 21:48:59 crc kubenswrapper[4795]: W0219 21:48:59.509542 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1d3a710_addc_4f86_b77c_0d05dc98695f.slice/crio-51b9ea4c5a1378436eccbd6511b12e6d246bc54dae9f0ee487b98ec7869d4de6 WatchSource:0}: Error finding container 51b9ea4c5a1378436eccbd6511b12e6d246bc54dae9f0ee487b98ec7869d4de6: Status 404 returned error can't find the container with id 51b9ea4c5a1378436eccbd6511b12e6d246bc54dae9f0ee487b98ec7869d4de6 Feb 19 21:49:00 crc kubenswrapper[4795]: I0219 21:49:00.356574 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-w8x5b" event={"ID":"d1d3a710-addc-4f86-b77c-0d05dc98695f","Type":"ContainerStarted","Data":"2f19263c7946bfa6317a4079eb988e89329f3723a790f4116428242859b8575d"} Feb 19 21:49:00 crc kubenswrapper[4795]: I0219 21:49:00.357827 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-w8x5b" event={"ID":"d1d3a710-addc-4f86-b77c-0d05dc98695f","Type":"ContainerStarted","Data":"51b9ea4c5a1378436eccbd6511b12e6d246bc54dae9f0ee487b98ec7869d4de6"} Feb 19 21:49:00 crc kubenswrapper[4795]: I0219 21:49:00.361594 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerName="ceilometer-central-agent" containerID="cri-o://323b724b3b048a0396352d1a1fdda411a293b7403fd5ff6b7c5a586be082ee66" gracePeriod=30 Feb 19 21:49:00 crc kubenswrapper[4795]: I0219 21:49:00.361855 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerName="proxy-httpd" containerID="cri-o://8ac5d19e61a3b12f19dd54a9a359ff3a6b479d8b61e7d3ae285bef106f9f3f3a" gracePeriod=30 Feb 19 21:49:00 crc kubenswrapper[4795]: I0219 21:49:00.361843 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5","Type":"ContainerStarted","Data":"8ac5d19e61a3b12f19dd54a9a359ff3a6b479d8b61e7d3ae285bef106f9f3f3a"} Feb 19 21:49:00 crc kubenswrapper[4795]: I0219 21:49:00.362487 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 21:49:00 crc kubenswrapper[4795]: I0219 21:49:00.361930 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerName="ceilometer-notification-agent" containerID="cri-o://d031e57a2a8228a96f6d8c7aec2dddf131a18030a5301dfcd9286e4c94fa3f52" gracePeriod=30 Feb 19 21:49:00 crc kubenswrapper[4795]: I0219 21:49:00.361883 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerName="sg-core" containerID="cri-o://7f760caa528ce072e5792c18e9f808627474b8c013128a1c46f50177df48e444" gracePeriod=30 Feb 19 21:49:00 crc kubenswrapper[4795]: I0219 21:49:00.389229 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-w8x5b" podStartSLOduration=2.389204869 podStartE2EDuration="2.389204869s" podCreationTimestamp="2026-02-19 21:48:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:49:00.375136953 +0000 UTC m=+1251.567654817" watchObservedRunningTime="2026-02-19 21:49:00.389204869 +0000 UTC m=+1251.581722743" Feb 19 21:49:00 crc kubenswrapper[4795]: I0219 21:49:00.412333 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.470677727 podStartE2EDuration="6.412309648s" podCreationTimestamp="2026-02-19 21:48:54 +0000 UTC" firstStartedPulling="2026-02-19 21:48:55.35525774 +0000 UTC m=+1246.547775604" lastFinishedPulling="2026-02-19 21:48:59.296889671 +0000 UTC m=+1250.489407525" observedRunningTime="2026-02-19 21:49:00.399327443 +0000 UTC m=+1251.591845307" watchObservedRunningTime="2026-02-19 21:49:00.412309648 +0000 UTC m=+1251.604827532" Feb 19 21:49:00 crc kubenswrapper[4795]: I0219 21:49:00.733210 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:49:00 crc kubenswrapper[4795]: I0219 21:49:00.788933 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75ddbf7c75-f5qcb"] Feb 19 21:49:00 crc kubenswrapper[4795]: I0219 21:49:00.789215 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" podUID="59d807ee-6555-4c2f-8598-9f264d5a95f9" containerName="dnsmasq-dns" containerID="cri-o://fd261f043799d7df60456eb5576e328975c0d9086db0abbd02ad64c7bf8efe3f" gracePeriod=10 Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.310567 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.371391 4795 generic.go:334] "Generic (PLEG): container finished" podID="59d807ee-6555-4c2f-8598-9f264d5a95f9" containerID="fd261f043799d7df60456eb5576e328975c0d9086db0abbd02ad64c7bf8efe3f" exitCode=0 Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.371729 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" event={"ID":"59d807ee-6555-4c2f-8598-9f264d5a95f9","Type":"ContainerDied","Data":"fd261f043799d7df60456eb5576e328975c0d9086db0abbd02ad64c7bf8efe3f"} Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.371757 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" event={"ID":"59d807ee-6555-4c2f-8598-9f264d5a95f9","Type":"ContainerDied","Data":"919c6d3f60ce379525cac92e740a7e9dd73e2c7e47ddc1efebc10b6306ebfc94"} Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.371773 4795 scope.go:117] "RemoveContainer" containerID="fd261f043799d7df60456eb5576e328975c0d9086db0abbd02ad64c7bf8efe3f" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.371890 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.397077 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-ovsdbserver-sb\") pod \"59d807ee-6555-4c2f-8598-9f264d5a95f9\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.397132 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-dns-swift-storage-0\") pod \"59d807ee-6555-4c2f-8598-9f264d5a95f9\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.397219 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zrmm\" (UniqueName: \"kubernetes.io/projected/59d807ee-6555-4c2f-8598-9f264d5a95f9-kube-api-access-5zrmm\") pod \"59d807ee-6555-4c2f-8598-9f264d5a95f9\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.397257 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-dns-svc\") pod \"59d807ee-6555-4c2f-8598-9f264d5a95f9\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.397297 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-ovsdbserver-nb\") pod \"59d807ee-6555-4c2f-8598-9f264d5a95f9\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.397397 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-config\") pod \"59d807ee-6555-4c2f-8598-9f264d5a95f9\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.399374 4795 generic.go:334] "Generic (PLEG): container finished" podID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerID="8ac5d19e61a3b12f19dd54a9a359ff3a6b479d8b61e7d3ae285bef106f9f3f3a" exitCode=0 Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.399430 4795 generic.go:334] "Generic (PLEG): container finished" podID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerID="7f760caa528ce072e5792c18e9f808627474b8c013128a1c46f50177df48e444" exitCode=2 Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.399439 4795 generic.go:334] "Generic (PLEG): container finished" podID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerID="d031e57a2a8228a96f6d8c7aec2dddf131a18030a5301dfcd9286e4c94fa3f52" exitCode=0 Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.400181 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5","Type":"ContainerDied","Data":"8ac5d19e61a3b12f19dd54a9a359ff3a6b479d8b61e7d3ae285bef106f9f3f3a"} Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.400233 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5","Type":"ContainerDied","Data":"7f760caa528ce072e5792c18e9f808627474b8c013128a1c46f50177df48e444"} Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.400245 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5","Type":"ContainerDied","Data":"d031e57a2a8228a96f6d8c7aec2dddf131a18030a5301dfcd9286e4c94fa3f52"} Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.407337 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59d807ee-6555-4c2f-8598-9f264d5a95f9-kube-api-access-5zrmm" (OuterVolumeSpecName: "kube-api-access-5zrmm") pod "59d807ee-6555-4c2f-8598-9f264d5a95f9" (UID: "59d807ee-6555-4c2f-8598-9f264d5a95f9"). InnerVolumeSpecName "kube-api-access-5zrmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.449322 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "59d807ee-6555-4c2f-8598-9f264d5a95f9" (UID: "59d807ee-6555-4c2f-8598-9f264d5a95f9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.451710 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "59d807ee-6555-4c2f-8598-9f264d5a95f9" (UID: "59d807ee-6555-4c2f-8598-9f264d5a95f9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.454917 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "59d807ee-6555-4c2f-8598-9f264d5a95f9" (UID: "59d807ee-6555-4c2f-8598-9f264d5a95f9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.461944 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "59d807ee-6555-4c2f-8598-9f264d5a95f9" (UID: "59d807ee-6555-4c2f-8598-9f264d5a95f9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.474251 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-config" (OuterVolumeSpecName: "config") pod "59d807ee-6555-4c2f-8598-9f264d5a95f9" (UID: "59d807ee-6555-4c2f-8598-9f264d5a95f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.503331 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zrmm\" (UniqueName: \"kubernetes.io/projected/59d807ee-6555-4c2f-8598-9f264d5a95f9-kube-api-access-5zrmm\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.503363 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.503375 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.503384 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.503393 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.503403 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.595946 4795 scope.go:117] "RemoveContainer" containerID="c4b13b710aaac472f707d3124011993b912c3a0c284fec1b64e6081245b45895" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.619423 4795 scope.go:117] "RemoveContainer" containerID="fd261f043799d7df60456eb5576e328975c0d9086db0abbd02ad64c7bf8efe3f" Feb 19 21:49:01 crc kubenswrapper[4795]: E0219 21:49:01.625896 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd261f043799d7df60456eb5576e328975c0d9086db0abbd02ad64c7bf8efe3f\": container with ID starting with fd261f043799d7df60456eb5576e328975c0d9086db0abbd02ad64c7bf8efe3f not found: ID does not exist" containerID="fd261f043799d7df60456eb5576e328975c0d9086db0abbd02ad64c7bf8efe3f" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.625926 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd261f043799d7df60456eb5576e328975c0d9086db0abbd02ad64c7bf8efe3f"} err="failed to get container status \"fd261f043799d7df60456eb5576e328975c0d9086db0abbd02ad64c7bf8efe3f\": rpc error: code = NotFound desc = could not find container \"fd261f043799d7df60456eb5576e328975c0d9086db0abbd02ad64c7bf8efe3f\": container with ID starting with fd261f043799d7df60456eb5576e328975c0d9086db0abbd02ad64c7bf8efe3f not found: ID does not exist" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.625949 4795 scope.go:117] "RemoveContainer" containerID="c4b13b710aaac472f707d3124011993b912c3a0c284fec1b64e6081245b45895" Feb 19 21:49:01 crc kubenswrapper[4795]: E0219 21:49:01.626443 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4b13b710aaac472f707d3124011993b912c3a0c284fec1b64e6081245b45895\": container with ID starting with c4b13b710aaac472f707d3124011993b912c3a0c284fec1b64e6081245b45895 not found: ID does not exist" containerID="c4b13b710aaac472f707d3124011993b912c3a0c284fec1b64e6081245b45895" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.626524 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4b13b710aaac472f707d3124011993b912c3a0c284fec1b64e6081245b45895"} err="failed to get container status \"c4b13b710aaac472f707d3124011993b912c3a0c284fec1b64e6081245b45895\": rpc error: code = NotFound desc = could not find container \"c4b13b710aaac472f707d3124011993b912c3a0c284fec1b64e6081245b45895\": container with ID starting with c4b13b710aaac472f707d3124011993b912c3a0c284fec1b64e6081245b45895 not found: ID does not exist" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.683201 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.702285 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75ddbf7c75-f5qcb"] Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.719063 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75ddbf7c75-f5qcb"] Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.821136 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-246kt\" (UniqueName: \"kubernetes.io/projected/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-kube-api-access-246kt\") pod \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.821218 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-log-httpd\") pod \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.821294 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-sg-core-conf-yaml\") pod \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.821386 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-run-httpd\") pod \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.821411 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-config-data\") pod \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.821428 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-scripts\") pod \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.821458 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-combined-ca-bundle\") pod \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.821506 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-ceilometer-tls-certs\") pod \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.822025 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" (UID: "01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.822084 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" (UID: "01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.827472 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-scripts" (OuterVolumeSpecName: "scripts") pod "01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" (UID: "01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.827613 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-kube-api-access-246kt" (OuterVolumeSpecName: "kube-api-access-246kt") pod "01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" (UID: "01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5"). InnerVolumeSpecName "kube-api-access-246kt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.864916 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" (UID: "01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.892049 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" (UID: "01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.924034 4795 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.924083 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-246kt\" (UniqueName: \"kubernetes.io/projected/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-kube-api-access-246kt\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.924102 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.924119 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.924134 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.924149 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.938148 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" (UID: "01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.962681 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-config-data" (OuterVolumeSpecName: "config-data") pod "01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" (UID: "01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.025370 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.025403 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.410497 4795 generic.go:334] "Generic (PLEG): container finished" podID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerID="323b724b3b048a0396352d1a1fdda411a293b7403fd5ff6b7c5a586be082ee66" exitCode=0 Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.410556 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.410571 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5","Type":"ContainerDied","Data":"323b724b3b048a0396352d1a1fdda411a293b7403fd5ff6b7c5a586be082ee66"} Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.411703 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5","Type":"ContainerDied","Data":"86abd155d63f2c0bc5b785f25429ac5c2d08a30858af34af122f1895bdc9fbc8"} Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.411722 4795 scope.go:117] "RemoveContainer" containerID="8ac5d19e61a3b12f19dd54a9a359ff3a6b479d8b61e7d3ae285bef106f9f3f3a" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.440332 4795 scope.go:117] "RemoveContainer" containerID="7f760caa528ce072e5792c18e9f808627474b8c013128a1c46f50177df48e444" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.448271 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.458540 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.471598 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:49:02 crc kubenswrapper[4795]: E0219 21:49:02.471980 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerName="proxy-httpd" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.471997 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerName="proxy-httpd" Feb 19 21:49:02 crc kubenswrapper[4795]: E0219 21:49:02.472018 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerName="ceilometer-central-agent" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.472027 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerName="ceilometer-central-agent" Feb 19 21:49:02 crc kubenswrapper[4795]: E0219 21:49:02.472041 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerName="sg-core" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.472046 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerName="sg-core" Feb 19 21:49:02 crc kubenswrapper[4795]: E0219 21:49:02.472057 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59d807ee-6555-4c2f-8598-9f264d5a95f9" containerName="init" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.472064 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="59d807ee-6555-4c2f-8598-9f264d5a95f9" containerName="init" Feb 19 21:49:02 crc kubenswrapper[4795]: E0219 21:49:02.472079 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59d807ee-6555-4c2f-8598-9f264d5a95f9" containerName="dnsmasq-dns" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.472085 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="59d807ee-6555-4c2f-8598-9f264d5a95f9" containerName="dnsmasq-dns" Feb 19 21:49:02 crc kubenswrapper[4795]: E0219 21:49:02.472103 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerName="ceilometer-notification-agent" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.472109 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerName="ceilometer-notification-agent" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.472278 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="59d807ee-6555-4c2f-8598-9f264d5a95f9" containerName="dnsmasq-dns" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.472287 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerName="proxy-httpd" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.472297 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerName="sg-core" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.472310 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerName="ceilometer-notification-agent" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.472328 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerName="ceilometer-central-agent" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.473945 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.475858 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.477352 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.477535 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.478997 4795 scope.go:117] "RemoveContainer" containerID="d031e57a2a8228a96f6d8c7aec2dddf131a18030a5301dfcd9286e4c94fa3f52" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.500461 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.523994 4795 scope.go:117] "RemoveContainer" containerID="323b724b3b048a0396352d1a1fdda411a293b7403fd5ff6b7c5a586be082ee66" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.541159 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e956453d-551f-44b4-8125-8656b3155402-log-httpd\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.541382 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.541415 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.541431 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-config-data\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.541488 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.541566 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-829s7\" (UniqueName: \"kubernetes.io/projected/e956453d-551f-44b4-8125-8656b3155402-kube-api-access-829s7\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.541587 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e956453d-551f-44b4-8125-8656b3155402-run-httpd\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.541624 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-scripts\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.578396 4795 scope.go:117] "RemoveContainer" containerID="8ac5d19e61a3b12f19dd54a9a359ff3a6b479d8b61e7d3ae285bef106f9f3f3a" Feb 19 21:49:02 crc kubenswrapper[4795]: E0219 21:49:02.579580 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ac5d19e61a3b12f19dd54a9a359ff3a6b479d8b61e7d3ae285bef106f9f3f3a\": container with ID starting with 8ac5d19e61a3b12f19dd54a9a359ff3a6b479d8b61e7d3ae285bef106f9f3f3a not found: ID does not exist" containerID="8ac5d19e61a3b12f19dd54a9a359ff3a6b479d8b61e7d3ae285bef106f9f3f3a" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.579628 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ac5d19e61a3b12f19dd54a9a359ff3a6b479d8b61e7d3ae285bef106f9f3f3a"} err="failed to get container status \"8ac5d19e61a3b12f19dd54a9a359ff3a6b479d8b61e7d3ae285bef106f9f3f3a\": rpc error: code = NotFound desc = could not find container \"8ac5d19e61a3b12f19dd54a9a359ff3a6b479d8b61e7d3ae285bef106f9f3f3a\": container with ID starting with 8ac5d19e61a3b12f19dd54a9a359ff3a6b479d8b61e7d3ae285bef106f9f3f3a not found: ID does not exist" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.579650 4795 scope.go:117] "RemoveContainer" containerID="7f760caa528ce072e5792c18e9f808627474b8c013128a1c46f50177df48e444" Feb 19 21:49:02 crc kubenswrapper[4795]: E0219 21:49:02.579936 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f760caa528ce072e5792c18e9f808627474b8c013128a1c46f50177df48e444\": container with ID starting with 7f760caa528ce072e5792c18e9f808627474b8c013128a1c46f50177df48e444 not found: ID does not exist" containerID="7f760caa528ce072e5792c18e9f808627474b8c013128a1c46f50177df48e444" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.580015 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f760caa528ce072e5792c18e9f808627474b8c013128a1c46f50177df48e444"} err="failed to get container status \"7f760caa528ce072e5792c18e9f808627474b8c013128a1c46f50177df48e444\": rpc error: code = NotFound desc = could not find container \"7f760caa528ce072e5792c18e9f808627474b8c013128a1c46f50177df48e444\": container with ID starting with 7f760caa528ce072e5792c18e9f808627474b8c013128a1c46f50177df48e444 not found: ID does not exist" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.580091 4795 scope.go:117] "RemoveContainer" containerID="d031e57a2a8228a96f6d8c7aec2dddf131a18030a5301dfcd9286e4c94fa3f52" Feb 19 21:49:02 crc kubenswrapper[4795]: E0219 21:49:02.580327 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d031e57a2a8228a96f6d8c7aec2dddf131a18030a5301dfcd9286e4c94fa3f52\": container with ID starting with d031e57a2a8228a96f6d8c7aec2dddf131a18030a5301dfcd9286e4c94fa3f52 not found: ID does not exist" containerID="d031e57a2a8228a96f6d8c7aec2dddf131a18030a5301dfcd9286e4c94fa3f52" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.580351 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d031e57a2a8228a96f6d8c7aec2dddf131a18030a5301dfcd9286e4c94fa3f52"} err="failed to get container status \"d031e57a2a8228a96f6d8c7aec2dddf131a18030a5301dfcd9286e4c94fa3f52\": rpc error: code = NotFound desc = could not find container \"d031e57a2a8228a96f6d8c7aec2dddf131a18030a5301dfcd9286e4c94fa3f52\": container with ID starting with d031e57a2a8228a96f6d8c7aec2dddf131a18030a5301dfcd9286e4c94fa3f52 not found: ID does not exist" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.580365 4795 scope.go:117] "RemoveContainer" containerID="323b724b3b048a0396352d1a1fdda411a293b7403fd5ff6b7c5a586be082ee66" Feb 19 21:49:02 crc kubenswrapper[4795]: E0219 21:49:02.581339 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"323b724b3b048a0396352d1a1fdda411a293b7403fd5ff6b7c5a586be082ee66\": container with ID starting with 323b724b3b048a0396352d1a1fdda411a293b7403fd5ff6b7c5a586be082ee66 not found: ID does not exist" containerID="323b724b3b048a0396352d1a1fdda411a293b7403fd5ff6b7c5a586be082ee66" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.581365 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"323b724b3b048a0396352d1a1fdda411a293b7403fd5ff6b7c5a586be082ee66"} err="failed to get container status \"323b724b3b048a0396352d1a1fdda411a293b7403fd5ff6b7c5a586be082ee66\": rpc error: code = NotFound desc = could not find container \"323b724b3b048a0396352d1a1fdda411a293b7403fd5ff6b7c5a586be082ee66\": container with ID starting with 323b724b3b048a0396352d1a1fdda411a293b7403fd5ff6b7c5a586be082ee66 not found: ID does not exist" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.643942 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-829s7\" (UniqueName: \"kubernetes.io/projected/e956453d-551f-44b4-8125-8656b3155402-kube-api-access-829s7\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.643993 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e956453d-551f-44b4-8125-8656b3155402-run-httpd\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.644043 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-scripts\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.644095 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e956453d-551f-44b4-8125-8656b3155402-log-httpd\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.644158 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.644216 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.644239 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-config-data\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.644314 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.646035 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e956453d-551f-44b4-8125-8656b3155402-log-httpd\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.647539 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e956453d-551f-44b4-8125-8656b3155402-run-httpd\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.649536 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-scripts\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.650414 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-config-data\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.652520 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.654798 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.660177 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.662305 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-829s7\" (UniqueName: \"kubernetes.io/projected/e956453d-551f-44b4-8125-8656b3155402-kube-api-access-829s7\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.797798 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:49:03 crc kubenswrapper[4795]: I0219 21:49:03.222695 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:49:03 crc kubenswrapper[4795]: I0219 21:49:03.425493 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e956453d-551f-44b4-8125-8656b3155402","Type":"ContainerStarted","Data":"9d09fb4d826d8602127fabff658a8440e51f38b0c8a942f510e29c6808527ef7"} Feb 19 21:49:03 crc kubenswrapper[4795]: I0219 21:49:03.534925 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" path="/var/lib/kubelet/pods/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5/volumes" Feb 19 21:49:03 crc kubenswrapper[4795]: I0219 21:49:03.536271 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59d807ee-6555-4c2f-8598-9f264d5a95f9" path="/var/lib/kubelet/pods/59d807ee-6555-4c2f-8598-9f264d5a95f9/volumes" Feb 19 21:49:04 crc kubenswrapper[4795]: I0219 21:49:04.436537 4795 generic.go:334] "Generic (PLEG): container finished" podID="d1d3a710-addc-4f86-b77c-0d05dc98695f" containerID="2f19263c7946bfa6317a4079eb988e89329f3723a790f4116428242859b8575d" exitCode=0 Feb 19 21:49:04 crc kubenswrapper[4795]: I0219 21:49:04.436662 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-w8x5b" event={"ID":"d1d3a710-addc-4f86-b77c-0d05dc98695f","Type":"ContainerDied","Data":"2f19263c7946bfa6317a4079eb988e89329f3723a790f4116428242859b8575d"} Feb 19 21:49:04 crc kubenswrapper[4795]: I0219 21:49:04.441396 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e956453d-551f-44b4-8125-8656b3155402","Type":"ContainerStarted","Data":"a502298e81b556f379f6fb3a9a80c1cf56ac3cccd10f1bd463a7436d04a56552"} Feb 19 21:49:05 crc kubenswrapper[4795]: I0219 21:49:05.458065 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e956453d-551f-44b4-8125-8656b3155402","Type":"ContainerStarted","Data":"98c35a6c89e932ad9cd3cce6a6d12a0653f350e3b0c1cefa51b67ce7752f0734"} Feb 19 21:49:05 crc kubenswrapper[4795]: I0219 21:49:05.813124 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-w8x5b" Feb 19 21:49:05 crc kubenswrapper[4795]: I0219 21:49:05.907999 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-combined-ca-bundle\") pod \"d1d3a710-addc-4f86-b77c-0d05dc98695f\" (UID: \"d1d3a710-addc-4f86-b77c-0d05dc98695f\") " Feb 19 21:49:05 crc kubenswrapper[4795]: I0219 21:49:05.908195 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-scripts\") pod \"d1d3a710-addc-4f86-b77c-0d05dc98695f\" (UID: \"d1d3a710-addc-4f86-b77c-0d05dc98695f\") " Feb 19 21:49:05 crc kubenswrapper[4795]: I0219 21:49:05.908261 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spfz4\" (UniqueName: \"kubernetes.io/projected/d1d3a710-addc-4f86-b77c-0d05dc98695f-kube-api-access-spfz4\") pod \"d1d3a710-addc-4f86-b77c-0d05dc98695f\" (UID: \"d1d3a710-addc-4f86-b77c-0d05dc98695f\") " Feb 19 21:49:05 crc kubenswrapper[4795]: I0219 21:49:05.908316 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-config-data\") pod \"d1d3a710-addc-4f86-b77c-0d05dc98695f\" (UID: \"d1d3a710-addc-4f86-b77c-0d05dc98695f\") " Feb 19 21:49:05 crc kubenswrapper[4795]: I0219 21:49:05.915232 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1d3a710-addc-4f86-b77c-0d05dc98695f-kube-api-access-spfz4" (OuterVolumeSpecName: "kube-api-access-spfz4") pod "d1d3a710-addc-4f86-b77c-0d05dc98695f" (UID: "d1d3a710-addc-4f86-b77c-0d05dc98695f"). InnerVolumeSpecName "kube-api-access-spfz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:05 crc kubenswrapper[4795]: I0219 21:49:05.921298 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-scripts" (OuterVolumeSpecName: "scripts") pod "d1d3a710-addc-4f86-b77c-0d05dc98695f" (UID: "d1d3a710-addc-4f86-b77c-0d05dc98695f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:05 crc kubenswrapper[4795]: I0219 21:49:05.938019 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-config-data" (OuterVolumeSpecName: "config-data") pod "d1d3a710-addc-4f86-b77c-0d05dc98695f" (UID: "d1d3a710-addc-4f86-b77c-0d05dc98695f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:05 crc kubenswrapper[4795]: I0219 21:49:05.940416 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1d3a710-addc-4f86-b77c-0d05dc98695f" (UID: "d1d3a710-addc-4f86-b77c-0d05dc98695f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:06 crc kubenswrapper[4795]: I0219 21:49:06.010401 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:06 crc kubenswrapper[4795]: I0219 21:49:06.010444 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spfz4\" (UniqueName: \"kubernetes.io/projected/d1d3a710-addc-4f86-b77c-0d05dc98695f-kube-api-access-spfz4\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:06 crc kubenswrapper[4795]: I0219 21:49:06.010460 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:06 crc kubenswrapper[4795]: I0219 21:49:06.010472 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:06 crc kubenswrapper[4795]: I0219 21:49:06.473111 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e956453d-551f-44b4-8125-8656b3155402","Type":"ContainerStarted","Data":"21486f4b00e940629fe56d483d784412d9ef80990cf1527d454dfa0b33b31ddb"} Feb 19 21:49:06 crc kubenswrapper[4795]: I0219 21:49:06.476254 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-w8x5b" event={"ID":"d1d3a710-addc-4f86-b77c-0d05dc98695f","Type":"ContainerDied","Data":"51b9ea4c5a1378436eccbd6511b12e6d246bc54dae9f0ee487b98ec7869d4de6"} Feb 19 21:49:06 crc kubenswrapper[4795]: I0219 21:49:06.476286 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51b9ea4c5a1378436eccbd6511b12e6d246bc54dae9f0ee487b98ec7869d4de6" Feb 19 21:49:06 crc kubenswrapper[4795]: I0219 21:49:06.476420 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-w8x5b" Feb 19 21:49:06 crc kubenswrapper[4795]: I0219 21:49:06.634459 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:49:06 crc kubenswrapper[4795]: I0219 21:49:06.634726 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7ecdad61-afa9-43fa-9321-1b58d9abf074" containerName="nova-api-log" containerID="cri-o://7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc" gracePeriod=30 Feb 19 21:49:06 crc kubenswrapper[4795]: I0219 21:49:06.634864 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7ecdad61-afa9-43fa-9321-1b58d9abf074" containerName="nova-api-api" containerID="cri-o://5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c" gracePeriod=30 Feb 19 21:49:06 crc kubenswrapper[4795]: I0219 21:49:06.648042 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:49:06 crc kubenswrapper[4795]: I0219 21:49:06.648285 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c0f49585-0601-424b-9f28-304ae06c9d93" containerName="nova-scheduler-scheduler" containerID="cri-o://a293322e7ade93b6e5ee39b06413dcdb22fcc62b8715355c0228e95452255ef5" gracePeriod=30 Feb 19 21:49:06 crc kubenswrapper[4795]: I0219 21:49:06.695668 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:49:06 crc kubenswrapper[4795]: I0219 21:49:06.696059 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" containerName="nova-metadata-log" containerID="cri-o://fac0b49dac38eb57e08779c1a5fd43f5502fc80faee614fd505dcf5a274e4145" gracePeriod=30 Feb 19 21:49:06 crc kubenswrapper[4795]: I0219 21:49:06.696241 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" containerName="nova-metadata-metadata" containerID="cri-o://a60101c154c65170b0cd0d24ec4c3534c546765af68be292c858d3e94c795294" gracePeriod=30 Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.152664 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.272717 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhjf8\" (UniqueName: \"kubernetes.io/projected/7ecdad61-afa9-43fa-9321-1b58d9abf074-kube-api-access-qhjf8\") pod \"7ecdad61-afa9-43fa-9321-1b58d9abf074\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.272846 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-config-data\") pod \"7ecdad61-afa9-43fa-9321-1b58d9abf074\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.272897 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-combined-ca-bundle\") pod \"7ecdad61-afa9-43fa-9321-1b58d9abf074\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.272963 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-public-tls-certs\") pod \"7ecdad61-afa9-43fa-9321-1b58d9abf074\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.273038 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ecdad61-afa9-43fa-9321-1b58d9abf074-logs\") pod \"7ecdad61-afa9-43fa-9321-1b58d9abf074\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.273063 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-internal-tls-certs\") pod \"7ecdad61-afa9-43fa-9321-1b58d9abf074\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.273963 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ecdad61-afa9-43fa-9321-1b58d9abf074-logs" (OuterVolumeSpecName: "logs") pod "7ecdad61-afa9-43fa-9321-1b58d9abf074" (UID: "7ecdad61-afa9-43fa-9321-1b58d9abf074"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.276891 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ecdad61-afa9-43fa-9321-1b58d9abf074-kube-api-access-qhjf8" (OuterVolumeSpecName: "kube-api-access-qhjf8") pod "7ecdad61-afa9-43fa-9321-1b58d9abf074" (UID: "7ecdad61-afa9-43fa-9321-1b58d9abf074"). InnerVolumeSpecName "kube-api-access-qhjf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.306931 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-config-data" (OuterVolumeSpecName: "config-data") pod "7ecdad61-afa9-43fa-9321-1b58d9abf074" (UID: "7ecdad61-afa9-43fa-9321-1b58d9abf074"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.313236 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ecdad61-afa9-43fa-9321-1b58d9abf074" (UID: "7ecdad61-afa9-43fa-9321-1b58d9abf074"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.327476 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7ecdad61-afa9-43fa-9321-1b58d9abf074" (UID: "7ecdad61-afa9-43fa-9321-1b58d9abf074"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.339859 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7ecdad61-afa9-43fa-9321-1b58d9abf074" (UID: "7ecdad61-afa9-43fa-9321-1b58d9abf074"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.374899 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.374933 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.374946 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.374956 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ecdad61-afa9-43fa-9321-1b58d9abf074-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.374964 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.374972 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhjf8\" (UniqueName: \"kubernetes.io/projected/7ecdad61-afa9-43fa-9321-1b58d9abf074-kube-api-access-qhjf8\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.486549 4795 generic.go:334] "Generic (PLEG): container finished" podID="7ecdad61-afa9-43fa-9321-1b58d9abf074" containerID="5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c" exitCode=0 Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.486593 4795 generic.go:334] "Generic (PLEG): container finished" podID="7ecdad61-afa9-43fa-9321-1b58d9abf074" containerID="7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc" exitCode=143 Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.486612 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.486639 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7ecdad61-afa9-43fa-9321-1b58d9abf074","Type":"ContainerDied","Data":"5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c"} Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.487056 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7ecdad61-afa9-43fa-9321-1b58d9abf074","Type":"ContainerDied","Data":"7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc"} Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.487083 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7ecdad61-afa9-43fa-9321-1b58d9abf074","Type":"ContainerDied","Data":"ffda1e0cd8af360a3425530f0623937f6ca5f730c7420ebf643f335d5eaf911c"} Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.487104 4795 scope.go:117] "RemoveContainer" containerID="5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.494100 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e956453d-551f-44b4-8125-8656b3155402","Type":"ContainerStarted","Data":"c677d54567313f232f88b9a873fec54a2577075c7b806ea88ad32df8cb36cb26"} Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.496448 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.500179 4795 generic.go:334] "Generic (PLEG): container finished" podID="3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" containerID="fac0b49dac38eb57e08779c1a5fd43f5502fc80faee614fd505dcf5a274e4145" exitCode=143 Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.500230 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8","Type":"ContainerDied","Data":"fac0b49dac38eb57e08779c1a5fd43f5502fc80faee614fd505dcf5a274e4145"} Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.531750 4795 scope.go:117] "RemoveContainer" containerID="7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.543803 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.7829972299999999 podStartE2EDuration="5.543784058s" podCreationTimestamp="2026-02-19 21:49:02 +0000 UTC" firstStartedPulling="2026-02-19 21:49:03.226061122 +0000 UTC m=+1254.418579036" lastFinishedPulling="2026-02-19 21:49:06.986848 +0000 UTC m=+1258.179365864" observedRunningTime="2026-02-19 21:49:07.519757163 +0000 UTC m=+1258.712275027" watchObservedRunningTime="2026-02-19 21:49:07.543784058 +0000 UTC m=+1258.736301922" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.547149 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.555529 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.562394 4795 scope.go:117] "RemoveContainer" containerID="5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c" Feb 19 21:49:07 crc kubenswrapper[4795]: E0219 21:49:07.562836 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c\": container with ID starting with 5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c not found: ID does not exist" containerID="5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.562879 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c"} err="failed to get container status \"5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c\": rpc error: code = NotFound desc = could not find container \"5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c\": container with ID starting with 5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c not found: ID does not exist" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.562910 4795 scope.go:117] "RemoveContainer" containerID="7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc" Feb 19 21:49:07 crc kubenswrapper[4795]: E0219 21:49:07.563273 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc\": container with ID starting with 7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc not found: ID does not exist" containerID="7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.563316 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc"} err="failed to get container status \"7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc\": rpc error: code = NotFound desc = could not find container \"7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc\": container with ID starting with 7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc not found: ID does not exist" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.563342 4795 scope.go:117] "RemoveContainer" containerID="5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.563612 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c"} err="failed to get container status \"5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c\": rpc error: code = NotFound desc = could not find container \"5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c\": container with ID starting with 5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c not found: ID does not exist" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.563640 4795 scope.go:117] "RemoveContainer" containerID="7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.563926 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc"} err="failed to get container status \"7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc\": rpc error: code = NotFound desc = could not find container \"7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc\": container with ID starting with 7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc not found: ID does not exist" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.571878 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 21:49:07 crc kubenswrapper[4795]: E0219 21:49:07.572364 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ecdad61-afa9-43fa-9321-1b58d9abf074" containerName="nova-api-api" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.572383 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ecdad61-afa9-43fa-9321-1b58d9abf074" containerName="nova-api-api" Feb 19 21:49:07 crc kubenswrapper[4795]: E0219 21:49:07.572398 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1d3a710-addc-4f86-b77c-0d05dc98695f" containerName="nova-manage" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.572404 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1d3a710-addc-4f86-b77c-0d05dc98695f" containerName="nova-manage" Feb 19 21:49:07 crc kubenswrapper[4795]: E0219 21:49:07.572417 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ecdad61-afa9-43fa-9321-1b58d9abf074" containerName="nova-api-log" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.572424 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ecdad61-afa9-43fa-9321-1b58d9abf074" containerName="nova-api-log" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.572593 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ecdad61-afa9-43fa-9321-1b58d9abf074" containerName="nova-api-api" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.572615 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1d3a710-addc-4f86-b77c-0d05dc98695f" containerName="nova-manage" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.572626 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ecdad61-afa9-43fa-9321-1b58d9abf074" containerName="nova-api-log" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.573597 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.575968 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.577547 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.578520 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.600854 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.684823 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/793bbadc-8b53-4084-a63a-0b76b37284df-logs\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.684900 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-internal-tls-certs\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.684936 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-public-tls-certs\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.684969 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-config-data\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.685001 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sbjg\" (UniqueName: \"kubernetes.io/projected/793bbadc-8b53-4084-a63a-0b76b37284df-kube-api-access-5sbjg\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.685061 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.786489 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/793bbadc-8b53-4084-a63a-0b76b37284df-logs\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.786589 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-internal-tls-certs\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.786624 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-public-tls-certs\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.786665 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-config-data\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.786708 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sbjg\" (UniqueName: \"kubernetes.io/projected/793bbadc-8b53-4084-a63a-0b76b37284df-kube-api-access-5sbjg\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.786788 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.787124 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/793bbadc-8b53-4084-a63a-0b76b37284df-logs\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.791560 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-internal-tls-certs\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.791816 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.794142 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-config-data\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.794957 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-public-tls-certs\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.805849 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sbjg\" (UniqueName: \"kubernetes.io/projected/793bbadc-8b53-4084-a63a-0b76b37284df-kube-api-access-5sbjg\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.904237 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:49:08 crc kubenswrapper[4795]: E0219 21:49:08.274442 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a293322e7ade93b6e5ee39b06413dcdb22fcc62b8715355c0228e95452255ef5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:49:08 crc kubenswrapper[4795]: E0219 21:49:08.276394 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a293322e7ade93b6e5ee39b06413dcdb22fcc62b8715355c0228e95452255ef5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:49:08 crc kubenswrapper[4795]: E0219 21:49:08.278956 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a293322e7ade93b6e5ee39b06413dcdb22fcc62b8715355c0228e95452255ef5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:49:08 crc kubenswrapper[4795]: E0219 21:49:08.279015 4795 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="c0f49585-0601-424b-9f28-304ae06c9d93" containerName="nova-scheduler-scheduler" Feb 19 21:49:08 crc kubenswrapper[4795]: I0219 21:49:08.340285 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:49:08 crc kubenswrapper[4795]: I0219 21:49:08.515983 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"793bbadc-8b53-4084-a63a-0b76b37284df","Type":"ContainerStarted","Data":"14cc3bd11f9f9a1b0b976a11e87f576616488b4c4d4dfa8a49e1d97fcc43ddfd"} Feb 19 21:49:09 crc kubenswrapper[4795]: I0219 21:49:09.526453 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ecdad61-afa9-43fa-9321-1b58d9abf074" path="/var/lib/kubelet/pods/7ecdad61-afa9-43fa-9321-1b58d9abf074/volumes" Feb 19 21:49:09 crc kubenswrapper[4795]: I0219 21:49:09.536951 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"793bbadc-8b53-4084-a63a-0b76b37284df","Type":"ContainerStarted","Data":"5b12dfe428f5ad6bc0c6c39c5790008897c140f405635c55ad4d8157a86e3f70"} Feb 19 21:49:09 crc kubenswrapper[4795]: I0219 21:49:09.536991 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"793bbadc-8b53-4084-a63a-0b76b37284df","Type":"ContainerStarted","Data":"4a55d98b95e49937bea845358fb6e967ce1fb52ec526e4696c5d439369df7681"} Feb 19 21:49:09 crc kubenswrapper[4795]: I0219 21:49:09.567011 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.566988611 podStartE2EDuration="2.566988611s" podCreationTimestamp="2026-02-19 21:49:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:49:09.562396212 +0000 UTC m=+1260.754914156" watchObservedRunningTime="2026-02-19 21:49:09.566988611 +0000 UTC m=+1260.759506495" Feb 19 21:49:09 crc kubenswrapper[4795]: I0219 21:49:09.834780 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": read tcp 10.217.0.2:60428->10.217.0.195:8775: read: connection reset by peer" Feb 19 21:49:09 crc kubenswrapper[4795]: I0219 21:49:09.835224 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": read tcp 10.217.0.2:60426->10.217.0.195:8775: read: connection reset by peer" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.295767 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.438711 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-config-data\") pod \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.438770 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-combined-ca-bundle\") pod \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.438863 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-nova-metadata-tls-certs\") pod \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.439010 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fd6zt\" (UniqueName: \"kubernetes.io/projected/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-kube-api-access-fd6zt\") pod \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.439096 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-logs\") pod \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.440210 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-logs" (OuterVolumeSpecName: "logs") pod "3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" (UID: "3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.450487 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-kube-api-access-fd6zt" (OuterVolumeSpecName: "kube-api-access-fd6zt") pod "3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" (UID: "3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8"). InnerVolumeSpecName "kube-api-access-fd6zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.477551 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" (UID: "3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.479622 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-config-data" (OuterVolumeSpecName: "config-data") pod "3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" (UID: "3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.505337 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" (UID: "3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.541235 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fd6zt\" (UniqueName: \"kubernetes.io/projected/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-kube-api-access-fd6zt\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.541266 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.541279 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.541293 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.541304 4795 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.548881 4795 generic.go:334] "Generic (PLEG): container finished" podID="3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" containerID="a60101c154c65170b0cd0d24ec4c3534c546765af68be292c858d3e94c795294" exitCode=0 Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.548947 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.548953 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8","Type":"ContainerDied","Data":"a60101c154c65170b0cd0d24ec4c3534c546765af68be292c858d3e94c795294"} Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.549043 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8","Type":"ContainerDied","Data":"d8263151d90162e0d834534389009f120a5ff2fd0c5a460719918f5dbd83bc95"} Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.549063 4795 scope.go:117] "RemoveContainer" containerID="a60101c154c65170b0cd0d24ec4c3534c546765af68be292c858d3e94c795294" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.599355 4795 scope.go:117] "RemoveContainer" containerID="fac0b49dac38eb57e08779c1a5fd43f5502fc80faee614fd505dcf5a274e4145" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.637741 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.652379 4795 scope.go:117] "RemoveContainer" containerID="a60101c154c65170b0cd0d24ec4c3534c546765af68be292c858d3e94c795294" Feb 19 21:49:10 crc kubenswrapper[4795]: E0219 21:49:10.654906 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a60101c154c65170b0cd0d24ec4c3534c546765af68be292c858d3e94c795294\": container with ID starting with a60101c154c65170b0cd0d24ec4c3534c546765af68be292c858d3e94c795294 not found: ID does not exist" containerID="a60101c154c65170b0cd0d24ec4c3534c546765af68be292c858d3e94c795294" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.654946 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a60101c154c65170b0cd0d24ec4c3534c546765af68be292c858d3e94c795294"} err="failed to get container status \"a60101c154c65170b0cd0d24ec4c3534c546765af68be292c858d3e94c795294\": rpc error: code = NotFound desc = could not find container \"a60101c154c65170b0cd0d24ec4c3534c546765af68be292c858d3e94c795294\": container with ID starting with a60101c154c65170b0cd0d24ec4c3534c546765af68be292c858d3e94c795294 not found: ID does not exist" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.654973 4795 scope.go:117] "RemoveContainer" containerID="fac0b49dac38eb57e08779c1a5fd43f5502fc80faee614fd505dcf5a274e4145" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.655055 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:49:10 crc kubenswrapper[4795]: E0219 21:49:10.655430 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fac0b49dac38eb57e08779c1a5fd43f5502fc80faee614fd505dcf5a274e4145\": container with ID starting with fac0b49dac38eb57e08779c1a5fd43f5502fc80faee614fd505dcf5a274e4145 not found: ID does not exist" containerID="fac0b49dac38eb57e08779c1a5fd43f5502fc80faee614fd505dcf5a274e4145" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.655450 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fac0b49dac38eb57e08779c1a5fd43f5502fc80faee614fd505dcf5a274e4145"} err="failed to get container status \"fac0b49dac38eb57e08779c1a5fd43f5502fc80faee614fd505dcf5a274e4145\": rpc error: code = NotFound desc = could not find container \"fac0b49dac38eb57e08779c1a5fd43f5502fc80faee614fd505dcf5a274e4145\": container with ID starting with fac0b49dac38eb57e08779c1a5fd43f5502fc80faee614fd505dcf5a274e4145 not found: ID does not exist" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.663756 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:49:10 crc kubenswrapper[4795]: E0219 21:49:10.664194 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" containerName="nova-metadata-metadata" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.664206 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" containerName="nova-metadata-metadata" Feb 19 21:49:10 crc kubenswrapper[4795]: E0219 21:49:10.664223 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" containerName="nova-metadata-log" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.664229 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" containerName="nova-metadata-log" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.664575 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" containerName="nova-metadata-log" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.664674 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" containerName="nova-metadata-metadata" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.665592 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.668930 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.669286 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.674647 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.745430 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxzt2\" (UniqueName: \"kubernetes.io/projected/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-kube-api-access-hxzt2\") pod \"nova-metadata-0\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.745497 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-config-data\") pod \"nova-metadata-0\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.745684 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.745821 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.745955 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-logs\") pod \"nova-metadata-0\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.847962 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-logs\") pod \"nova-metadata-0\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.848019 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxzt2\" (UniqueName: \"kubernetes.io/projected/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-kube-api-access-hxzt2\") pod \"nova-metadata-0\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.848080 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-config-data\") pod \"nova-metadata-0\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.848157 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.848244 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.849453 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-logs\") pod \"nova-metadata-0\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.853080 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.855833 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-config-data\") pod \"nova-metadata-0\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.856086 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.873730 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxzt2\" (UniqueName: \"kubernetes.io/projected/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-kube-api-access-hxzt2\") pod \"nova-metadata-0\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.985830 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:49:11 crc kubenswrapper[4795]: I0219 21:49:11.463317 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:49:11 crc kubenswrapper[4795]: W0219 21:49:11.471718 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1e160dc_ca4c_45d8_ab73_5ddd1a7d2107.slice/crio-d49ebe84c757eba1bd2bd142f49a19380b1d9884de4a65242a0e36933f808c52 WatchSource:0}: Error finding container d49ebe84c757eba1bd2bd142f49a19380b1d9884de4a65242a0e36933f808c52: Status 404 returned error can't find the container with id d49ebe84c757eba1bd2bd142f49a19380b1d9884de4a65242a0e36933f808c52 Feb 19 21:49:11 crc kubenswrapper[4795]: I0219 21:49:11.524014 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" path="/var/lib/kubelet/pods/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8/volumes" Feb 19 21:49:11 crc kubenswrapper[4795]: I0219 21:49:11.573715 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107","Type":"ContainerStarted","Data":"d49ebe84c757eba1bd2bd142f49a19380b1d9884de4a65242a0e36933f808c52"} Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.119869 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.172788 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0f49585-0601-424b-9f28-304ae06c9d93-config-data\") pod \"c0f49585-0601-424b-9f28-304ae06c9d93\" (UID: \"c0f49585-0601-424b-9f28-304ae06c9d93\") " Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.172934 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f49585-0601-424b-9f28-304ae06c9d93-combined-ca-bundle\") pod \"c0f49585-0601-424b-9f28-304ae06c9d93\" (UID: \"c0f49585-0601-424b-9f28-304ae06c9d93\") " Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.172981 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tmq5\" (UniqueName: \"kubernetes.io/projected/c0f49585-0601-424b-9f28-304ae06c9d93-kube-api-access-8tmq5\") pod \"c0f49585-0601-424b-9f28-304ae06c9d93\" (UID: \"c0f49585-0601-424b-9f28-304ae06c9d93\") " Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.179362 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0f49585-0601-424b-9f28-304ae06c9d93-kube-api-access-8tmq5" (OuterVolumeSpecName: "kube-api-access-8tmq5") pod "c0f49585-0601-424b-9f28-304ae06c9d93" (UID: "c0f49585-0601-424b-9f28-304ae06c9d93"). InnerVolumeSpecName "kube-api-access-8tmq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.206390 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0f49585-0601-424b-9f28-304ae06c9d93-config-data" (OuterVolumeSpecName: "config-data") pod "c0f49585-0601-424b-9f28-304ae06c9d93" (UID: "c0f49585-0601-424b-9f28-304ae06c9d93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.212097 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0f49585-0601-424b-9f28-304ae06c9d93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0f49585-0601-424b-9f28-304ae06c9d93" (UID: "c0f49585-0601-424b-9f28-304ae06c9d93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.274936 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0f49585-0601-424b-9f28-304ae06c9d93-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.274979 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f49585-0601-424b-9f28-304ae06c9d93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.274993 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tmq5\" (UniqueName: \"kubernetes.io/projected/c0f49585-0601-424b-9f28-304ae06c9d93-kube-api-access-8tmq5\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.582725 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107","Type":"ContainerStarted","Data":"11d0871b61a89a3e5545d5527d867256cc43829525a32af590e7a61ae24edec5"} Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.582773 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107","Type":"ContainerStarted","Data":"a92c3489fb5665772be143ca85a31a0e43e2db5654864e03970155d992501b37"} Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.584715 4795 generic.go:334] "Generic (PLEG): container finished" podID="c0f49585-0601-424b-9f28-304ae06c9d93" containerID="a293322e7ade93b6e5ee39b06413dcdb22fcc62b8715355c0228e95452255ef5" exitCode=0 Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.584779 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.584775 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c0f49585-0601-424b-9f28-304ae06c9d93","Type":"ContainerDied","Data":"a293322e7ade93b6e5ee39b06413dcdb22fcc62b8715355c0228e95452255ef5"} Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.584884 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c0f49585-0601-424b-9f28-304ae06c9d93","Type":"ContainerDied","Data":"cd865b730fa4ce805f52ae67f6b00c0275c97a6018c4d5724e64d28e7cd4b5db"} Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.584906 4795 scope.go:117] "RemoveContainer" containerID="a293322e7ade93b6e5ee39b06413dcdb22fcc62b8715355c0228e95452255ef5" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.608627 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.608606975 podStartE2EDuration="2.608606975s" podCreationTimestamp="2026-02-19 21:49:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:49:12.606744813 +0000 UTC m=+1263.799262697" watchObservedRunningTime="2026-02-19 21:49:12.608606975 +0000 UTC m=+1263.801124879" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.634086 4795 scope.go:117] "RemoveContainer" containerID="a293322e7ade93b6e5ee39b06413dcdb22fcc62b8715355c0228e95452255ef5" Feb 19 21:49:12 crc kubenswrapper[4795]: E0219 21:49:12.637012 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a293322e7ade93b6e5ee39b06413dcdb22fcc62b8715355c0228e95452255ef5\": container with ID starting with a293322e7ade93b6e5ee39b06413dcdb22fcc62b8715355c0228e95452255ef5 not found: ID does not exist" containerID="a293322e7ade93b6e5ee39b06413dcdb22fcc62b8715355c0228e95452255ef5" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.637060 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a293322e7ade93b6e5ee39b06413dcdb22fcc62b8715355c0228e95452255ef5"} err="failed to get container status \"a293322e7ade93b6e5ee39b06413dcdb22fcc62b8715355c0228e95452255ef5\": rpc error: code = NotFound desc = could not find container \"a293322e7ade93b6e5ee39b06413dcdb22fcc62b8715355c0228e95452255ef5\": container with ID starting with a293322e7ade93b6e5ee39b06413dcdb22fcc62b8715355c0228e95452255ef5 not found: ID does not exist" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.652488 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.664745 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.692269 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:49:12 crc kubenswrapper[4795]: E0219 21:49:12.692719 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0f49585-0601-424b-9f28-304ae06c9d93" containerName="nova-scheduler-scheduler" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.692738 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0f49585-0601-424b-9f28-304ae06c9d93" containerName="nova-scheduler-scheduler" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.692958 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0f49585-0601-424b-9f28-304ae06c9d93" containerName="nova-scheduler-scheduler" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.693729 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.698788 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.705084 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.783587 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2710b23-7a5c-44cb-b916-9e08edc59636-config-data\") pod \"nova-scheduler-0\" (UID: \"f2710b23-7a5c-44cb-b916-9e08edc59636\") " pod="openstack/nova-scheduler-0" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.783647 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5wq7\" (UniqueName: \"kubernetes.io/projected/f2710b23-7a5c-44cb-b916-9e08edc59636-kube-api-access-t5wq7\") pod \"nova-scheduler-0\" (UID: \"f2710b23-7a5c-44cb-b916-9e08edc59636\") " pod="openstack/nova-scheduler-0" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.783808 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2710b23-7a5c-44cb-b916-9e08edc59636-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f2710b23-7a5c-44cb-b916-9e08edc59636\") " pod="openstack/nova-scheduler-0" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.885515 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2710b23-7a5c-44cb-b916-9e08edc59636-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f2710b23-7a5c-44cb-b916-9e08edc59636\") " pod="openstack/nova-scheduler-0" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.885812 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2710b23-7a5c-44cb-b916-9e08edc59636-config-data\") pod \"nova-scheduler-0\" (UID: \"f2710b23-7a5c-44cb-b916-9e08edc59636\") " pod="openstack/nova-scheduler-0" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.885929 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5wq7\" (UniqueName: \"kubernetes.io/projected/f2710b23-7a5c-44cb-b916-9e08edc59636-kube-api-access-t5wq7\") pod \"nova-scheduler-0\" (UID: \"f2710b23-7a5c-44cb-b916-9e08edc59636\") " pod="openstack/nova-scheduler-0" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.890092 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2710b23-7a5c-44cb-b916-9e08edc59636-config-data\") pod \"nova-scheduler-0\" (UID: \"f2710b23-7a5c-44cb-b916-9e08edc59636\") " pod="openstack/nova-scheduler-0" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.890559 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2710b23-7a5c-44cb-b916-9e08edc59636-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f2710b23-7a5c-44cb-b916-9e08edc59636\") " pod="openstack/nova-scheduler-0" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.918242 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5wq7\" (UniqueName: \"kubernetes.io/projected/f2710b23-7a5c-44cb-b916-9e08edc59636-kube-api-access-t5wq7\") pod \"nova-scheduler-0\" (UID: \"f2710b23-7a5c-44cb-b916-9e08edc59636\") " pod="openstack/nova-scheduler-0" Feb 19 21:49:13 crc kubenswrapper[4795]: I0219 21:49:13.010604 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:49:13 crc kubenswrapper[4795]: I0219 21:49:13.473289 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:49:13 crc kubenswrapper[4795]: I0219 21:49:13.524690 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0f49585-0601-424b-9f28-304ae06c9d93" path="/var/lib/kubelet/pods/c0f49585-0601-424b-9f28-304ae06c9d93/volumes" Feb 19 21:49:13 crc kubenswrapper[4795]: I0219 21:49:13.594717 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f2710b23-7a5c-44cb-b916-9e08edc59636","Type":"ContainerStarted","Data":"18a795e7f80bb780eadeb9ae01b9659d15da8639c51f358e1baf726a07014084"} Feb 19 21:49:14 crc kubenswrapper[4795]: I0219 21:49:14.604650 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f2710b23-7a5c-44cb-b916-9e08edc59636","Type":"ContainerStarted","Data":"f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6"} Feb 19 21:49:14 crc kubenswrapper[4795]: I0219 21:49:14.626684 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.6266681419999998 podStartE2EDuration="2.626668142s" podCreationTimestamp="2026-02-19 21:49:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:49:14.617567007 +0000 UTC m=+1265.810084871" watchObservedRunningTime="2026-02-19 21:49:14.626668142 +0000 UTC m=+1265.819186006" Feb 19 21:49:15 crc kubenswrapper[4795]: I0219 21:49:15.986679 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 21:49:15 crc kubenswrapper[4795]: I0219 21:49:15.987101 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 21:49:18 crc kubenswrapper[4795]: I0219 21:49:18.225655 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 21:49:18 crc kubenswrapper[4795]: I0219 21:49:18.228055 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 21:49:18 crc kubenswrapper[4795]: I0219 21:49:18.229202 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 21:49:19 crc kubenswrapper[4795]: I0219 21:49:19.311399 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="793bbadc-8b53-4084-a63a-0b76b37284df" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 21:49:19 crc kubenswrapper[4795]: I0219 21:49:19.311523 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="793bbadc-8b53-4084-a63a-0b76b37284df" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 21:49:20 crc kubenswrapper[4795]: I0219 21:49:20.986436 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 21:49:20 crc kubenswrapper[4795]: I0219 21:49:20.986956 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 21:49:22 crc kubenswrapper[4795]: I0219 21:49:22.005275 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.207:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 21:49:22 crc kubenswrapper[4795]: I0219 21:49:22.005510 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.207:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 21:49:23 crc kubenswrapper[4795]: I0219 21:49:23.011085 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 21:49:23 crc kubenswrapper[4795]: I0219 21:49:23.052466 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 21:49:23 crc kubenswrapper[4795]: I0219 21:49:23.359036 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 21:49:27 crc kubenswrapper[4795]: I0219 21:49:27.918963 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 21:49:27 crc kubenswrapper[4795]: I0219 21:49:27.920845 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 21:49:27 crc kubenswrapper[4795]: I0219 21:49:27.921049 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 21:49:27 crc kubenswrapper[4795]: I0219 21:49:27.932128 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 21:49:28 crc kubenswrapper[4795]: I0219 21:49:28.412225 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 21:49:28 crc kubenswrapper[4795]: I0219 21:49:28.420399 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 21:49:28 crc kubenswrapper[4795]: I0219 21:49:28.427875 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:49:28 crc kubenswrapper[4795]: I0219 21:49:28.427937 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:49:30 crc kubenswrapper[4795]: I0219 21:49:30.992968 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 21:49:30 crc kubenswrapper[4795]: I0219 21:49:30.993768 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 21:49:30 crc kubenswrapper[4795]: I0219 21:49:30.998915 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 21:49:31 crc kubenswrapper[4795]: I0219 21:49:31.000865 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 21:49:32 crc kubenswrapper[4795]: I0219 21:49:32.807093 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.729901 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.730538 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="336beec4-e534-448f-8367-78645b53650e" containerName="openstackclient" containerID="cri-o://6f5fd7fb0db869022abdd3586a7debaf90502df56c7437b86541c8afbbd3687a" gracePeriod=2 Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.750106 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.794736 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-ktl2b"] Feb 19 21:49:53 crc kubenswrapper[4795]: E0219 21:49:53.798670 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="336beec4-e534-448f-8367-78645b53650e" containerName="openstackclient" Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.798777 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="336beec4-e534-448f-8367-78645b53650e" containerName="openstackclient" Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.799062 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="336beec4-e534-448f-8367-78645b53650e" containerName="openstackclient" Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.799689 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ktl2b" Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.809624 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.828511 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ktl2b"] Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.900087 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-d602-account-create-update-lcd8k"] Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.901313 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d602-account-create-update-lcd8k" Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.905788 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.943389 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-jr6xc"] Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.962229 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-jr6xc"] Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.970426 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d602-account-create-update-lcd8k"] Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.980098 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kll6\" (UniqueName: \"kubernetes.io/projected/2164f9d1-1d8b-486b-beca-0d3a5172b302-kube-api-access-9kll6\") pod \"root-account-create-update-ktl2b\" (UID: \"2164f9d1-1d8b-486b-beca-0d3a5172b302\") " pod="openstack/root-account-create-update-ktl2b" Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.980210 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2164f9d1-1d8b-486b-beca-0d3a5172b302-operator-scripts\") pod \"root-account-create-update-ktl2b\" (UID: \"2164f9d1-1d8b-486b-beca-0d3a5172b302\") " pod="openstack/root-account-create-update-ktl2b" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.003347 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-d602-account-create-update-mc6fv"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.003878 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-d602-account-create-update-mc6fv"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.085125 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w65w5\" (UniqueName: \"kubernetes.io/projected/10e13a52-b0f3-447a-b47e-2c4dd50d6400-kube-api-access-w65w5\") pod \"barbican-d602-account-create-update-lcd8k\" (UID: \"10e13a52-b0f3-447a-b47e-2c4dd50d6400\") " pod="openstack/barbican-d602-account-create-update-lcd8k" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.085214 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kll6\" (UniqueName: \"kubernetes.io/projected/2164f9d1-1d8b-486b-beca-0d3a5172b302-kube-api-access-9kll6\") pod \"root-account-create-update-ktl2b\" (UID: \"2164f9d1-1d8b-486b-beca-0d3a5172b302\") " pod="openstack/root-account-create-update-ktl2b" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.085245 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10e13a52-b0f3-447a-b47e-2c4dd50d6400-operator-scripts\") pod \"barbican-d602-account-create-update-lcd8k\" (UID: \"10e13a52-b0f3-447a-b47e-2c4dd50d6400\") " pod="openstack/barbican-d602-account-create-update-lcd8k" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.085294 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2164f9d1-1d8b-486b-beca-0d3a5172b302-operator-scripts\") pod \"root-account-create-update-ktl2b\" (UID: \"2164f9d1-1d8b-486b-beca-0d3a5172b302\") " pod="openstack/root-account-create-update-ktl2b" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.086075 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2164f9d1-1d8b-486b-beca-0d3a5172b302-operator-scripts\") pod \"root-account-create-update-ktl2b\" (UID: \"2164f9d1-1d8b-486b-beca-0d3a5172b302\") " pod="openstack/root-account-create-update-ktl2b" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.186501 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10e13a52-b0f3-447a-b47e-2c4dd50d6400-operator-scripts\") pod \"barbican-d602-account-create-update-lcd8k\" (UID: \"10e13a52-b0f3-447a-b47e-2c4dd50d6400\") " pod="openstack/barbican-d602-account-create-update-lcd8k" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.186650 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w65w5\" (UniqueName: \"kubernetes.io/projected/10e13a52-b0f3-447a-b47e-2c4dd50d6400-kube-api-access-w65w5\") pod \"barbican-d602-account-create-update-lcd8k\" (UID: \"10e13a52-b0f3-447a-b47e-2c4dd50d6400\") " pod="openstack/barbican-d602-account-create-update-lcd8k" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.187229 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10e13a52-b0f3-447a-b47e-2c4dd50d6400-operator-scripts\") pod \"barbican-d602-account-create-update-lcd8k\" (UID: \"10e13a52-b0f3-447a-b47e-2c4dd50d6400\") " pod="openstack/barbican-d602-account-create-update-lcd8k" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.191679 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kll6\" (UniqueName: \"kubernetes.io/projected/2164f9d1-1d8b-486b-beca-0d3a5172b302-kube-api-access-9kll6\") pod \"root-account-create-update-ktl2b\" (UID: \"2164f9d1-1d8b-486b-beca-0d3a5172b302\") " pod="openstack/root-account-create-update-ktl2b" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.199931 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-c741-account-create-update-26ljt"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.201098 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c741-account-create-update-26ljt" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.210249 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.237119 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w65w5\" (UniqueName: \"kubernetes.io/projected/10e13a52-b0f3-447a-b47e-2c4dd50d6400-kube-api-access-w65w5\") pod \"barbican-d602-account-create-update-lcd8k\" (UID: \"10e13a52-b0f3-447a-b47e-2c4dd50d6400\") " pod="openstack/barbican-d602-account-create-update-lcd8k" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.291952 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c741-account-create-update-26ljt"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.314288 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.314527 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" containerName="ovn-northd" containerID="cri-o://2cccec2e02e2c16f7cff73b24c7a62191291d809702ff698745b7a193cbe2c2f" gracePeriod=30 Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.314571 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" containerName="openstack-network-exporter" containerID="cri-o://1947803c21d9e4b80d454677a036fe260f1c04cd37249f1f066a15e8b611b1c3" gracePeriod=30 Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.347841 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.398348 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e65bdd0-b6ac-406d-bc79-ade76397295e-operator-scripts\") pod \"placement-c741-account-create-update-26ljt\" (UID: \"3e65bdd0-b6ac-406d-bc79-ade76397295e\") " pod="openstack/placement-c741-account-create-update-26ljt" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.398442 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzgh4\" (UniqueName: \"kubernetes.io/projected/3e65bdd0-b6ac-406d-bc79-ade76397295e-kube-api-access-mzgh4\") pod \"placement-c741-account-create-update-26ljt\" (UID: \"3e65bdd0-b6ac-406d-bc79-ade76397295e\") " pod="openstack/placement-c741-account-create-update-26ljt" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.405228 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-9f51-account-create-update-z87p8"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.406633 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9f51-account-create-update-z87p8" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.418104 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.439353 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-f769-account-create-update-8k7r2"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.440621 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f769-account-create-update-8k7r2" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.452701 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ktl2b" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.453213 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.481490 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9f51-account-create-update-z87p8"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.499857 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f769-account-create-update-8k7r2"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.502034 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e65bdd0-b6ac-406d-bc79-ade76397295e-operator-scripts\") pod \"placement-c741-account-create-update-26ljt\" (UID: \"3e65bdd0-b6ac-406d-bc79-ade76397295e\") " pod="openstack/placement-c741-account-create-update-26ljt" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.502120 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzgh4\" (UniqueName: \"kubernetes.io/projected/3e65bdd0-b6ac-406d-bc79-ade76397295e-kube-api-access-mzgh4\") pod \"placement-c741-account-create-update-26ljt\" (UID: \"3e65bdd0-b6ac-406d-bc79-ade76397295e\") " pod="openstack/placement-c741-account-create-update-26ljt" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.503872 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e65bdd0-b6ac-406d-bc79-ade76397295e-operator-scripts\") pod \"placement-c741-account-create-update-26ljt\" (UID: \"3e65bdd0-b6ac-406d-bc79-ade76397295e\") " pod="openstack/placement-c741-account-create-update-26ljt" Feb 19 21:49:54 crc kubenswrapper[4795]: E0219 21:49:54.503891 4795 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 19 21:49:54 crc kubenswrapper[4795]: E0219 21:49:54.503946 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-config-data podName:ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d nodeName:}" failed. No retries permitted until 2026-02-19 21:49:55.003929475 +0000 UTC m=+1306.196447339 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-config-data") pod "rabbitmq-cell1-server-0" (UID: "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d") : configmap "rabbitmq-cell1-config-data" not found Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.528546 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-b4bcd"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.529547 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d602-account-create-update-lcd8k" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.535904 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-b4bcd"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.544754 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-c741-account-create-update-hdlzx"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.575188 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzgh4\" (UniqueName: \"kubernetes.io/projected/3e65bdd0-b6ac-406d-bc79-ade76397295e-kube-api-access-mzgh4\") pod \"placement-c741-account-create-update-26ljt\" (UID: \"3e65bdd0-b6ac-406d-bc79-ade76397295e\") " pod="openstack/placement-c741-account-create-update-26ljt" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.593479 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-c741-account-create-update-hdlzx"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.601036 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c741-account-create-update-26ljt" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.604305 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0500ca0-0cef-4b76-9c78-cb2189b520ff-operator-scripts\") pod \"cinder-9f51-account-create-update-z87p8\" (UID: \"b0500ca0-0cef-4b76-9c78-cb2189b520ff\") " pod="openstack/cinder-9f51-account-create-update-z87p8" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.605736 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc6xc\" (UniqueName: \"kubernetes.io/projected/b0500ca0-0cef-4b76-9c78-cb2189b520ff-kube-api-access-pc6xc\") pod \"cinder-9f51-account-create-update-z87p8\" (UID: \"b0500ca0-0cef-4b76-9c78-cb2189b520ff\") " pod="openstack/cinder-9f51-account-create-update-z87p8" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.605944 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hkps\" (UniqueName: \"kubernetes.io/projected/8eaa69df-d563-4dc0-8a78-40413946cbca-kube-api-access-5hkps\") pod \"glance-f769-account-create-update-8k7r2\" (UID: \"8eaa69df-d563-4dc0-8a78-40413946cbca\") " pod="openstack/glance-f769-account-create-update-8k7r2" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.606156 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8eaa69df-d563-4dc0-8a78-40413946cbca-operator-scripts\") pod \"glance-f769-account-create-update-8k7r2\" (UID: \"8eaa69df-d563-4dc0-8a78-40413946cbca\") " pod="openstack/glance-f769-account-create-update-8k7r2" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.660332 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-9f51-account-create-update-n57zq"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.678553 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-9f51-account-create-update-n57zq"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.715285 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8eaa69df-d563-4dc0-8a78-40413946cbca-operator-scripts\") pod \"glance-f769-account-create-update-8k7r2\" (UID: \"8eaa69df-d563-4dc0-8a78-40413946cbca\") " pod="openstack/glance-f769-account-create-update-8k7r2" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.715389 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0500ca0-0cef-4b76-9c78-cb2189b520ff-operator-scripts\") pod \"cinder-9f51-account-create-update-z87p8\" (UID: \"b0500ca0-0cef-4b76-9c78-cb2189b520ff\") " pod="openstack/cinder-9f51-account-create-update-z87p8" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.715554 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc6xc\" (UniqueName: \"kubernetes.io/projected/b0500ca0-0cef-4b76-9c78-cb2189b520ff-kube-api-access-pc6xc\") pod \"cinder-9f51-account-create-update-z87p8\" (UID: \"b0500ca0-0cef-4b76-9c78-cb2189b520ff\") " pod="openstack/cinder-9f51-account-create-update-z87p8" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.715622 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hkps\" (UniqueName: \"kubernetes.io/projected/8eaa69df-d563-4dc0-8a78-40413946cbca-kube-api-access-5hkps\") pod \"glance-f769-account-create-update-8k7r2\" (UID: \"8eaa69df-d563-4dc0-8a78-40413946cbca\") " pod="openstack/glance-f769-account-create-update-8k7r2" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.716283 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0500ca0-0cef-4b76-9c78-cb2189b520ff-operator-scripts\") pod \"cinder-9f51-account-create-update-z87p8\" (UID: \"b0500ca0-0cef-4b76-9c78-cb2189b520ff\") " pod="openstack/cinder-9f51-account-create-update-z87p8" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.766295 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8eaa69df-d563-4dc0-8a78-40413946cbca-operator-scripts\") pod \"glance-f769-account-create-update-8k7r2\" (UID: \"8eaa69df-d563-4dc0-8a78-40413946cbca\") " pod="openstack/glance-f769-account-create-update-8k7r2" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.767843 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.768718 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="3c5a8678-8ce2-4bee-9160-37b1dea9f897" containerName="openstack-network-exporter" containerID="cri-o://9afa6d2d9c1c3abf2795f79e56d0f2c85700d5b1b8b011dec52725e8bbc63d6b" gracePeriod=300 Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.851325 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hkps\" (UniqueName: \"kubernetes.io/projected/8eaa69df-d563-4dc0-8a78-40413946cbca-kube-api-access-5hkps\") pod \"glance-f769-account-create-update-8k7r2\" (UID: \"8eaa69df-d563-4dc0-8a78-40413946cbca\") " pod="openstack/glance-f769-account-create-update-8k7r2" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.877084 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc6xc\" (UniqueName: \"kubernetes.io/projected/b0500ca0-0cef-4b76-9c78-cb2189b520ff-kube-api-access-pc6xc\") pod \"cinder-9f51-account-create-update-z87p8\" (UID: \"b0500ca0-0cef-4b76-9c78-cb2189b520ff\") " pod="openstack/cinder-9f51-account-create-update-z87p8" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.905518 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-f769-account-create-update-25m5x"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.905663 4795 generic.go:334] "Generic (PLEG): container finished" podID="fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" containerID="1947803c21d9e4b80d454677a036fe260f1c04cd37249f1f066a15e8b611b1c3" exitCode=2 Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.905690 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c","Type":"ContainerDied","Data":"1947803c21d9e4b80d454677a036fe260f1c04cd37249f1f066a15e8b611b1c3"} Feb 19 21:49:54 crc kubenswrapper[4795]: E0219 21:49:54.944203 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2cccec2e02e2c16f7cff73b24c7a62191291d809702ff698745b7a193cbe2c2f" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.944243 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-f769-account-create-update-25m5x"] Feb 19 21:49:54 crc kubenswrapper[4795]: E0219 21:49:54.972336 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2cccec2e02e2c16f7cff73b24c7a62191291d809702ff698745b7a193cbe2c2f" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 19 21:49:54 crc kubenswrapper[4795]: E0219 21:49:54.978463 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2cccec2e02e2c16f7cff73b24c7a62191291d809702ff698745b7a193cbe2c2f" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 19 21:49:54 crc kubenswrapper[4795]: E0219 21:49:54.978527 4795 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" containerName="ovn-northd" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.995338 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="3c5a8678-8ce2-4bee-9160-37b1dea9f897" containerName="ovsdbserver-nb" containerID="cri-o://146b7776ad30bb62917c430a4cb976669fc9f5db740f7f370033e5be6e16f033" gracePeriod=300 Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.997875 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9f51-account-create-update-z87p8" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.020529 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.021263 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="3c2bcb9c-07d3-4d71-924b-aacd537e3430" containerName="openstack-network-exporter" containerID="cri-o://e932ce1114c0c3a49c8f6332f06a4a9aadb5f1382200346f37f0e3e9fe2d3373" gracePeriod=300 Feb 19 21:49:55 crc kubenswrapper[4795]: E0219 21:49:55.049035 4795 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 19 21:49:55 crc kubenswrapper[4795]: E0219 21:49:55.049118 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-config-data podName:ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d nodeName:}" failed. No retries permitted until 2026-02-19 21:49:56.04909626 +0000 UTC m=+1307.241614124 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-config-data") pod "rabbitmq-cell1-server-0" (UID: "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d") : configmap "rabbitmq-cell1-config-data" not found Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.061648 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-2d62-account-create-update-4vs7v"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.066854 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2d62-account-create-update-4vs7v" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.068395 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f769-account-create-update-8k7r2" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.069753 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.091506 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-ttz5x"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.132222 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-2d62-account-create-update-jrx2c"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.169342 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xc68\" (UniqueName: \"kubernetes.io/projected/299d8d1c-c181-4c7b-b95f-9f3c62ddb102-kube-api-access-4xc68\") pod \"nova-api-2d62-account-create-update-4vs7v\" (UID: \"299d8d1c-c181-4c7b-b95f-9f3c62ddb102\") " pod="openstack/nova-api-2d62-account-create-update-4vs7v" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.169525 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/299d8d1c-c181-4c7b-b95f-9f3c62ddb102-operator-scripts\") pod \"nova-api-2d62-account-create-update-4vs7v\" (UID: \"299d8d1c-c181-4c7b-b95f-9f3c62ddb102\") " pod="openstack/nova-api-2d62-account-create-update-4vs7v" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.184986 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-2d62-account-create-update-jrx2c"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.232683 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-ttz5x"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.255994 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.272191 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/299d8d1c-c181-4c7b-b95f-9f3c62ddb102-operator-scripts\") pod \"nova-api-2d62-account-create-update-4vs7v\" (UID: \"299d8d1c-c181-4c7b-b95f-9f3c62ddb102\") " pod="openstack/nova-api-2d62-account-create-update-4vs7v" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.272336 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xc68\" (UniqueName: \"kubernetes.io/projected/299d8d1c-c181-4c7b-b95f-9f3c62ddb102-kube-api-access-4xc68\") pod \"nova-api-2d62-account-create-update-4vs7v\" (UID: \"299d8d1c-c181-4c7b-b95f-9f3c62ddb102\") " pod="openstack/nova-api-2d62-account-create-update-4vs7v" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.273102 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/299d8d1c-c181-4c7b-b95f-9f3c62ddb102-operator-scripts\") pod \"nova-api-2d62-account-create-update-4vs7v\" (UID: \"299d8d1c-c181-4c7b-b95f-9f3c62ddb102\") " pod="openstack/nova-api-2d62-account-create-update-4vs7v" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.276565 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2d62-account-create-update-4vs7v"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.297390 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7677694455-vk4hv"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.297644 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7677694455-vk4hv" podUID="74df8ac0-77f2-4e8d-aa39-d05dc6ce7190" containerName="dnsmasq-dns" containerID="cri-o://0b266e887f3cb16fe191b1e64cc8d8adf7464d874863c315ba4605ce8d79b678" gracePeriod=10 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.310009 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="3c2bcb9c-07d3-4d71-924b-aacd537e3430" containerName="ovsdbserver-sb" containerID="cri-o://fc2aa4e6ca186f0a3259cef3b73108248f8b30394d6e7f6ee3356a21235bd96b" gracePeriod=300 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.330141 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xc68\" (UniqueName: \"kubernetes.io/projected/299d8d1c-c181-4c7b-b95f-9f3c62ddb102-kube-api-access-4xc68\") pod \"nova-api-2d62-account-create-update-4vs7v\" (UID: \"299d8d1c-c181-4c7b-b95f-9f3c62ddb102\") " pod="openstack/nova-api-2d62-account-create-update-4vs7v" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.353454 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-jkspq"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.373990 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-jkspq"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.399622 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-zjbsw"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.423348 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-zjbsw"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.454229 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-1922-account-create-update-gzkhl"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.455382 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1922-account-create-update-gzkhl" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.463323 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2d62-account-create-update-4vs7v" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.464792 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 19 21:49:55 crc kubenswrapper[4795]: E0219 21:49:55.483675 4795 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 19 21:49:55 crc kubenswrapper[4795]: E0219 21:49:55.493308 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-config-data podName:7b096325-542d-4ac6-8d16-8aa0937013b2 nodeName:}" failed. No retries permitted until 2026-02-19 21:49:55.993274627 +0000 UTC m=+1307.185792491 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-config-data") pod "rabbitmq-server-0" (UID: "7b096325-542d-4ac6-8d16-8aa0937013b2") : configmap "rabbitmq-config-data" not found Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.496856 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-e48f-account-create-update-n77q8"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.498152 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e48f-account-create-update-n77q8" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.500774 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.510545 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-1922-account-create-update-gzkhl"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.586573 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee3fde95-91bf-4f6a-9753-f879d56fedbb-operator-scripts\") pod \"nova-cell0-1922-account-create-update-gzkhl\" (UID: \"ee3fde95-91bf-4f6a-9753-f879d56fedbb\") " pod="openstack/nova-cell0-1922-account-create-update-gzkhl" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.586623 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq9sz\" (UniqueName: \"kubernetes.io/projected/ee3fde95-91bf-4f6a-9753-f879d56fedbb-kube-api-access-vq9sz\") pod \"nova-cell0-1922-account-create-update-gzkhl\" (UID: \"ee3fde95-91bf-4f6a-9753-f879d56fedbb\") " pod="openstack/nova-cell0-1922-account-create-update-gzkhl" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.699539 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8sbw\" (UniqueName: \"kubernetes.io/projected/7400eda6-e731-4942-b002-c81dd9a87e6a-kube-api-access-z8sbw\") pod \"nova-cell1-e48f-account-create-update-n77q8\" (UID: \"7400eda6-e731-4942-b002-c81dd9a87e6a\") " pod="openstack/nova-cell1-e48f-account-create-update-n77q8" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.699670 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee3fde95-91bf-4f6a-9753-f879d56fedbb-operator-scripts\") pod \"nova-cell0-1922-account-create-update-gzkhl\" (UID: \"ee3fde95-91bf-4f6a-9753-f879d56fedbb\") " pod="openstack/nova-cell0-1922-account-create-update-gzkhl" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.699717 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq9sz\" (UniqueName: \"kubernetes.io/projected/ee3fde95-91bf-4f6a-9753-f879d56fedbb-kube-api-access-vq9sz\") pod \"nova-cell0-1922-account-create-update-gzkhl\" (UID: \"ee3fde95-91bf-4f6a-9753-f879d56fedbb\") " pod="openstack/nova-cell0-1922-account-create-update-gzkhl" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.699735 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7400eda6-e731-4942-b002-c81dd9a87e6a-operator-scripts\") pod \"nova-cell1-e48f-account-create-update-n77q8\" (UID: \"7400eda6-e731-4942-b002-c81dd9a87e6a\") " pod="openstack/nova-cell1-e48f-account-create-update-n77q8" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.701348 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee3fde95-91bf-4f6a-9753-f879d56fedbb-operator-scripts\") pod \"nova-cell0-1922-account-create-update-gzkhl\" (UID: \"ee3fde95-91bf-4f6a-9753-f879d56fedbb\") " pod="openstack/nova-cell0-1922-account-create-update-gzkhl" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.719186 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq9sz\" (UniqueName: \"kubernetes.io/projected/ee3fde95-91bf-4f6a-9753-f879d56fedbb-kube-api-access-vq9sz\") pod \"nova-cell0-1922-account-create-update-gzkhl\" (UID: \"ee3fde95-91bf-4f6a-9753-f879d56fedbb\") " pod="openstack/nova-cell0-1922-account-create-update-gzkhl" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.796300 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec" path="/var/lib/kubelet/pods/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec/volumes" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.798197 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="282595b2-0eaa-4deb-9af4-288241817325" path="/var/lib/kubelet/pods/282595b2-0eaa-4deb-9af4-288241817325/volumes" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.798767 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="454af6b2-4c9e-4706-a537-b3e3d468353d" path="/var/lib/kubelet/pods/454af6b2-4c9e-4706-a537-b3e3d468353d/volumes" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.801647 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8sbw\" (UniqueName: \"kubernetes.io/projected/7400eda6-e731-4942-b002-c81dd9a87e6a-kube-api-access-z8sbw\") pod \"nova-cell1-e48f-account-create-update-n77q8\" (UID: \"7400eda6-e731-4942-b002-c81dd9a87e6a\") " pod="openstack/nova-cell1-e48f-account-create-update-n77q8" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.801794 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7400eda6-e731-4942-b002-c81dd9a87e6a-operator-scripts\") pod \"nova-cell1-e48f-account-create-update-n77q8\" (UID: \"7400eda6-e731-4942-b002-c81dd9a87e6a\") " pod="openstack/nova-cell1-e48f-account-create-update-n77q8" Feb 19 21:49:55 crc kubenswrapper[4795]: E0219 21:49:55.801917 4795 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Feb 19 21:49:55 crc kubenswrapper[4795]: E0219 21:49:55.801981 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7400eda6-e731-4942-b002-c81dd9a87e6a-operator-scripts podName:7400eda6-e731-4942-b002-c81dd9a87e6a nodeName:}" failed. No retries permitted until 2026-02-19 21:49:56.301965383 +0000 UTC m=+1307.494483247 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7400eda6-e731-4942-b002-c81dd9a87e6a-operator-scripts") pod "nova-cell1-e48f-account-create-update-n77q8" (UID: "7400eda6-e731-4942-b002-c81dd9a87e6a") : configmap "openstack-cell1-scripts" not found Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.803814 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5084e7b9-4923-449e-b0d7-28c602faeff0" path="/var/lib/kubelet/pods/5084e7b9-4923-449e-b0d7-28c602faeff0/volumes" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.804842 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="541fd524-94f2-4149-b16b-ab11a716ff95" path="/var/lib/kubelet/pods/541fd524-94f2-4149-b16b-ab11a716ff95/volumes" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.805505 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57961551-d4f8-4586-b255-8810fbdb499a" path="/var/lib/kubelet/pods/57961551-d4f8-4586-b255-8810fbdb499a/volumes" Feb 19 21:49:55 crc kubenswrapper[4795]: E0219 21:49:55.805838 4795 projected.go:194] Error preparing data for projected volume kube-api-access-z8sbw for pod openstack/nova-cell1-e48f-account-create-update-n77q8: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 19 21:49:55 crc kubenswrapper[4795]: E0219 21:49:55.805869 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7400eda6-e731-4942-b002-c81dd9a87e6a-kube-api-access-z8sbw podName:7400eda6-e731-4942-b002-c81dd9a87e6a nodeName:}" failed. No retries permitted until 2026-02-19 21:49:56.305858143 +0000 UTC m=+1307.498376007 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-z8sbw" (UniqueName: "kubernetes.io/projected/7400eda6-e731-4942-b002-c81dd9a87e6a-kube-api-access-z8sbw") pod "nova-cell1-e48f-account-create-update-n77q8" (UID: "7400eda6-e731-4942-b002-c81dd9a87e6a") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.812191 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b15ba11-a170-4fac-bac1-15ecf9de7379" path="/var/lib/kubelet/pods/5b15ba11-a170-4fac-bac1-15ecf9de7379/volumes" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.812723 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d152069-2c3d-4cf4-94e8-3068e24def9f" path="/var/lib/kubelet/pods/7d152069-2c3d-4cf4-94e8-3068e24def9f/volumes" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.813467 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="890a044b-0060-4feb-866b-9a9e80bfa706" path="/var/lib/kubelet/pods/890a044b-0060-4feb-866b-9a9e80bfa706/volumes" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.814029 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e61f40e0-d6c3-49f7-a93f-d9956f086d4b" path="/var/lib/kubelet/pods/e61f40e0-d6c3-49f7-a93f-d9956f086d4b/volumes" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.834738 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e48f-account-create-update-n77q8"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.834772 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-2wbff"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.834788 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-2wbff"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.834807 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-tl5hf"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.834826 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-w9fbs"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.834853 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-1922-account-create-update-bnqt2"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.834870 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-1922-account-create-update-bnqt2"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.834881 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-e48f-account-create-update-48v7f"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.834893 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-e48f-account-create-update-48v7f"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.834912 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-p9cs4"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.834924 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-576c65f985-r97z7"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.834940 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ktl2b"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.834961 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.834973 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6df95dfbd4-ftf6x"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.834984 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-xmbg2"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.834995 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-xmbg2"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.835320 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6df95dfbd4-ftf6x" podUID="b01bcd5b-435a-4702-b0a4-8dfe8f553c23" containerName="placement-log" containerID="cri-o://16d1434f729c32f7f5af098d1664dafb8ed3d4636079462bf6c45b1454ee08ef" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.835517 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6df95dfbd4-ftf6x" podUID="b01bcd5b-435a-4702-b0a4-8dfe8f553c23" containerName="placement-api" containerID="cri-o://9d2e881624f7800e5e6e027e5487084188713e96e14cab5c281c53ae829851a2" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.841475 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="account-server" containerID="cri-o://51c55baa52f08bcb95276c0f7a67a7ef348b9bd02a9fc401f50e679f37e0c117" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.841595 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="swift-recon-cron" containerID="cri-o://955740cbd5ac4eda735378957980240001d5c0ce0905f2fca18b4155f3fb6c98" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.841668 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="rsync" containerID="cri-o://bb960073c3b2955d7aa2d18d3eb2e0958e7e98f4cd499d7077f5064d1e43a05e" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.841747 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-expirer" containerID="cri-o://8d3a569ab5140e595996d2c82fd170ed28aa9420de4fdae36e9b5854b2e0bd5e" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.841804 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-updater" containerID="cri-o://2ce7f7a343a6c79fd57a6b4c7cea8f6f21ccfbc5ea5261b4c592c5cc2035910e" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.841866 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-auditor" containerID="cri-o://cb0176a835c07bb843ea9834f19b5792b6d9700c5cd61a140ec8b99a66854f5f" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.841917 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-replicator" containerID="cri-o://c8d438309dbe4ee742eb9d7a2b93e755a74c9ba2dd39409dcf7caf84dee6405a" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.841993 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-server" containerID="cri-o://c5d0640985105b2140d43ecf956f5621f6e82eb5a3d40d95fb3d09d303406c84" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.841980 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-p9cs4" podUID="eee0ea5d-4b43-4421-b23e-555c5eac3564" containerName="openstack-network-exporter" containerID="cri-o://3d2f2fb944954b1447f44f88d21764d52d00b1e2faecd4fc556d6cdb1c4d63f4" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.842052 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="container-updater" containerID="cri-o://f2b55f40d9ab92e06fdc09f65e72764f7f9c63fbb1f126ede2058624236d001f" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.842095 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="container-auditor" containerID="cri-o://5019fca913d092cd1d004e058553c364ac08007bafeae027b681e3bb6eb59026" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.842188 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="container-replicator" containerID="cri-o://be1c394688c447ed772b4929317159025a1e97491b40b847644ed369351532b5" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.842234 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-576c65f985-r97z7" podUID="30f2c894-7a7a-4e5a-a090-a28ab50c766a" containerName="neutron-api" containerID="cri-o://b4418cb3b4a933e20130924748ce8ea933d7929e7a32c146e7e7ce593a7c13a0" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.842244 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="container-server" containerID="cri-o://a86db1a02ddab5086097179b35e6f17d71c32f36157625ee70912b23839603d4" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.842293 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="account-reaper" containerID="cri-o://d2281de4777acfa86c800d06ed4c2e0ac8613cf4008b8449cd7089d057ee51ec" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.842336 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="account-auditor" containerID="cri-o://22441008e17545864d7c6366d4ab4fa8333a1c04e36b9961e8fdfdfaeec8b1b6" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.842386 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="account-replicator" containerID="cri-o://b4b0474dd3a4192273fa0b2dc273a792e955cd0dde33e24a1afa65bb56656eaa" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.842499 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-576c65f985-r97z7" podUID="30f2c894-7a7a-4e5a-a090-a28ab50c766a" containerName="neutron-httpd" containerID="cri-o://fe9cf6c1b6f422c771781be2cfb21deabc54071e241ab6b5c24d3d83414ee4ec" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.885537 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1922-account-create-update-gzkhl" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.939794 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-x4mls"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.957712 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-8jt8c"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.015464 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-x4mls"] Feb 19 21:49:56 crc kubenswrapper[4795]: E0219 21:49:56.023091 4795 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 19 21:49:56 crc kubenswrapper[4795]: E0219 21:49:56.023188 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-config-data podName:7b096325-542d-4ac6-8d16-8aa0937013b2 nodeName:}" failed. No retries permitted until 2026-02-19 21:49:57.023142641 +0000 UTC m=+1308.215660505 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-config-data") pod "rabbitmq-server-0" (UID: "7b096325-542d-4ac6-8d16-8aa0937013b2") : configmap "rabbitmq-config-data" not found Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.024072 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-8jt8c"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.029228 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ktl2b" event={"ID":"2164f9d1-1d8b-486b-beca-0d3a5172b302","Type":"ContainerStarted","Data":"0bc8d13f4092138cc363d9e77ad1f35f49f21dad6c940b0ffcd7de9f24d779fb"} Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.052446 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3c2bcb9c-07d3-4d71-924b-aacd537e3430/ovsdbserver-sb/0.log" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.052501 4795 generic.go:334] "Generic (PLEG): container finished" podID="3c2bcb9c-07d3-4d71-924b-aacd537e3430" containerID="e932ce1114c0c3a49c8f6332f06a4a9aadb5f1382200346f37f0e3e9fe2d3373" exitCode=2 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.052520 4795 generic.go:334] "Generic (PLEG): container finished" podID="3c2bcb9c-07d3-4d71-924b-aacd537e3430" containerID="fc2aa4e6ca186f0a3259cef3b73108248f8b30394d6e7f6ee3356a21235bd96b" exitCode=143 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.053360 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-w8x5b"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.053393 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3c2bcb9c-07d3-4d71-924b-aacd537e3430","Type":"ContainerDied","Data":"e932ce1114c0c3a49c8f6332f06a4a9aadb5f1382200346f37f0e3e9fe2d3373"} Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.053414 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3c2bcb9c-07d3-4d71-924b-aacd537e3430","Type":"ContainerDied","Data":"fc2aa4e6ca186f0a3259cef3b73108248f8b30394d6e7f6ee3356a21235bd96b"} Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.062894 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-w8x5b"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.068583 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3c5a8678-8ce2-4bee-9160-37b1dea9f897/ovsdbserver-nb/0.log" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.068658 4795 generic.go:334] "Generic (PLEG): container finished" podID="3c5a8678-8ce2-4bee-9160-37b1dea9f897" containerID="9afa6d2d9c1c3abf2795f79e56d0f2c85700d5b1b8b011dec52725e8bbc63d6b" exitCode=2 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.068678 4795 generic.go:334] "Generic (PLEG): container finished" podID="3c5a8678-8ce2-4bee-9160-37b1dea9f897" containerID="146b7776ad30bb62917c430a4cb976669fc9f5db740f7f370033e5be6e16f033" exitCode=143 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.068907 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3c5a8678-8ce2-4bee-9160-37b1dea9f897","Type":"ContainerDied","Data":"9afa6d2d9c1c3abf2795f79e56d0f2c85700d5b1b8b011dec52725e8bbc63d6b"} Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.068937 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3c5a8678-8ce2-4bee-9160-37b1dea9f897","Type":"ContainerDied","Data":"146b7776ad30bb62917c430a4cb976669fc9f5db740f7f370033e5be6e16f033"} Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.077364 4795 generic.go:334] "Generic (PLEG): container finished" podID="74df8ac0-77f2-4e8d-aa39-d05dc6ce7190" containerID="0b266e887f3cb16fe191b1e64cc8d8adf7464d874863c315ba4605ce8d79b678" exitCode=0 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.077415 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7677694455-vk4hv" event={"ID":"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190","Type":"ContainerDied","Data":"0b266e887f3cb16fe191b1e64cc8d8adf7464d874863c315ba4605ce8d79b678"} Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.079323 4795 generic.go:334] "Generic (PLEG): container finished" podID="336beec4-e534-448f-8367-78645b53650e" containerID="6f5fd7fb0db869022abdd3586a7debaf90502df56c7437b86541c8afbbd3687a" exitCode=137 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.079421 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.128987 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-config\") pod \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.129071 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-dns-swift-storage-0\") pod \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.129091 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8xm5\" (UniqueName: \"kubernetes.io/projected/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-kube-api-access-s8xm5\") pod \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.138744 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-dns-svc\") pod \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.138791 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-ovsdbserver-sb\") pod \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.138819 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-ovsdbserver-nb\") pod \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " Feb 19 21:49:56 crc kubenswrapper[4795]: E0219 21:49:56.139624 4795 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 19 21:49:56 crc kubenswrapper[4795]: E0219 21:49:56.139675 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-config-data podName:ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d nodeName:}" failed. No retries permitted until 2026-02-19 21:49:58.139659587 +0000 UTC m=+1309.332177451 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-config-data") pod "rabbitmq-cell1-server-0" (UID: "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d") : configmap "rabbitmq-cell1-config-data" not found Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.170254 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-kube-api-access-s8xm5" (OuterVolumeSpecName: "kube-api-access-s8xm5") pod "74df8ac0-77f2-4e8d-aa39-d05dc6ce7190" (UID: "74df8ac0-77f2-4e8d-aa39-d05dc6ce7190"). InnerVolumeSpecName "kube-api-access-s8xm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.244272 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.246919 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d2561f4e-0a01-4927-96f8-ee7bef69f561" containerName="cinder-api-log" containerID="cri-o://46487241a29d4cc3bff33a03b2f13ce2e328740a30388d55d6c987233cf2d399" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.247115 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8xm5\" (UniqueName: \"kubernetes.io/projected/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-kube-api-access-s8xm5\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.247286 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d2561f4e-0a01-4927-96f8-ee7bef69f561" containerName="cinder-api" containerID="cri-o://067a3784bb90d910b6b73dca0dc993d5c4844e46f49fddffcbfe6f467e1645d3" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.256461 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7c93-account-create-update-ptrqq"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.265124 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7c93-account-create-update-ptrqq"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.274951 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.275841 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c54f77a4-1095-4ff1-bc74-b845cde659d9" containerName="cinder-scheduler" containerID="cri-o://5256ff7cce2d8a61bc1b79dbf58379aff2f08fd19479fafdd317965f24177cc8" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.276633 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c54f77a4-1095-4ff1-bc74-b845cde659d9" containerName="probe" containerID="cri-o://b84203bdd91f64f0aed89ae5d8334985c117eb72cc298fbd0865d60f726218c1" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.303638 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-vlmnn"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.309177 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-config" (OuterVolumeSpecName: "config") pod "74df8ac0-77f2-4e8d-aa39-d05dc6ce7190" (UID: "74df8ac0-77f2-4e8d-aa39-d05dc6ce7190"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.326881 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-vlmnn"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.331845 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.352457 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-d602-account-create-update-lcd8k"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.354324 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.354416 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.356312 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7400eda6-e731-4942-b002-c81dd9a87e6a-operator-scripts\") pod \"nova-cell1-e48f-account-create-update-n77q8\" (UID: \"7400eda6-e731-4942-b002-c81dd9a87e6a\") " pod="openstack/nova-cell1-e48f-account-create-update-n77q8" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.356473 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8sbw\" (UniqueName: \"kubernetes.io/projected/7400eda6-e731-4942-b002-c81dd9a87e6a-kube-api-access-z8sbw\") pod \"nova-cell1-e48f-account-create-update-n77q8\" (UID: \"7400eda6-e731-4942-b002-c81dd9a87e6a\") " pod="openstack/nova-cell1-e48f-account-create-update-n77q8" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.348825 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "74df8ac0-77f2-4e8d-aa39-d05dc6ce7190" (UID: "74df8ac0-77f2-4e8d-aa39-d05dc6ce7190"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.354750 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3697a3b0-4077-4837-bcdc-c17d8aa361f1" containerName="glance-log" containerID="cri-o://e797be576a87ea7d2cd1a10d4fb93c6e0f25a6a5bebf1abb85c7b6e12aa13e38" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.354767 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3697a3b0-4077-4837-bcdc-c17d8aa361f1" containerName="glance-httpd" containerID="cri-o://c6abf78f9f811ce98af3de204165d6af86666923e425da520b7a47fdc3944ee7" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.353891 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" containerName="glance-httpd" containerID="cri-o://708e4d014b2d6219a997cb5284977900cb2aafb592d9023c95962ea2c3074217" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.353518 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" containerName="glance-log" containerID="cri-o://46cb4f812b6ae6e3f71ffef829cc67ce3250106de6851d56a2c280735844fd0f" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: E0219 21:49:56.357420 4795 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Feb 19 21:49:56 crc kubenswrapper[4795]: E0219 21:49:56.358511 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7400eda6-e731-4942-b002-c81dd9a87e6a-operator-scripts podName:7400eda6-e731-4942-b002-c81dd9a87e6a nodeName:}" failed. No retries permitted until 2026-02-19 21:49:57.358488629 +0000 UTC m=+1308.551006493 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7400eda6-e731-4942-b002-c81dd9a87e6a-operator-scripts") pod "nova-cell1-e48f-account-create-update-n77q8" (UID: "7400eda6-e731-4942-b002-c81dd9a87e6a") : configmap "openstack-cell1-scripts" not found Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.358788 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.358844 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: E0219 21:49:56.359996 4795 projected.go:194] Error preparing data for projected volume kube-api-access-z8sbw for pod openstack/nova-cell1-e48f-account-create-update-n77q8: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 19 21:49:56 crc kubenswrapper[4795]: E0219 21:49:56.360106 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7400eda6-e731-4942-b002-c81dd9a87e6a-kube-api-access-z8sbw podName:7400eda6-e731-4942-b002-c81dd9a87e6a nodeName:}" failed. No retries permitted until 2026-02-19 21:49:57.360092134 +0000 UTC m=+1308.552609998 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-z8sbw" (UniqueName: "kubernetes.io/projected/7400eda6-e731-4942-b002-c81dd9a87e6a-kube-api-access-z8sbw") pod "nova-cell1-e48f-account-create-update-n77q8" (UID: "7400eda6-e731-4942-b002-c81dd9a87e6a") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.362958 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-lqp9l"] Feb 19 21:49:56 crc kubenswrapper[4795]: E0219 21:49:56.391433 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 19 21:49:56 crc kubenswrapper[4795]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 19 21:49:56 crc kubenswrapper[4795]: Feb 19 21:49:56 crc kubenswrapper[4795]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 19 21:49:56 crc kubenswrapper[4795]: Feb 19 21:49:56 crc kubenswrapper[4795]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 19 21:49:56 crc kubenswrapper[4795]: Feb 19 21:49:56 crc kubenswrapper[4795]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 19 21:49:56 crc kubenswrapper[4795]: Feb 19 21:49:56 crc kubenswrapper[4795]: if [ -n "barbican" ]; then Feb 19 21:49:56 crc kubenswrapper[4795]: GRANT_DATABASE="barbican" Feb 19 21:49:56 crc kubenswrapper[4795]: else Feb 19 21:49:56 crc kubenswrapper[4795]: GRANT_DATABASE="*" Feb 19 21:49:56 crc kubenswrapper[4795]: fi Feb 19 21:49:56 crc kubenswrapper[4795]: Feb 19 21:49:56 crc kubenswrapper[4795]: # going for maximum compatibility here: Feb 19 21:49:56 crc kubenswrapper[4795]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 19 21:49:56 crc kubenswrapper[4795]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 19 21:49:56 crc kubenswrapper[4795]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 19 21:49:56 crc kubenswrapper[4795]: # support updates Feb 19 21:49:56 crc kubenswrapper[4795]: Feb 19 21:49:56 crc kubenswrapper[4795]: $MYSQL_CMD < logger="UnhandledError" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.391498 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-lqp9l"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.393014 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "74df8ac0-77f2-4e8d-aa39-d05dc6ce7190" (UID: "74df8ac0-77f2-4e8d-aa39-d05dc6ce7190"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: E0219 21:49:56.393210 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-d602-account-create-update-lcd8k" podUID="10e13a52-b0f3-447a-b47e-2c4dd50d6400" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.431804 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "74df8ac0-77f2-4e8d-aa39-d05dc6ce7190" (UID: "74df8ac0-77f2-4e8d-aa39-d05dc6ce7190"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.456667 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-c741-account-create-update-26ljt"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.460147 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.460188 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.472725 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-snb69"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.472802 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" containerName="rabbitmq" containerID="cri-o://5a6b19520891e7087129c9dfe002592444a956d213365be36d88dc721e7adc6e" gracePeriod=604800 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.482515 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7cd95cf589-2gw48"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.482903 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7cd95cf589-2gw48" podUID="250e9cae-06d9-44da-88af-239d15356a3c" containerName="barbican-worker-log" containerID="cri-o://0d1ad96e830846d9034790e753be1811679f1bbf15af5f12e70081c7ae374cfd" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.483842 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7cd95cf589-2gw48" podUID="250e9cae-06d9-44da-88af-239d15356a3c" containerName="barbican-worker" containerID="cri-o://f1227cc577cca2da8d7067560947f94ac089651b67e263368202e928196a7bc8" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.496425 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-snb69"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.503391 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-9f51-account-create-update-z87p8"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.510336 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "74df8ac0-77f2-4e8d-aa39-d05dc6ce7190" (UID: "74df8ac0-77f2-4e8d-aa39-d05dc6ce7190"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.516144 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6f98bf9994-pr48x"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.516711 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6f98bf9994-pr48x" podUID="e4a6a069-904a-4072-b98c-346f67f22def" containerName="barbican-api-log" containerID="cri-o://d4e7caf152ae88ec057977570b5950355e310fd918c2c78401ccea557c71b632" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.517187 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6f98bf9994-pr48x" podUID="e4a6a069-904a-4072-b98c-346f67f22def" containerName="barbican-api" containerID="cri-o://a6cf86c6cb492d1373f389e62b5d8e8687a30a53e6a19fc785851a2a8a42b429" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.529385 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-7ktnd"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.565322 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.570803 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-7ktnd"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.576094 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3c5a8678-8ce2-4bee-9160-37b1dea9f897/ovsdbserver-nb/0.log" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.576150 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.600945 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3c2bcb9c-07d3-4d71-924b-aacd537e3430/ovsdbserver-sb/0.log" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.601004 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.606982 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-tl5hf" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovs-vswitchd" containerID="cri-o://ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: E0219 21:49:56.608398 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 19 21:49:56 crc kubenswrapper[4795]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 19 21:49:56 crc kubenswrapper[4795]: Feb 19 21:49:56 crc kubenswrapper[4795]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 19 21:49:56 crc kubenswrapper[4795]: Feb 19 21:49:56 crc kubenswrapper[4795]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 19 21:49:56 crc kubenswrapper[4795]: Feb 19 21:49:56 crc kubenswrapper[4795]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 19 21:49:56 crc kubenswrapper[4795]: Feb 19 21:49:56 crc kubenswrapper[4795]: if [ -n "placement" ]; then Feb 19 21:49:56 crc kubenswrapper[4795]: GRANT_DATABASE="placement" Feb 19 21:49:56 crc kubenswrapper[4795]: else Feb 19 21:49:56 crc kubenswrapper[4795]: GRANT_DATABASE="*" Feb 19 21:49:56 crc kubenswrapper[4795]: fi Feb 19 21:49:56 crc kubenswrapper[4795]: Feb 19 21:49:56 crc kubenswrapper[4795]: # going for maximum compatibility here: Feb 19 21:49:56 crc kubenswrapper[4795]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 19 21:49:56 crc kubenswrapper[4795]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 19 21:49:56 crc kubenswrapper[4795]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 19 21:49:56 crc kubenswrapper[4795]: # support updates Feb 19 21:49:56 crc kubenswrapper[4795]: Feb 19 21:49:56 crc kubenswrapper[4795]: $MYSQL_CMD < logger="UnhandledError" Feb 19 21:49:56 crc kubenswrapper[4795]: E0219 21:49:56.612887 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"placement-db-secret\\\" not found\"" pod="openstack/placement-c741-account-create-update-26ljt" podUID="3e65bdd0-b6ac-406d-bc79-ade76397295e" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.614217 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6b6d98fbd4-svzc8"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.614506 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" podUID="4532c069-4eb7-48ab-b575-b6a130e2b438" containerName="barbican-keystone-listener-log" containerID="cri-o://374356bcc0d8632ce08f62d54e2492534b8e8fc2a40b07f92483295295d884ef" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.614633 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" podUID="4532c069-4eb7-48ab-b575-b6a130e2b438" containerName="barbican-keystone-listener" containerID="cri-o://b139002ae58e716fbddbfc8a4fe687ac4f1df52f0c26c4b6314da64bed2d0075" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.620771 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.661651 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-f769-account-create-update-8k7r2"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.669750 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.670074 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" containerName="nova-metadata-log" containerID="cri-o://a92c3489fb5665772be143ca85a31a0e43e2db5654864e03970155d992501b37" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.670247 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" containerName="nova-metadata-metadata" containerID="cri-o://11d0871b61a89a3e5545d5527d867256cc43829525a32af590e7a61ae24edec5" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.693333 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-2d62-account-create-update-4vs7v"] Feb 19 21:49:56 crc kubenswrapper[4795]: E0219 21:49:56.696509 4795 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Feb 19 21:49:56 crc kubenswrapper[4795]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 19 21:49:56 crc kubenswrapper[4795]: + source /usr/local/bin/container-scripts/functions Feb 19 21:49:56 crc kubenswrapper[4795]: ++ OVNBridge=br-int Feb 19 21:49:56 crc kubenswrapper[4795]: ++ OVNRemote=tcp:localhost:6642 Feb 19 21:49:56 crc kubenswrapper[4795]: ++ OVNEncapType=geneve Feb 19 21:49:56 crc kubenswrapper[4795]: ++ OVNAvailabilityZones= Feb 19 21:49:56 crc kubenswrapper[4795]: ++ EnableChassisAsGateway=true Feb 19 21:49:56 crc kubenswrapper[4795]: ++ PhysicalNetworks= Feb 19 21:49:56 crc kubenswrapper[4795]: ++ OVNHostName= Feb 19 21:49:56 crc kubenswrapper[4795]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 19 21:49:56 crc kubenswrapper[4795]: ++ ovs_dir=/var/lib/openvswitch Feb 19 21:49:56 crc kubenswrapper[4795]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 19 21:49:56 crc kubenswrapper[4795]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 19 21:49:56 crc kubenswrapper[4795]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 19 21:49:56 crc kubenswrapper[4795]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 19 21:49:56 crc kubenswrapper[4795]: + sleep 0.5 Feb 19 21:49:56 crc kubenswrapper[4795]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 19 21:49:56 crc kubenswrapper[4795]: + cleanup_ovsdb_server_semaphore Feb 19 21:49:56 crc kubenswrapper[4795]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 19 21:49:56 crc kubenswrapper[4795]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 19 21:49:56 crc kubenswrapper[4795]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-tl5hf" message=< Feb 19 21:49:56 crc kubenswrapper[4795]: Exiting ovsdb-server (5) [ OK ] Feb 19 21:49:56 crc kubenswrapper[4795]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 19 21:49:56 crc kubenswrapper[4795]: + source /usr/local/bin/container-scripts/functions Feb 19 21:49:56 crc kubenswrapper[4795]: ++ OVNBridge=br-int Feb 19 21:49:56 crc kubenswrapper[4795]: ++ OVNRemote=tcp:localhost:6642 Feb 19 21:49:56 crc kubenswrapper[4795]: ++ OVNEncapType=geneve Feb 19 21:49:56 crc kubenswrapper[4795]: ++ OVNAvailabilityZones= Feb 19 21:49:56 crc kubenswrapper[4795]: ++ EnableChassisAsGateway=true Feb 19 21:49:56 crc kubenswrapper[4795]: ++ PhysicalNetworks= Feb 19 21:49:56 crc kubenswrapper[4795]: ++ OVNHostName= Feb 19 21:49:56 crc kubenswrapper[4795]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 19 21:49:56 crc kubenswrapper[4795]: ++ ovs_dir=/var/lib/openvswitch Feb 19 21:49:56 crc kubenswrapper[4795]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 19 21:49:56 crc kubenswrapper[4795]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 19 21:49:56 crc kubenswrapper[4795]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 19 21:49:56 crc kubenswrapper[4795]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 19 21:49:56 crc kubenswrapper[4795]: + sleep 0.5 Feb 19 21:49:56 crc kubenswrapper[4795]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 19 21:49:56 crc kubenswrapper[4795]: + cleanup_ovsdb_server_semaphore Feb 19 21:49:56 crc kubenswrapper[4795]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 19 21:49:56 crc kubenswrapper[4795]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 19 21:49:56 crc kubenswrapper[4795]: > Feb 19 21:49:56 crc kubenswrapper[4795]: E0219 21:49:56.696543 4795 kuberuntime_container.go:691] "PreStop hook failed" err=< Feb 19 21:49:56 crc kubenswrapper[4795]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 19 21:49:56 crc kubenswrapper[4795]: + source /usr/local/bin/container-scripts/functions Feb 19 21:49:56 crc kubenswrapper[4795]: ++ OVNBridge=br-int Feb 19 21:49:56 crc kubenswrapper[4795]: ++ OVNRemote=tcp:localhost:6642 Feb 19 21:49:56 crc kubenswrapper[4795]: ++ OVNEncapType=geneve Feb 19 21:49:56 crc kubenswrapper[4795]: ++ OVNAvailabilityZones= Feb 19 21:49:56 crc kubenswrapper[4795]: ++ EnableChassisAsGateway=true Feb 19 21:49:56 crc kubenswrapper[4795]: ++ PhysicalNetworks= Feb 19 21:49:56 crc kubenswrapper[4795]: ++ OVNHostName= Feb 19 21:49:56 crc kubenswrapper[4795]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 19 21:49:56 crc kubenswrapper[4795]: ++ ovs_dir=/var/lib/openvswitch Feb 19 21:49:56 crc kubenswrapper[4795]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 19 21:49:56 crc kubenswrapper[4795]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 19 21:49:56 crc kubenswrapper[4795]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 19 21:49:56 crc kubenswrapper[4795]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 19 21:49:56 crc kubenswrapper[4795]: + sleep 0.5 Feb 19 21:49:56 crc kubenswrapper[4795]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 19 21:49:56 crc kubenswrapper[4795]: + cleanup_ovsdb_server_semaphore Feb 19 21:49:56 crc kubenswrapper[4795]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 19 21:49:56 crc kubenswrapper[4795]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 19 21:49:56 crc kubenswrapper[4795]: > pod="openstack/ovn-controller-ovs-tl5hf" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovsdb-server" containerID="cri-o://e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.696575 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-tl5hf" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovsdb-server" containerID="cri-o://e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.705883 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.718683 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.718960 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="793bbadc-8b53-4084-a63a-0b76b37284df" containerName="nova-api-log" containerID="cri-o://4a55d98b95e49937bea845358fb6e967ce1fb52ec526e4696c5d439369df7681" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.719631 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="793bbadc-8b53-4084-a63a-0b76b37284df" containerName="nova-api-api" containerID="cri-o://5b12dfe428f5ad6bc0c6c39c5790008897c140f405635c55ad4d8157a86e3f70" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.741108 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-h72xz"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.756781 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-1922-account-create-update-gzkhl"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.768895 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3c5a8678-8ce2-4bee-9160-37b1dea9f897-ovsdb-rundir\") pod \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.769646 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c5a8678-8ce2-4bee-9160-37b1dea9f897-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "3c5a8678-8ce2-4bee-9160-37b1dea9f897" (UID: "3c5a8678-8ce2-4bee-9160-37b1dea9f897"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.768942 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-ovsdbserver-nb-tls-certs\") pod \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.769928 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c5a8678-8ce2-4bee-9160-37b1dea9f897-config\") pod \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.769955 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbtv2\" (UniqueName: \"kubernetes.io/projected/336beec4-e534-448f-8367-78645b53650e-kube-api-access-bbtv2\") pod \"336beec4-e534-448f-8367-78645b53650e\" (UID: \"336beec4-e534-448f-8367-78645b53650e\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770035 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-metrics-certs-tls-certs\") pod \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770099 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-combined-ca-bundle\") pod \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770126 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3c2bcb9c-07d3-4d71-924b-aacd537e3430-ovsdb-rundir\") pod \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770155 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czm58\" (UniqueName: \"kubernetes.io/projected/3c5a8678-8ce2-4bee-9160-37b1dea9f897-kube-api-access-czm58\") pod \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770280 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-metrics-certs-tls-certs\") pod \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770302 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770329 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/336beec4-e534-448f-8367-78645b53650e-openstack-config\") pod \"336beec4-e534-448f-8367-78645b53650e\" (UID: \"336beec4-e534-448f-8367-78645b53650e\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770367 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c2bcb9c-07d3-4d71-924b-aacd537e3430-config\") pod \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770388 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c2bcb9c-07d3-4d71-924b-aacd537e3430-scripts\") pod \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770440 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770477 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/336beec4-e534-448f-8367-78645b53650e-combined-ca-bundle\") pod \"336beec4-e534-448f-8367-78645b53650e\" (UID: \"336beec4-e534-448f-8367-78645b53650e\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770507 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-combined-ca-bundle\") pod \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770523 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-ovsdbserver-sb-tls-certs\") pod \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770551 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwcsv\" (UniqueName: \"kubernetes.io/projected/3c2bcb9c-07d3-4d71-924b-aacd537e3430-kube-api-access-fwcsv\") pod \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770573 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/336beec4-e534-448f-8367-78645b53650e-openstack-config-secret\") pod \"336beec4-e534-448f-8367-78645b53650e\" (UID: \"336beec4-e534-448f-8367-78645b53650e\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770598 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c5a8678-8ce2-4bee-9160-37b1dea9f897-scripts\") pod \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770996 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c5a8678-8ce2-4bee-9160-37b1dea9f897-config" (OuterVolumeSpecName: "config") pod "3c5a8678-8ce2-4bee-9160-37b1dea9f897" (UID: "3c5a8678-8ce2-4bee-9160-37b1dea9f897"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.772271 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3c5a8678-8ce2-4bee-9160-37b1dea9f897-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.772309 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c5a8678-8ce2-4bee-9160-37b1dea9f897-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.774443 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c2bcb9c-07d3-4d71-924b-aacd537e3430-config" (OuterVolumeSpecName: "config") pod "3c2bcb9c-07d3-4d71-924b-aacd537e3430" (UID: "3c2bcb9c-07d3-4d71-924b-aacd537e3430"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.776407 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-h72xz"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.811730 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c2bcb9c-07d3-4d71-924b-aacd537e3430-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "3c2bcb9c-07d3-4d71-924b-aacd537e3430" (UID: "3c2bcb9c-07d3-4d71-924b-aacd537e3430"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.820212 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c2bcb9c-07d3-4d71-924b-aacd537e3430-kube-api-access-fwcsv" (OuterVolumeSpecName: "kube-api-access-fwcsv") pod "3c2bcb9c-07d3-4d71-924b-aacd537e3430" (UID: "3c2bcb9c-07d3-4d71-924b-aacd537e3430"). InnerVolumeSpecName "kube-api-access-fwcsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.828951 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c2bcb9c-07d3-4d71-924b-aacd537e3430-scripts" (OuterVolumeSpecName: "scripts") pod "3c2bcb9c-07d3-4d71-924b-aacd537e3430" (UID: "3c2bcb9c-07d3-4d71-924b-aacd537e3430"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.829322 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-p7s8n"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.828617 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "3c2bcb9c-07d3-4d71-924b-aacd537e3430" (UID: "3c2bcb9c-07d3-4d71-924b-aacd537e3430"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.830805 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c5a8678-8ce2-4bee-9160-37b1dea9f897-scripts" (OuterVolumeSpecName: "scripts") pod "3c5a8678-8ce2-4bee-9160-37b1dea9f897" (UID: "3c5a8678-8ce2-4bee-9160-37b1dea9f897"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.878288 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c2bcb9c-07d3-4d71-924b-aacd537e3430-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.878345 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c2bcb9c-07d3-4d71-924b-aacd537e3430-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.878371 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.878401 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwcsv\" (UniqueName: \"kubernetes.io/projected/3c2bcb9c-07d3-4d71-924b-aacd537e3430-kube-api-access-fwcsv\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.878412 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c5a8678-8ce2-4bee-9160-37b1dea9f897-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.878421 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3c2bcb9c-07d3-4d71-924b-aacd537e3430-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.882910 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "3c5a8678-8ce2-4bee-9160-37b1dea9f897" (UID: "3c5a8678-8ce2-4bee-9160-37b1dea9f897"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.883061 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c5a8678-8ce2-4bee-9160-37b1dea9f897-kube-api-access-czm58" (OuterVolumeSpecName: "kube-api-access-czm58") pod "3c5a8678-8ce2-4bee-9160-37b1dea9f897" (UID: "3c5a8678-8ce2-4bee-9160-37b1dea9f897"). InnerVolumeSpecName "kube-api-access-czm58". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.883153 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/336beec4-e534-448f-8367-78645b53650e-kube-api-access-bbtv2" (OuterVolumeSpecName: "kube-api-access-bbtv2") pod "336beec4-e534-448f-8367-78645b53650e" (UID: "336beec4-e534-448f-8367-78645b53650e"). InnerVolumeSpecName "kube-api-access-bbtv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.931647 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c2bcb9c-07d3-4d71-924b-aacd537e3430" (UID: "3c2bcb9c-07d3-4d71-924b-aacd537e3430"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.932538 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-e48f-account-create-update-n77q8"] Feb 19 21:49:56 crc kubenswrapper[4795]: E0219 21:49:56.933282 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-z8sbw operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/nova-cell1-e48f-account-create-update-n77q8" podUID="7400eda6-e731-4942-b002-c81dd9a87e6a" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.939764 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/336beec4-e534-448f-8367-78645b53650e-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "336beec4-e534-448f-8367-78645b53650e" (UID: "336beec4-e534-448f-8367-78645b53650e"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.947091 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-r8v4f"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.951426 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-p7s8n"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.957381 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.965211 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-r8v4f"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.968338 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c5a8678-8ce2-4bee-9160-37b1dea9f897" (UID: "3c5a8678-8ce2-4bee-9160-37b1dea9f897"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.971955 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-d602-account-create-update-lcd8k"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.972052 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/336beec4-e534-448f-8367-78645b53650e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "336beec4-e534-448f-8367-78645b53650e" (UID: "336beec4-e534-448f-8367-78645b53650e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.978555 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.978765 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="0adadcd9-8949-443b-8042-d0d11191eae9" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://0f773609fa3d94e54bc6067d53960a0ec55209c713b15b0460c6c5287772d27f" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.980414 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.980433 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/336beec4-e534-448f-8367-78645b53650e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.980442 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.980450 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbtv2\" (UniqueName: \"kubernetes.io/projected/336beec4-e534-448f-8367-78645b53650e-kube-api-access-bbtv2\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.980458 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.980466 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czm58\" (UniqueName: \"kubernetes.io/projected/3c5a8678-8ce2-4bee-9160-37b1dea9f897-kube-api-access-czm58\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.980492 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.980503 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/336beec4-e534-448f-8367-78645b53650e-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.986981 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:56.995892 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-c741-account-create-update-26ljt"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.011628 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.023943 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-p9cs4_eee0ea5d-4b43-4421-b23e-555c5eac3564/openstack-network-exporter/0.log" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.024007 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.025319 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/336beec4-e534-448f-8367-78645b53650e-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "336beec4-e534-448f-8367-78645b53650e" (UID: "336beec4-e534-448f-8367-78645b53650e"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.027922 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "3c2bcb9c-07d3-4d71-924b-aacd537e3430" (UID: "3c2bcb9c-07d3-4d71-924b-aacd537e3430"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.055344 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="7b096325-542d-4ac6-8d16-8aa0937013b2" containerName="rabbitmq" containerID="cri-o://65d80d81718a8cacc6c1500df8f269c37262bad204c9b2124ccbee151d27a66a" gracePeriod=604800 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.055897 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gxh8d"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.060006 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "3c5a8678-8ce2-4bee-9160-37b1dea9f897" (UID: "3c5a8678-8ce2-4bee-9160-37b1dea9f897"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.063639 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 19 21:49:57 crc kubenswrapper[4795]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: if [ -n "cinder" ]; then Feb 19 21:49:57 crc kubenswrapper[4795]: GRANT_DATABASE="cinder" Feb 19 21:49:57 crc kubenswrapper[4795]: else Feb 19 21:49:57 crc kubenswrapper[4795]: GRANT_DATABASE="*" Feb 19 21:49:57 crc kubenswrapper[4795]: fi Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: # going for maximum compatibility here: Feb 19 21:49:57 crc kubenswrapper[4795]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 19 21:49:57 crc kubenswrapper[4795]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 19 21:49:57 crc kubenswrapper[4795]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 19 21:49:57 crc kubenswrapper[4795]: # support updates Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: $MYSQL_CMD < logger="UnhandledError" Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.064853 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack/cinder-9f51-account-create-update-z87p8" podUID="b0500ca0-0cef-4b76-9c78-cb2189b520ff" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.066861 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.067450 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="f6fd7841-2a08-4786-8e96-b2ab0f477eff" containerName="nova-cell1-conductor-conductor" containerID="cri-o://4f511a80129a9b198e5be1216670e966f0623d26f1c7d17cc3a99a813e69b1c3" gracePeriod=30 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.074466 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gxh8d"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.076925 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "3c5a8678-8ce2-4bee-9160-37b1dea9f897" (UID: "3c5a8678-8ce2-4bee-9160-37b1dea9f897"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.082383 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.082616 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="57b83043-2f7c-4b55-a2b9-66eef96f0008" containerName="nova-cell0-conductor-conductor" containerID="cri-o://4f011f521b748247939cf0cf1c345e321c4287fa532812c9357da77b932cf4c6" gracePeriod=30 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.086600 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eee0ea5d-4b43-4421-b23e-555c5eac3564-combined-ca-bundle\") pod \"eee0ea5d-4b43-4421-b23e-555c5eac3564\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.086659 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eee0ea5d-4b43-4421-b23e-555c5eac3564-metrics-certs-tls-certs\") pod \"eee0ea5d-4b43-4421-b23e-555c5eac3564\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.086682 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eee0ea5d-4b43-4421-b23e-555c5eac3564-config\") pod \"eee0ea5d-4b43-4421-b23e-555c5eac3564\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.086748 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/eee0ea5d-4b43-4421-b23e-555c5eac3564-ovs-rundir\") pod \"eee0ea5d-4b43-4421-b23e-555c5eac3564\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.086805 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/eee0ea5d-4b43-4421-b23e-555c5eac3564-ovn-rundir\") pod \"eee0ea5d-4b43-4421-b23e-555c5eac3564\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.086912 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wdlj\" (UniqueName: \"kubernetes.io/projected/eee0ea5d-4b43-4421-b23e-555c5eac3564-kube-api-access-6wdlj\") pod \"eee0ea5d-4b43-4421-b23e-555c5eac3564\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.087368 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.087388 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.087399 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/336beec4-e534-448f-8367-78645b53650e-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.087407 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.087416 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.087471 4795 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.087506 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-config-data podName:7b096325-542d-4ac6-8d16-8aa0937013b2 nodeName:}" failed. No retries permitted until 2026-02-19 21:49:59.087492859 +0000 UTC m=+1310.280010723 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-config-data") pod "rabbitmq-server-0" (UID: "7b096325-542d-4ac6-8d16-8aa0937013b2") : configmap "rabbitmq-config-data" not found Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.087702 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7ssdv"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.087847 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eee0ea5d-4b43-4421-b23e-555c5eac3564-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "eee0ea5d-4b43-4421-b23e-555c5eac3564" (UID: "eee0ea5d-4b43-4421-b23e-555c5eac3564"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.092418 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eee0ea5d-4b43-4421-b23e-555c5eac3564-config" (OuterVolumeSpecName: "config") pod "eee0ea5d-4b43-4421-b23e-555c5eac3564" (UID: "eee0ea5d-4b43-4421-b23e-555c5eac3564"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.092851 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eee0ea5d-4b43-4421-b23e-555c5eac3564-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "eee0ea5d-4b43-4421-b23e-555c5eac3564" (UID: "eee0ea5d-4b43-4421-b23e-555c5eac3564"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.097000 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7ssdv"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.103871 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-9f51-account-create-update-z87p8"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.107444 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eee0ea5d-4b43-4421-b23e-555c5eac3564-kube-api-access-6wdlj" (OuterVolumeSpecName: "kube-api-access-6wdlj") pod "eee0ea5d-4b43-4421-b23e-555c5eac3564" (UID: "eee0ea5d-4b43-4421-b23e-555c5eac3564"). InnerVolumeSpecName "kube-api-access-6wdlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.110636 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9f51-account-create-update-z87p8" event={"ID":"b0500ca0-0cef-4b76-9c78-cb2189b520ff","Type":"ContainerStarted","Data":"72ececc667f06320402ffe76c00f9ed550ee45b4a8e93c91bfe4b2261f921f68"} Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.139736 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 19 21:49:57 crc kubenswrapper[4795]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: if [ -n "glance" ]; then Feb 19 21:49:57 crc kubenswrapper[4795]: GRANT_DATABASE="glance" Feb 19 21:49:57 crc kubenswrapper[4795]: else Feb 19 21:49:57 crc kubenswrapper[4795]: GRANT_DATABASE="*" Feb 19 21:49:57 crc kubenswrapper[4795]: fi Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: # going for maximum compatibility here: Feb 19 21:49:57 crc kubenswrapper[4795]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 19 21:49:57 crc kubenswrapper[4795]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 19 21:49:57 crc kubenswrapper[4795]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 19 21:49:57 crc kubenswrapper[4795]: # support updates Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: $MYSQL_CMD < logger="UnhandledError" Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.140885 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack/glance-f769-account-create-update-8k7r2" podUID="8eaa69df-d563-4dc0-8a78-40413946cbca" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.144703 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.144942 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f2710b23-7a5c-44cb-b916-9e08edc59636" containerName="nova-scheduler-scheduler" containerID="cri-o://f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6" gracePeriod=30 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.145882 4795 generic.go:334] "Generic (PLEG): container finished" podID="e4a6a069-904a-4072-b98c-346f67f22def" containerID="d4e7caf152ae88ec057977570b5950355e310fd918c2c78401ccea557c71b632" exitCode=143 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.145982 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f98bf9994-pr48x" event={"ID":"e4a6a069-904a-4072-b98c-346f67f22def","Type":"ContainerDied","Data":"d4e7caf152ae88ec057977570b5950355e310fd918c2c78401ccea557c71b632"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.149382 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "3c2bcb9c-07d3-4d71-924b-aacd537e3430" (UID: "3c2bcb9c-07d3-4d71-924b-aacd537e3430"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.149486 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eee0ea5d-4b43-4421-b23e-555c5eac3564-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eee0ea5d-4b43-4421-b23e-555c5eac3564" (UID: "eee0ea5d-4b43-4421-b23e-555c5eac3564"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.162957 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-f769-account-create-update-8k7r2"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.167272 4795 generic.go:334] "Generic (PLEG): container finished" podID="30f2c894-7a7a-4e5a-a090-a28ab50c766a" containerID="fe9cf6c1b6f422c771781be2cfb21deabc54071e241ab6b5c24d3d83414ee4ec" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.167339 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-576c65f985-r97z7" event={"ID":"30f2c894-7a7a-4e5a-a090-a28ab50c766a","Type":"ContainerDied","Data":"fe9cf6c1b6f422c771781be2cfb21deabc54071e241ab6b5c24d3d83414ee4ec"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.181889 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" containerName="galera" containerID="cri-o://ae770d1785f3d5deaa5b1b98adf315fd8a61a2cbce434ecb3970ab496579b196" gracePeriod=30 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.192622 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.192649 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wdlj\" (UniqueName: \"kubernetes.io/projected/eee0ea5d-4b43-4421-b23e-555c5eac3564-kube-api-access-6wdlj\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.192658 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eee0ea5d-4b43-4421-b23e-555c5eac3564-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.192667 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eee0ea5d-4b43-4421-b23e-555c5eac3564-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.192675 4795 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/eee0ea5d-4b43-4421-b23e-555c5eac3564-ovs-rundir\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.192683 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/eee0ea5d-4b43-4421-b23e-555c5eac3564-ovn-rundir\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.193474 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3c5a8678-8ce2-4bee-9160-37b1dea9f897/ovsdbserver-nb/0.log" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.193555 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3c5a8678-8ce2-4bee-9160-37b1dea9f897","Type":"ContainerDied","Data":"f3cf06abd80bf98f00707db38827bef51395b42a731dbd8362aaf2be900fcb70"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.193592 4795 scope.go:117] "RemoveContainer" containerID="9afa6d2d9c1c3abf2795f79e56d0f2c85700d5b1b8b011dec52725e8bbc63d6b" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.193709 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.222634 4795 generic.go:334] "Generic (PLEG): container finished" podID="250e9cae-06d9-44da-88af-239d15356a3c" containerID="f1227cc577cca2da8d7067560947f94ac089651b67e263368202e928196a7bc8" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.222665 4795 generic.go:334] "Generic (PLEG): container finished" podID="250e9cae-06d9-44da-88af-239d15356a3c" containerID="0d1ad96e830846d9034790e753be1811679f1bbf15af5f12e70081c7ae374cfd" exitCode=143 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.222745 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cd95cf589-2gw48" event={"ID":"250e9cae-06d9-44da-88af-239d15356a3c","Type":"ContainerDied","Data":"f1227cc577cca2da8d7067560947f94ac089651b67e263368202e928196a7bc8"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.222773 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cd95cf589-2gw48" event={"ID":"250e9cae-06d9-44da-88af-239d15356a3c","Type":"ContainerDied","Data":"0d1ad96e830846d9034790e753be1811679f1bbf15af5f12e70081c7ae374cfd"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.225526 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-p9cs4_eee0ea5d-4b43-4421-b23e-555c5eac3564/openstack-network-exporter/0.log" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.225664 4795 generic.go:334] "Generic (PLEG): container finished" podID="eee0ea5d-4b43-4421-b23e-555c5eac3564" containerID="3d2f2fb944954b1447f44f88d21764d52d00b1e2faecd4fc556d6cdb1c4d63f4" exitCode=2 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.225815 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-p9cs4" event={"ID":"eee0ea5d-4b43-4421-b23e-555c5eac3564","Type":"ContainerDied","Data":"3d2f2fb944954b1447f44f88d21764d52d00b1e2faecd4fc556d6cdb1c4d63f4"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.225932 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-p9cs4" event={"ID":"eee0ea5d-4b43-4421-b23e-555c5eac3564","Type":"ContainerDied","Data":"eaba90113d6ff0b858d733af82b8a4a862659df0d41e63fdc645db66d9298341"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.226053 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.227639 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eee0ea5d-4b43-4421-b23e-555c5eac3564-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "eee0ea5d-4b43-4421-b23e-555c5eac3564" (UID: "eee0ea5d-4b43-4421-b23e-555c5eac3564"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.230969 4795 generic.go:334] "Generic (PLEG): container finished" podID="d2561f4e-0a01-4927-96f8-ee7bef69f561" containerID="46487241a29d4cc3bff33a03b2f13ce2e328740a30388d55d6c987233cf2d399" exitCode=143 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.231294 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d2561f4e-0a01-4927-96f8-ee7bef69f561","Type":"ContainerDied","Data":"46487241a29d4cc3bff33a03b2f13ce2e328740a30388d55d6c987233cf2d399"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.233642 4795 generic.go:334] "Generic (PLEG): container finished" podID="4532c069-4eb7-48ab-b575-b6a130e2b438" containerID="374356bcc0d8632ce08f62d54e2492534b8e8fc2a40b07f92483295295d884ef" exitCode=143 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.233681 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" event={"ID":"4532c069-4eb7-48ab-b575-b6a130e2b438","Type":"ContainerDied","Data":"374356bcc0d8632ce08f62d54e2492534b8e8fc2a40b07f92483295295d884ef"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.234802 4795 generic.go:334] "Generic (PLEG): container finished" podID="e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" containerID="a92c3489fb5665772be143ca85a31a0e43e2db5654864e03970155d992501b37" exitCode=143 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.234871 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107","Type":"ContainerDied","Data":"a92c3489fb5665772be143ca85a31a0e43e2db5654864e03970155d992501b37"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238691 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerID="bb960073c3b2955d7aa2d18d3eb2e0958e7e98f4cd499d7077f5064d1e43a05e" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238714 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerID="8d3a569ab5140e595996d2c82fd170ed28aa9420de4fdae36e9b5854b2e0bd5e" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238779 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerID="2ce7f7a343a6c79fd57a6b4c7cea8f6f21ccfbc5ea5261b4c592c5cc2035910e" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238789 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerID="cb0176a835c07bb843ea9834f19b5792b6d9700c5cd61a140ec8b99a66854f5f" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238796 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerID="c8d438309dbe4ee742eb9d7a2b93e755a74c9ba2dd39409dcf7caf84dee6405a" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238802 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerID="c5d0640985105b2140d43ecf956f5621f6e82eb5a3d40d95fb3d09d303406c84" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238808 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerID="f2b55f40d9ab92e06fdc09f65e72764f7f9c63fbb1f126ede2058624236d001f" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238815 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerID="5019fca913d092cd1d004e058553c364ac08007bafeae027b681e3bb6eb59026" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238821 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerID="be1c394688c447ed772b4929317159025a1e97491b40b847644ed369351532b5" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238826 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerID="a86db1a02ddab5086097179b35e6f17d71c32f36157625ee70912b23839603d4" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238833 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerID="d2281de4777acfa86c800d06ed4c2e0ac8613cf4008b8449cd7089d057ee51ec" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238839 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerID="22441008e17545864d7c6366d4ab4fa8333a1c04e36b9961e8fdfdfaeec8b1b6" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238844 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerID="b4b0474dd3a4192273fa0b2dc273a792e955cd0dde33e24a1afa65bb56656eaa" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238850 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerID="51c55baa52f08bcb95276c0f7a67a7ef348b9bd02a9fc401f50e679f37e0c117" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238881 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerDied","Data":"bb960073c3b2955d7aa2d18d3eb2e0958e7e98f4cd499d7077f5064d1e43a05e"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238895 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerDied","Data":"8d3a569ab5140e595996d2c82fd170ed28aa9420de4fdae36e9b5854b2e0bd5e"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238905 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerDied","Data":"2ce7f7a343a6c79fd57a6b4c7cea8f6f21ccfbc5ea5261b4c592c5cc2035910e"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238913 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerDied","Data":"cb0176a835c07bb843ea9834f19b5792b6d9700c5cd61a140ec8b99a66854f5f"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238921 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerDied","Data":"c8d438309dbe4ee742eb9d7a2b93e755a74c9ba2dd39409dcf7caf84dee6405a"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238929 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerDied","Data":"c5d0640985105b2140d43ecf956f5621f6e82eb5a3d40d95fb3d09d303406c84"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238938 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerDied","Data":"f2b55f40d9ab92e06fdc09f65e72764f7f9c63fbb1f126ede2058624236d001f"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238946 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerDied","Data":"5019fca913d092cd1d004e058553c364ac08007bafeae027b681e3bb6eb59026"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238953 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerDied","Data":"be1c394688c447ed772b4929317159025a1e97491b40b847644ed369351532b5"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238961 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerDied","Data":"a86db1a02ddab5086097179b35e6f17d71c32f36157625ee70912b23839603d4"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238969 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerDied","Data":"d2281de4777acfa86c800d06ed4c2e0ac8613cf4008b8449cd7089d057ee51ec"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238977 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerDied","Data":"22441008e17545864d7c6366d4ab4fa8333a1c04e36b9961e8fdfdfaeec8b1b6"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238984 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerDied","Data":"b4b0474dd3a4192273fa0b2dc273a792e955cd0dde33e24a1afa65bb56656eaa"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238992 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerDied","Data":"51c55baa52f08bcb95276c0f7a67a7ef348b9bd02a9fc401f50e679f37e0c117"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.243143 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3c2bcb9c-07d3-4d71-924b-aacd537e3430/ovsdbserver-sb/0.log" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.243391 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.243526 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3c2bcb9c-07d3-4d71-924b-aacd537e3430","Type":"ContainerDied","Data":"0753bcc18c087ec61d4625b239ed921fd6b476f148310ba726f57a4cfa8d345c"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.251300 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c741-account-create-update-26ljt" event={"ID":"3e65bdd0-b6ac-406d-bc79-ade76397295e","Type":"ContainerStarted","Data":"75197e9e2a0c3d83dd9ff3b61a9f77e4a300ce7b88d838cd67a1b818ba9ef069"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.270802 4795 generic.go:334] "Generic (PLEG): container finished" podID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.271059 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tl5hf" event={"ID":"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3","Type":"ContainerDied","Data":"e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.281376 4795 generic.go:334] "Generic (PLEG): container finished" podID="b01bcd5b-435a-4702-b0a4-8dfe8f553c23" containerID="16d1434f729c32f7f5af098d1664dafb8ed3d4636079462bf6c45b1454ee08ef" exitCode=143 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.281473 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6df95dfbd4-ftf6x" event={"ID":"b01bcd5b-435a-4702-b0a4-8dfe8f553c23","Type":"ContainerDied","Data":"16d1434f729c32f7f5af098d1664dafb8ed3d4636079462bf6c45b1454ee08ef"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.294964 4795 generic.go:334] "Generic (PLEG): container finished" podID="793bbadc-8b53-4084-a63a-0b76b37284df" containerID="4a55d98b95e49937bea845358fb6e967ce1fb52ec526e4696c5d439369df7681" exitCode=143 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.295209 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"793bbadc-8b53-4084-a63a-0b76b37284df","Type":"ContainerDied","Data":"4a55d98b95e49937bea845358fb6e967ce1fb52ec526e4696c5d439369df7681"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.295350 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eee0ea5d-4b43-4421-b23e-555c5eac3564-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.296500 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-1922-account-create-update-gzkhl"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.298867 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7677694455-vk4hv" event={"ID":"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190","Type":"ContainerDied","Data":"cdc18778f810815fa368ef0cc45dbb0e103ffbfcfa9231a83aa5be2dd6cfe1c2"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.298966 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.323551 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-2d62-account-create-update-4vs7v"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.343006 4795 generic.go:334] "Generic (PLEG): container finished" podID="a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" containerID="46cb4f812b6ae6e3f71ffef829cc67ce3250106de6851d56a2c280735844fd0f" exitCode=143 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.343076 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9","Type":"ContainerDied","Data":"46cb4f812b6ae6e3f71ffef829cc67ce3250106de6851d56a2c280735844fd0f"} Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.343307 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 19 21:49:57 crc kubenswrapper[4795]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: if [ -n "nova_cell0" ]; then Feb 19 21:49:57 crc kubenswrapper[4795]: GRANT_DATABASE="nova_cell0" Feb 19 21:49:57 crc kubenswrapper[4795]: else Feb 19 21:49:57 crc kubenswrapper[4795]: GRANT_DATABASE="*" Feb 19 21:49:57 crc kubenswrapper[4795]: fi Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: # going for maximum compatibility here: Feb 19 21:49:57 crc kubenswrapper[4795]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 19 21:49:57 crc kubenswrapper[4795]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 19 21:49:57 crc kubenswrapper[4795]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 19 21:49:57 crc kubenswrapper[4795]: # support updates Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: $MYSQL_CMD < logger="UnhandledError" Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.344890 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack/nova-cell0-1922-account-create-update-gzkhl" podUID="ee3fde95-91bf-4f6a-9753-f879d56fedbb" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.347657 4795 generic.go:334] "Generic (PLEG): container finished" podID="2164f9d1-1d8b-486b-beca-0d3a5172b302" containerID="e72a80485fc6a5dd8d829ffc139130a167468a52119185e5d82a547efb9020bb" exitCode=1 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.347846 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ktl2b" event={"ID":"2164f9d1-1d8b-486b-beca-0d3a5172b302","Type":"ContainerDied","Data":"e72a80485fc6a5dd8d829ffc139130a167468a52119185e5d82a547efb9020bb"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.348122 4795 scope.go:117] "RemoveContainer" containerID="e72a80485fc6a5dd8d829ffc139130a167468a52119185e5d82a547efb9020bb" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.358449 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d602-account-create-update-lcd8k" event={"ID":"10e13a52-b0f3-447a-b47e-2c4dd50d6400","Type":"ContainerStarted","Data":"6e320135d03de930dfd2e7664584c07e81ae7f341dfa862b733a714182521b02"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.367759 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.389318 4795 generic.go:334] "Generic (PLEG): container finished" podID="3697a3b0-4077-4837-bcdc-c17d8aa361f1" containerID="e797be576a87ea7d2cd1a10d4fb93c6e0f25a6a5bebf1abb85c7b6e12aa13e38" exitCode=143 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.389395 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e48f-account-create-update-n77q8" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.389392 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3697a3b0-4077-4837-bcdc-c17d8aa361f1","Type":"ContainerDied","Data":"e797be576a87ea7d2cd1a10d4fb93c6e0f25a6a5bebf1abb85c7b6e12aa13e38"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.400492 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7400eda6-e731-4942-b002-c81dd9a87e6a-operator-scripts\") pod \"nova-cell1-e48f-account-create-update-n77q8\" (UID: \"7400eda6-e731-4942-b002-c81dd9a87e6a\") " pod="openstack/nova-cell1-e48f-account-create-update-n77q8" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.400606 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8sbw\" (UniqueName: \"kubernetes.io/projected/7400eda6-e731-4942-b002-c81dd9a87e6a-kube-api-access-z8sbw\") pod \"nova-cell1-e48f-account-create-update-n77q8\" (UID: \"7400eda6-e731-4942-b002-c81dd9a87e6a\") " pod="openstack/nova-cell1-e48f-account-create-update-n77q8" Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.400670 4795 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.400744 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7400eda6-e731-4942-b002-c81dd9a87e6a-operator-scripts podName:7400eda6-e731-4942-b002-c81dd9a87e6a nodeName:}" failed. No retries permitted until 2026-02-19 21:49:59.400724863 +0000 UTC m=+1310.593242727 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7400eda6-e731-4942-b002-c81dd9a87e6a-operator-scripts") pod "nova-cell1-e48f-account-create-update-n77q8" (UID: "7400eda6-e731-4942-b002-c81dd9a87e6a") : configmap "openstack-cell1-scripts" not found Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.401477 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 19 21:49:57 crc kubenswrapper[4795]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: if [ -n "nova_api" ]; then Feb 19 21:49:57 crc kubenswrapper[4795]: GRANT_DATABASE="nova_api" Feb 19 21:49:57 crc kubenswrapper[4795]: else Feb 19 21:49:57 crc kubenswrapper[4795]: GRANT_DATABASE="*" Feb 19 21:49:57 crc kubenswrapper[4795]: fi Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: # going for maximum compatibility here: Feb 19 21:49:57 crc kubenswrapper[4795]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 19 21:49:57 crc kubenswrapper[4795]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 19 21:49:57 crc kubenswrapper[4795]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 19 21:49:57 crc kubenswrapper[4795]: # support updates Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: $MYSQL_CMD < logger="UnhandledError" Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.402618 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-2d62-account-create-update-4vs7v" podUID="299d8d1c-c181-4c7b-b95f-9f3c62ddb102" Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.408031 4795 projected.go:194] Error preparing data for projected volume kube-api-access-z8sbw for pod openstack/nova-cell1-e48f-account-create-update-n77q8: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.408098 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7400eda6-e731-4942-b002-c81dd9a87e6a-kube-api-access-z8sbw podName:7400eda6-e731-4942-b002-c81dd9a87e6a nodeName:}" failed. No retries permitted until 2026-02-19 21:49:59.4080801 +0000 UTC m=+1310.600597964 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-z8sbw" (UniqueName: "kubernetes.io/projected/7400eda6-e731-4942-b002-c81dd9a87e6a-kube-api-access-z8sbw") pod "nova-cell1-e48f-account-create-update-n77q8" (UID: "7400eda6-e731-4942-b002-c81dd9a87e6a") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.441536 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="7b096325-542d-4ac6-8d16-8aa0937013b2" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.527763 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7677694455-vk4hv"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.553572 4795 scope.go:117] "RemoveContainer" containerID="146b7776ad30bb62917c430a4cb976669fc9f5db740f7f370033e5be6e16f033" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.553784 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e48f-account-create-update-n77q8" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.557879 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7677694455-vk4hv"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.587241 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9f51-account-create-update-z87p8" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.616258 4795 scope.go:117] "RemoveContainer" containerID="3d2f2fb944954b1447f44f88d21764d52d00b1e2faecd4fc556d6cdb1c4d63f4" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.633687 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18561896-d336-4962-8e9e-4ccf748f8605" path="/var/lib/kubelet/pods/18561896-d336-4962-8e9e-4ccf748f8605/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.634541 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1946f4fd-5254-4e66-8739-5a51af23e963" path="/var/lib/kubelet/pods/1946f4fd-5254-4e66-8739-5a51af23e963/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.636728 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29d85454-a8db-47bc-b616-bbdb4f6d8920" path="/var/lib/kubelet/pods/29d85454-a8db-47bc-b616-bbdb4f6d8920/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.637517 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="336beec4-e534-448f-8367-78645b53650e" path="/var/lib/kubelet/pods/336beec4-e534-448f-8367-78645b53650e/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.638808 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3778f66e-fd7f-4af5-ae3e-2a7c272785a0" path="/var/lib/kubelet/pods/3778f66e-fd7f-4af5-ae3e-2a7c272785a0/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.639299 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.640051 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cb700b2-4c29-4deb-a379-d18f2695dcaf" path="/var/lib/kubelet/pods/4cb700b2-4c29-4deb-a379-d18f2695dcaf/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.640327 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.640436 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.640718 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="573a7aa5-43d9-4523-8eea-4c1a36da49fb" path="/var/lib/kubelet/pods/573a7aa5-43d9-4523-8eea-4c1a36da49fb/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.641316 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6952c796-d85e-49b3-b931-60966311a0c0" path="/var/lib/kubelet/pods/6952c796-d85e-49b3-b931-60966311a0c0/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.643226 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c" path="/var/lib/kubelet/pods/6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.643703 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.643851 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.643880 4795 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tl5hf" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovsdb-server" Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.647648 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.647713 4795 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tl5hf" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovs-vswitchd" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.648315 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73f01f44-1467-442f-b91f-ac1765626a3d" path="/var/lib/kubelet/pods/73f01f44-1467-442f-b91f-ac1765626a3d/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.649149 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74df8ac0-77f2-4e8d-aa39-d05dc6ce7190" path="/var/lib/kubelet/pods/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.649866 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2c0e289-4e3b-4b5a-93db-d38621a870ec" path="/var/lib/kubelet/pods/a2c0e289-4e3b-4b5a-93db-d38621a870ec/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.653409 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8" path="/var/lib/kubelet/pods/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.658259 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1d3a710-addc-4f86-b77c-0d05dc98695f" path="/var/lib/kubelet/pods/d1d3a710-addc-4f86-b77c-0d05dc98695f/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.664400 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddebb2b6-7bc0-45af-ba68-ae108b0d91fd" path="/var/lib/kubelet/pods/ddebb2b6-7bc0-45af-ba68-ae108b0d91fd/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.665288 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef98a0b8-d6d9-4075-ae60-e7d614a79e7f" path="/var/lib/kubelet/pods/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.666020 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8945f31-b1d9-4c65-9f8c-2619f87d4237" path="/var/lib/kubelet/pods/f8945f31-b1d9-4c65-9f8c-2619f87d4237/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.666782 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8cadc24-23ed-4063-8e1a-47a27c1d6ffd" path="/var/lib/kubelet/pods/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.667940 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd8e89cd-b890-4f36-9008-59767ccbad91" path="/var/lib/kubelet/pods/fd8e89cd-b890-4f36-9008-59767ccbad91/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.671745 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.671780 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.707399 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0500ca0-0cef-4b76-9c78-cb2189b520ff-operator-scripts\") pod \"b0500ca0-0cef-4b76-9c78-cb2189b520ff\" (UID: \"b0500ca0-0cef-4b76-9c78-cb2189b520ff\") " Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.707536 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc6xc\" (UniqueName: \"kubernetes.io/projected/b0500ca0-0cef-4b76-9c78-cb2189b520ff-kube-api-access-pc6xc\") pod \"b0500ca0-0cef-4b76-9c78-cb2189b520ff\" (UID: \"b0500ca0-0cef-4b76-9c78-cb2189b520ff\") " Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.710687 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0500ca0-0cef-4b76-9c78-cb2189b520ff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b0500ca0-0cef-4b76-9c78-cb2189b520ff" (UID: "b0500ca0-0cef-4b76-9c78-cb2189b520ff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.718860 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0500ca0-0cef-4b76-9c78-cb2189b520ff-kube-api-access-pc6xc" (OuterVolumeSpecName: "kube-api-access-pc6xc") pod "b0500ca0-0cef-4b76-9c78-cb2189b520ff" (UID: "b0500ca0-0cef-4b76-9c78-cb2189b520ff"). InnerVolumeSpecName "kube-api-access-pc6xc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.752104 4795 scope.go:117] "RemoveContainer" containerID="3d2f2fb944954b1447f44f88d21764d52d00b1e2faecd4fc556d6cdb1c4d63f4" Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.765495 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d2f2fb944954b1447f44f88d21764d52d00b1e2faecd4fc556d6cdb1c4d63f4\": container with ID starting with 3d2f2fb944954b1447f44f88d21764d52d00b1e2faecd4fc556d6cdb1c4d63f4 not found: ID does not exist" containerID="3d2f2fb944954b1447f44f88d21764d52d00b1e2faecd4fc556d6cdb1c4d63f4" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.765541 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d2f2fb944954b1447f44f88d21764d52d00b1e2faecd4fc556d6cdb1c4d63f4"} err="failed to get container status \"3d2f2fb944954b1447f44f88d21764d52d00b1e2faecd4fc556d6cdb1c4d63f4\": rpc error: code = NotFound desc = could not find container \"3d2f2fb944954b1447f44f88d21764d52d00b1e2faecd4fc556d6cdb1c4d63f4\": container with ID starting with 3d2f2fb944954b1447f44f88d21764d52d00b1e2faecd4fc556d6cdb1c4d63f4 not found: ID does not exist" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.765576 4795 scope.go:117] "RemoveContainer" containerID="e932ce1114c0c3a49c8f6332f06a4a9aadb5f1382200346f37f0e3e9fe2d3373" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.805918 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.816007 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc6xc\" (UniqueName: \"kubernetes.io/projected/b0500ca0-0cef-4b76-9c78-cb2189b520ff-kube-api-access-pc6xc\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.816045 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0500ca0-0cef-4b76-9c78-cb2189b520ff-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.897535 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c741-account-create-update-26ljt" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.898717 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.903836 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.935561 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.947000 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-p9cs4"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.948232 4795 scope.go:117] "RemoveContainer" containerID="fc2aa4e6ca186f0a3259cef3b73108248f8b30394d6e7f6ee3356a21235bd96b" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.963232 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-p9cs4"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.970879 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d602-account-create-update-lcd8k" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.973407 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-858c4dcd57-whkj2"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.973651 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-858c4dcd57-whkj2" podUID="da2e3f89-bf0b-4371-8e5b-a0037f266c70" containerName="proxy-httpd" containerID="cri-o://38273143a291266be6dd29c71788a99ae4aa366ccd575844722fcc6687631e66" gracePeriod=30 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.973781 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-858c4dcd57-whkj2" podUID="da2e3f89-bf0b-4371-8e5b-a0037f266c70" containerName="proxy-server" containerID="cri-o://812bd35c706001e18c0da5e2c3dd17a059f42e984bdd2e12b92f8fd91195b1e2" gracePeriod=30 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.990006 4795 scope.go:117] "RemoveContainer" containerID="0b266e887f3cb16fe191b1e64cc8d8adf7464d874863c315ba4605ce8d79b678" Feb 19 21:49:58 crc kubenswrapper[4795]: E0219 21:49:58.014629 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:49:58 crc kubenswrapper[4795]: E0219 21:49:58.016762 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:49:58 crc kubenswrapper[4795]: E0219 21:49:58.018391 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:49:58 crc kubenswrapper[4795]: E0219 21:49:58.018462 4795 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="f2710b23-7a5c-44cb-b916-9e08edc59636" containerName="nova-scheduler-scheduler" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.022613 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e65bdd0-b6ac-406d-bc79-ade76397295e-operator-scripts\") pod \"3e65bdd0-b6ac-406d-bc79-ade76397295e\" (UID: \"3e65bdd0-b6ac-406d-bc79-ade76397295e\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.022674 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-config-data\") pod \"250e9cae-06d9-44da-88af-239d15356a3c\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.022709 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-combined-ca-bundle\") pod \"250e9cae-06d9-44da-88af-239d15356a3c\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.022760 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w65w5\" (UniqueName: \"kubernetes.io/projected/10e13a52-b0f3-447a-b47e-2c4dd50d6400-kube-api-access-w65w5\") pod \"10e13a52-b0f3-447a-b47e-2c4dd50d6400\" (UID: \"10e13a52-b0f3-447a-b47e-2c4dd50d6400\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.022786 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10e13a52-b0f3-447a-b47e-2c4dd50d6400-operator-scripts\") pod \"10e13a52-b0f3-447a-b47e-2c4dd50d6400\" (UID: \"10e13a52-b0f3-447a-b47e-2c4dd50d6400\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.022880 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/250e9cae-06d9-44da-88af-239d15356a3c-logs\") pod \"250e9cae-06d9-44da-88af-239d15356a3c\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.023272 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzgh4\" (UniqueName: \"kubernetes.io/projected/3e65bdd0-b6ac-406d-bc79-ade76397295e-kube-api-access-mzgh4\") pod \"3e65bdd0-b6ac-406d-bc79-ade76397295e\" (UID: \"3e65bdd0-b6ac-406d-bc79-ade76397295e\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.023459 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6npn5\" (UniqueName: \"kubernetes.io/projected/250e9cae-06d9-44da-88af-239d15356a3c-kube-api-access-6npn5\") pod \"250e9cae-06d9-44da-88af-239d15356a3c\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.023533 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-config-data-custom\") pod \"250e9cae-06d9-44da-88af-239d15356a3c\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.024543 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/250e9cae-06d9-44da-88af-239d15356a3c-logs" (OuterVolumeSpecName: "logs") pod "250e9cae-06d9-44da-88af-239d15356a3c" (UID: "250e9cae-06d9-44da-88af-239d15356a3c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.029343 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e65bdd0-b6ac-406d-bc79-ade76397295e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3e65bdd0-b6ac-406d-bc79-ade76397295e" (UID: "3e65bdd0-b6ac-406d-bc79-ade76397295e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.029640 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10e13a52-b0f3-447a-b47e-2c4dd50d6400-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "10e13a52-b0f3-447a-b47e-2c4dd50d6400" (UID: "10e13a52-b0f3-447a-b47e-2c4dd50d6400"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.029690 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10e13a52-b0f3-447a-b47e-2c4dd50d6400-kube-api-access-w65w5" (OuterVolumeSpecName: "kube-api-access-w65w5") pod "10e13a52-b0f3-447a-b47e-2c4dd50d6400" (UID: "10e13a52-b0f3-447a-b47e-2c4dd50d6400"). InnerVolumeSpecName "kube-api-access-w65w5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.029750 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "250e9cae-06d9-44da-88af-239d15356a3c" (UID: "250e9cae-06d9-44da-88af-239d15356a3c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.031655 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e65bdd0-b6ac-406d-bc79-ade76397295e-kube-api-access-mzgh4" (OuterVolumeSpecName: "kube-api-access-mzgh4") pod "3e65bdd0-b6ac-406d-bc79-ade76397295e" (UID: "3e65bdd0-b6ac-406d-bc79-ade76397295e"). InnerVolumeSpecName "kube-api-access-mzgh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.032132 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/250e9cae-06d9-44da-88af-239d15356a3c-kube-api-access-6npn5" (OuterVolumeSpecName: "kube-api-access-6npn5") pod "250e9cae-06d9-44da-88af-239d15356a3c" (UID: "250e9cae-06d9-44da-88af-239d15356a3c"). InnerVolumeSpecName "kube-api-access-6npn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.032616 4795 scope.go:117] "RemoveContainer" containerID="445b39298e77e683eee2d951a286aa4f9de79beb92f370dca4d81f0dfe56d255" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.057921 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "250e9cae-06d9-44da-88af-239d15356a3c" (UID: "250e9cae-06d9-44da-88af-239d15356a3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.091373 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-config-data" (OuterVolumeSpecName: "config-data") pod "250e9cae-06d9-44da-88af-239d15356a3c" (UID: "250e9cae-06d9-44da-88af-239d15356a3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.115665 4795 scope.go:117] "RemoveContainer" containerID="6f5fd7fb0db869022abdd3586a7debaf90502df56c7437b86541c8afbbd3687a" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.126269 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e65bdd0-b6ac-406d-bc79-ade76397295e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.126300 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.126309 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.126318 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w65w5\" (UniqueName: \"kubernetes.io/projected/10e13a52-b0f3-447a-b47e-2c4dd50d6400-kube-api-access-w65w5\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.126328 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10e13a52-b0f3-447a-b47e-2c4dd50d6400-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.126335 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/250e9cae-06d9-44da-88af-239d15356a3c-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.126343 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzgh4\" (UniqueName: \"kubernetes.io/projected/3e65bdd0-b6ac-406d-bc79-ade76397295e-kube-api-access-mzgh4\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.126352 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6npn5\" (UniqueName: \"kubernetes.io/projected/250e9cae-06d9-44da-88af-239d15356a3c-kube-api-access-6npn5\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.126406 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: E0219 21:49:58.228057 4795 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 19 21:49:58 crc kubenswrapper[4795]: E0219 21:49:58.228503 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-config-data podName:ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d nodeName:}" failed. No retries permitted until 2026-02-19 21:50:02.228238412 +0000 UTC m=+1313.420756276 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-config-data") pod "rabbitmq-cell1-server-0" (UID: "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d") : configmap "rabbitmq-cell1-config-data" not found Feb 19 21:49:58 crc kubenswrapper[4795]: E0219 21:49:58.234091 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4f511a80129a9b198e5be1216670e966f0623d26f1c7d17cc3a99a813e69b1c3 is running failed: container process not found" containerID="4f511a80129a9b198e5be1216670e966f0623d26f1c7d17cc3a99a813e69b1c3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 21:49:58 crc kubenswrapper[4795]: E0219 21:49:58.234580 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4f511a80129a9b198e5be1216670e966f0623d26f1c7d17cc3a99a813e69b1c3 is running failed: container process not found" containerID="4f511a80129a9b198e5be1216670e966f0623d26f1c7d17cc3a99a813e69b1c3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 21:49:58 crc kubenswrapper[4795]: E0219 21:49:58.235762 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4f511a80129a9b198e5be1216670e966f0623d26f1c7d17cc3a99a813e69b1c3 is running failed: container process not found" containerID="4f511a80129a9b198e5be1216670e966f0623d26f1c7d17cc3a99a813e69b1c3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 21:49:58 crc kubenswrapper[4795]: E0219 21:49:58.235791 4795 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4f511a80129a9b198e5be1216670e966f0623d26f1c7d17cc3a99a813e69b1c3 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="f6fd7841-2a08-4786-8e96-b2ab0f477eff" containerName="nova-cell1-conductor-conductor" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.345382 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.408529 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9f51-account-create-update-z87p8" event={"ID":"b0500ca0-0cef-4b76-9c78-cb2189b520ff","Type":"ContainerDied","Data":"72ececc667f06320402ffe76c00f9ed550ee45b4a8e93c91bfe4b2261f921f68"} Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.408616 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9f51-account-create-update-z87p8" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.415580 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c741-account-create-update-26ljt" event={"ID":"3e65bdd0-b6ac-406d-bc79-ade76397295e","Type":"ContainerDied","Data":"75197e9e2a0c3d83dd9ff3b61a9f77e4a300ce7b88d838cd67a1b818ba9ef069"} Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.415675 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c741-account-create-update-26ljt" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.423015 4795 generic.go:334] "Generic (PLEG): container finished" podID="c54f77a4-1095-4ff1-bc74-b845cde659d9" containerID="b84203bdd91f64f0aed89ae5d8334985c117eb72cc298fbd0865d60f726218c1" exitCode=0 Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.423088 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c54f77a4-1095-4ff1-bc74-b845cde659d9","Type":"ContainerDied","Data":"b84203bdd91f64f0aed89ae5d8334985c117eb72cc298fbd0865d60f726218c1"} Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.426448 4795 generic.go:334] "Generic (PLEG): container finished" podID="2164f9d1-1d8b-486b-beca-0d3a5172b302" containerID="37b5ade45a3dd017a14fe70f806fa2f9b2438f7ec74aeb16d3207b0a96e2916a" exitCode=1 Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.427106 4795 scope.go:117] "RemoveContainer" containerID="37b5ade45a3dd017a14fe70f806fa2f9b2438f7ec74aeb16d3207b0a96e2916a" Feb 19 21:49:58 crc kubenswrapper[4795]: E0219 21:49:58.427552 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-ktl2b_openstack(2164f9d1-1d8b-486b-beca-0d3a5172b302)\"" pod="openstack/root-account-create-update-ktl2b" podUID="2164f9d1-1d8b-486b-beca-0d3a5172b302" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.427778 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ktl2b" event={"ID":"2164f9d1-1d8b-486b-beca-0d3a5172b302","Type":"ContainerDied","Data":"37b5ade45a3dd017a14fe70f806fa2f9b2438f7ec74aeb16d3207b0a96e2916a"} Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.427804 4795 scope.go:117] "RemoveContainer" containerID="e72a80485fc6a5dd8d829ffc139130a167468a52119185e5d82a547efb9020bb" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.429995 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.430039 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.430516 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-vencrypt-tls-certs\") pod \"0adadcd9-8949-443b-8042-d0d11191eae9\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.430576 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-nova-novncproxy-tls-certs\") pod \"0adadcd9-8949-443b-8042-d0d11191eae9\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.430610 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-combined-ca-bundle\") pod \"0adadcd9-8949-443b-8042-d0d11191eae9\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.430665 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-config-data\") pod \"0adadcd9-8949-443b-8042-d0d11191eae9\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.430693 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz5kj\" (UniqueName: \"kubernetes.io/projected/0adadcd9-8949-443b-8042-d0d11191eae9-kube-api-access-cz5kj\") pod \"0adadcd9-8949-443b-8042-d0d11191eae9\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.443321 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0adadcd9-8949-443b-8042-d0d11191eae9-kube-api-access-cz5kj" (OuterVolumeSpecName: "kube-api-access-cz5kj") pod "0adadcd9-8949-443b-8042-d0d11191eae9" (UID: "0adadcd9-8949-443b-8042-d0d11191eae9"). InnerVolumeSpecName "kube-api-access-cz5kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.462373 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d602-account-create-update-lcd8k" event={"ID":"10e13a52-b0f3-447a-b47e-2c4dd50d6400","Type":"ContainerDied","Data":"6e320135d03de930dfd2e7664584c07e81ae7f341dfa862b733a714182521b02"} Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.462449 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d602-account-create-update-lcd8k" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.469218 4795 generic.go:334] "Generic (PLEG): container finished" podID="0adadcd9-8949-443b-8042-d0d11191eae9" containerID="0f773609fa3d94e54bc6067d53960a0ec55209c713b15b0460c6c5287772d27f" exitCode=0 Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.469805 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0adadcd9-8949-443b-8042-d0d11191eae9","Type":"ContainerDied","Data":"0f773609fa3d94e54bc6067d53960a0ec55209c713b15b0460c6c5287772d27f"} Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.469910 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0adadcd9-8949-443b-8042-d0d11191eae9","Type":"ContainerDied","Data":"6555271b3d98ead4145953a8b269ef0cca6677e13c5b918d43b6b24ff869130e"} Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.470319 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.491704 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0adadcd9-8949-443b-8042-d0d11191eae9" (UID: "0adadcd9-8949-443b-8042-d0d11191eae9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.510899 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1922-account-create-update-gzkhl" event={"ID":"ee3fde95-91bf-4f6a-9753-f879d56fedbb","Type":"ContainerStarted","Data":"8f247d4d74ea4937a4f6282c8b5e4ccafc361e919f81dbaf677caedc499822b7"} Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.537045 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.537071 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz5kj\" (UniqueName: \"kubernetes.io/projected/0adadcd9-8949-443b-8042-d0d11191eae9-kube-api-access-cz5kj\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.549043 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "0adadcd9-8949-443b-8042-d0d11191eae9" (UID: "0adadcd9-8949-443b-8042-d0d11191eae9"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.556208 4795 generic.go:334] "Generic (PLEG): container finished" podID="2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" containerID="ae770d1785f3d5deaa5b1b98adf315fd8a61a2cbce434ecb3970ab496579b196" exitCode=0 Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.556288 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d","Type":"ContainerDied","Data":"ae770d1785f3d5deaa5b1b98adf315fd8a61a2cbce434ecb3970ab496579b196"} Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.567712 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cd95cf589-2gw48" event={"ID":"250e9cae-06d9-44da-88af-239d15356a3c","Type":"ContainerDied","Data":"26ff50c7b1851e9704bfa4221d66176820b3417a16cff032c2c82bc2945df7a8"} Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.567828 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.580986 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2d62-account-create-update-4vs7v" event={"ID":"299d8d1c-c181-4c7b-b95f-9f3c62ddb102","Type":"ContainerStarted","Data":"a6e6e439c0ca9f9cb3832a5b2b0c274f01e0f16a2f8f82d2544c41499347ea51"} Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.592367 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-config-data" (OuterVolumeSpecName: "config-data") pod "0adadcd9-8949-443b-8042-d0d11191eae9" (UID: "0adadcd9-8949-443b-8042-d0d11191eae9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.599724 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "0adadcd9-8949-443b-8042-d0d11191eae9" (UID: "0adadcd9-8949-443b-8042-d0d11191eae9"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.601204 4795 generic.go:334] "Generic (PLEG): container finished" podID="f6fd7841-2a08-4786-8e96-b2ab0f477eff" containerID="4f511a80129a9b198e5be1216670e966f0623d26f1c7d17cc3a99a813e69b1c3" exitCode=0 Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.601514 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f6fd7841-2a08-4786-8e96-b2ab0f477eff","Type":"ContainerDied","Data":"4f511a80129a9b198e5be1216670e966f0623d26f1c7d17cc3a99a813e69b1c3"} Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.617729 4795 generic.go:334] "Generic (PLEG): container finished" podID="da2e3f89-bf0b-4371-8e5b-a0037f266c70" containerID="38273143a291266be6dd29c71788a99ae4aa366ccd575844722fcc6687631e66" exitCode=0 Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.617794 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-858c4dcd57-whkj2" event={"ID":"da2e3f89-bf0b-4371-8e5b-a0037f266c70","Type":"ContainerDied","Data":"38273143a291266be6dd29c71788a99ae4aa366ccd575844722fcc6687631e66"} Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.627823 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e48f-account-create-update-n77q8" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.628936 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f769-account-create-update-8k7r2" event={"ID":"8eaa69df-d563-4dc0-8a78-40413946cbca","Type":"ContainerStarted","Data":"7d6e2a5c198e5bc708e1811f1860c10bf53d61ca572544f39bca6ba9c9264ae5"} Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.638565 4795 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.638592 4795 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.638603 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.661128 4795 scope.go:117] "RemoveContainer" containerID="0f773609fa3d94e54bc6067d53960a0ec55209c713b15b0460c6c5287772d27f" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.662257 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.695916 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-c741-account-create-update-26ljt"] Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.700852 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-c741-account-create-update-26ljt"] Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.718005 4795 scope.go:117] "RemoveContainer" containerID="0f773609fa3d94e54bc6067d53960a0ec55209c713b15b0460c6c5287772d27f" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.720275 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-e48f-account-create-update-n77q8"] Feb 19 21:49:58 crc kubenswrapper[4795]: E0219 21:49:58.723785 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f773609fa3d94e54bc6067d53960a0ec55209c713b15b0460c6c5287772d27f\": container with ID starting with 0f773609fa3d94e54bc6067d53960a0ec55209c713b15b0460c6c5287772d27f not found: ID does not exist" containerID="0f773609fa3d94e54bc6067d53960a0ec55209c713b15b0460c6c5287772d27f" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.723955 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f773609fa3d94e54bc6067d53960a0ec55209c713b15b0460c6c5287772d27f"} err="failed to get container status \"0f773609fa3d94e54bc6067d53960a0ec55209c713b15b0460c6c5287772d27f\": rpc error: code = NotFound desc = could not find container \"0f773609fa3d94e54bc6067d53960a0ec55209c713b15b0460c6c5287772d27f\": container with ID starting with 0f773609fa3d94e54bc6067d53960a0ec55209c713b15b0460c6c5287772d27f not found: ID does not exist" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.724037 4795 scope.go:117] "RemoveContainer" containerID="f1227cc577cca2da8d7067560947f94ac089651b67e263368202e928196a7bc8" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.732331 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-e48f-account-create-update-n77q8"] Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.739344 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6fd7841-2a08-4786-8e96-b2ab0f477eff-combined-ca-bundle\") pod \"f6fd7841-2a08-4786-8e96-b2ab0f477eff\" (UID: \"f6fd7841-2a08-4786-8e96-b2ab0f477eff\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.739433 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6fd7841-2a08-4786-8e96-b2ab0f477eff-config-data\") pod \"f6fd7841-2a08-4786-8e96-b2ab0f477eff\" (UID: \"f6fd7841-2a08-4786-8e96-b2ab0f477eff\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.739453 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtr87\" (UniqueName: \"kubernetes.io/projected/f6fd7841-2a08-4786-8e96-b2ab0f477eff-kube-api-access-qtr87\") pod \"f6fd7841-2a08-4786-8e96-b2ab0f477eff\" (UID: \"f6fd7841-2a08-4786-8e96-b2ab0f477eff\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.743370 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6fd7841-2a08-4786-8e96-b2ab0f477eff-kube-api-access-qtr87" (OuterVolumeSpecName: "kube-api-access-qtr87") pod "f6fd7841-2a08-4786-8e96-b2ab0f477eff" (UID: "f6fd7841-2a08-4786-8e96-b2ab0f477eff"). InnerVolumeSpecName "kube-api-access-qtr87". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.745738 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-9f51-account-create-update-z87p8"] Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.754282 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-9f51-account-create-update-z87p8"] Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.796070 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-d602-account-create-update-lcd8k"] Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.798742 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6fd7841-2a08-4786-8e96-b2ab0f477eff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6fd7841-2a08-4786-8e96-b2ab0f477eff" (UID: "f6fd7841-2a08-4786-8e96-b2ab0f477eff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.801421 4795 scope.go:117] "RemoveContainer" containerID="0d1ad96e830846d9034790e753be1811679f1bbf15af5f12e70081c7ae374cfd" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.802469 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6fd7841-2a08-4786-8e96-b2ab0f477eff-config-data" (OuterVolumeSpecName: "config-data") pod "f6fd7841-2a08-4786-8e96-b2ab0f477eff" (UID: "f6fd7841-2a08-4786-8e96-b2ab0f477eff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.825819 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-d602-account-create-update-lcd8k"] Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.836498 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7cd95cf589-2gw48"] Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.841758 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6fd7841-2a08-4786-8e96-b2ab0f477eff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.841791 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6fd7841-2a08-4786-8e96-b2ab0f477eff-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.841801 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtr87\" (UniqueName: \"kubernetes.io/projected/f6fd7841-2a08-4786-8e96-b2ab0f477eff-kube-api-access-qtr87\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.841812 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7400eda6-e731-4942-b002-c81dd9a87e6a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.841822 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8sbw\" (UniqueName: \"kubernetes.io/projected/7400eda6-e731-4942-b002-c81dd9a87e6a-kube-api-access-z8sbw\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.849381 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-7cd95cf589-2gw48"] Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.859248 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.863249 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.881789 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.953941 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-config-data\") pod \"4532c069-4eb7-48ab-b575-b6a130e2b438\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.953993 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-combined-ca-bundle\") pod \"4532c069-4eb7-48ab-b575-b6a130e2b438\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.954017 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-config-data-custom\") pod \"4532c069-4eb7-48ab-b575-b6a130e2b438\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.954129 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4hjh\" (UniqueName: \"kubernetes.io/projected/4532c069-4eb7-48ab-b575-b6a130e2b438-kube-api-access-v4hjh\") pod \"4532c069-4eb7-48ab-b575-b6a130e2b438\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.954178 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4532c069-4eb7-48ab-b575-b6a130e2b438-logs\") pod \"4532c069-4eb7-48ab-b575-b6a130e2b438\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.955005 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4532c069-4eb7-48ab-b575-b6a130e2b438-logs" (OuterVolumeSpecName: "logs") pod "4532c069-4eb7-48ab-b575-b6a130e2b438" (UID: "4532c069-4eb7-48ab-b575-b6a130e2b438"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.960596 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.960891 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="ceilometer-central-agent" containerID="cri-o://a502298e81b556f379f6fb3a9a80c1cf56ac3cccd10f1bd463a7436d04a56552" gracePeriod=30 Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.961281 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="proxy-httpd" containerID="cri-o://c677d54567313f232f88b9a873fec54a2577075c7b806ea88ad32df8cb36cb26" gracePeriod=30 Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.961323 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="sg-core" containerID="cri-o://21486f4b00e940629fe56d483d784412d9ef80990cf1527d454dfa0b33b31ddb" gracePeriod=30 Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.961354 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="ceilometer-notification-agent" containerID="cri-o://98c35a6c89e932ad9cd3cce6a6d12a0653f350e3b0c1cefa51b67ce7752f0734" gracePeriod=30 Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.969528 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4532c069-4eb7-48ab-b575-b6a130e2b438-kube-api-access-v4hjh" (OuterVolumeSpecName: "kube-api-access-v4hjh") pod "4532c069-4eb7-48ab-b575-b6a130e2b438" (UID: "4532c069-4eb7-48ab-b575-b6a130e2b438"). InnerVolumeSpecName "kube-api-access-v4hjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.986794 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4532c069-4eb7-48ab-b575-b6a130e2b438" (UID: "4532c069-4eb7-48ab-b575-b6a130e2b438"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.000664 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.000868 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="296f6b57-de45-495d-abe9-8c779c157057" containerName="kube-state-metrics" containerID="cri-o://5a4765254bd1f522ba57b50f9f6989619b42ec50e10b34373400dceff9cc29f7" gracePeriod=30 Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.002334 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4532c069-4eb7-48ab-b575-b6a130e2b438" (UID: "4532c069-4eb7-48ab-b575-b6a130e2b438"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.070339 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1922-account-create-update-gzkhl" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.097132 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2d62-account-create-update-4vs7v" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.126428 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-41fb-account-create-update-ntc9w"] Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.141464 4795 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.141539 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-config-data podName:7b096325-542d-4ac6-8d16-8aa0937013b2 nodeName:}" failed. No retries permitted until 2026-02-19 21:50:03.141516561 +0000 UTC m=+1314.334034425 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-config-data") pod "rabbitmq-server-0" (UID: "7b096325-542d-4ac6-8d16-8aa0937013b2") : configmap "rabbitmq-config-data" not found Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.146159 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.146280 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.146358 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4hjh\" (UniqueName: \"kubernetes.io/projected/4532c069-4eb7-48ab-b575-b6a130e2b438-kube-api-access-v4hjh\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.146477 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4532c069-4eb7-48ab-b575-b6a130e2b438-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.190213 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-41fb-account-create-update-ntc9w"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.203136 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.209159 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.209440 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="42e81a78-17fd-4ed3-b072-dd6a20cbe5d3" containerName="memcached" containerID="cri-o://db08b177837bd80146274836c0977b72219e93219854a2c6c5e81a24922a33fe" gracePeriod=30 Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.248644 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xc68\" (UniqueName: \"kubernetes.io/projected/299d8d1c-c181-4c7b-b95f-9f3c62ddb102-kube-api-access-4xc68\") pod \"299d8d1c-c181-4c7b-b95f-9f3c62ddb102\" (UID: \"299d8d1c-c181-4c7b-b95f-9f3c62ddb102\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.248691 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/299d8d1c-c181-4c7b-b95f-9f3c62ddb102-operator-scripts\") pod \"299d8d1c-c181-4c7b-b95f-9f3c62ddb102\" (UID: \"299d8d1c-c181-4c7b-b95f-9f3c62ddb102\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.248876 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq9sz\" (UniqueName: \"kubernetes.io/projected/ee3fde95-91bf-4f6a-9753-f879d56fedbb-kube-api-access-vq9sz\") pod \"ee3fde95-91bf-4f6a-9753-f879d56fedbb\" (UID: \"ee3fde95-91bf-4f6a-9753-f879d56fedbb\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.248900 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee3fde95-91bf-4f6a-9753-f879d56fedbb-operator-scripts\") pod \"ee3fde95-91bf-4f6a-9753-f879d56fedbb\" (UID: \"ee3fde95-91bf-4f6a-9753-f879d56fedbb\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.257920 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/299d8d1c-c181-4c7b-b95f-9f3c62ddb102-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "299d8d1c-c181-4c7b-b95f-9f3c62ddb102" (UID: "299d8d1c-c181-4c7b-b95f-9f3c62ddb102"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.259241 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee3fde95-91bf-4f6a-9753-f879d56fedbb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ee3fde95-91bf-4f6a-9753-f879d56fedbb" (UID: "ee3fde95-91bf-4f6a-9753-f879d56fedbb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.270234 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee3fde95-91bf-4f6a-9753-f879d56fedbb-kube-api-access-vq9sz" (OuterVolumeSpecName: "kube-api-access-vq9sz") pod "ee3fde95-91bf-4f6a-9753-f879d56fedbb" (UID: "ee3fde95-91bf-4f6a-9753-f879d56fedbb"). InnerVolumeSpecName "kube-api-access-vq9sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.283300 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/299d8d1c-c181-4c7b-b95f-9f3c62ddb102-kube-api-access-4xc68" (OuterVolumeSpecName: "kube-api-access-4xc68") pod "299d8d1c-c181-4c7b-b95f-9f3c62ddb102" (UID: "299d8d1c-c181-4c7b-b95f-9f3c62ddb102"). InnerVolumeSpecName "kube-api-access-4xc68". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.299678 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-41fb-account-create-update-h4ql2"] Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.300030 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c5a8678-8ce2-4bee-9160-37b1dea9f897" containerName="ovsdbserver-nb" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300047 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c5a8678-8ce2-4bee-9160-37b1dea9f897" containerName="ovsdbserver-nb" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.300057 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0adadcd9-8949-443b-8042-d0d11191eae9" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300063 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0adadcd9-8949-443b-8042-d0d11191eae9" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.300074 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c2bcb9c-07d3-4d71-924b-aacd537e3430" containerName="ovsdbserver-sb" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300080 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c2bcb9c-07d3-4d71-924b-aacd537e3430" containerName="ovsdbserver-sb" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.300088 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" containerName="mysql-bootstrap" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300094 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" containerName="mysql-bootstrap" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.300106 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c5a8678-8ce2-4bee-9160-37b1dea9f897" containerName="openstack-network-exporter" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300112 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c5a8678-8ce2-4bee-9160-37b1dea9f897" containerName="openstack-network-exporter" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.300122 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6fd7841-2a08-4786-8e96-b2ab0f477eff" containerName="nova-cell1-conductor-conductor" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300127 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6fd7841-2a08-4786-8e96-b2ab0f477eff" containerName="nova-cell1-conductor-conductor" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.300135 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="250e9cae-06d9-44da-88af-239d15356a3c" containerName="barbican-worker-log" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300141 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="250e9cae-06d9-44da-88af-239d15356a3c" containerName="barbican-worker-log" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.300152 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c2bcb9c-07d3-4d71-924b-aacd537e3430" containerName="openstack-network-exporter" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300158 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c2bcb9c-07d3-4d71-924b-aacd537e3430" containerName="openstack-network-exporter" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.300180 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="250e9cae-06d9-44da-88af-239d15356a3c" containerName="barbican-worker" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300186 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="250e9cae-06d9-44da-88af-239d15356a3c" containerName="barbican-worker" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.300198 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4532c069-4eb7-48ab-b575-b6a130e2b438" containerName="barbican-keystone-listener" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300203 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4532c069-4eb7-48ab-b575-b6a130e2b438" containerName="barbican-keystone-listener" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.300217 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74df8ac0-77f2-4e8d-aa39-d05dc6ce7190" containerName="dnsmasq-dns" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300223 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="74df8ac0-77f2-4e8d-aa39-d05dc6ce7190" containerName="dnsmasq-dns" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.300231 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74df8ac0-77f2-4e8d-aa39-d05dc6ce7190" containerName="init" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300236 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="74df8ac0-77f2-4e8d-aa39-d05dc6ce7190" containerName="init" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.300248 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4532c069-4eb7-48ab-b575-b6a130e2b438" containerName="barbican-keystone-listener-log" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300254 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4532c069-4eb7-48ab-b575-b6a130e2b438" containerName="barbican-keystone-listener-log" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.300263 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" containerName="galera" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300269 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" containerName="galera" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.300277 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee0ea5d-4b43-4421-b23e-555c5eac3564" containerName="openstack-network-exporter" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300282 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee0ea5d-4b43-4421-b23e-555c5eac3564" containerName="openstack-network-exporter" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300424 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="250e9cae-06d9-44da-88af-239d15356a3c" containerName="barbican-worker-log" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300437 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="eee0ea5d-4b43-4421-b23e-555c5eac3564" containerName="openstack-network-exporter" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300447 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c2bcb9c-07d3-4d71-924b-aacd537e3430" containerName="openstack-network-exporter" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300456 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c5a8678-8ce2-4bee-9160-37b1dea9f897" containerName="ovsdbserver-nb" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300464 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="74df8ac0-77f2-4e8d-aa39-d05dc6ce7190" containerName="dnsmasq-dns" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300475 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" containerName="galera" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300485 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0adadcd9-8949-443b-8042-d0d11191eae9" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300495 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c5a8678-8ce2-4bee-9160-37b1dea9f897" containerName="openstack-network-exporter" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300506 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4532c069-4eb7-48ab-b575-b6a130e2b438" containerName="barbican-keystone-listener" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300513 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c2bcb9c-07d3-4d71-924b-aacd537e3430" containerName="ovsdbserver-sb" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300521 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="250e9cae-06d9-44da-88af-239d15356a3c" containerName="barbican-worker" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300530 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6fd7841-2a08-4786-8e96-b2ab0f477eff" containerName="nova-cell1-conductor-conductor" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300536 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4532c069-4eb7-48ab-b575-b6a130e2b438" containerName="barbican-keystone-listener-log" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.301211 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-41fb-account-create-update-h4ql2" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.304504 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.306345 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-config-data" (OuterVolumeSpecName: "config-data") pod "4532c069-4eb7-48ab-b575-b6a130e2b438" (UID: "4532c069-4eb7-48ab-b575-b6a130e2b438"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.328273 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-41fb-account-create-update-h4ql2"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.347219 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-dxql7"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.347263 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-dxql7"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.356685 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-kolla-config\") pod \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.356729 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-operator-scripts\") pod \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.356781 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwqnm\" (UniqueName: \"kubernetes.io/projected/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-kube-api-access-kwqnm\") pod \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.356802 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.356850 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-config-data-default\") pod \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.356959 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-combined-ca-bundle\") pod \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.357012 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-galera-tls-certs\") pod \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.357041 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-config-data-generated\") pod \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.357668 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" (UID: "2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.359817 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" (UID: "2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.360329 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" (UID: "2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.366817 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-operator-scripts\") pod \"keystone-41fb-account-create-update-h4ql2\" (UID: \"38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9\") " pod="openstack/keystone-41fb-account-create-update-h4ql2" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.366951 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcb7p\" (UniqueName: \"kubernetes.io/projected/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-kube-api-access-qcb7p\") pod \"keystone-41fb-account-create-update-h4ql2\" (UID: \"38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9\") " pod="openstack/keystone-41fb-account-create-update-h4ql2" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.367141 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee3fde95-91bf-4f6a-9753-f879d56fedbb-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.367157 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.367179 4795 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.367188 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.367196 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xc68\" (UniqueName: \"kubernetes.io/projected/299d8d1c-c181-4c7b-b95f-9f3c62ddb102-kube-api-access-4xc68\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.367205 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/299d8d1c-c181-4c7b-b95f-9f3c62ddb102-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.367214 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.367222 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq9sz\" (UniqueName: \"kubernetes.io/projected/ee3fde95-91bf-4f6a-9753-f879d56fedbb-kube-api-access-vq9sz\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.367253 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-nffrq"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.372783 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" (UID: "2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.374502 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-nffrq"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.378511 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-kube-api-access-kwqnm" (OuterVolumeSpecName: "kube-api-access-kwqnm") pod "2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" (UID: "2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d"). InnerVolumeSpecName "kube-api-access-kwqnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.384230 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6945f64f65-rnq2b"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.384474 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-6945f64f65-rnq2b" podUID="4b928260-ac65-479d-bd4b-f14b48d24ddb" containerName="keystone-api" containerID="cri-o://77a5c881bfb4b3162733203d6d06eed351c1cde8f2967461288b978f94eeb5ba" gracePeriod=30 Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.401778 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "mysql-db") pod "2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" (UID: "2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.414468 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-41fb-account-create-update-h4ql2"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.422593 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.433464 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-q8t82"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.470830 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-operator-scripts\") pod \"keystone-41fb-account-create-update-h4ql2\" (UID: \"38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9\") " pod="openstack/keystone-41fb-account-create-update-h4ql2" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.470902 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcb7p\" (UniqueName: \"kubernetes.io/projected/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-kube-api-access-qcb7p\") pod \"keystone-41fb-account-create-update-h4ql2\" (UID: \"38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9\") " pod="openstack/keystone-41fb-account-create-update-h4ql2" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.470982 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwqnm\" (UniqueName: \"kubernetes.io/projected/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-kube-api-access-kwqnm\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.471001 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.471012 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.471961 4795 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.472017 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-operator-scripts podName:38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9 nodeName:}" failed. No retries permitted until 2026-02-19 21:49:59.972002501 +0000 UTC m=+1311.164520355 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-operator-scripts") pod "keystone-41fb-account-create-update-h4ql2" (UID: "38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9") : configmap "openstack-scripts" not found Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.474964 4795 projected.go:194] Error preparing data for projected volume kube-api-access-qcb7p for pod openstack/keystone-41fb-account-create-update-h4ql2: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.475033 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-kube-api-access-qcb7p podName:38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9 nodeName:}" failed. No retries permitted until 2026-02-19 21:49:59.975012996 +0000 UTC m=+1311.167530860 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-qcb7p" (UniqueName: "kubernetes.io/projected/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-kube-api-access-qcb7p") pod "keystone-41fb-account-create-update-h4ql2" (UID: "38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.482867 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-q8t82"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.496830 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-ktl2b"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.504409 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" (UID: "2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.508296 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" (UID: "2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.547697 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.558082 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0adadcd9-8949-443b-8042-d0d11191eae9" path="/var/lib/kubelet/pods/0adadcd9-8949-443b-8042-d0d11191eae9/volumes" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.560848 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10e13a52-b0f3-447a-b47e-2c4dd50d6400" path="/var/lib/kubelet/pods/10e13a52-b0f3-447a-b47e-2c4dd50d6400/volumes" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.562413 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="250e9cae-06d9-44da-88af-239d15356a3c" path="/var/lib/kubelet/pods/250e9cae-06d9-44da-88af-239d15356a3c/volumes" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.563187 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c2bcb9c-07d3-4d71-924b-aacd537e3430" path="/var/lib/kubelet/pods/3c2bcb9c-07d3-4d71-924b-aacd537e3430/volumes" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.566582 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c5a8678-8ce2-4bee-9160-37b1dea9f897" path="/var/lib/kubelet/pods/3c5a8678-8ce2-4bee-9160-37b1dea9f897/volumes" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.567748 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e65bdd0-b6ac-406d-bc79-ade76397295e" path="/var/lib/kubelet/pods/3e65bdd0-b6ac-406d-bc79-ade76397295e/volumes" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.568217 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65449c45-b8f9-445e-80e7-6e3c8541c62c" path="/var/lib/kubelet/pods/65449c45-b8f9-445e-80e7-6e3c8541c62c/volumes" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.568705 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7400eda6-e731-4942-b002-c81dd9a87e6a" path="/var/lib/kubelet/pods/7400eda6-e731-4942-b002-c81dd9a87e6a/volumes" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.576257 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.576279 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.576289 4795 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.576503 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2a7b298-40b6-43b3-9099-ec74f2f0bfad" path="/var/lib/kubelet/pods/a2a7b298-40b6-43b3-9099-ec74f2f0bfad/volumes" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.577004 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0500ca0-0cef-4b76-9c78-cb2189b520ff" path="/var/lib/kubelet/pods/b0500ca0-0cef-4b76-9c78-cb2189b520ff/volumes" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.577522 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23" path="/var/lib/kubelet/pods/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23/volumes" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.578249 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eee0ea5d-4b43-4421-b23e-555c5eac3564" path="/var/lib/kubelet/pods/eee0ea5d-4b43-4421-b23e-555c5eac3564/volumes" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.579429 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb1c9562-0143-4fa4-86d3-f1ed93f3fa31" path="/var/lib/kubelet/pods/fb1c9562-0143-4fa4-86d3-f1ed93f3fa31/volumes" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.613219 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb01bcd5b_435a_4702_b0a4_8dfe8f553c23.slice/crio-conmon-9d2e881624f7800e5e6e027e5487084188713e96e14cab5c281c53ae829851a2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb01bcd5b_435a_4702_b0a4_8dfe8f553c23.slice/crio-9d2e881624f7800e5e6e027e5487084188713e96e14cab5c281c53ae829851a2.scope\": RecentStats: unable to find data in memory cache]" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.640833 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1922-account-create-update-gzkhl" event={"ID":"ee3fde95-91bf-4f6a-9753-f879d56fedbb","Type":"ContainerDied","Data":"8f247d4d74ea4937a4f6282c8b5e4ccafc361e919f81dbaf677caedc499822b7"} Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.640933 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1922-account-create-update-gzkhl" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.651072 4795 generic.go:334] "Generic (PLEG): container finished" podID="da2e3f89-bf0b-4371-8e5b-a0037f266c70" containerID="812bd35c706001e18c0da5e2c3dd17a059f42e984bdd2e12b92f8fd91195b1e2" exitCode=0 Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.651447 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-858c4dcd57-whkj2" event={"ID":"da2e3f89-bf0b-4371-8e5b-a0037f266c70","Type":"ContainerDied","Data":"812bd35c706001e18c0da5e2c3dd17a059f42e984bdd2e12b92f8fd91195b1e2"} Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.652922 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2d62-account-create-update-4vs7v" event={"ID":"299d8d1c-c181-4c7b-b95f-9f3c62ddb102","Type":"ContainerDied","Data":"a6e6e439c0ca9f9cb3832a5b2b0c274f01e0f16a2f8f82d2544c41499347ea51"} Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.653109 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2d62-account-create-update-4vs7v" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.667613 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" containerName="galera" containerID="cri-o://0f7b958a502b531c3fff207837d630d66468893d907fbb748e2bd58503327153" gracePeriod=30 Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.670005 4795 generic.go:334] "Generic (PLEG): container finished" podID="296f6b57-de45-495d-abe9-8c779c157057" containerID="5a4765254bd1f522ba57b50f9f6989619b42ec50e10b34373400dceff9cc29f7" exitCode=2 Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.670066 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"296f6b57-de45-495d-abe9-8c779c157057","Type":"ContainerDied","Data":"5a4765254bd1f522ba57b50f9f6989619b42ec50e10b34373400dceff9cc29f7"} Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.676138 4795 generic.go:334] "Generic (PLEG): container finished" podID="b01bcd5b-435a-4702-b0a4-8dfe8f553c23" containerID="9d2e881624f7800e5e6e027e5487084188713e96e14cab5c281c53ae829851a2" exitCode=0 Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.676215 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6df95dfbd4-ftf6x" event={"ID":"b01bcd5b-435a-4702-b0a4-8dfe8f553c23","Type":"ContainerDied","Data":"9d2e881624f7800e5e6e027e5487084188713e96e14cab5c281c53ae829851a2"} Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.681585 4795 generic.go:334] "Generic (PLEG): container finished" podID="e956453d-551f-44b4-8125-8656b3155402" containerID="c677d54567313f232f88b9a873fec54a2577075c7b806ea88ad32df8cb36cb26" exitCode=0 Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.681622 4795 generic.go:334] "Generic (PLEG): container finished" podID="e956453d-551f-44b4-8125-8656b3155402" containerID="21486f4b00e940629fe56d483d784412d9ef80990cf1527d454dfa0b33b31ddb" exitCode=2 Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.681629 4795 generic.go:334] "Generic (PLEG): container finished" podID="e956453d-551f-44b4-8125-8656b3155402" containerID="a502298e81b556f379f6fb3a9a80c1cf56ac3cccd10f1bd463a7436d04a56552" exitCode=0 Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.681674 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e956453d-551f-44b4-8125-8656b3155402","Type":"ContainerDied","Data":"c677d54567313f232f88b9a873fec54a2577075c7b806ea88ad32df8cb36cb26"} Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.681701 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e956453d-551f-44b4-8125-8656b3155402","Type":"ContainerDied","Data":"21486f4b00e940629fe56d483d784412d9ef80990cf1527d454dfa0b33b31ddb"} Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.681709 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e956453d-551f-44b4-8125-8656b3155402","Type":"ContainerDied","Data":"a502298e81b556f379f6fb3a9a80c1cf56ac3cccd10f1bd463a7436d04a56552"} Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.684703 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d","Type":"ContainerDied","Data":"fa781215a57c5a384dc9196151cb9d88b19a59e6ec4219a4b6443b0c5d96ab8f"} Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.684740 4795 scope.go:117] "RemoveContainer" containerID="ae770d1785f3d5deaa5b1b98adf315fd8a61a2cbce434ecb3970ab496579b196" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.684858 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.698008 4795 generic.go:334] "Generic (PLEG): container finished" podID="d2561f4e-0a01-4927-96f8-ee7bef69f561" containerID="067a3784bb90d910b6b73dca0dc993d5c4844e46f49fddffcbfe6f467e1645d3" exitCode=0 Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.698066 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d2561f4e-0a01-4927-96f8-ee7bef69f561","Type":"ContainerDied","Data":"067a3784bb90d910b6b73dca0dc993d5c4844e46f49fddffcbfe6f467e1645d3"} Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.700387 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6f98bf9994-pr48x" podUID="e4a6a069-904a-4072-b98c-346f67f22def" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:49366->10.217.0.164:9311: read: connection reset by peer" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.700393 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6f98bf9994-pr48x" podUID="e4a6a069-904a-4072-b98c-346f67f22def" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:49372->10.217.0.164:9311: read: connection reset by peer" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.704150 4795 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-ktl2b" secret="" err="secret \"galera-openstack-dockercfg-snswc\" not found" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.704222 4795 scope.go:117] "RemoveContainer" containerID="37b5ade45a3dd017a14fe70f806fa2f9b2438f7ec74aeb16d3207b0a96e2916a" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.704490 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-ktl2b_openstack(2164f9d1-1d8b-486b-beca-0d3a5172b302)\"" pod="openstack/root-account-create-update-ktl2b" podUID="2164f9d1-1d8b-486b-beca-0d3a5172b302" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.707945 4795 generic.go:334] "Generic (PLEG): container finished" podID="4532c069-4eb7-48ab-b575-b6a130e2b438" containerID="b139002ae58e716fbddbfc8a4fe687ac4f1df52f0c26c4b6314da64bed2d0075" exitCode=0 Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.707999 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" event={"ID":"4532c069-4eb7-48ab-b575-b6a130e2b438","Type":"ContainerDied","Data":"b139002ae58e716fbddbfc8a4fe687ac4f1df52f0c26c4b6314da64bed2d0075"} Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.708024 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" event={"ID":"4532c069-4eb7-48ab-b575-b6a130e2b438","Type":"ContainerDied","Data":"c0b18fd78cd092c133f6dd779fd8c2b41870a6c99e45b8bcd625ff594cb4d9de"} Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.708099 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.712476 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.712525 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f6fd7841-2a08-4786-8e96-b2ab0f477eff","Type":"ContainerDied","Data":"d9ff2d99d6679f8062f4938a68a22506ca0912d3767e7bc32eff175a9e262c96"} Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.734634 4795 scope.go:117] "RemoveContainer" containerID="026030d64ab176967289d43803e975c62df885e093dcaebf727c13a2d67da5ff" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.735122 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3697a3b0-4077-4837-bcdc-c17d8aa361f1","Type":"ContainerDied","Data":"c6abf78f9f811ce98af3de204165d6af86666923e425da520b7a47fdc3944ee7"} Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.735361 4795 generic.go:334] "Generic (PLEG): container finished" podID="3697a3b0-4077-4837-bcdc-c17d8aa361f1" containerID="c6abf78f9f811ce98af3de204165d6af86666923e425da520b7a47fdc3944ee7" exitCode=0 Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.741208 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f769-account-create-update-8k7r2" event={"ID":"8eaa69df-d563-4dc0-8a78-40413946cbca","Type":"ContainerDied","Data":"7d6e2a5c198e5bc708e1811f1860c10bf53d61ca572544f39bca6ba9c9264ae5"} Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.741244 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d6e2a5c198e5bc708e1811f1860c10bf53d61ca572544f39bca6ba9c9264ae5" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.778085 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f769-account-create-update-8k7r2" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.779823 4795 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.779871 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2164f9d1-1d8b-486b-beca-0d3a5172b302-operator-scripts podName:2164f9d1-1d8b-486b-beca-0d3a5172b302 nodeName:}" failed. No retries permitted until 2026-02-19 21:50:00.279857894 +0000 UTC m=+1311.472375758 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/2164f9d1-1d8b-486b-beca-0d3a5172b302-operator-scripts") pod "root-account-create-update-ktl2b" (UID: "2164f9d1-1d8b-486b-beca-0d3a5172b302") : configmap "openstack-scripts" not found Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.782061 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-1922-account-create-update-gzkhl"] Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.784813 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-qcb7p operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-41fb-account-create-update-h4ql2" podUID="38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.793143 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.795144 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-1922-account-create-update-gzkhl"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.808896 4795 scope.go:117] "RemoveContainer" containerID="b139002ae58e716fbddbfc8a4fe687ac4f1df52f0c26c4b6314da64bed2d0075" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.831320 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-2d62-account-create-update-4vs7v"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.842496 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-2d62-account-create-update-4vs7v"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.857107 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6b6d98fbd4-svzc8"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.866569 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-6b6d98fbd4-svzc8"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.867017 4795 scope.go:117] "RemoveContainer" containerID="374356bcc0d8632ce08f62d54e2492534b8e8fc2a40b07f92483295295d884ef" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.893785 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.909960 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.915182 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.941672 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.946877 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.990456 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-config-data\") pod \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.990496 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wfxq\" (UniqueName: \"kubernetes.io/projected/da2e3f89-bf0b-4371-8e5b-a0037f266c70-kube-api-access-2wfxq\") pod \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.990529 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-internal-tls-certs\") pod \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.990555 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/da2e3f89-bf0b-4371-8e5b-a0037f266c70-etc-swift\") pod \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.990584 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da2e3f89-bf0b-4371-8e5b-a0037f266c70-run-httpd\") pod \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.990619 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-public-tls-certs\") pod \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.990654 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da2e3f89-bf0b-4371-8e5b-a0037f266c70-log-httpd\") pod \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.990714 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8eaa69df-d563-4dc0-8a78-40413946cbca-operator-scripts\") pod \"8eaa69df-d563-4dc0-8a78-40413946cbca\" (UID: \"8eaa69df-d563-4dc0-8a78-40413946cbca\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.990830 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hkps\" (UniqueName: \"kubernetes.io/projected/8eaa69df-d563-4dc0-8a78-40413946cbca-kube-api-access-5hkps\") pod \"8eaa69df-d563-4dc0-8a78-40413946cbca\" (UID: \"8eaa69df-d563-4dc0-8a78-40413946cbca\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.990851 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-combined-ca-bundle\") pod \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.991047 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-operator-scripts\") pod \"keystone-41fb-account-create-update-h4ql2\" (UID: \"38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9\") " pod="openstack/keystone-41fb-account-create-update-h4ql2" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.991116 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcb7p\" (UniqueName: \"kubernetes.io/projected/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-kube-api-access-qcb7p\") pod \"keystone-41fb-account-create-update-h4ql2\" (UID: \"38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9\") " pod="openstack/keystone-41fb-account-create-update-h4ql2" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.995290 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8eaa69df-d563-4dc0-8a78-40413946cbca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8eaa69df-d563-4dc0-8a78-40413946cbca" (UID: "8eaa69df-d563-4dc0-8a78-40413946cbca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.995485 4795 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.995533 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-operator-scripts podName:38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9 nodeName:}" failed. No retries permitted until 2026-02-19 21:50:00.995518176 +0000 UTC m=+1312.188036040 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-operator-scripts") pod "keystone-41fb-account-create-update-h4ql2" (UID: "38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9") : configmap "openstack-scripts" not found Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.997544 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da2e3f89-bf0b-4371-8e5b-a0037f266c70-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "da2e3f89-bf0b-4371-8e5b-a0037f266c70" (UID: "da2e3f89-bf0b-4371-8e5b-a0037f266c70"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.998484 4795 projected.go:194] Error preparing data for projected volume kube-api-access-qcb7p for pod openstack/keystone-41fb-account-create-update-h4ql2: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.998578 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-kube-api-access-qcb7p podName:38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9 nodeName:}" failed. No retries permitted until 2026-02-19 21:50:00.998567222 +0000 UTC m=+1312.191085086 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-qcb7p" (UniqueName: "kubernetes.io/projected/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-kube-api-access-qcb7p") pod "keystone-41fb-account-create-update-h4ql2" (UID: "38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.998766 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da2e3f89-bf0b-4371-8e5b-a0037f266c70-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "da2e3f89-bf0b-4371-8e5b-a0037f266c70" (UID: "da2e3f89-bf0b-4371-8e5b-a0037f266c70"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.005634 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eaa69df-d563-4dc0-8a78-40413946cbca-kube-api-access-5hkps" (OuterVolumeSpecName: "kube-api-access-5hkps") pod "8eaa69df-d563-4dc0-8a78-40413946cbca" (UID: "8eaa69df-d563-4dc0-8a78-40413946cbca"). InnerVolumeSpecName "kube-api-access-5hkps". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.012637 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da2e3f89-bf0b-4371-8e5b-a0037f266c70-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "da2e3f89-bf0b-4371-8e5b-a0037f266c70" (UID: "da2e3f89-bf0b-4371-8e5b-a0037f266c70"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.014323 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da2e3f89-bf0b-4371-8e5b-a0037f266c70-kube-api-access-2wfxq" (OuterVolumeSpecName: "kube-api-access-2wfxq") pod "da2e3f89-bf0b-4371-8e5b-a0037f266c70" (UID: "da2e3f89-bf0b-4371-8e5b-a0037f266c70"). InnerVolumeSpecName "kube-api-access-2wfxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.025372 4795 scope.go:117] "RemoveContainer" containerID="b139002ae58e716fbddbfc8a4fe687ac4f1df52f0c26c4b6314da64bed2d0075" Feb 19 21:50:00 crc kubenswrapper[4795]: E0219 21:50:00.033263 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b139002ae58e716fbddbfc8a4fe687ac4f1df52f0c26c4b6314da64bed2d0075\": container with ID starting with b139002ae58e716fbddbfc8a4fe687ac4f1df52f0c26c4b6314da64bed2d0075 not found: ID does not exist" containerID="b139002ae58e716fbddbfc8a4fe687ac4f1df52f0c26c4b6314da64bed2d0075" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.033330 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b139002ae58e716fbddbfc8a4fe687ac4f1df52f0c26c4b6314da64bed2d0075"} err="failed to get container status \"b139002ae58e716fbddbfc8a4fe687ac4f1df52f0c26c4b6314da64bed2d0075\": rpc error: code = NotFound desc = could not find container \"b139002ae58e716fbddbfc8a4fe687ac4f1df52f0c26c4b6314da64bed2d0075\": container with ID starting with b139002ae58e716fbddbfc8a4fe687ac4f1df52f0c26c4b6314da64bed2d0075 not found: ID does not exist" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.033359 4795 scope.go:117] "RemoveContainer" containerID="374356bcc0d8632ce08f62d54e2492534b8e8fc2a40b07f92483295295d884ef" Feb 19 21:50:00 crc kubenswrapper[4795]: E0219 21:50:00.034287 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"374356bcc0d8632ce08f62d54e2492534b8e8fc2a40b07f92483295295d884ef\": container with ID starting with 374356bcc0d8632ce08f62d54e2492534b8e8fc2a40b07f92483295295d884ef not found: ID does not exist" containerID="374356bcc0d8632ce08f62d54e2492534b8e8fc2a40b07f92483295295d884ef" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.034431 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"374356bcc0d8632ce08f62d54e2492534b8e8fc2a40b07f92483295295d884ef"} err="failed to get container status \"374356bcc0d8632ce08f62d54e2492534b8e8fc2a40b07f92483295295d884ef\": rpc error: code = NotFound desc = could not find container \"374356bcc0d8632ce08f62d54e2492534b8e8fc2a40b07f92483295295d884ef\": container with ID starting with 374356bcc0d8632ce08f62d54e2492534b8e8fc2a40b07f92483295295d884ef not found: ID does not exist" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.034452 4795 scope.go:117] "RemoveContainer" containerID="4f511a80129a9b198e5be1216670e966f0623d26f1c7d17cc3a99a813e69b1c3" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.079113 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-config-data" (OuterVolumeSpecName: "config-data") pod "da2e3f89-bf0b-4371-8e5b-a0037f266c70" (UID: "da2e3f89-bf0b-4371-8e5b-a0037f266c70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.080967 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "da2e3f89-bf0b-4371-8e5b-a0037f266c70" (UID: "da2e3f89-bf0b-4371-8e5b-a0037f266c70"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.089185 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "da2e3f89-bf0b-4371-8e5b-a0037f266c70" (UID: "da2e3f89-bf0b-4371-8e5b-a0037f266c70"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.095151 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-public-tls-certs\") pod \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.095210 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-scripts\") pod \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.095245 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-logs\") pod \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.095314 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-combined-ca-bundle\") pod \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.095393 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-config-data\") pod \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.095468 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-internal-tls-certs\") pod \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.095487 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84vz2\" (UniqueName: \"kubernetes.io/projected/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-kube-api-access-84vz2\") pod \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.095849 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hkps\" (UniqueName: \"kubernetes.io/projected/8eaa69df-d563-4dc0-8a78-40413946cbca-kube-api-access-5hkps\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.095862 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.095871 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wfxq\" (UniqueName: \"kubernetes.io/projected/da2e3f89-bf0b-4371-8e5b-a0037f266c70-kube-api-access-2wfxq\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.095879 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.095887 4795 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/da2e3f89-bf0b-4371-8e5b-a0037f266c70-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.095895 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da2e3f89-bf0b-4371-8e5b-a0037f266c70-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.095902 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.095910 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da2e3f89-bf0b-4371-8e5b-a0037f266c70-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.095918 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8eaa69df-d563-4dc0-8a78-40413946cbca-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.098752 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-logs" (OuterVolumeSpecName: "logs") pod "b01bcd5b-435a-4702-b0a4-8dfe8f553c23" (UID: "b01bcd5b-435a-4702-b0a4-8dfe8f553c23"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.108640 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-kube-api-access-84vz2" (OuterVolumeSpecName: "kube-api-access-84vz2") pod "b01bcd5b-435a-4702-b0a4-8dfe8f553c23" (UID: "b01bcd5b-435a-4702-b0a4-8dfe8f553c23"). InnerVolumeSpecName "kube-api-access-84vz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.118020 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-scripts" (OuterVolumeSpecName: "scripts") pod "b01bcd5b-435a-4702-b0a4-8dfe8f553c23" (UID: "b01bcd5b-435a-4702-b0a4-8dfe8f553c23"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.130255 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da2e3f89-bf0b-4371-8e5b-a0037f266c70" (UID: "da2e3f89-bf0b-4371-8e5b-a0037f266c70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.197006 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.197033 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84vz2\" (UniqueName: \"kubernetes.io/projected/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-kube-api-access-84vz2\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.197043 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.197051 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.202192 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.291291 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b01bcd5b-435a-4702-b0a4-8dfe8f553c23" (UID: "b01bcd5b-435a-4702-b0a4-8dfe8f553c23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.297629 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-config-data" (OuterVolumeSpecName: "config-data") pod "b01bcd5b-435a-4702-b0a4-8dfe8f553c23" (UID: "b01bcd5b-435a-4702-b0a4-8dfe8f553c23"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.297771 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.297908 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3697a3b0-4077-4837-bcdc-c17d8aa361f1-logs\") pod \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.298017 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bg5wb\" (UniqueName: \"kubernetes.io/projected/3697a3b0-4077-4837-bcdc-c17d8aa361f1-kube-api-access-bg5wb\") pod \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.298056 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3697a3b0-4077-4837-bcdc-c17d8aa361f1-httpd-run\") pod \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.298138 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-config-data\") pod \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.298482 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-combined-ca-bundle\") pod \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.298552 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-scripts\") pod \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.298608 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-internal-tls-certs\") pod \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.300510 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3697a3b0-4077-4837-bcdc-c17d8aa361f1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3697a3b0-4077-4837-bcdc-c17d8aa361f1" (UID: "3697a3b0-4077-4837-bcdc-c17d8aa361f1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: E0219 21:50:00.302532 4795 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 19 21:50:00 crc kubenswrapper[4795]: E0219 21:50:00.303615 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2164f9d1-1d8b-486b-beca-0d3a5172b302-operator-scripts podName:2164f9d1-1d8b-486b-beca-0d3a5172b302 nodeName:}" failed. No retries permitted until 2026-02-19 21:50:01.303447441 +0000 UTC m=+1312.495965305 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/2164f9d1-1d8b-486b-beca-0d3a5172b302-operator-scripts") pod "root-account-create-update-ktl2b" (UID: "2164f9d1-1d8b-486b-beca-0d3a5172b302") : configmap "openstack-scripts" not found Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.303305 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3697a3b0-4077-4837-bcdc-c17d8aa361f1-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.303891 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.304014 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.304886 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3697a3b0-4077-4837-bcdc-c17d8aa361f1-kube-api-access-bg5wb" (OuterVolumeSpecName: "kube-api-access-bg5wb") pod "3697a3b0-4077-4837-bcdc-c17d8aa361f1" (UID: "3697a3b0-4077-4837-bcdc-c17d8aa361f1"). InnerVolumeSpecName "kube-api-access-bg5wb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.306419 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3697a3b0-4077-4837-bcdc-c17d8aa361f1-logs" (OuterVolumeSpecName: "logs") pod "3697a3b0-4077-4837-bcdc-c17d8aa361f1" (UID: "3697a3b0-4077-4837-bcdc-c17d8aa361f1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.320302 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "3697a3b0-4077-4837-bcdc-c17d8aa361f1" (UID: "3697a3b0-4077-4837-bcdc-c17d8aa361f1"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.325732 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-scripts" (OuterVolumeSpecName: "scripts") pod "3697a3b0-4077-4837-bcdc-c17d8aa361f1" (UID: "3697a3b0-4077-4837-bcdc-c17d8aa361f1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.342363 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3697a3b0-4077-4837-bcdc-c17d8aa361f1" (UID: "3697a3b0-4077-4837-bcdc-c17d8aa361f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.369250 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b01bcd5b-435a-4702-b0a4-8dfe8f553c23" (UID: "b01bcd5b-435a-4702-b0a4-8dfe8f553c23"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.369351 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b01bcd5b-435a-4702-b0a4-8dfe8f553c23" (UID: "b01bcd5b-435a-4702-b0a4-8dfe8f553c23"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.370628 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-config-data" (OuterVolumeSpecName: "config-data") pod "3697a3b0-4077-4837-bcdc-c17d8aa361f1" (UID: "3697a3b0-4077-4837-bcdc-c17d8aa361f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.386921 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3697a3b0-4077-4837-bcdc-c17d8aa361f1" (UID: "3697a3b0-4077-4837-bcdc-c17d8aa361f1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.405524 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bg5wb\" (UniqueName: \"kubernetes.io/projected/3697a3b0-4077-4837-bcdc-c17d8aa361f1-kube-api-access-bg5wb\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.405559 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.405572 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.405583 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.405595 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.405620 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.405631 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.405641 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3697a3b0-4077-4837-bcdc-c17d8aa361f1-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.405650 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.423955 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.507471 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.563871 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.571946 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.618307 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.693394 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.695329 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.700034 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.710689 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gj2cq\" (UniqueName: \"kubernetes.io/projected/d2561f4e-0a01-4927-96f8-ee7bef69f561-kube-api-access-gj2cq\") pod \"d2561f4e-0a01-4927-96f8-ee7bef69f561\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.710772 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-internal-tls-certs\") pod \"d2561f4e-0a01-4927-96f8-ee7bef69f561\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.711131 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2561f4e-0a01-4927-96f8-ee7bef69f561-logs\") pod \"d2561f4e-0a01-4927-96f8-ee7bef69f561\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.711225 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-public-tls-certs\") pod \"d2561f4e-0a01-4927-96f8-ee7bef69f561\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.711251 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-kube-state-metrics-tls-certs\") pod \"296f6b57-de45-495d-abe9-8c779c157057\" (UID: \"296f6b57-de45-495d-abe9-8c779c157057\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.711329 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d2561f4e-0a01-4927-96f8-ee7bef69f561-etc-machine-id\") pod \"d2561f4e-0a01-4927-96f8-ee7bef69f561\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.711355 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-kube-state-metrics-tls-config\") pod \"296f6b57-de45-495d-abe9-8c779c157057\" (UID: \"296f6b57-de45-495d-abe9-8c779c157057\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.711375 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-combined-ca-bundle\") pod \"296f6b57-de45-495d-abe9-8c779c157057\" (UID: \"296f6b57-de45-495d-abe9-8c779c157057\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.711421 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-config-data-custom\") pod \"d2561f4e-0a01-4927-96f8-ee7bef69f561\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.711445 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlmk9\" (UniqueName: \"kubernetes.io/projected/296f6b57-de45-495d-abe9-8c779c157057-kube-api-access-zlmk9\") pod \"296f6b57-de45-495d-abe9-8c779c157057\" (UID: \"296f6b57-de45-495d-abe9-8c779c157057\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.711464 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-scripts\") pod \"d2561f4e-0a01-4927-96f8-ee7bef69f561\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.711493 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-combined-ca-bundle\") pod \"d2561f4e-0a01-4927-96f8-ee7bef69f561\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.711512 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-config-data\") pod \"d2561f4e-0a01-4927-96f8-ee7bef69f561\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.718182 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2561f4e-0a01-4927-96f8-ee7bef69f561-logs" (OuterVolumeSpecName: "logs") pod "d2561f4e-0a01-4927-96f8-ee7bef69f561" (UID: "d2561f4e-0a01-4927-96f8-ee7bef69f561"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.718520 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2561f4e-0a01-4927-96f8-ee7bef69f561-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d2561f4e-0a01-4927-96f8-ee7bef69f561" (UID: "d2561f4e-0a01-4927-96f8-ee7bef69f561"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.719130 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d2561f4e-0a01-4927-96f8-ee7bef69f561" (UID: "d2561f4e-0a01-4927-96f8-ee7bef69f561"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.722106 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-scripts" (OuterVolumeSpecName: "scripts") pod "d2561f4e-0a01-4927-96f8-ee7bef69f561" (UID: "d2561f4e-0a01-4927-96f8-ee7bef69f561"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.728214 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2561f4e-0a01-4927-96f8-ee7bef69f561-kube-api-access-gj2cq" (OuterVolumeSpecName: "kube-api-access-gj2cq") pod "d2561f4e-0a01-4927-96f8-ee7bef69f561" (UID: "d2561f4e-0a01-4927-96f8-ee7bef69f561"). InnerVolumeSpecName "kube-api-access-gj2cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.734260 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7677694455-vk4hv" podUID="74df8ac0-77f2-4e8d-aa39-d05dc6ce7190" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.201:5353: i/o timeout" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.754456 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3697a3b0-4077-4837-bcdc-c17d8aa361f1","Type":"ContainerDied","Data":"4b0f0860894e220641b68db9b622d33a66b09008a18bfa19149efafa413199c3"} Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.754510 4795 scope.go:117] "RemoveContainer" containerID="c6abf78f9f811ce98af3de204165d6af86666923e425da520b7a47fdc3944ee7" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.754607 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.766805 4795 generic.go:334] "Generic (PLEG): container finished" podID="42e81a78-17fd-4ed3-b072-dd6a20cbe5d3" containerID="db08b177837bd80146274836c0977b72219e93219854a2c6c5e81a24922a33fe" exitCode=0 Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.766898 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3","Type":"ContainerDied","Data":"db08b177837bd80146274836c0977b72219e93219854a2c6c5e81a24922a33fe"} Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.768785 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6df95dfbd4-ftf6x" event={"ID":"b01bcd5b-435a-4702-b0a4-8dfe8f553c23","Type":"ContainerDied","Data":"47caa9a7519cc7b778b03d7e938c02973816e703da61178a9af0e7d1bdc77812"} Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.768867 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.779266 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/296f6b57-de45-495d-abe9-8c779c157057-kube-api-access-zlmk9" (OuterVolumeSpecName: "kube-api-access-zlmk9") pod "296f6b57-de45-495d-abe9-8c779c157057" (UID: "296f6b57-de45-495d-abe9-8c779c157057"). InnerVolumeSpecName "kube-api-access-zlmk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.784045 4795 generic.go:334] "Generic (PLEG): container finished" podID="793bbadc-8b53-4084-a63a-0b76b37284df" containerID="5b12dfe428f5ad6bc0c6c39c5790008897c140f405635c55ad4d8157a86e3f70" exitCode=0 Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.784108 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"793bbadc-8b53-4084-a63a-0b76b37284df","Type":"ContainerDied","Data":"5b12dfe428f5ad6bc0c6c39c5790008897c140f405635c55ad4d8157a86e3f70"} Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.784129 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"793bbadc-8b53-4084-a63a-0b76b37284df","Type":"ContainerDied","Data":"14cc3bd11f9f9a1b0b976a11e87f576616488b4c4d4dfa8a49e1d97fcc43ddfd"} Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.784192 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.794128 4795 scope.go:117] "RemoveContainer" containerID="e797be576a87ea7d2cd1a10d4fb93c6e0f25a6a5bebf1abb85c7b6e12aa13e38" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.795459 4795 generic.go:334] "Generic (PLEG): container finished" podID="a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" containerID="708e4d014b2d6219a997cb5284977900cb2aafb592d9023c95962ea2c3074217" exitCode=0 Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.795521 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.795567 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9","Type":"ContainerDied","Data":"708e4d014b2d6219a997cb5284977900cb2aafb592d9023c95962ea2c3074217"} Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.795590 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9","Type":"ContainerDied","Data":"6d8d28f68ae7a05b3b24448d485df065e39bc0509b04817346db5c0af58598b8"} Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.809909 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.810000 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"296f6b57-de45-495d-abe9-8c779c157057","Type":"ContainerDied","Data":"b8dfa3bc80470864a239275349fad5e129ee2b5e9ff5a093105f56a1aaa790d3"} Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.811999 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.812885 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/793bbadc-8b53-4084-a63a-0b76b37284df-logs\") pod \"793bbadc-8b53-4084-a63a-0b76b37284df\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.812948 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-config-data\") pod \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.812986 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-public-tls-certs\") pod \"793bbadc-8b53-4084-a63a-0b76b37284df\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813016 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-internal-tls-certs\") pod \"e4a6a069-904a-4072-b98c-346f67f22def\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813057 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-public-tls-certs\") pod \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813122 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-config-data\") pod \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813185 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6pv9\" (UniqueName: \"kubernetes.io/projected/e4a6a069-904a-4072-b98c-346f67f22def-kube-api-access-h6pv9\") pod \"e4a6a069-904a-4072-b98c-346f67f22def\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813209 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8fsc\" (UniqueName: \"kubernetes.io/projected/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-kube-api-access-x8fsc\") pod \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813252 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sbjg\" (UniqueName: \"kubernetes.io/projected/793bbadc-8b53-4084-a63a-0b76b37284df-kube-api-access-5sbjg\") pod \"793bbadc-8b53-4084-a63a-0b76b37284df\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813282 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-scripts\") pod \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813301 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-logs\") pod \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813334 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-combined-ca-bundle\") pod \"e4a6a069-904a-4072-b98c-346f67f22def\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813354 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-nova-metadata-tls-certs\") pod \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813369 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-combined-ca-bundle\") pod \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813385 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-config-data\") pod \"e4a6a069-904a-4072-b98c-346f67f22def\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813402 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-combined-ca-bundle\") pod \"793bbadc-8b53-4084-a63a-0b76b37284df\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813427 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-httpd-run\") pod \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813448 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-public-tls-certs\") pod \"e4a6a069-904a-4072-b98c-346f67f22def\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813463 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-internal-tls-certs\") pod \"793bbadc-8b53-4084-a63a-0b76b37284df\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813509 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxzt2\" (UniqueName: \"kubernetes.io/projected/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-kube-api-access-hxzt2\") pod \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813535 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-logs\") pod \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813556 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813575 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-config-data-custom\") pod \"e4a6a069-904a-4072-b98c-346f67f22def\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813592 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-config-data\") pod \"793bbadc-8b53-4084-a63a-0b76b37284df\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813608 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-combined-ca-bundle\") pod \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813629 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4a6a069-904a-4072-b98c-346f67f22def-logs\") pod \"e4a6a069-904a-4072-b98c-346f67f22def\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813944 4795 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d2561f4e-0a01-4927-96f8-ee7bef69f561-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813957 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813969 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlmk9\" (UniqueName: \"kubernetes.io/projected/296f6b57-de45-495d-abe9-8c779c157057-kube-api-access-zlmk9\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813978 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.814092 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gj2cq\" (UniqueName: \"kubernetes.io/projected/d2561f4e-0a01-4927-96f8-ee7bef69f561-kube-api-access-gj2cq\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.814106 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2561f4e-0a01-4927-96f8-ee7bef69f561-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.814620 4795 generic.go:334] "Generic (PLEG): container finished" podID="57b83043-2f7c-4b55-a2b9-66eef96f0008" containerID="4f011f521b748247939cf0cf1c345e321c4287fa532812c9357da77b932cf4c6" exitCode=0 Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.814728 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4a6a069-904a-4072-b98c-346f67f22def-logs" (OuterVolumeSpecName: "logs") pod "e4a6a069-904a-4072-b98c-346f67f22def" (UID: "e4a6a069-904a-4072-b98c-346f67f22def"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.814728 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"57b83043-2f7c-4b55-a2b9-66eef96f0008","Type":"ContainerDied","Data":"4f011f521b748247939cf0cf1c345e321c4287fa532812c9357da77b932cf4c6"} Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.815250 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/793bbadc-8b53-4084-a63a-0b76b37284df-logs" (OuterVolumeSpecName: "logs") pod "793bbadc-8b53-4084-a63a-0b76b37284df" (UID: "793bbadc-8b53-4084-a63a-0b76b37284df"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.816231 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-logs" (OuterVolumeSpecName: "logs") pod "e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" (UID: "e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.816449 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" (UID: "a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.817392 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-logs" (OuterVolumeSpecName: "logs") pod "a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" (UID: "a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.818117 4795 generic.go:334] "Generic (PLEG): container finished" podID="e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" containerID="11d0871b61a89a3e5545d5527d867256cc43829525a32af590e7a61ae24edec5" exitCode=0 Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.818191 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107","Type":"ContainerDied","Data":"11d0871b61a89a3e5545d5527d867256cc43829525a32af590e7a61ae24edec5"} Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.818213 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107","Type":"ContainerDied","Data":"d49ebe84c757eba1bd2bd142f49a19380b1d9884de4a65242a0e36933f808c52"} Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.818276 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.821404 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d2561f4e-0a01-4927-96f8-ee7bef69f561","Type":"ContainerDied","Data":"72a722e4ebd20de4e2ab880d4812af758e115c7dc2dbe4b6fadf7ad0adda880d"} Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.821441 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.823299 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.834864 4795 generic.go:334] "Generic (PLEG): container finished" podID="e4a6a069-904a-4072-b98c-346f67f22def" containerID="a6cf86c6cb492d1373f389e62b5d8e8687a30a53e6a19fc785851a2a8a42b429" exitCode=0 Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.834912 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f98bf9994-pr48x" event={"ID":"e4a6a069-904a-4072-b98c-346f67f22def","Type":"ContainerDied","Data":"a6cf86c6cb492d1373f389e62b5d8e8687a30a53e6a19fc785851a2a8a42b429"} Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.834937 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f98bf9994-pr48x" event={"ID":"e4a6a069-904a-4072-b98c-346f67f22def","Type":"ContainerDied","Data":"95eca73f9943de18ca7dd19f1ef5d95e39ab42d81563dce332afbfa7377d20f4"} Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.834987 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.837614 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/793bbadc-8b53-4084-a63a-0b76b37284df-kube-api-access-5sbjg" (OuterVolumeSpecName: "kube-api-access-5sbjg") pod "793bbadc-8b53-4084-a63a-0b76b37284df" (UID: "793bbadc-8b53-4084-a63a-0b76b37284df"). InnerVolumeSpecName "kube-api-access-5sbjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.838257 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.840577 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-41fb-account-create-update-h4ql2" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.841019 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.841327 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f769-account-create-update-8k7r2" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.842310 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-kube-api-access-x8fsc" (OuterVolumeSpecName: "kube-api-access-x8fsc") pod "a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" (UID: "a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9"). InnerVolumeSpecName "kube-api-access-x8fsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.842409 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-858c4dcd57-whkj2" event={"ID":"da2e3f89-bf0b-4371-8e5b-a0037f266c70","Type":"ContainerDied","Data":"fd10d4f85e04ded895f7718dd53443f09a3be089bf6f4718e6d017852d997436"} Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.845643 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" (UID: "a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.860303 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4a6a069-904a-4072-b98c-346f67f22def-kube-api-access-h6pv9" (OuterVolumeSpecName: "kube-api-access-h6pv9") pod "e4a6a069-904a-4072-b98c-346f67f22def" (UID: "e4a6a069-904a-4072-b98c-346f67f22def"). InnerVolumeSpecName "kube-api-access-h6pv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.862936 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-kube-api-access-hxzt2" (OuterVolumeSpecName: "kube-api-access-hxzt2") pod "e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" (UID: "e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107"). InnerVolumeSpecName "kube-api-access-hxzt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.864362 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e4a6a069-904a-4072-b98c-346f67f22def" (UID: "e4a6a069-904a-4072-b98c-346f67f22def"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.887858 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-scripts" (OuterVolumeSpecName: "scripts") pod "a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" (UID: "a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: E0219 21:50:00.892730 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4f011f521b748247939cf0cf1c345e321c4287fa532812c9357da77b932cf4c6 is running failed: container process not found" containerID="4f011f521b748247939cf0cf1c345e321c4287fa532812c9357da77b932cf4c6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 21:50:00 crc kubenswrapper[4795]: E0219 21:50:00.906730 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4f011f521b748247939cf0cf1c345e321c4287fa532812c9357da77b932cf4c6 is running failed: container process not found" containerID="4f011f521b748247939cf0cf1c345e321c4287fa532812c9357da77b932cf4c6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 21:50:00 crc kubenswrapper[4795]: E0219 21:50:00.907508 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4f011f521b748247939cf0cf1c345e321c4287fa532812c9357da77b932cf4c6 is running failed: container process not found" containerID="4f011f521b748247939cf0cf1c345e321c4287fa532812c9357da77b932cf4c6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 21:50:00 crc kubenswrapper[4795]: E0219 21:50:00.907537 4795 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4f011f521b748247939cf0cf1c345e321c4287fa532812c9357da77b932cf4c6 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="57b83043-2f7c-4b55-a2b9-66eef96f0008" containerName="nova-cell0-conductor-conductor" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.912660 4795 scope.go:117] "RemoveContainer" containerID="9d2e881624f7800e5e6e027e5487084188713e96e14cab5c281c53ae829851a2" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.917403 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sbjg\" (UniqueName: \"kubernetes.io/projected/793bbadc-8b53-4084-a63a-0b76b37284df-kube-api-access-5sbjg\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.917428 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.917438 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.917446 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.917455 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxzt2\" (UniqueName: \"kubernetes.io/projected/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-kube-api-access-hxzt2\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.917465 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.917484 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.917493 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.917502 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4a6a069-904a-4072-b98c-346f67f22def-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.917510 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/793bbadc-8b53-4084-a63a-0b76b37284df-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.917519 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6pv9\" (UniqueName: \"kubernetes.io/projected/e4a6a069-904a-4072-b98c-346f67f22def-kube-api-access-h6pv9\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.917527 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8fsc\" (UniqueName: \"kubernetes.io/projected/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-kube-api-access-x8fsc\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.917747 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6df95dfbd4-ftf6x"] Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.925384 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "296f6b57-de45-495d-abe9-8c779c157057" (UID: "296f6b57-de45-495d-abe9-8c779c157057"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.932492 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-41fb-account-create-update-h4ql2" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.933666 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6df95dfbd4-ftf6x"] Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.941328 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "296f6b57-de45-495d-abe9-8c779c157057" (UID: "296f6b57-de45-495d-abe9-8c779c157057"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.965097 4795 scope.go:117] "RemoveContainer" containerID="16d1434f729c32f7f5af098d1664dafb8ed3d4636079462bf6c45b1454ee08ef" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.974044 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2561f4e-0a01-4927-96f8-ee7bef69f561" (UID: "d2561f4e-0a01-4927-96f8-ee7bef69f561"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.982002 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-858c4dcd57-whkj2"] Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.983621 4795 scope.go:117] "RemoveContainer" containerID="5b12dfe428f5ad6bc0c6c39c5790008897c140f405635c55ad4d8157a86e3f70" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.988779 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-858c4dcd57-whkj2"] Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.994636 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "296f6b57-de45-495d-abe9-8c779c157057" (UID: "296f6b57-de45-495d-abe9-8c779c157057"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.006256 4795 scope.go:117] "RemoveContainer" containerID="4a55d98b95e49937bea845358fb6e967ce1fb52ec526e4696c5d439369df7681" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.019257 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-memcached-tls-certs\") pod \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.019333 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csp2v\" (UniqueName: \"kubernetes.io/projected/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-kube-api-access-csp2v\") pod \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.019474 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-combined-ca-bundle\") pod \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.019605 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-config-data\") pod \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.019644 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-kolla-config\") pod \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.019869 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-operator-scripts\") pod \"keystone-41fb-account-create-update-h4ql2\" (UID: \"38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9\") " pod="openstack/keystone-41fb-account-create-update-h4ql2" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.019927 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcb7p\" (UniqueName: \"kubernetes.io/projected/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-kube-api-access-qcb7p\") pod \"keystone-41fb-account-create-update-h4ql2\" (UID: \"38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9\") " pod="openstack/keystone-41fb-account-create-update-h4ql2" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.020015 4795 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.020030 4795 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.020040 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.020051 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: E0219 21:50:01.020886 4795 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 19 21:50:01 crc kubenswrapper[4795]: E0219 21:50:01.020984 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-operator-scripts podName:38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9 nodeName:}" failed. No retries permitted until 2026-02-19 21:50:03.020961447 +0000 UTC m=+1314.213479311 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-operator-scripts") pod "keystone-41fb-account-create-update-h4ql2" (UID: "38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9") : configmap "openstack-scripts" not found Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.023411 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-config-data" (OuterVolumeSpecName: "config-data") pod "42e81a78-17fd-4ed3-b072-dd6a20cbe5d3" (UID: "42e81a78-17fd-4ed3-b072-dd6a20cbe5d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.023822 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "42e81a78-17fd-4ed3-b072-dd6a20cbe5d3" (UID: "42e81a78-17fd-4ed3-b072-dd6a20cbe5d3"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: E0219 21:50:01.023838 4795 projected.go:194] Error preparing data for projected volume kube-api-access-qcb7p for pod openstack/keystone-41fb-account-create-update-h4ql2: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 19 21:50:01 crc kubenswrapper[4795]: E0219 21:50:01.024733 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-kube-api-access-qcb7p podName:38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9 nodeName:}" failed. No retries permitted until 2026-02-19 21:50:03.02423617 +0000 UTC m=+1314.216754034 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-qcb7p" (UniqueName: "kubernetes.io/projected/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-kube-api-access-qcb7p") pod "keystone-41fb-account-create-update-h4ql2" (UID: "38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.029409 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-f769-account-create-update-8k7r2"] Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.034948 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d2561f4e-0a01-4927-96f8-ee7bef69f561" (UID: "d2561f4e-0a01-4927-96f8-ee7bef69f561"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.036699 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-kube-api-access-csp2v" (OuterVolumeSpecName: "kube-api-access-csp2v") pod "42e81a78-17fd-4ed3-b072-dd6a20cbe5d3" (UID: "42e81a78-17fd-4ed3-b072-dd6a20cbe5d3"). InnerVolumeSpecName "kube-api-access-csp2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.043179 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-f769-account-create-update-8k7r2"] Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.049864 4795 scope.go:117] "RemoveContainer" containerID="5b12dfe428f5ad6bc0c6c39c5790008897c140f405635c55ad4d8157a86e3f70" Feb 19 21:50:01 crc kubenswrapper[4795]: E0219 21:50:01.050326 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b12dfe428f5ad6bc0c6c39c5790008897c140f405635c55ad4d8157a86e3f70\": container with ID starting with 5b12dfe428f5ad6bc0c6c39c5790008897c140f405635c55ad4d8157a86e3f70 not found: ID does not exist" containerID="5b12dfe428f5ad6bc0c6c39c5790008897c140f405635c55ad4d8157a86e3f70" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.050360 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b12dfe428f5ad6bc0c6c39c5790008897c140f405635c55ad4d8157a86e3f70"} err="failed to get container status \"5b12dfe428f5ad6bc0c6c39c5790008897c140f405635c55ad4d8157a86e3f70\": rpc error: code = NotFound desc = could not find container \"5b12dfe428f5ad6bc0c6c39c5790008897c140f405635c55ad4d8157a86e3f70\": container with ID starting with 5b12dfe428f5ad6bc0c6c39c5790008897c140f405635c55ad4d8157a86e3f70 not found: ID does not exist" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.050379 4795 scope.go:117] "RemoveContainer" containerID="4a55d98b95e49937bea845358fb6e967ce1fb52ec526e4696c5d439369df7681" Feb 19 21:50:01 crc kubenswrapper[4795]: E0219 21:50:01.050737 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a55d98b95e49937bea845358fb6e967ce1fb52ec526e4696c5d439369df7681\": container with ID starting with 4a55d98b95e49937bea845358fb6e967ce1fb52ec526e4696c5d439369df7681 not found: ID does not exist" containerID="4a55d98b95e49937bea845358fb6e967ce1fb52ec526e4696c5d439369df7681" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.050777 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a55d98b95e49937bea845358fb6e967ce1fb52ec526e4696c5d439369df7681"} err="failed to get container status \"4a55d98b95e49937bea845358fb6e967ce1fb52ec526e4696c5d439369df7681\": rpc error: code = NotFound desc = could not find container \"4a55d98b95e49937bea845358fb6e967ce1fb52ec526e4696c5d439369df7681\": container with ID starting with 4a55d98b95e49937bea845358fb6e967ce1fb52ec526e4696c5d439369df7681 not found: ID does not exist" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.050807 4795 scope.go:117] "RemoveContainer" containerID="708e4d014b2d6219a997cb5284977900cb2aafb592d9023c95962ea2c3074217" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.069316 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" (UID: "e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.077004 4795 scope.go:117] "RemoveContainer" containerID="46cb4f812b6ae6e3f71ffef829cc67ce3250106de6851d56a2c280735844fd0f" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.090247 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d2561f4e-0a01-4927-96f8-ee7bef69f561" (UID: "d2561f4e-0a01-4927-96f8-ee7bef69f561"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.095049 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "793bbadc-8b53-4084-a63a-0b76b37284df" (UID: "793bbadc-8b53-4084-a63a-0b76b37284df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.098425 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-config-data" (OuterVolumeSpecName: "config-data") pod "e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" (UID: "e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.100305 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-config-data" (OuterVolumeSpecName: "config-data") pod "793bbadc-8b53-4084-a63a-0b76b37284df" (UID: "793bbadc-8b53-4084-a63a-0b76b37284df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.120405 4795 scope.go:117] "RemoveContainer" containerID="708e4d014b2d6219a997cb5284977900cb2aafb592d9023c95962ea2c3074217" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.122724 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.122742 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csp2v\" (UniqueName: \"kubernetes.io/projected/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-kube-api-access-csp2v\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.122751 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.122760 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.122772 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.122782 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.122790 4795 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.122799 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.122810 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: E0219 21:50:01.126261 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"708e4d014b2d6219a997cb5284977900cb2aafb592d9023c95962ea2c3074217\": container with ID starting with 708e4d014b2d6219a997cb5284977900cb2aafb592d9023c95962ea2c3074217 not found: ID does not exist" containerID="708e4d014b2d6219a997cb5284977900cb2aafb592d9023c95962ea2c3074217" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.126290 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"708e4d014b2d6219a997cb5284977900cb2aafb592d9023c95962ea2c3074217"} err="failed to get container status \"708e4d014b2d6219a997cb5284977900cb2aafb592d9023c95962ea2c3074217\": rpc error: code = NotFound desc = could not find container \"708e4d014b2d6219a997cb5284977900cb2aafb592d9023c95962ea2c3074217\": container with ID starting with 708e4d014b2d6219a997cb5284977900cb2aafb592d9023c95962ea2c3074217 not found: ID does not exist" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.126319 4795 scope.go:117] "RemoveContainer" containerID="46cb4f812b6ae6e3f71ffef829cc67ce3250106de6851d56a2c280735844fd0f" Feb 19 21:50:01 crc kubenswrapper[4795]: E0219 21:50:01.127471 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46cb4f812b6ae6e3f71ffef829cc67ce3250106de6851d56a2c280735844fd0f\": container with ID starting with 46cb4f812b6ae6e3f71ffef829cc67ce3250106de6851d56a2c280735844fd0f not found: ID does not exist" containerID="46cb4f812b6ae6e3f71ffef829cc67ce3250106de6851d56a2c280735844fd0f" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.127517 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46cb4f812b6ae6e3f71ffef829cc67ce3250106de6851d56a2c280735844fd0f"} err="failed to get container status \"46cb4f812b6ae6e3f71ffef829cc67ce3250106de6851d56a2c280735844fd0f\": rpc error: code = NotFound desc = could not find container \"46cb4f812b6ae6e3f71ffef829cc67ce3250106de6851d56a2c280735844fd0f\": container with ID starting with 46cb4f812b6ae6e3f71ffef829cc67ce3250106de6851d56a2c280735844fd0f not found: ID does not exist" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.127542 4795 scope.go:117] "RemoveContainer" containerID="5a4765254bd1f522ba57b50f9f6989619b42ec50e10b34373400dceff9cc29f7" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.147385 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4a6a069-904a-4072-b98c-346f67f22def" (UID: "e4a6a069-904a-4072-b98c-346f67f22def"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.147796 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.171483 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.174785 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e4a6a069-904a-4072-b98c-346f67f22def" (UID: "e4a6a069-904a-4072-b98c-346f67f22def"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.175154 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "793bbadc-8b53-4084-a63a-0b76b37284df" (UID: "793bbadc-8b53-4084-a63a-0b76b37284df"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.180593 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.186334 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" (UID: "e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.192595 4795 scope.go:117] "RemoveContainer" containerID="11d0871b61a89a3e5545d5527d867256cc43829525a32af590e7a61ae24edec5" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.207413 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.208929 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "793bbadc-8b53-4084-a63a-0b76b37284df" (UID: "793bbadc-8b53-4084-a63a-0b76b37284df"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.217920 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-config-data" (OuterVolumeSpecName: "config-data") pod "d2561f4e-0a01-4927-96f8-ee7bef69f561" (UID: "d2561f4e-0a01-4927-96f8-ee7bef69f561"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.218318 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-config-data" (OuterVolumeSpecName: "config-data") pod "e4a6a069-904a-4072-b98c-346f67f22def" (UID: "e4a6a069-904a-4072-b98c-346f67f22def"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.219053 4795 scope.go:117] "RemoveContainer" containerID="a92c3489fb5665772be143ca85a31a0e43e2db5654864e03970155d992501b37" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.220545 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ktl2b" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.223789 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.223812 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.223821 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.223830 4795 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.223838 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.223846 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.223854 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.223862 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.257348 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" (UID: "a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.279790 4795 scope.go:117] "RemoveContainer" containerID="11d0871b61a89a3e5545d5527d867256cc43829525a32af590e7a61ae24edec5" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.280418 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "42e81a78-17fd-4ed3-b072-dd6a20cbe5d3" (UID: "42e81a78-17fd-4ed3-b072-dd6a20cbe5d3"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: E0219 21:50:01.280438 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11d0871b61a89a3e5545d5527d867256cc43829525a32af590e7a61ae24edec5\": container with ID starting with 11d0871b61a89a3e5545d5527d867256cc43829525a32af590e7a61ae24edec5 not found: ID does not exist" containerID="11d0871b61a89a3e5545d5527d867256cc43829525a32af590e7a61ae24edec5" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.280490 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11d0871b61a89a3e5545d5527d867256cc43829525a32af590e7a61ae24edec5"} err="failed to get container status \"11d0871b61a89a3e5545d5527d867256cc43829525a32af590e7a61ae24edec5\": rpc error: code = NotFound desc = could not find container \"11d0871b61a89a3e5545d5527d867256cc43829525a32af590e7a61ae24edec5\": container with ID starting with 11d0871b61a89a3e5545d5527d867256cc43829525a32af590e7a61ae24edec5 not found: ID does not exist" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.280521 4795 scope.go:117] "RemoveContainer" containerID="a92c3489fb5665772be143ca85a31a0e43e2db5654864e03970155d992501b37" Feb 19 21:50:01 crc kubenswrapper[4795]: E0219 21:50:01.281195 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a92c3489fb5665772be143ca85a31a0e43e2db5654864e03970155d992501b37\": container with ID starting with a92c3489fb5665772be143ca85a31a0e43e2db5654864e03970155d992501b37 not found: ID does not exist" containerID="a92c3489fb5665772be143ca85a31a0e43e2db5654864e03970155d992501b37" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.281235 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a92c3489fb5665772be143ca85a31a0e43e2db5654864e03970155d992501b37"} err="failed to get container status \"a92c3489fb5665772be143ca85a31a0e43e2db5654864e03970155d992501b37\": rpc error: code = NotFound desc = could not find container \"a92c3489fb5665772be143ca85a31a0e43e2db5654864e03970155d992501b37\": container with ID starting with a92c3489fb5665772be143ca85a31a0e43e2db5654864e03970155d992501b37 not found: ID does not exist" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.281263 4795 scope.go:117] "RemoveContainer" containerID="067a3784bb90d910b6b73dca0dc993d5c4844e46f49fddffcbfe6f467e1645d3" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.300946 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42e81a78-17fd-4ed3-b072-dd6a20cbe5d3" (UID: "42e81a78-17fd-4ed3-b072-dd6a20cbe5d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.303185 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-config-data" (OuterVolumeSpecName: "config-data") pod "a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" (UID: "a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.317069 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e4a6a069-904a-4072-b98c-346f67f22def" (UID: "e4a6a069-904a-4072-b98c-346f67f22def"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.323413 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" (UID: "a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.331250 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kll6\" (UniqueName: \"kubernetes.io/projected/2164f9d1-1d8b-486b-beca-0d3a5172b302-kube-api-access-9kll6\") pod \"2164f9d1-1d8b-486b-beca-0d3a5172b302\" (UID: \"2164f9d1-1d8b-486b-beca-0d3a5172b302\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.331299 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2164f9d1-1d8b-486b-beca-0d3a5172b302-operator-scripts\") pod \"2164f9d1-1d8b-486b-beca-0d3a5172b302\" (UID: \"2164f9d1-1d8b-486b-beca-0d3a5172b302\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.331321 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57b83043-2f7c-4b55-a2b9-66eef96f0008-combined-ca-bundle\") pod \"57b83043-2f7c-4b55-a2b9-66eef96f0008\" (UID: \"57b83043-2f7c-4b55-a2b9-66eef96f0008\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.331385 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57b83043-2f7c-4b55-a2b9-66eef96f0008-config-data\") pod \"57b83043-2f7c-4b55-a2b9-66eef96f0008\" (UID: \"57b83043-2f7c-4b55-a2b9-66eef96f0008\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.331430 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v44bq\" (UniqueName: \"kubernetes.io/projected/57b83043-2f7c-4b55-a2b9-66eef96f0008-kube-api-access-v44bq\") pod \"57b83043-2f7c-4b55-a2b9-66eef96f0008\" (UID: \"57b83043-2f7c-4b55-a2b9-66eef96f0008\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.331661 4795 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.331676 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.331689 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.331697 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.331705 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.331713 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.334701 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2164f9d1-1d8b-486b-beca-0d3a5172b302-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2164f9d1-1d8b-486b-beca-0d3a5172b302" (UID: "2164f9d1-1d8b-486b-beca-0d3a5172b302"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.335867 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57b83043-2f7c-4b55-a2b9-66eef96f0008-kube-api-access-v44bq" (OuterVolumeSpecName: "kube-api-access-v44bq") pod "57b83043-2f7c-4b55-a2b9-66eef96f0008" (UID: "57b83043-2f7c-4b55-a2b9-66eef96f0008"). InnerVolumeSpecName "kube-api-access-v44bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.338728 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2164f9d1-1d8b-486b-beca-0d3a5172b302-kube-api-access-9kll6" (OuterVolumeSpecName: "kube-api-access-9kll6") pod "2164f9d1-1d8b-486b-beca-0d3a5172b302" (UID: "2164f9d1-1d8b-486b-beca-0d3a5172b302"). InnerVolumeSpecName "kube-api-access-9kll6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.364279 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57b83043-2f7c-4b55-a2b9-66eef96f0008-config-data" (OuterVolumeSpecName: "config-data") pod "57b83043-2f7c-4b55-a2b9-66eef96f0008" (UID: "57b83043-2f7c-4b55-a2b9-66eef96f0008"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.367370 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57b83043-2f7c-4b55-a2b9-66eef96f0008-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57b83043-2f7c-4b55-a2b9-66eef96f0008" (UID: "57b83043-2f7c-4b55-a2b9-66eef96f0008"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.433248 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57b83043-2f7c-4b55-a2b9-66eef96f0008-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.433678 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v44bq\" (UniqueName: \"kubernetes.io/projected/57b83043-2f7c-4b55-a2b9-66eef96f0008-kube-api-access-v44bq\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.433690 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kll6\" (UniqueName: \"kubernetes.io/projected/2164f9d1-1d8b-486b-beca-0d3a5172b302-kube-api-access-9kll6\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.433702 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2164f9d1-1d8b-486b-beca-0d3a5172b302-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.433711 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57b83043-2f7c-4b55-a2b9-66eef96f0008-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.523909 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="296f6b57-de45-495d-abe9-8c779c157057" path="/var/lib/kubelet/pods/296f6b57-de45-495d-abe9-8c779c157057/volumes" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.524527 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="299d8d1c-c181-4c7b-b95f-9f3c62ddb102" path="/var/lib/kubelet/pods/299d8d1c-c181-4c7b-b95f-9f3c62ddb102/volumes" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.525768 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" path="/var/lib/kubelet/pods/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d/volumes" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.527048 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3697a3b0-4077-4837-bcdc-c17d8aa361f1" path="/var/lib/kubelet/pods/3697a3b0-4077-4837-bcdc-c17d8aa361f1/volumes" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.527718 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4532c069-4eb7-48ab-b575-b6a130e2b438" path="/var/lib/kubelet/pods/4532c069-4eb7-48ab-b575-b6a130e2b438/volumes" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.530891 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8eaa69df-d563-4dc0-8a78-40413946cbca" path="/var/lib/kubelet/pods/8eaa69df-d563-4dc0-8a78-40413946cbca/volumes" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.531404 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b01bcd5b-435a-4702-b0a4-8dfe8f553c23" path="/var/lib/kubelet/pods/b01bcd5b-435a-4702-b0a4-8dfe8f553c23/volumes" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.537610 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da2e3f89-bf0b-4371-8e5b-a0037f266c70" path="/var/lib/kubelet/pods/da2e3f89-bf0b-4371-8e5b-a0037f266c70/volumes" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.538554 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee3fde95-91bf-4f6a-9753-f879d56fedbb" path="/var/lib/kubelet/pods/ee3fde95-91bf-4f6a-9753-f879d56fedbb/volumes" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.539135 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6fd7841-2a08-4786-8e96-b2ab0f477eff" path="/var/lib/kubelet/pods/f6fd7841-2a08-4786-8e96-b2ab0f477eff/volumes" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.570312 4795 scope.go:117] "RemoveContainer" containerID="46487241a29d4cc3bff33a03b2f13ce2e328740a30388d55d6c987233cf2d399" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.614508 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_fe2e673c-4ce5-4df9-b9e6-0b7cea99727c/ovn-northd/0.log" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.614585 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.626616 4795 scope.go:117] "RemoveContainer" containerID="a6cf86c6cb492d1373f389e62b5d8e8687a30a53e6a19fc785851a2a8a42b429" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.637972 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxwxt\" (UniqueName: \"kubernetes.io/projected/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-kube-api-access-xxwxt\") pod \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.638216 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-config\") pod \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.638334 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-metrics-certs-tls-certs\") pod \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.638494 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-ovn-rundir\") pod \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.638521 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-scripts\") pod \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.639838 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-scripts" (OuterVolumeSpecName: "scripts") pod "fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" (UID: "fe2e673c-4ce5-4df9-b9e6-0b7cea99727c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.640281 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-config" (OuterVolumeSpecName: "config") pod "fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" (UID: "fe2e673c-4ce5-4df9-b9e6-0b7cea99727c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.640311 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.641654 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" (UID: "fe2e673c-4ce5-4df9-b9e6-0b7cea99727c"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.646934 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-kube-api-access-xxwxt" (OuterVolumeSpecName: "kube-api-access-xxwxt") pod "fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" (UID: "fe2e673c-4ce5-4df9-b9e6-0b7cea99727c"). InnerVolumeSpecName "kube-api-access-xxwxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.741947 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-combined-ca-bundle\") pod \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.742061 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-ovn-northd-tls-certs\") pod \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.742663 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxwxt\" (UniqueName: \"kubernetes.io/projected/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-kube-api-access-xxwxt\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.742689 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.742703 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-ovn-rundir\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.751634 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" (UID: "fe2e673c-4ce5-4df9-b9e6-0b7cea99727c"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.766363 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" (UID: "fe2e673c-4ce5-4df9-b9e6-0b7cea99727c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.796852 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.801263 4795 scope.go:117] "RemoveContainer" containerID="d4e7caf152ae88ec057977570b5950355e310fd918c2c78401ccea557c71b632" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.819708 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.825799 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.841468 4795 scope.go:117] "RemoveContainer" containerID="a6cf86c6cb492d1373f389e62b5d8e8687a30a53e6a19fc785851a2a8a42b429" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.842870 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-combined-ca-bundle\") pod \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.842928 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-config-data-generated\") pod \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.842946 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-galera-tls-certs\") pod \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.843090 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-operator-scripts\") pod \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.843121 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-kolla-config\") pod \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.843139 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtmqc\" (UniqueName: \"kubernetes.io/projected/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-kube-api-access-gtmqc\") pod \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.843206 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-config-data-default\") pod \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.843232 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.843471 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.843482 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.848772 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" (UID: "0bbc6c00-2fc9-42cb-9c5a-9a160903ae99"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: E0219 21:50:01.850384 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6cf86c6cb492d1373f389e62b5d8e8687a30a53e6a19fc785851a2a8a42b429\": container with ID starting with a6cf86c6cb492d1373f389e62b5d8e8687a30a53e6a19fc785851a2a8a42b429 not found: ID does not exist" containerID="a6cf86c6cb492d1373f389e62b5d8e8687a30a53e6a19fc785851a2a8a42b429" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.850566 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6cf86c6cb492d1373f389e62b5d8e8687a30a53e6a19fc785851a2a8a42b429"} err="failed to get container status \"a6cf86c6cb492d1373f389e62b5d8e8687a30a53e6a19fc785851a2a8a42b429\": rpc error: code = NotFound desc = could not find container \"a6cf86c6cb492d1373f389e62b5d8e8687a30a53e6a19fc785851a2a8a42b429\": container with ID starting with a6cf86c6cb492d1373f389e62b5d8e8687a30a53e6a19fc785851a2a8a42b429 not found: ID does not exist" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.850590 4795 scope.go:117] "RemoveContainer" containerID="d4e7caf152ae88ec057977570b5950355e310fd918c2c78401ccea557c71b632" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.850744 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" (UID: "0bbc6c00-2fc9-42cb-9c5a-9a160903ae99"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.852560 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" (UID: "0bbc6c00-2fc9-42cb-9c5a-9a160903ae99"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: E0219 21:50:01.858794 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4e7caf152ae88ec057977570b5950355e310fd918c2c78401ccea557c71b632\": container with ID starting with d4e7caf152ae88ec057977570b5950355e310fd918c2c78401ccea557c71b632 not found: ID does not exist" containerID="d4e7caf152ae88ec057977570b5950355e310fd918c2c78401ccea557c71b632" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.859500 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4e7caf152ae88ec057977570b5950355e310fd918c2c78401ccea557c71b632"} err="failed to get container status \"d4e7caf152ae88ec057977570b5950355e310fd918c2c78401ccea557c71b632\": rpc error: code = NotFound desc = could not find container \"d4e7caf152ae88ec057977570b5950355e310fd918c2c78401ccea557c71b632\": container with ID starting with d4e7caf152ae88ec057977570b5950355e310fd918c2c78401ccea557c71b632 not found: ID does not exist" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.859537 4795 scope.go:117] "RemoveContainer" containerID="812bd35c706001e18c0da5e2c3dd17a059f42e984bdd2e12b92f8fd91195b1e2" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.861504 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-kube-api-access-gtmqc" (OuterVolumeSpecName: "kube-api-access-gtmqc") pod "0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" (UID: "0bbc6c00-2fc9-42cb-9c5a-9a160903ae99"). InnerVolumeSpecName "kube-api-access-gtmqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.862349 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" (UID: "0bbc6c00-2fc9-42cb-9c5a-9a160903ae99"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.869445 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "mysql-db") pod "0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" (UID: "0bbc6c00-2fc9-42cb-9c5a-9a160903ae99"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.875058 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_fe2e673c-4ce5-4df9-b9e6-0b7cea99727c/ovn-northd/0.log" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.875160 4795 generic.go:334] "Generic (PLEG): container finished" podID="fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" containerID="2cccec2e02e2c16f7cff73b24c7a62191291d809702ff698745b7a193cbe2c2f" exitCode=139 Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.875353 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c","Type":"ContainerDied","Data":"2cccec2e02e2c16f7cff73b24c7a62191291d809702ff698745b7a193cbe2c2f"} Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.875378 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c","Type":"ContainerDied","Data":"dbab531f1a8f22d58c44dcac6c6209fda329451de2d8664028adcfc876aa2507"} Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.875437 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.877299 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" (UID: "fe2e673c-4ce5-4df9-b9e6-0b7cea99727c"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.884717 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"57b83043-2f7c-4b55-a2b9-66eef96f0008","Type":"ContainerDied","Data":"83f719d65e236fae031c225d4f8065a2b4c198be5a5993edbe70af70bfebe600"} Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.884835 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.885657 4795 scope.go:117] "RemoveContainer" containerID="38273143a291266be6dd29c71788a99ae4aa366ccd575844722fcc6687631e66" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.885787 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" (UID: "0bbc6c00-2fc9-42cb-9c5a-9a160903ae99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.891093 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.900512 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ktl2b" event={"ID":"2164f9d1-1d8b-486b-beca-0d3a5172b302","Type":"ContainerDied","Data":"0bc8d13f4092138cc363d9e77ad1f35f49f21dad6c940b0ffcd7de9f24d779fb"} Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.900586 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ktl2b" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.913222 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" (UID: "0bbc6c00-2fc9-42cb-9c5a-9a160903ae99"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.919531 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3","Type":"ContainerDied","Data":"aada954b6c8106a5c25613b1c4b96d76ce41049aa7128aa357d9511f84c5abf0"} Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.919613 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.926107 4795 generic.go:334] "Generic (PLEG): container finished" podID="0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" containerID="0f7b958a502b531c3fff207837d630d66468893d907fbb748e2bd58503327153" exitCode=0 Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.926190 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-41fb-account-create-update-h4ql2" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.926629 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.931328 4795 scope.go:117] "RemoveContainer" containerID="1947803c21d9e4b80d454677a036fe260f1c04cd37249f1f066a15e8b611b1c3" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.931353 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99","Type":"ContainerDied","Data":"0f7b958a502b531c3fff207837d630d66468893d907fbb748e2bd58503327153"} Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.931394 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99","Type":"ContainerDied","Data":"050b9a153d584bbd1ba63be9e7a93c951075127827418493ce2ba5e1d8a7ed20"} Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.931407 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.935175 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.950955 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.950986 4795 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.951339 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtmqc\" (UniqueName: \"kubernetes.io/projected/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-kube-api-access-gtmqc\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.951354 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.951365 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.951397 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.951407 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.951417 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.951426 4795 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.958916 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.972743 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.973419 4795 scope.go:117] "RemoveContainer" containerID="2cccec2e02e2c16f7cff73b24c7a62191291d809702ff698745b7a193cbe2c2f" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.975844 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6f98bf9994-pr48x"] Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.987333 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6f98bf9994-pr48x"] Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.995300 4795 scope.go:117] "RemoveContainer" containerID="1947803c21d9e4b80d454677a036fe260f1c04cd37249f1f066a15e8b611b1c3" Feb 19 21:50:01 crc kubenswrapper[4795]: E0219 21:50:01.996368 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1947803c21d9e4b80d454677a036fe260f1c04cd37249f1f066a15e8b611b1c3\": container with ID starting with 1947803c21d9e4b80d454677a036fe260f1c04cd37249f1f066a15e8b611b1c3 not found: ID does not exist" containerID="1947803c21d9e4b80d454677a036fe260f1c04cd37249f1f066a15e8b611b1c3" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.996400 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1947803c21d9e4b80d454677a036fe260f1c04cd37249f1f066a15e8b611b1c3"} err="failed to get container status \"1947803c21d9e4b80d454677a036fe260f1c04cd37249f1f066a15e8b611b1c3\": rpc error: code = NotFound desc = could not find container \"1947803c21d9e4b80d454677a036fe260f1c04cd37249f1f066a15e8b611b1c3\": container with ID starting with 1947803c21d9e4b80d454677a036fe260f1c04cd37249f1f066a15e8b611b1c3 not found: ID does not exist" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.996421 4795 scope.go:117] "RemoveContainer" containerID="2cccec2e02e2c16f7cff73b24c7a62191291d809702ff698745b7a193cbe2c2f" Feb 19 21:50:01 crc kubenswrapper[4795]: E0219 21:50:01.996769 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cccec2e02e2c16f7cff73b24c7a62191291d809702ff698745b7a193cbe2c2f\": container with ID starting with 2cccec2e02e2c16f7cff73b24c7a62191291d809702ff698745b7a193cbe2c2f not found: ID does not exist" containerID="2cccec2e02e2c16f7cff73b24c7a62191291d809702ff698745b7a193cbe2c2f" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.996796 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cccec2e02e2c16f7cff73b24c7a62191291d809702ff698745b7a193cbe2c2f"} err="failed to get container status \"2cccec2e02e2c16f7cff73b24c7a62191291d809702ff698745b7a193cbe2c2f\": rpc error: code = NotFound desc = could not find container \"2cccec2e02e2c16f7cff73b24c7a62191291d809702ff698745b7a193cbe2c2f\": container with ID starting with 2cccec2e02e2c16f7cff73b24c7a62191291d809702ff698745b7a193cbe2c2f not found: ID does not exist" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.996821 4795 scope.go:117] "RemoveContainer" containerID="4f011f521b748247939cf0cf1c345e321c4287fa532812c9357da77b932cf4c6" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.997801 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.004356 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.010355 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.013203 4795 scope.go:117] "RemoveContainer" containerID="37b5ade45a3dd017a14fe70f806fa2f9b2438f7ec74aeb16d3207b0a96e2916a" Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.013905 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.019570 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-ktl2b"] Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.025359 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-ktl2b"] Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.030927 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.033983 4795 scope.go:117] "RemoveContainer" containerID="db08b177837bd80146274836c0977b72219e93219854a2c6c5e81a24922a33fe" Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.035391 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.063085 4795 scope.go:117] "RemoveContainer" containerID="0f7b958a502b531c3fff207837d630d66468893d907fbb748e2bd58503327153" Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.066100 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.068600 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-41fb-account-create-update-h4ql2"] Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.078059 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-41fb-account-create-update-h4ql2"] Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.080322 4795 scope.go:117] "RemoveContainer" containerID="106dec830838772106b6621ed4bf25acb79616ec219d016a3ce4bb8a514abaf0" Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.082755 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.088601 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.101775 4795 scope.go:117] "RemoveContainer" containerID="0f7b958a502b531c3fff207837d630d66468893d907fbb748e2bd58503327153" Feb 19 21:50:02 crc kubenswrapper[4795]: E0219 21:50:02.105744 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f7b958a502b531c3fff207837d630d66468893d907fbb748e2bd58503327153\": container with ID starting with 0f7b958a502b531c3fff207837d630d66468893d907fbb748e2bd58503327153 not found: ID does not exist" containerID="0f7b958a502b531c3fff207837d630d66468893d907fbb748e2bd58503327153" Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.105776 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f7b958a502b531c3fff207837d630d66468893d907fbb748e2bd58503327153"} err="failed to get container status \"0f7b958a502b531c3fff207837d630d66468893d907fbb748e2bd58503327153\": rpc error: code = NotFound desc = could not find container \"0f7b958a502b531c3fff207837d630d66468893d907fbb748e2bd58503327153\": container with ID starting with 0f7b958a502b531c3fff207837d630d66468893d907fbb748e2bd58503327153 not found: ID does not exist" Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.105798 4795 scope.go:117] "RemoveContainer" containerID="106dec830838772106b6621ed4bf25acb79616ec219d016a3ce4bb8a514abaf0" Feb 19 21:50:02 crc kubenswrapper[4795]: E0219 21:50:02.106211 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"106dec830838772106b6621ed4bf25acb79616ec219d016a3ce4bb8a514abaf0\": container with ID starting with 106dec830838772106b6621ed4bf25acb79616ec219d016a3ce4bb8a514abaf0 not found: ID does not exist" containerID="106dec830838772106b6621ed4bf25acb79616ec219d016a3ce4bb8a514abaf0" Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.106265 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"106dec830838772106b6621ed4bf25acb79616ec219d016a3ce4bb8a514abaf0"} err="failed to get container status \"106dec830838772106b6621ed4bf25acb79616ec219d016a3ce4bb8a514abaf0\": rpc error: code = NotFound desc = could not find container \"106dec830838772106b6621ed4bf25acb79616ec219d016a3ce4bb8a514abaf0\": container with ID starting with 106dec830838772106b6621ed4bf25acb79616ec219d016a3ce4bb8a514abaf0 not found: ID does not exist" Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.171371 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcb7p\" (UniqueName: \"kubernetes.io/projected/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-kube-api-access-qcb7p\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.209100 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.216417 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.273087 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:02 crc kubenswrapper[4795]: E0219 21:50:02.273140 4795 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 19 21:50:02 crc kubenswrapper[4795]: E0219 21:50:02.273226 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-config-data podName:ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d nodeName:}" failed. No retries permitted until 2026-02-19 21:50:10.273207406 +0000 UTC m=+1321.465725270 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-config-data") pod "rabbitmq-cell1-server-0" (UID: "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d") : configmap "rabbitmq-cell1-config-data" not found Feb 19 21:50:02 crc kubenswrapper[4795]: E0219 21:50:02.628063 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:50:02 crc kubenswrapper[4795]: E0219 21:50:02.628494 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:50:02 crc kubenswrapper[4795]: E0219 21:50:02.629022 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:50:02 crc kubenswrapper[4795]: E0219 21:50:02.629047 4795 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tl5hf" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovsdb-server" Feb 19 21:50:02 crc kubenswrapper[4795]: E0219 21:50:02.630723 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:50:02 crc kubenswrapper[4795]: E0219 21:50:02.633252 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:50:02 crc kubenswrapper[4795]: E0219 21:50:02.634993 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:50:02 crc kubenswrapper[4795]: E0219 21:50:02.635027 4795 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tl5hf" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovs-vswitchd" Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.800469 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.205:3000/\": dial tcp 10.217.0.205:3000: connect: connection refused" Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.829891 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-w9fbs" podUID="c30e8522-2d7f-4f10-a0b4-a7cfc351d093" containerName="ovn-controller" probeResult="failure" output="command timed out" Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.864058 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-w9fbs" podUID="c30e8522-2d7f-4f10-a0b4-a7cfc351d093" containerName="ovn-controller" probeResult="failure" output=< Feb 19 21:50:02 crc kubenswrapper[4795]: ERROR - Failed to get connection status from ovn-controller, ovn-appctl exit status: 0 Feb 19 21:50:02 crc kubenswrapper[4795]: > Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.906022 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="0adadcd9-8949-443b-8042-d0d11191eae9" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.200:6080/vnc_lite.html\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.949606 4795 generic.go:334] "Generic (PLEG): container finished" podID="ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" containerID="5a6b19520891e7087129c9dfe002592444a956d213365be36d88dc721e7adc6e" exitCode=0 Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.949682 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d","Type":"ContainerDied","Data":"5a6b19520891e7087129c9dfe002592444a956d213365be36d88dc721e7adc6e"} Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.958060 4795 generic.go:334] "Generic (PLEG): container finished" podID="4b928260-ac65-479d-bd4b-f14b48d24ddb" containerID="77a5c881bfb4b3162733203d6d06eed351c1cde8f2967461288b978f94eeb5ba" exitCode=0 Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.958152 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6945f64f65-rnq2b" event={"ID":"4b928260-ac65-479d-bd4b-f14b48d24ddb","Type":"ContainerDied","Data":"77a5c881bfb4b3162733203d6d06eed351c1cde8f2967461288b978f94eeb5ba"} Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.958222 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6945f64f65-rnq2b" event={"ID":"4b928260-ac65-479d-bd4b-f14b48d24ddb","Type":"ContainerDied","Data":"bdfd7065946391c6d353cb9168c7226ec5dc670a459303e629e9266fcea79077"} Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.958233 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdfd7065946391c6d353cb9168c7226ec5dc670a459303e629e9266fcea79077" Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.967562 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.987908 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-scripts\") pod \"4b928260-ac65-479d-bd4b-f14b48d24ddb\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.987956 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-combined-ca-bundle\") pod \"4b928260-ac65-479d-bd4b-f14b48d24ddb\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.988002 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-public-tls-certs\") pod \"4b928260-ac65-479d-bd4b-f14b48d24ddb\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.988031 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-internal-tls-certs\") pod \"4b928260-ac65-479d-bd4b-f14b48d24ddb\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.988062 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8hb5\" (UniqueName: \"kubernetes.io/projected/4b928260-ac65-479d-bd4b-f14b48d24ddb-kube-api-access-v8hb5\") pod \"4b928260-ac65-479d-bd4b-f14b48d24ddb\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.988086 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-fernet-keys\") pod \"4b928260-ac65-479d-bd4b-f14b48d24ddb\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.988130 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-credential-keys\") pod \"4b928260-ac65-479d-bd4b-f14b48d24ddb\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.988158 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-config-data\") pod \"4b928260-ac65-479d-bd4b-f14b48d24ddb\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.996095 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4b928260-ac65-479d-bd4b-f14b48d24ddb" (UID: "4b928260-ac65-479d-bd4b-f14b48d24ddb"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.000606 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4b928260-ac65-479d-bd4b-f14b48d24ddb" (UID: "4b928260-ac65-479d-bd4b-f14b48d24ddb"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.000878 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b928260-ac65-479d-bd4b-f14b48d24ddb-kube-api-access-v8hb5" (OuterVolumeSpecName: "kube-api-access-v8hb5") pod "4b928260-ac65-479d-bd4b-f14b48d24ddb" (UID: "4b928260-ac65-479d-bd4b-f14b48d24ddb"). InnerVolumeSpecName "kube-api-access-v8hb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.008507 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-scripts" (OuterVolumeSpecName: "scripts") pod "4b928260-ac65-479d-bd4b-f14b48d24ddb" (UID: "4b928260-ac65-479d-bd4b-f14b48d24ddb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: E0219 21:50:03.014794 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.015072 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-config-data" (OuterVolumeSpecName: "config-data") pod "4b928260-ac65-479d-bd4b-f14b48d24ddb" (UID: "4b928260-ac65-479d-bd4b-f14b48d24ddb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: E0219 21:50:03.018074 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.020940 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b928260-ac65-479d-bd4b-f14b48d24ddb" (UID: "4b928260-ac65-479d-bd4b-f14b48d24ddb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: E0219 21:50:03.021176 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:50:03 crc kubenswrapper[4795]: E0219 21:50:03.021214 4795 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="f2710b23-7a5c-44cb-b916-9e08edc59636" containerName="nova-scheduler-scheduler" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.024112 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.035139 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4b928260-ac65-479d-bd4b-f14b48d24ddb" (UID: "4b928260-ac65-479d-bd4b-f14b48d24ddb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.041422 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4b928260-ac65-479d-bd4b-f14b48d24ddb" (UID: "4b928260-ac65-479d-bd4b-f14b48d24ddb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.088760 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-erlang-cookie\") pod \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.088799 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-confd\") pod \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.088854 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pjq4\" (UniqueName: \"kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-kube-api-access-7pjq4\") pod \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.088871 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-erlang-cookie-secret\") pod \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.088893 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-config-data\") pod \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.088912 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-server-conf\") pod \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.088950 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-plugins\") pod \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.089406 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-pod-info\") pod \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.089430 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.089448 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-tls\") pod \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.089473 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-plugins-conf\") pod \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.089675 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.089686 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.089695 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.089703 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.089711 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8hb5\" (UniqueName: \"kubernetes.io/projected/4b928260-ac65-479d-bd4b-f14b48d24ddb-kube-api-access-v8hb5\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.090204 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" (UID: "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.090503 4795 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.090517 4795 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.090525 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.090651 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" (UID: "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.090745 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" (UID: "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.091704 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-kube-api-access-7pjq4" (OuterVolumeSpecName: "kube-api-access-7pjq4") pod "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" (UID: "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d"). InnerVolumeSpecName "kube-api-access-7pjq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.091727 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" (UID: "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.092389 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-pod-info" (OuterVolumeSpecName: "pod-info") pod "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" (UID: "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.093086 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "persistence") pod "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" (UID: "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.093783 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" (UID: "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.114158 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-config-data" (OuterVolumeSpecName: "config-data") pod "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" (UID: "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.123134 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-server-conf" (OuterVolumeSpecName: "server-conf") pod "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" (UID: "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.164178 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" (UID: "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.191530 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pjq4\" (UniqueName: \"kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-kube-api-access-7pjq4\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.191564 4795 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.191575 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: E0219 21:50:03.191570 4795 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 19 21:50:03 crc kubenswrapper[4795]: E0219 21:50:03.191664 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-config-data podName:7b096325-542d-4ac6-8d16-8aa0937013b2 nodeName:}" failed. No retries permitted until 2026-02-19 21:50:11.191642569 +0000 UTC m=+1322.384160453 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-config-data") pod "rabbitmq-server-0" (UID: "7b096325-542d-4ac6-8d16-8aa0937013b2") : configmap "rabbitmq-config-data" not found Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.191584 4795 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-server-conf\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.192007 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.192024 4795 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-pod-info\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.192052 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.192065 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.192078 4795 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.192091 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.192104 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.211754 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.293339 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.522297 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" path="/var/lib/kubelet/pods/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99/volumes" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.523651 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2164f9d1-1d8b-486b-beca-0d3a5172b302" path="/var/lib/kubelet/pods/2164f9d1-1d8b-486b-beca-0d3a5172b302/volumes" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.524450 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9" path="/var/lib/kubelet/pods/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9/volumes" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.525454 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42e81a78-17fd-4ed3-b072-dd6a20cbe5d3" path="/var/lib/kubelet/pods/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3/volumes" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.527327 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57b83043-2f7c-4b55-a2b9-66eef96f0008" path="/var/lib/kubelet/pods/57b83043-2f7c-4b55-a2b9-66eef96f0008/volumes" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.528212 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="793bbadc-8b53-4084-a63a-0b76b37284df" path="/var/lib/kubelet/pods/793bbadc-8b53-4084-a63a-0b76b37284df/volumes" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.530950 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" path="/var/lib/kubelet/pods/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9/volumes" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.532938 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2561f4e-0a01-4927-96f8-ee7bef69f561" path="/var/lib/kubelet/pods/d2561f4e-0a01-4927-96f8-ee7bef69f561/volumes" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.534821 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" path="/var/lib/kubelet/pods/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107/volumes" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.535989 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4a6a069-904a-4072-b98c-346f67f22def" path="/var/lib/kubelet/pods/e4a6a069-904a-4072-b98c-346f67f22def/volumes" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.537569 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" path="/var/lib/kubelet/pods/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c/volumes" Feb 19 21:50:03 crc kubenswrapper[4795]: E0219 21:50:03.574993 4795 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Feb 19 21:50:03 crc kubenswrapper[4795]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-02-19T21:49:57Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Feb 19 21:50:03 crc kubenswrapper[4795]: /etc/init.d/functions: line 589: 386 Alarm clock "$@" Feb 19 21:50:03 crc kubenswrapper[4795]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-w9fbs" message=< Feb 19 21:50:03 crc kubenswrapper[4795]: Exiting ovn-controller (1) [FAILED] Feb 19 21:50:03 crc kubenswrapper[4795]: Killing ovn-controller (1) [ OK ] Feb 19 21:50:03 crc kubenswrapper[4795]: 2026-02-19T21:49:57Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Feb 19 21:50:03 crc kubenswrapper[4795]: /etc/init.d/functions: line 589: 386 Alarm clock "$@" Feb 19 21:50:03 crc kubenswrapper[4795]: > Feb 19 21:50:03 crc kubenswrapper[4795]: E0219 21:50:03.575065 4795 kuberuntime_container.go:691] "PreStop hook failed" err=< Feb 19 21:50:03 crc kubenswrapper[4795]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-02-19T21:49:57Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Feb 19 21:50:03 crc kubenswrapper[4795]: /etc/init.d/functions: line 589: 386 Alarm clock "$@" Feb 19 21:50:03 crc kubenswrapper[4795]: > pod="openstack/ovn-controller-w9fbs" podUID="c30e8522-2d7f-4f10-a0b4-a7cfc351d093" containerName="ovn-controller" containerID="cri-o://e6a343acb1888ef7b6cc47d70fee6b276aa66bddee8c2a595c2dde665a4a1a27" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.575156 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-w9fbs" podUID="c30e8522-2d7f-4f10-a0b4-a7cfc351d093" containerName="ovn-controller" containerID="cri-o://e6a343acb1888ef7b6cc47d70fee6b276aa66bddee8c2a595c2dde665a4a1a27" gracePeriod=23 Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.688242 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.699727 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"7b096325-542d-4ac6-8d16-8aa0937013b2\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.699772 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-plugins-conf\") pod \"7b096325-542d-4ac6-8d16-8aa0937013b2\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.699799 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-erlang-cookie\") pod \"7b096325-542d-4ac6-8d16-8aa0937013b2\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.699827 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cwvp\" (UniqueName: \"kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-kube-api-access-7cwvp\") pod \"7b096325-542d-4ac6-8d16-8aa0937013b2\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.699899 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7b096325-542d-4ac6-8d16-8aa0937013b2-erlang-cookie-secret\") pod \"7b096325-542d-4ac6-8d16-8aa0937013b2\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.699930 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-plugins\") pod \"7b096325-542d-4ac6-8d16-8aa0937013b2\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.699952 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-server-conf\") pod \"7b096325-542d-4ac6-8d16-8aa0937013b2\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.699981 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-tls\") pod \"7b096325-542d-4ac6-8d16-8aa0937013b2\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.700017 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-config-data\") pod \"7b096325-542d-4ac6-8d16-8aa0937013b2\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.700045 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7b096325-542d-4ac6-8d16-8aa0937013b2-pod-info\") pod \"7b096325-542d-4ac6-8d16-8aa0937013b2\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.700097 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-confd\") pod \"7b096325-542d-4ac6-8d16-8aa0937013b2\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.701872 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "7b096325-542d-4ac6-8d16-8aa0937013b2" (UID: "7b096325-542d-4ac6-8d16-8aa0937013b2"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.702239 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "7b096325-542d-4ac6-8d16-8aa0937013b2" (UID: "7b096325-542d-4ac6-8d16-8aa0937013b2"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.702625 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "7b096325-542d-4ac6-8d16-8aa0937013b2" (UID: "7b096325-542d-4ac6-8d16-8aa0937013b2"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.740890 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b096325-542d-4ac6-8d16-8aa0937013b2-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "7b096325-542d-4ac6-8d16-8aa0937013b2" (UID: "7b096325-542d-4ac6-8d16-8aa0937013b2"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.741853 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/7b096325-542d-4ac6-8d16-8aa0937013b2-pod-info" (OuterVolumeSpecName: "pod-info") pod "7b096325-542d-4ac6-8d16-8aa0937013b2" (UID: "7b096325-542d-4ac6-8d16-8aa0937013b2"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.750376 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "7b096325-542d-4ac6-8d16-8aa0937013b2" (UID: "7b096325-542d-4ac6-8d16-8aa0937013b2"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.757669 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "7b096325-542d-4ac6-8d16-8aa0937013b2" (UID: "7b096325-542d-4ac6-8d16-8aa0937013b2"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.758957 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-kube-api-access-7cwvp" (OuterVolumeSpecName: "kube-api-access-7cwvp") pod "7b096325-542d-4ac6-8d16-8aa0937013b2" (UID: "7b096325-542d-4ac6-8d16-8aa0937013b2"). InnerVolumeSpecName "kube-api-access-7cwvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.769148 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-config-data" (OuterVolumeSpecName: "config-data") pod "7b096325-542d-4ac6-8d16-8aa0937013b2" (UID: "7b096325-542d-4ac6-8d16-8aa0937013b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.773610 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-server-conf" (OuterVolumeSpecName: "server-conf") pod "7b096325-542d-4ac6-8d16-8aa0937013b2" (UID: "7b096325-542d-4ac6-8d16-8aa0937013b2"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.801234 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.801264 4795 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.801278 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.801290 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cwvp\" (UniqueName: \"kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-kube-api-access-7cwvp\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.801301 4795 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7b096325-542d-4ac6-8d16-8aa0937013b2-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.801311 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.801321 4795 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-server-conf\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.801331 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.801340 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.801350 4795 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7b096325-542d-4ac6-8d16-8aa0937013b2-pod-info\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.817924 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.819146 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "7b096325-542d-4ac6-8d16-8aa0937013b2" (UID: "7b096325-542d-4ac6-8d16-8aa0937013b2"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.902391 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.902422 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.975610 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d","Type":"ContainerDied","Data":"12b91da897daae78f76b09af510ceca04ac8909ff3967813b7e6274bf414c6a5"} Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.975641 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.975659 4795 scope.go:117] "RemoveContainer" containerID="5a6b19520891e7087129c9dfe002592444a956d213365be36d88dc721e7adc6e" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.979327 4795 generic.go:334] "Generic (PLEG): container finished" podID="7b096325-542d-4ac6-8d16-8aa0937013b2" containerID="65d80d81718a8cacc6c1500df8f269c37262bad204c9b2124ccbee151d27a66a" exitCode=0 Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.979392 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7b096325-542d-4ac6-8d16-8aa0937013b2","Type":"ContainerDied","Data":"65d80d81718a8cacc6c1500df8f269c37262bad204c9b2124ccbee151d27a66a"} Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.979417 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7b096325-542d-4ac6-8d16-8aa0937013b2","Type":"ContainerDied","Data":"43ff46f740a6f7a342639c9893e1a10e76310ef799a0ad928eb028dabd7dd840"} Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.979411 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.983549 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-w9fbs_c30e8522-2d7f-4f10-a0b4-a7cfc351d093/ovn-controller/0.log" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.983584 4795 generic.go:334] "Generic (PLEG): container finished" podID="c30e8522-2d7f-4f10-a0b4-a7cfc351d093" containerID="e6a343acb1888ef7b6cc47d70fee6b276aa66bddee8c2a595c2dde665a4a1a27" exitCode=139 Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.983641 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.983633 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-w9fbs" event={"ID":"c30e8522-2d7f-4f10-a0b4-a7cfc351d093","Type":"ContainerDied","Data":"e6a343acb1888ef7b6cc47d70fee6b276aa66bddee8c2a595c2dde665a4a1a27"} Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.015396 4795 scope.go:117] "RemoveContainer" containerID="f5dc53fcd687359370a9224413921410e03027d27bb1e741143948af4422db6c" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.018473 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.025723 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.053274 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6945f64f65-rnq2b"] Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.061945 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-6945f64f65-rnq2b"] Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.075221 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.078310 4795 scope.go:117] "RemoveContainer" containerID="65d80d81718a8cacc6c1500df8f269c37262bad204c9b2124ccbee151d27a66a" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.080539 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.105210 4795 scope.go:117] "RemoveContainer" containerID="90e58bd420672619d861e1841541403fdbfcac1af5a896cc071e58d6af65cf55" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.127859 4795 scope.go:117] "RemoveContainer" containerID="65d80d81718a8cacc6c1500df8f269c37262bad204c9b2124ccbee151d27a66a" Feb 19 21:50:04 crc kubenswrapper[4795]: E0219 21:50:04.128527 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65d80d81718a8cacc6c1500df8f269c37262bad204c9b2124ccbee151d27a66a\": container with ID starting with 65d80d81718a8cacc6c1500df8f269c37262bad204c9b2124ccbee151d27a66a not found: ID does not exist" containerID="65d80d81718a8cacc6c1500df8f269c37262bad204c9b2124ccbee151d27a66a" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.128579 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65d80d81718a8cacc6c1500df8f269c37262bad204c9b2124ccbee151d27a66a"} err="failed to get container status \"65d80d81718a8cacc6c1500df8f269c37262bad204c9b2124ccbee151d27a66a\": rpc error: code = NotFound desc = could not find container \"65d80d81718a8cacc6c1500df8f269c37262bad204c9b2124ccbee151d27a66a\": container with ID starting with 65d80d81718a8cacc6c1500df8f269c37262bad204c9b2124ccbee151d27a66a not found: ID does not exist" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.128607 4795 scope.go:117] "RemoveContainer" containerID="90e58bd420672619d861e1841541403fdbfcac1af5a896cc071e58d6af65cf55" Feb 19 21:50:04 crc kubenswrapper[4795]: E0219 21:50:04.128873 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90e58bd420672619d861e1841541403fdbfcac1af5a896cc071e58d6af65cf55\": container with ID starting with 90e58bd420672619d861e1841541403fdbfcac1af5a896cc071e58d6af65cf55 not found: ID does not exist" containerID="90e58bd420672619d861e1841541403fdbfcac1af5a896cc071e58d6af65cf55" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.128917 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90e58bd420672619d861e1841541403fdbfcac1af5a896cc071e58d6af65cf55"} err="failed to get container status \"90e58bd420672619d861e1841541403fdbfcac1af5a896cc071e58d6af65cf55\": rpc error: code = NotFound desc = could not find container \"90e58bd420672619d861e1841541403fdbfcac1af5a896cc071e58d6af65cf55\": container with ID starting with 90e58bd420672619d861e1841541403fdbfcac1af5a896cc071e58d6af65cf55 not found: ID does not exist" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.600664 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-w9fbs_c30e8522-2d7f-4f10-a0b4-a7cfc351d093/ovn-controller/0.log" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.600744 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-w9fbs" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.613953 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkzbj\" (UniqueName: \"kubernetes.io/projected/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-kube-api-access-bkzbj\") pod \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.613995 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-combined-ca-bundle\") pod \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.614034 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-scripts\") pod \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.614076 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-ovn-controller-tls-certs\") pod \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.614094 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-run\") pod \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.614143 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-run-ovn\") pod \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.614159 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-log-ovn\") pod \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.614368 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c30e8522-2d7f-4f10-a0b4-a7cfc351d093" (UID: "c30e8522-2d7f-4f10-a0b4-a7cfc351d093"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.615153 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-scripts" (OuterVolumeSpecName: "scripts") pod "c30e8522-2d7f-4f10-a0b4-a7cfc351d093" (UID: "c30e8522-2d7f-4f10-a0b4-a7cfc351d093"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.615230 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-run" (OuterVolumeSpecName: "var-run") pod "c30e8522-2d7f-4f10-a0b4-a7cfc351d093" (UID: "c30e8522-2d7f-4f10-a0b4-a7cfc351d093"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.615252 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c30e8522-2d7f-4f10-a0b4-a7cfc351d093" (UID: "c30e8522-2d7f-4f10-a0b4-a7cfc351d093"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.620740 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-kube-api-access-bkzbj" (OuterVolumeSpecName: "kube-api-access-bkzbj") pod "c30e8522-2d7f-4f10-a0b4-a7cfc351d093" (UID: "c30e8522-2d7f-4f10-a0b4-a7cfc351d093"). InnerVolumeSpecName "kube-api-access-bkzbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.644423 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c30e8522-2d7f-4f10-a0b4-a7cfc351d093" (UID: "c30e8522-2d7f-4f10-a0b4-a7cfc351d093"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.698908 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "c30e8522-2d7f-4f10-a0b4-a7cfc351d093" (UID: "c30e8522-2d7f-4f10-a0b4-a7cfc351d093"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.705889 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.715550 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.715600 4795 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-run\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.715610 4795 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.715622 4795 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.715630 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkzbj\" (UniqueName: \"kubernetes.io/projected/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-kube-api-access-bkzbj\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.715638 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.715646 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.816274 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-config-data\") pod \"e956453d-551f-44b4-8125-8656b3155402\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.816371 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-sg-core-conf-yaml\") pod \"e956453d-551f-44b4-8125-8656b3155402\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.816409 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-scripts\") pod \"e956453d-551f-44b4-8125-8656b3155402\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.816441 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-combined-ca-bundle\") pod \"e956453d-551f-44b4-8125-8656b3155402\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.816470 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-ceilometer-tls-certs\") pod \"e956453d-551f-44b4-8125-8656b3155402\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.816510 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-829s7\" (UniqueName: \"kubernetes.io/projected/e956453d-551f-44b4-8125-8656b3155402-kube-api-access-829s7\") pod \"e956453d-551f-44b4-8125-8656b3155402\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.816546 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e956453d-551f-44b4-8125-8656b3155402-run-httpd\") pod \"e956453d-551f-44b4-8125-8656b3155402\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.816609 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e956453d-551f-44b4-8125-8656b3155402-log-httpd\") pod \"e956453d-551f-44b4-8125-8656b3155402\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.817895 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e956453d-551f-44b4-8125-8656b3155402-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e956453d-551f-44b4-8125-8656b3155402" (UID: "e956453d-551f-44b4-8125-8656b3155402"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.818045 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e956453d-551f-44b4-8125-8656b3155402-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e956453d-551f-44b4-8125-8656b3155402" (UID: "e956453d-551f-44b4-8125-8656b3155402"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.820678 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-scripts" (OuterVolumeSpecName: "scripts") pod "e956453d-551f-44b4-8125-8656b3155402" (UID: "e956453d-551f-44b4-8125-8656b3155402"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.823613 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e956453d-551f-44b4-8125-8656b3155402-kube-api-access-829s7" (OuterVolumeSpecName: "kube-api-access-829s7") pod "e956453d-551f-44b4-8125-8656b3155402" (UID: "e956453d-551f-44b4-8125-8656b3155402"). InnerVolumeSpecName "kube-api-access-829s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.840799 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e956453d-551f-44b4-8125-8656b3155402" (UID: "e956453d-551f-44b4-8125-8656b3155402"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.860085 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e956453d-551f-44b4-8125-8656b3155402" (UID: "e956453d-551f-44b4-8125-8656b3155402"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.889297 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e956453d-551f-44b4-8125-8656b3155402" (UID: "e956453d-551f-44b4-8125-8656b3155402"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.896483 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-config-data" (OuterVolumeSpecName: "config-data") pod "e956453d-551f-44b4-8125-8656b3155402" (UID: "e956453d-551f-44b4-8125-8656b3155402"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.907544 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.917605 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2710b23-7a5c-44cb-b916-9e08edc59636-combined-ca-bundle\") pod \"f2710b23-7a5c-44cb-b916-9e08edc59636\" (UID: \"f2710b23-7a5c-44cb-b916-9e08edc59636\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.917712 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5wq7\" (UniqueName: \"kubernetes.io/projected/f2710b23-7a5c-44cb-b916-9e08edc59636-kube-api-access-t5wq7\") pod \"f2710b23-7a5c-44cb-b916-9e08edc59636\" (UID: \"f2710b23-7a5c-44cb-b916-9e08edc59636\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.917789 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2710b23-7a5c-44cb-b916-9e08edc59636-config-data\") pod \"f2710b23-7a5c-44cb-b916-9e08edc59636\" (UID: \"f2710b23-7a5c-44cb-b916-9e08edc59636\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.917999 4795 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.918013 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-829s7\" (UniqueName: \"kubernetes.io/projected/e956453d-551f-44b4-8125-8656b3155402-kube-api-access-829s7\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.918025 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e956453d-551f-44b4-8125-8656b3155402-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.918035 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e956453d-551f-44b4-8125-8656b3155402-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.918044 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.918054 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.918063 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.918076 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.948803 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2710b23-7a5c-44cb-b916-9e08edc59636-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f2710b23-7a5c-44cb-b916-9e08edc59636" (UID: "f2710b23-7a5c-44cb-b916-9e08edc59636"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.949409 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2710b23-7a5c-44cb-b916-9e08edc59636-kube-api-access-t5wq7" (OuterVolumeSpecName: "kube-api-access-t5wq7") pod "f2710b23-7a5c-44cb-b916-9e08edc59636" (UID: "f2710b23-7a5c-44cb-b916-9e08edc59636"). InnerVolumeSpecName "kube-api-access-t5wq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.958929 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2710b23-7a5c-44cb-b916-9e08edc59636-config-data" (OuterVolumeSpecName: "config-data") pod "f2710b23-7a5c-44cb-b916-9e08edc59636" (UID: "f2710b23-7a5c-44cb-b916-9e08edc59636"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.992370 4795 generic.go:334] "Generic (PLEG): container finished" podID="f2710b23-7a5c-44cb-b916-9e08edc59636" containerID="f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6" exitCode=0 Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.992457 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.992472 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f2710b23-7a5c-44cb-b916-9e08edc59636","Type":"ContainerDied","Data":"f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6"} Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.992502 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f2710b23-7a5c-44cb-b916-9e08edc59636","Type":"ContainerDied","Data":"18a795e7f80bb780eadeb9ae01b9659d15da8639c51f358e1baf726a07014084"} Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.992517 4795 scope.go:117] "RemoveContainer" containerID="f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.997661 4795 generic.go:334] "Generic (PLEG): container finished" podID="e956453d-551f-44b4-8125-8656b3155402" containerID="98c35a6c89e932ad9cd3cce6a6d12a0653f350e3b0c1cefa51b67ce7752f0734" exitCode=0 Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.997699 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e956453d-551f-44b4-8125-8656b3155402","Type":"ContainerDied","Data":"98c35a6c89e932ad9cd3cce6a6d12a0653f350e3b0c1cefa51b67ce7752f0734"} Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.997720 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.997731 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e956453d-551f-44b4-8125-8656b3155402","Type":"ContainerDied","Data":"9d09fb4d826d8602127fabff658a8440e51f38b0c8a942f510e29c6808527ef7"} Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.001509 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-w9fbs_c30e8522-2d7f-4f10-a0b4-a7cfc351d093/ovn-controller/0.log" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.001621 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-w9fbs" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.001731 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-w9fbs" event={"ID":"c30e8522-2d7f-4f10-a0b4-a7cfc351d093","Type":"ContainerDied","Data":"dc18420d588bd541d274269ae096f1224bb6a914c81107d9d0d3602a4e7a25d2"} Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.019291 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2710b23-7a5c-44cb-b916-9e08edc59636-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.019319 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2710b23-7a5c-44cb-b916-9e08edc59636-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.019329 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5wq7\" (UniqueName: \"kubernetes.io/projected/f2710b23-7a5c-44cb-b916-9e08edc59636-kube-api-access-t5wq7\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.028138 4795 scope.go:117] "RemoveContainer" containerID="f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6" Feb 19 21:50:05 crc kubenswrapper[4795]: E0219 21:50:05.028535 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6\": container with ID starting with f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6 not found: ID does not exist" containerID="f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.028567 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6"} err="failed to get container status \"f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6\": rpc error: code = NotFound desc = could not find container \"f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6\": container with ID starting with f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6 not found: ID does not exist" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.028585 4795 scope.go:117] "RemoveContainer" containerID="c677d54567313f232f88b9a873fec54a2577075c7b806ea88ad32df8cb36cb26" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.038445 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.052446 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.059722 4795 scope.go:117] "RemoveContainer" containerID="21486f4b00e940629fe56d483d784412d9ef80990cf1527d454dfa0b33b31ddb" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.060836 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.066360 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.071081 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-w9fbs"] Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.075347 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-w9fbs"] Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.076270 4795 scope.go:117] "RemoveContainer" containerID="98c35a6c89e932ad9cd3cce6a6d12a0653f350e3b0c1cefa51b67ce7752f0734" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.093715 4795 scope.go:117] "RemoveContainer" containerID="a502298e81b556f379f6fb3a9a80c1cf56ac3cccd10f1bd463a7436d04a56552" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.113948 4795 scope.go:117] "RemoveContainer" containerID="c677d54567313f232f88b9a873fec54a2577075c7b806ea88ad32df8cb36cb26" Feb 19 21:50:05 crc kubenswrapper[4795]: E0219 21:50:05.114547 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c677d54567313f232f88b9a873fec54a2577075c7b806ea88ad32df8cb36cb26\": container with ID starting with c677d54567313f232f88b9a873fec54a2577075c7b806ea88ad32df8cb36cb26 not found: ID does not exist" containerID="c677d54567313f232f88b9a873fec54a2577075c7b806ea88ad32df8cb36cb26" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.114575 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c677d54567313f232f88b9a873fec54a2577075c7b806ea88ad32df8cb36cb26"} err="failed to get container status \"c677d54567313f232f88b9a873fec54a2577075c7b806ea88ad32df8cb36cb26\": rpc error: code = NotFound desc = could not find container \"c677d54567313f232f88b9a873fec54a2577075c7b806ea88ad32df8cb36cb26\": container with ID starting with c677d54567313f232f88b9a873fec54a2577075c7b806ea88ad32df8cb36cb26 not found: ID does not exist" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.114599 4795 scope.go:117] "RemoveContainer" containerID="21486f4b00e940629fe56d483d784412d9ef80990cf1527d454dfa0b33b31ddb" Feb 19 21:50:05 crc kubenswrapper[4795]: E0219 21:50:05.115446 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21486f4b00e940629fe56d483d784412d9ef80990cf1527d454dfa0b33b31ddb\": container with ID starting with 21486f4b00e940629fe56d483d784412d9ef80990cf1527d454dfa0b33b31ddb not found: ID does not exist" containerID="21486f4b00e940629fe56d483d784412d9ef80990cf1527d454dfa0b33b31ddb" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.115477 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21486f4b00e940629fe56d483d784412d9ef80990cf1527d454dfa0b33b31ddb"} err="failed to get container status \"21486f4b00e940629fe56d483d784412d9ef80990cf1527d454dfa0b33b31ddb\": rpc error: code = NotFound desc = could not find container \"21486f4b00e940629fe56d483d784412d9ef80990cf1527d454dfa0b33b31ddb\": container with ID starting with 21486f4b00e940629fe56d483d784412d9ef80990cf1527d454dfa0b33b31ddb not found: ID does not exist" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.115497 4795 scope.go:117] "RemoveContainer" containerID="98c35a6c89e932ad9cd3cce6a6d12a0653f350e3b0c1cefa51b67ce7752f0734" Feb 19 21:50:05 crc kubenswrapper[4795]: E0219 21:50:05.115720 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98c35a6c89e932ad9cd3cce6a6d12a0653f350e3b0c1cefa51b67ce7752f0734\": container with ID starting with 98c35a6c89e932ad9cd3cce6a6d12a0653f350e3b0c1cefa51b67ce7752f0734 not found: ID does not exist" containerID="98c35a6c89e932ad9cd3cce6a6d12a0653f350e3b0c1cefa51b67ce7752f0734" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.115741 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98c35a6c89e932ad9cd3cce6a6d12a0653f350e3b0c1cefa51b67ce7752f0734"} err="failed to get container status \"98c35a6c89e932ad9cd3cce6a6d12a0653f350e3b0c1cefa51b67ce7752f0734\": rpc error: code = NotFound desc = could not find container \"98c35a6c89e932ad9cd3cce6a6d12a0653f350e3b0c1cefa51b67ce7752f0734\": container with ID starting with 98c35a6c89e932ad9cd3cce6a6d12a0653f350e3b0c1cefa51b67ce7752f0734 not found: ID does not exist" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.115754 4795 scope.go:117] "RemoveContainer" containerID="a502298e81b556f379f6fb3a9a80c1cf56ac3cccd10f1bd463a7436d04a56552" Feb 19 21:50:05 crc kubenswrapper[4795]: E0219 21:50:05.116133 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a502298e81b556f379f6fb3a9a80c1cf56ac3cccd10f1bd463a7436d04a56552\": container with ID starting with a502298e81b556f379f6fb3a9a80c1cf56ac3cccd10f1bd463a7436d04a56552 not found: ID does not exist" containerID="a502298e81b556f379f6fb3a9a80c1cf56ac3cccd10f1bd463a7436d04a56552" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.116241 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a502298e81b556f379f6fb3a9a80c1cf56ac3cccd10f1bd463a7436d04a56552"} err="failed to get container status \"a502298e81b556f379f6fb3a9a80c1cf56ac3cccd10f1bd463a7436d04a56552\": rpc error: code = NotFound desc = could not find container \"a502298e81b556f379f6fb3a9a80c1cf56ac3cccd10f1bd463a7436d04a56552\": container with ID starting with a502298e81b556f379f6fb3a9a80c1cf56ac3cccd10f1bd463a7436d04a56552 not found: ID does not exist" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.116311 4795 scope.go:117] "RemoveContainer" containerID="e6a343acb1888ef7b6cc47d70fee6b276aa66bddee8c2a595c2dde665a4a1a27" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.520748 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b928260-ac65-479d-bd4b-f14b48d24ddb" path="/var/lib/kubelet/pods/4b928260-ac65-479d-bd4b-f14b48d24ddb/volumes" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.521674 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b096325-542d-4ac6-8d16-8aa0937013b2" path="/var/lib/kubelet/pods/7b096325-542d-4ac6-8d16-8aa0937013b2/volumes" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.522654 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" path="/var/lib/kubelet/pods/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d/volumes" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.524026 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c30e8522-2d7f-4f10-a0b4-a7cfc351d093" path="/var/lib/kubelet/pods/c30e8522-2d7f-4f10-a0b4-a7cfc351d093/volumes" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.524760 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e956453d-551f-44b4-8125-8656b3155402" path="/var/lib/kubelet/pods/e956453d-551f-44b4-8125-8656b3155402/volumes" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.526235 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2710b23-7a5c-44cb-b916-9e08edc59636" path="/var/lib/kubelet/pods/f2710b23-7a5c-44cb-b916-9e08edc59636/volumes" Feb 19 21:50:07 crc kubenswrapper[4795]: E0219 21:50:07.628550 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:50:07 crc kubenswrapper[4795]: E0219 21:50:07.629532 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:50:07 crc kubenswrapper[4795]: E0219 21:50:07.629831 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:50:07 crc kubenswrapper[4795]: E0219 21:50:07.629870 4795 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tl5hf" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovsdb-server" Feb 19 21:50:07 crc kubenswrapper[4795]: E0219 21:50:07.630419 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:50:07 crc kubenswrapper[4795]: E0219 21:50:07.632659 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:50:07 crc kubenswrapper[4795]: E0219 21:50:07.636857 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:50:07 crc kubenswrapper[4795]: E0219 21:50:07.636956 4795 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tl5hf" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovs-vswitchd" Feb 19 21:50:10 crc kubenswrapper[4795]: I0219 21:50:10.024447 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-576c65f985-r97z7" podUID="30f2c894-7a7a-4e5a-a090-a28ab50c766a" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.162:9696/\": dial tcp 10.217.0.162:9696: connect: connection refused" Feb 19 21:50:11 crc kubenswrapper[4795]: I0219 21:50:11.829746 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:50:11 crc kubenswrapper[4795]: I0219 21:50:11.937752 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dln52\" (UniqueName: \"kubernetes.io/projected/30f2c894-7a7a-4e5a-a090-a28ab50c766a-kube-api-access-dln52\") pod \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " Feb 19 21:50:11 crc kubenswrapper[4795]: I0219 21:50:11.937862 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-public-tls-certs\") pod \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " Feb 19 21:50:11 crc kubenswrapper[4795]: I0219 21:50:11.937998 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-combined-ca-bundle\") pod \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " Feb 19 21:50:11 crc kubenswrapper[4795]: I0219 21:50:11.938056 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-internal-tls-certs\") pod \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " Feb 19 21:50:11 crc kubenswrapper[4795]: I0219 21:50:11.938138 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-ovndb-tls-certs\") pod \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " Feb 19 21:50:11 crc kubenswrapper[4795]: I0219 21:50:11.938196 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-httpd-config\") pod \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " Feb 19 21:50:11 crc kubenswrapper[4795]: I0219 21:50:11.938244 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-config\") pod \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " Feb 19 21:50:11 crc kubenswrapper[4795]: I0219 21:50:11.943919 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30f2c894-7a7a-4e5a-a090-a28ab50c766a-kube-api-access-dln52" (OuterVolumeSpecName: "kube-api-access-dln52") pod "30f2c894-7a7a-4e5a-a090-a28ab50c766a" (UID: "30f2c894-7a7a-4e5a-a090-a28ab50c766a"). InnerVolumeSpecName "kube-api-access-dln52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:11 crc kubenswrapper[4795]: I0219 21:50:11.944347 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "30f2c894-7a7a-4e5a-a090-a28ab50c766a" (UID: "30f2c894-7a7a-4e5a-a090-a28ab50c766a"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:11 crc kubenswrapper[4795]: I0219 21:50:11.976304 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30f2c894-7a7a-4e5a-a090-a28ab50c766a" (UID: "30f2c894-7a7a-4e5a-a090-a28ab50c766a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:11 crc kubenswrapper[4795]: I0219 21:50:11.979707 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "30f2c894-7a7a-4e5a-a090-a28ab50c766a" (UID: "30f2c894-7a7a-4e5a-a090-a28ab50c766a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:11 crc kubenswrapper[4795]: I0219 21:50:11.986361 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "30f2c894-7a7a-4e5a-a090-a28ab50c766a" (UID: "30f2c894-7a7a-4e5a-a090-a28ab50c766a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:11 crc kubenswrapper[4795]: I0219 21:50:11.994053 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "30f2c894-7a7a-4e5a-a090-a28ab50c766a" (UID: "30f2c894-7a7a-4e5a-a090-a28ab50c766a"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.011359 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-config" (OuterVolumeSpecName: "config") pod "30f2c894-7a7a-4e5a-a090-a28ab50c766a" (UID: "30f2c894-7a7a-4e5a-a090-a28ab50c766a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.040113 4795 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.040386 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.040442 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.040506 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dln52\" (UniqueName: \"kubernetes.io/projected/30f2c894-7a7a-4e5a-a090-a28ab50c766a-kube-api-access-dln52\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.040559 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.040608 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.040656 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.090762 4795 generic.go:334] "Generic (PLEG): container finished" podID="30f2c894-7a7a-4e5a-a090-a28ab50c766a" containerID="b4418cb3b4a933e20130924748ce8ea933d7929e7a32c146e7e7ce593a7c13a0" exitCode=0 Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.090831 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-576c65f985-r97z7" event={"ID":"30f2c894-7a7a-4e5a-a090-a28ab50c766a","Type":"ContainerDied","Data":"b4418cb3b4a933e20130924748ce8ea933d7929e7a32c146e7e7ce593a7c13a0"} Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.090899 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.091216 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-576c65f985-r97z7" event={"ID":"30f2c894-7a7a-4e5a-a090-a28ab50c766a","Type":"ContainerDied","Data":"cc411b717439dc2d51f309775cfcf3728048016bc68869b8b28221a90840d6fb"} Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.091266 4795 scope.go:117] "RemoveContainer" containerID="fe9cf6c1b6f422c771781be2cfb21deabc54071e241ab6b5c24d3d83414ee4ec" Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.113983 4795 scope.go:117] "RemoveContainer" containerID="b4418cb3b4a933e20130924748ce8ea933d7929e7a32c146e7e7ce593a7c13a0" Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.129500 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-576c65f985-r97z7"] Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.133975 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-576c65f985-r97z7"] Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.152489 4795 scope.go:117] "RemoveContainer" containerID="fe9cf6c1b6f422c771781be2cfb21deabc54071e241ab6b5c24d3d83414ee4ec" Feb 19 21:50:12 crc kubenswrapper[4795]: E0219 21:50:12.153144 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe9cf6c1b6f422c771781be2cfb21deabc54071e241ab6b5c24d3d83414ee4ec\": container with ID starting with fe9cf6c1b6f422c771781be2cfb21deabc54071e241ab6b5c24d3d83414ee4ec not found: ID does not exist" containerID="fe9cf6c1b6f422c771781be2cfb21deabc54071e241ab6b5c24d3d83414ee4ec" Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.153204 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe9cf6c1b6f422c771781be2cfb21deabc54071e241ab6b5c24d3d83414ee4ec"} err="failed to get container status \"fe9cf6c1b6f422c771781be2cfb21deabc54071e241ab6b5c24d3d83414ee4ec\": rpc error: code = NotFound desc = could not find container \"fe9cf6c1b6f422c771781be2cfb21deabc54071e241ab6b5c24d3d83414ee4ec\": container with ID starting with fe9cf6c1b6f422c771781be2cfb21deabc54071e241ab6b5c24d3d83414ee4ec not found: ID does not exist" Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.153235 4795 scope.go:117] "RemoveContainer" containerID="b4418cb3b4a933e20130924748ce8ea933d7929e7a32c146e7e7ce593a7c13a0" Feb 19 21:50:12 crc kubenswrapper[4795]: E0219 21:50:12.153884 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4418cb3b4a933e20130924748ce8ea933d7929e7a32c146e7e7ce593a7c13a0\": container with ID starting with b4418cb3b4a933e20130924748ce8ea933d7929e7a32c146e7e7ce593a7c13a0 not found: ID does not exist" containerID="b4418cb3b4a933e20130924748ce8ea933d7929e7a32c146e7e7ce593a7c13a0" Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.153958 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4418cb3b4a933e20130924748ce8ea933d7929e7a32c146e7e7ce593a7c13a0"} err="failed to get container status \"b4418cb3b4a933e20130924748ce8ea933d7929e7a32c146e7e7ce593a7c13a0\": rpc error: code = NotFound desc = could not find container \"b4418cb3b4a933e20130924748ce8ea933d7929e7a32c146e7e7ce593a7c13a0\": container with ID starting with b4418cb3b4a933e20130924748ce8ea933d7929e7a32c146e7e7ce593a7c13a0 not found: ID does not exist" Feb 19 21:50:12 crc kubenswrapper[4795]: E0219 21:50:12.628667 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:50:12 crc kubenswrapper[4795]: E0219 21:50:12.632446 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:50:12 crc kubenswrapper[4795]: E0219 21:50:12.635716 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:50:12 crc kubenswrapper[4795]: E0219 21:50:12.637732 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:50:12 crc kubenswrapper[4795]: E0219 21:50:12.637786 4795 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tl5hf" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovs-vswitchd" Feb 19 21:50:12 crc kubenswrapper[4795]: E0219 21:50:12.632979 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:50:12 crc kubenswrapper[4795]: E0219 21:50:12.642095 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:50:12 crc kubenswrapper[4795]: E0219 21:50:12.642389 4795 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tl5hf" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovsdb-server" Feb 19 21:50:13 crc kubenswrapper[4795]: I0219 21:50:13.528957 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30f2c894-7a7a-4e5a-a090-a28ab50c766a" path="/var/lib/kubelet/pods/30f2c894-7a7a-4e5a-a090-a28ab50c766a/volumes" Feb 19 21:50:17 crc kubenswrapper[4795]: E0219 21:50:17.630563 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:50:17 crc kubenswrapper[4795]: E0219 21:50:17.632267 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:50:17 crc kubenswrapper[4795]: E0219 21:50:17.632334 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:50:17 crc kubenswrapper[4795]: E0219 21:50:17.638125 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:50:17 crc kubenswrapper[4795]: E0219 21:50:17.638225 4795 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tl5hf" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovsdb-server" Feb 19 21:50:17 crc kubenswrapper[4795]: E0219 21:50:17.639413 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:50:17 crc kubenswrapper[4795]: E0219 21:50:17.641115 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:50:17 crc kubenswrapper[4795]: E0219 21:50:17.641215 4795 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tl5hf" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovs-vswitchd" Feb 19 21:50:22 crc kubenswrapper[4795]: E0219 21:50:22.628017 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:50:22 crc kubenswrapper[4795]: E0219 21:50:22.629273 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:50:22 crc kubenswrapper[4795]: E0219 21:50:22.630006 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:50:22 crc kubenswrapper[4795]: E0219 21:50:22.630725 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:50:22 crc kubenswrapper[4795]: E0219 21:50:22.630795 4795 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tl5hf" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovsdb-server" Feb 19 21:50:22 crc kubenswrapper[4795]: E0219 21:50:22.632997 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:50:22 crc kubenswrapper[4795]: E0219 21:50:22.635240 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:50:22 crc kubenswrapper[4795]: E0219 21:50:22.635309 4795 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tl5hf" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovs-vswitchd" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.249061 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerID="955740cbd5ac4eda735378957980240001d5c0ce0905f2fca18b4155f3fb6c98" exitCode=137 Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.249180 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerDied","Data":"955740cbd5ac4eda735378957980240001d5c0ce0905f2fca18b4155f3fb6c98"} Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.511434 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.668872 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6c773ec2-a400-42a9-8784-ed9c295c3bb4-cache\") pod \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.668941 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6c773ec2-a400-42a9-8784-ed9c295c3bb4-lock\") pod \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.668968 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.669019 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift\") pod \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.669037 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bh4z\" (UniqueName: \"kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-kube-api-access-7bh4z\") pod \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.669062 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c773ec2-a400-42a9-8784-ed9c295c3bb4-combined-ca-bundle\") pod \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.670113 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c773ec2-a400-42a9-8784-ed9c295c3bb4-cache" (OuterVolumeSpecName: "cache") pod "6c773ec2-a400-42a9-8784-ed9c295c3bb4" (UID: "6c773ec2-a400-42a9-8784-ed9c295c3bb4"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.670480 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c773ec2-a400-42a9-8784-ed9c295c3bb4-lock" (OuterVolumeSpecName: "lock") pod "6c773ec2-a400-42a9-8784-ed9c295c3bb4" (UID: "6c773ec2-a400-42a9-8784-ed9c295c3bb4"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.675750 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "swift") pod "6c773ec2-a400-42a9-8784-ed9c295c3bb4" (UID: "6c773ec2-a400-42a9-8784-ed9c295c3bb4"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.676133 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-kube-api-access-7bh4z" (OuterVolumeSpecName: "kube-api-access-7bh4z") pod "6c773ec2-a400-42a9-8784-ed9c295c3bb4" (UID: "6c773ec2-a400-42a9-8784-ed9c295c3bb4"). InnerVolumeSpecName "kube-api-access-7bh4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.676437 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6c773ec2-a400-42a9-8784-ed9c295c3bb4" (UID: "6c773ec2-a400-42a9-8784-ed9c295c3bb4"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.719048 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.770565 4795 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6c773ec2-a400-42a9-8784-ed9c295c3bb4-cache\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.770624 4795 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6c773ec2-a400-42a9-8784-ed9c295c3bb4-lock\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.770653 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.770668 4795 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.770680 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bh4z\" (UniqueName: \"kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-kube-api-access-7bh4z\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.788911 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.871305 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-config-data-custom\") pod \"c54f77a4-1095-4ff1-bc74-b845cde659d9\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.871461 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-scripts\") pod \"c54f77a4-1095-4ff1-bc74-b845cde659d9\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.871517 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-config-data\") pod \"c54f77a4-1095-4ff1-bc74-b845cde659d9\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.871611 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4bdc\" (UniqueName: \"kubernetes.io/projected/c54f77a4-1095-4ff1-bc74-b845cde659d9-kube-api-access-b4bdc\") pod \"c54f77a4-1095-4ff1-bc74-b845cde659d9\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.871647 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c54f77a4-1095-4ff1-bc74-b845cde659d9-etc-machine-id\") pod \"c54f77a4-1095-4ff1-bc74-b845cde659d9\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.871704 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-combined-ca-bundle\") pod \"c54f77a4-1095-4ff1-bc74-b845cde659d9\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.872215 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c54f77a4-1095-4ff1-bc74-b845cde659d9-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c54f77a4-1095-4ff1-bc74-b845cde659d9" (UID: "c54f77a4-1095-4ff1-bc74-b845cde659d9"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.875498 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-scripts" (OuterVolumeSpecName: "scripts") pod "c54f77a4-1095-4ff1-bc74-b845cde659d9" (UID: "c54f77a4-1095-4ff1-bc74-b845cde659d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.881521 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.881563 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.881574 4795 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c54f77a4-1095-4ff1-bc74-b845cde659d9-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.885411 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c54f77a4-1095-4ff1-bc74-b845cde659d9-kube-api-access-b4bdc" (OuterVolumeSpecName: "kube-api-access-b4bdc") pod "c54f77a4-1095-4ff1-bc74-b845cde659d9" (UID: "c54f77a4-1095-4ff1-bc74-b845cde659d9"). InnerVolumeSpecName "kube-api-access-b4bdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.900354 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c54f77a4-1095-4ff1-bc74-b845cde659d9" (UID: "c54f77a4-1095-4ff1-bc74-b845cde659d9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.950243 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c54f77a4-1095-4ff1-bc74-b845cde659d9" (UID: "c54f77a4-1095-4ff1-bc74-b845cde659d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.977465 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-config-data" (OuterVolumeSpecName: "config-data") pod "c54f77a4-1095-4ff1-bc74-b845cde659d9" (UID: "c54f77a4-1095-4ff1-bc74-b845cde659d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.983691 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4bdc\" (UniqueName: \"kubernetes.io/projected/c54f77a4-1095-4ff1-bc74-b845cde659d9-kube-api-access-b4bdc\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.983713 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.983724 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.983735 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.998409 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c773ec2-a400-42a9-8784-ed9c295c3bb4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c773ec2-a400-42a9-8784-ed9c295c3bb4" (UID: "6c773ec2-a400-42a9-8784-ed9c295c3bb4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.078402 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tl5hf_9a19676c-9314-43a3-a2f8-bcf56d6b5ce3/ovs-vswitchd/0.log" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.079350 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.085615 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c773ec2-a400-42a9-8784-ed9c295c3bb4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.186200 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp572\" (UniqueName: \"kubernetes.io/projected/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-kube-api-access-rp572\") pod \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.186276 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-lib\") pod \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.186307 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-run\") pod \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.186347 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-log\") pod \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.186411 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-lib" (OuterVolumeSpecName: "var-lib") pod "9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" (UID: "9a19676c-9314-43a3-a2f8-bcf56d6b5ce3"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.186429 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-scripts\") pod \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.186480 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-log" (OuterVolumeSpecName: "var-log") pod "9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" (UID: "9a19676c-9314-43a3-a2f8-bcf56d6b5ce3"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.186488 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-run" (OuterVolumeSpecName: "var-run") pod "9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" (UID: "9a19676c-9314-43a3-a2f8-bcf56d6b5ce3"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.186551 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-etc-ovs\") pod \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.186581 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" (UID: "9a19676c-9314-43a3-a2f8-bcf56d6b5ce3"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.187186 4795 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-run\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.187214 4795 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-log\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.187227 4795 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-etc-ovs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.187239 4795 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-lib\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.187574 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-scripts" (OuterVolumeSpecName: "scripts") pod "9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" (UID: "9a19676c-9314-43a3-a2f8-bcf56d6b5ce3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.190705 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-kube-api-access-rp572" (OuterVolumeSpecName: "kube-api-access-rp572") pod "9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" (UID: "9a19676c-9314-43a3-a2f8-bcf56d6b5ce3"). InnerVolumeSpecName "kube-api-access-rp572". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.266235 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerDied","Data":"5f46001ee34a633ce141975121f3de2f61c9a83244b699ec2a130f6af5efbc36"} Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.266297 4795 scope.go:117] "RemoveContainer" containerID="955740cbd5ac4eda735378957980240001d5c0ce0905f2fca18b4155f3fb6c98" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.266345 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.268958 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tl5hf_9a19676c-9314-43a3-a2f8-bcf56d6b5ce3/ovs-vswitchd/0.log" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.269927 4795 generic.go:334] "Generic (PLEG): container finished" podID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" exitCode=137 Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.270029 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tl5hf" event={"ID":"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3","Type":"ContainerDied","Data":"ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf"} Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.270105 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tl5hf" event={"ID":"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3","Type":"ContainerDied","Data":"79efa7732f14d953bfab30c856bec07a21e8fbe6e42ebb6b0b9f4b604f334bb3"} Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.270283 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.273863 4795 generic.go:334] "Generic (PLEG): container finished" podID="c54f77a4-1095-4ff1-bc74-b845cde659d9" containerID="5256ff7cce2d8a61bc1b79dbf58379aff2f08fd19479fafdd317965f24177cc8" exitCode=137 Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.274237 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c54f77a4-1095-4ff1-bc74-b845cde659d9","Type":"ContainerDied","Data":"5256ff7cce2d8a61bc1b79dbf58379aff2f08fd19479fafdd317965f24177cc8"} Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.274269 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c54f77a4-1095-4ff1-bc74-b845cde659d9","Type":"ContainerDied","Data":"edafb0a067aa15654ff7c76ec822cff24dc2310ed79e9d7093d41cbc935fc540"} Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.274358 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.288191 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.288230 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rp572\" (UniqueName: \"kubernetes.io/projected/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-kube-api-access-rp572\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.295062 4795 scope.go:117] "RemoveContainer" containerID="bb960073c3b2955d7aa2d18d3eb2e0958e7e98f4cd499d7077f5064d1e43a05e" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.312817 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.331029 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.333337 4795 scope.go:117] "RemoveContainer" containerID="8d3a569ab5140e595996d2c82fd170ed28aa9420de4fdae36e9b5854b2e0bd5e" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.335679 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-tl5hf"] Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.343296 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-tl5hf"] Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.352810 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.359084 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.367388 4795 scope.go:117] "RemoveContainer" containerID="2ce7f7a343a6c79fd57a6b4c7cea8f6f21ccfbc5ea5261b4c592c5cc2035910e" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.388195 4795 scope.go:117] "RemoveContainer" containerID="cb0176a835c07bb843ea9834f19b5792b6d9700c5cd61a140ec8b99a66854f5f" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.407897 4795 scope.go:117] "RemoveContainer" containerID="c8d438309dbe4ee742eb9d7a2b93e755a74c9ba2dd39409dcf7caf84dee6405a" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.425745 4795 scope.go:117] "RemoveContainer" containerID="c5d0640985105b2140d43ecf956f5621f6e82eb5a3d40d95fb3d09d303406c84" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.451069 4795 scope.go:117] "RemoveContainer" containerID="f2b55f40d9ab92e06fdc09f65e72764f7f9c63fbb1f126ede2058624236d001f" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.472942 4795 scope.go:117] "RemoveContainer" containerID="5019fca913d092cd1d004e058553c364ac08007bafeae027b681e3bb6eb59026" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.492884 4795 scope.go:117] "RemoveContainer" containerID="be1c394688c447ed772b4929317159025a1e97491b40b847644ed369351532b5" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.513374 4795 scope.go:117] "RemoveContainer" containerID="a86db1a02ddab5086097179b35e6f17d71c32f36157625ee70912b23839603d4" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.525154 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" path="/var/lib/kubelet/pods/6c773ec2-a400-42a9-8784-ed9c295c3bb4/volumes" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.529020 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" path="/var/lib/kubelet/pods/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3/volumes" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.530244 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c54f77a4-1095-4ff1-bc74-b845cde659d9" path="/var/lib/kubelet/pods/c54f77a4-1095-4ff1-bc74-b845cde659d9/volumes" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.533856 4795 scope.go:117] "RemoveContainer" containerID="d2281de4777acfa86c800d06ed4c2e0ac8613cf4008b8449cd7089d057ee51ec" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.555485 4795 scope.go:117] "RemoveContainer" containerID="22441008e17545864d7c6366d4ab4fa8333a1c04e36b9961e8fdfdfaeec8b1b6" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.574812 4795 scope.go:117] "RemoveContainer" containerID="b4b0474dd3a4192273fa0b2dc273a792e955cd0dde33e24a1afa65bb56656eaa" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.592496 4795 scope.go:117] "RemoveContainer" containerID="51c55baa52f08bcb95276c0f7a67a7ef348b9bd02a9fc401f50e679f37e0c117" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.611206 4795 scope.go:117] "RemoveContainer" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.637865 4795 scope.go:117] "RemoveContainer" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.666456 4795 scope.go:117] "RemoveContainer" containerID="000996048fc152f2f7ff89617b797c3b6c97a3bedd94f9b3f95b461725418fde" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.702721 4795 scope.go:117] "RemoveContainer" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" Feb 19 21:50:27 crc kubenswrapper[4795]: E0219 21:50:27.703475 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf\": container with ID starting with ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf not found: ID does not exist" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.703543 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf"} err="failed to get container status \"ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf\": rpc error: code = NotFound desc = could not find container \"ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf\": container with ID starting with ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf not found: ID does not exist" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.703582 4795 scope.go:117] "RemoveContainer" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" Feb 19 21:50:27 crc kubenswrapper[4795]: E0219 21:50:27.704102 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23\": container with ID starting with e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 not found: ID does not exist" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.704234 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23"} err="failed to get container status \"e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23\": rpc error: code = NotFound desc = could not find container \"e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23\": container with ID starting with e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 not found: ID does not exist" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.704283 4795 scope.go:117] "RemoveContainer" containerID="000996048fc152f2f7ff89617b797c3b6c97a3bedd94f9b3f95b461725418fde" Feb 19 21:50:27 crc kubenswrapper[4795]: E0219 21:50:27.705534 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"000996048fc152f2f7ff89617b797c3b6c97a3bedd94f9b3f95b461725418fde\": container with ID starting with 000996048fc152f2f7ff89617b797c3b6c97a3bedd94f9b3f95b461725418fde not found: ID does not exist" containerID="000996048fc152f2f7ff89617b797c3b6c97a3bedd94f9b3f95b461725418fde" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.705578 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"000996048fc152f2f7ff89617b797c3b6c97a3bedd94f9b3f95b461725418fde"} err="failed to get container status \"000996048fc152f2f7ff89617b797c3b6c97a3bedd94f9b3f95b461725418fde\": rpc error: code = NotFound desc = could not find container \"000996048fc152f2f7ff89617b797c3b6c97a3bedd94f9b3f95b461725418fde\": container with ID starting with 000996048fc152f2f7ff89617b797c3b6c97a3bedd94f9b3f95b461725418fde not found: ID does not exist" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.705604 4795 scope.go:117] "RemoveContainer" containerID="b84203bdd91f64f0aed89ae5d8334985c117eb72cc298fbd0865d60f726218c1" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.745717 4795 scope.go:117] "RemoveContainer" containerID="5256ff7cce2d8a61bc1b79dbf58379aff2f08fd19479fafdd317965f24177cc8" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.767427 4795 scope.go:117] "RemoveContainer" containerID="b84203bdd91f64f0aed89ae5d8334985c117eb72cc298fbd0865d60f726218c1" Feb 19 21:50:27 crc kubenswrapper[4795]: E0219 21:50:27.767956 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b84203bdd91f64f0aed89ae5d8334985c117eb72cc298fbd0865d60f726218c1\": container with ID starting with b84203bdd91f64f0aed89ae5d8334985c117eb72cc298fbd0865d60f726218c1 not found: ID does not exist" containerID="b84203bdd91f64f0aed89ae5d8334985c117eb72cc298fbd0865d60f726218c1" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.768001 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b84203bdd91f64f0aed89ae5d8334985c117eb72cc298fbd0865d60f726218c1"} err="failed to get container status \"b84203bdd91f64f0aed89ae5d8334985c117eb72cc298fbd0865d60f726218c1\": rpc error: code = NotFound desc = could not find container \"b84203bdd91f64f0aed89ae5d8334985c117eb72cc298fbd0865d60f726218c1\": container with ID starting with b84203bdd91f64f0aed89ae5d8334985c117eb72cc298fbd0865d60f726218c1 not found: ID does not exist" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.768027 4795 scope.go:117] "RemoveContainer" containerID="5256ff7cce2d8a61bc1b79dbf58379aff2f08fd19479fafdd317965f24177cc8" Feb 19 21:50:27 crc kubenswrapper[4795]: E0219 21:50:27.768440 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5256ff7cce2d8a61bc1b79dbf58379aff2f08fd19479fafdd317965f24177cc8\": container with ID starting with 5256ff7cce2d8a61bc1b79dbf58379aff2f08fd19479fafdd317965f24177cc8 not found: ID does not exist" containerID="5256ff7cce2d8a61bc1b79dbf58379aff2f08fd19479fafdd317965f24177cc8" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.768470 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5256ff7cce2d8a61bc1b79dbf58379aff2f08fd19479fafdd317965f24177cc8"} err="failed to get container status \"5256ff7cce2d8a61bc1b79dbf58379aff2f08fd19479fafdd317965f24177cc8\": rpc error: code = NotFound desc = could not find container \"5256ff7cce2d8a61bc1b79dbf58379aff2f08fd19479fafdd317965f24177cc8\": container with ID starting with 5256ff7cce2d8a61bc1b79dbf58379aff2f08fd19479fafdd317965f24177cc8 not found: ID does not exist" Feb 19 21:50:28 crc kubenswrapper[4795]: I0219 21:50:28.427719 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:50:28 crc kubenswrapper[4795]: I0219 21:50:28.427763 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:50:28 crc kubenswrapper[4795]: I0219 21:50:28.427799 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:50:28 crc kubenswrapper[4795]: I0219 21:50:28.428313 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d1fe8d148e55484c7e32b6632ef0256602ab969af6bc815e1058a95087794811"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 21:50:28 crc kubenswrapper[4795]: I0219 21:50:28.428357 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://d1fe8d148e55484c7e32b6632ef0256602ab969af6bc815e1058a95087794811" gracePeriod=600 Feb 19 21:50:29 crc kubenswrapper[4795]: I0219 21:50:29.303928 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="d1fe8d148e55484c7e32b6632ef0256602ab969af6bc815e1058a95087794811" exitCode=0 Feb 19 21:50:29 crc kubenswrapper[4795]: I0219 21:50:29.304001 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"d1fe8d148e55484c7e32b6632ef0256602ab969af6bc815e1058a95087794811"} Feb 19 21:50:29 crc kubenswrapper[4795]: I0219 21:50:29.304359 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8"} Feb 19 21:50:29 crc kubenswrapper[4795]: I0219 21:50:29.304381 4795 scope.go:117] "RemoveContainer" containerID="26a524895f2f97a5543d7713a3a6fad00cc54e588f3f27507aef436c0255d593" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.118093 4795 scope.go:117] "RemoveContainer" containerID="088f63c71340a869532178f76fc11ab9c0cbbdc5c1aeacd6e851cf5d40a46fa0" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.175441 4795 scope.go:117] "RemoveContainer" containerID="6b0c6da228cd7d133c73866bc31005f08b881ce2ffda7a1dc8905fe2cbf580b7" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.198598 4795 scope.go:117] "RemoveContainer" containerID="9ae6b68ead4d625967dc9f8e67a83bcadf072cd6d6b5d617f89d10bb543678b8" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.252100 4795 scope.go:117] "RemoveContainer" containerID="83bc6dd6efe6f9962ae47d6fb7eb26b3205715ecf2651537919f42966aa1ce32" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.273188 4795 scope.go:117] "RemoveContainer" containerID="d90138519ffcbc0102f0a7e9dc5bfa3f8f09f718ea4e3fd3a5f7e537cfb53122" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.291538 4795 scope.go:117] "RemoveContainer" containerID="464384efe0cd85358add102341feabf48351464a71bfc2ad9ed2ce3ea45a4a94" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.323258 4795 scope.go:117] "RemoveContainer" containerID="c3d76989d78df6e0877ea687626ca27c833a6485113f8ba0b36de864c9998267" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.340866 4795 scope.go:117] "RemoveContainer" containerID="83550b0e412e3c0f2147b2389e1c3220efde13a4057f9e67e612e0461d3172fc" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.357932 4795 scope.go:117] "RemoveContainer" containerID="b0effc166b07beb1020b33c2901a97075e3341db6cb3398e72144f393ccb6850" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.399657 4795 scope.go:117] "RemoveContainer" containerID="122378ef931bbdce5fbd1c31b593170eeae5e5fed8a74b40fbb0d536de3a3d22" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.454698 4795 scope.go:117] "RemoveContainer" containerID="27f5e555e30e338ab9e5ae7facd6ba963cf6336bbb4bec245c099c4afa8bb6a3" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.484514 4795 scope.go:117] "RemoveContainer" containerID="f13d7ab12cdcba95909b27dcd1c1e77cc3443dbab3be6042ac8015c2168e9280" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.531376 4795 scope.go:117] "RemoveContainer" containerID="00ef13a6fc1812f0a18c09ccfd124e49c733a0a626dc7ae23746169010b14516" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.554165 4795 scope.go:117] "RemoveContainer" containerID="88bfd25384ba94c2658b1d015dafd73e32d146937b8caf9b1891261850b11f22" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.567848 4795 scope.go:117] "RemoveContainer" containerID="7c32d427d23010f8ec7644388fc5c6ced9ee00bbf0a6575354abad70dc15498d" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.593046 4795 scope.go:117] "RemoveContainer" containerID="2c21dc2092bc9760ef50273deb040b6251a319d32c5e2c0fdd3bf1678ba55094" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.613320 4795 scope.go:117] "RemoveContainer" containerID="62e8bfd862eff78d929edc90f98ab271d6cbd53192119320b7d77f0dddb03767" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.633978 4795 scope.go:117] "RemoveContainer" containerID="95c63f1640980c95f161952724dcddb5ac630545c512a2aa1ea3882e47e48df9" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.649399 4795 scope.go:117] "RemoveContainer" containerID="f3e360ac25dc639935c61b2baa9937c7f07b88ebec95aede182ab25a4907955c" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.676066 4795 scope.go:117] "RemoveContainer" containerID="0c92f3d0df6fb4ffa1967f7b15d462ab0538b07666cad5a1fe2cc6d118293fe8" Feb 19 21:52:28 crc kubenswrapper[4795]: I0219 21:52:28.427394 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:52:28 crc kubenswrapper[4795]: I0219 21:52:28.427927 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:52:58 crc kubenswrapper[4795]: I0219 21:52:58.427459 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:52:58 crc kubenswrapper[4795]: I0219 21:52:58.428054 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:53:12 crc kubenswrapper[4795]: I0219 21:53:12.930212 4795 scope.go:117] "RemoveContainer" containerID="76c63f1533e7c3fa5624057c7772474f1729c45e568e970f6a98d3e337ab74ae" Feb 19 21:53:12 crc kubenswrapper[4795]: I0219 21:53:12.969142 4795 scope.go:117] "RemoveContainer" containerID="0852bee2926e9dad886249433c004e0f4241817093b65fb2480708d4c6c502c0" Feb 19 21:53:13 crc kubenswrapper[4795]: I0219 21:53:13.013027 4795 scope.go:117] "RemoveContainer" containerID="a55b1033c9d990cf35d1c92321ef2ffff86d5c602754349685623befa9e70257" Feb 19 21:53:13 crc kubenswrapper[4795]: I0219 21:53:13.045156 4795 scope.go:117] "RemoveContainer" containerID="da621ffa210adf5451b6eb9d78a9cbed2608c7b5e2b1a9ce00eba1c1e65c6ec4" Feb 19 21:53:13 crc kubenswrapper[4795]: I0219 21:53:13.093756 4795 scope.go:117] "RemoveContainer" containerID="77a5c881bfb4b3162733203d6d06eed351c1cde8f2967461288b978f94eeb5ba" Feb 19 21:53:28 crc kubenswrapper[4795]: I0219 21:53:28.427445 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:53:28 crc kubenswrapper[4795]: I0219 21:53:28.428036 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:53:28 crc kubenswrapper[4795]: I0219 21:53:28.428098 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:53:28 crc kubenswrapper[4795]: I0219 21:53:28.428821 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 21:53:28 crc kubenswrapper[4795]: I0219 21:53:28.428893 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" gracePeriod=600 Feb 19 21:53:28 crc kubenswrapper[4795]: E0219 21:53:28.557104 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:53:28 crc kubenswrapper[4795]: I0219 21:53:28.845233 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" exitCode=0 Feb 19 21:53:28 crc kubenswrapper[4795]: I0219 21:53:28.845322 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8"} Feb 19 21:53:28 crc kubenswrapper[4795]: I0219 21:53:28.845421 4795 scope.go:117] "RemoveContainer" containerID="d1fe8d148e55484c7e32b6632ef0256602ab969af6bc815e1058a95087794811" Feb 19 21:53:28 crc kubenswrapper[4795]: I0219 21:53:28.845954 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:53:28 crc kubenswrapper[4795]: E0219 21:53:28.846151 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:53:42 crc kubenswrapper[4795]: I0219 21:53:42.511259 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:53:42 crc kubenswrapper[4795]: E0219 21:53:42.511994 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.031510 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-crhfj"] Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.031955 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="account-replicator" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.031983 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="account-replicator" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032009 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" containerName="ovn-northd" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032021 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" containerName="ovn-northd" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032042 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="container-replicator" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032055 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="container-replicator" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032076 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-expirer" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032102 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-expirer" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032125 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="swift-recon-cron" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032137 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="swift-recon-cron" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032157 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3697a3b0-4077-4837-bcdc-c17d8aa361f1" containerName="glance-httpd" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032192 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3697a3b0-4077-4837-bcdc-c17d8aa361f1" containerName="glance-httpd" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032222 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b096325-542d-4ac6-8d16-8aa0937013b2" containerName="setup-container" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032233 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b096325-542d-4ac6-8d16-8aa0937013b2" containerName="setup-container" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032254 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b01bcd5b-435a-4702-b0a4-8dfe8f553c23" containerName="placement-api" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032277 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b01bcd5b-435a-4702-b0a4-8dfe8f553c23" containerName="placement-api" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032297 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" containerName="openstack-network-exporter" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032311 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" containerName="openstack-network-exporter" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032326 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" containerName="glance-httpd" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032338 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" containerName="glance-httpd" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032364 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" containerName="rabbitmq" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032377 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" containerName="rabbitmq" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032395 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42e81a78-17fd-4ed3-b072-dd6a20cbe5d3" containerName="memcached" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032407 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e81a78-17fd-4ed3-b072-dd6a20cbe5d3" containerName="memcached" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032422 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a6a069-904a-4072-b98c-346f67f22def" containerName="barbican-api-log" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032434 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a6a069-904a-4072-b98c-346f67f22def" containerName="barbican-api-log" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032456 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="container-auditor" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032468 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="container-auditor" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032482 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c30e8522-2d7f-4f10-a0b4-a7cfc351d093" containerName="ovn-controller" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032493 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c30e8522-2d7f-4f10-a0b4-a7cfc351d093" containerName="ovn-controller" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032514 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="account-auditor" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032527 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="account-auditor" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032550 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-server" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032562 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-server" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032584 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2561f4e-0a01-4927-96f8-ee7bef69f561" containerName="cinder-api" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032595 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2561f4e-0a01-4927-96f8-ee7bef69f561" containerName="cinder-api" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032613 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="account-reaper" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032625 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="account-reaper" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032638 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="rsync" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032650 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="rsync" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032665 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="793bbadc-8b53-4084-a63a-0b76b37284df" containerName="nova-api-api" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032678 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="793bbadc-8b53-4084-a63a-0b76b37284df" containerName="nova-api-api" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032698 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" containerName="mysql-bootstrap" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032710 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" containerName="mysql-bootstrap" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032730 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" containerName="galera" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032742 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" containerName="galera" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032761 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30f2c894-7a7a-4e5a-a090-a28ab50c766a" containerName="neutron-httpd" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032773 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="30f2c894-7a7a-4e5a-a090-a28ab50c766a" containerName="neutron-httpd" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032788 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" containerName="setup-container" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032800 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" containerName="setup-container" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032813 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2164f9d1-1d8b-486b-beca-0d3a5172b302" containerName="mariadb-account-create-update" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032825 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2164f9d1-1d8b-486b-beca-0d3a5172b302" containerName="mariadb-account-create-update" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032843 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="296f6b57-de45-495d-abe9-8c779c157057" containerName="kube-state-metrics" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032857 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="296f6b57-de45-495d-abe9-8c779c157057" containerName="kube-state-metrics" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032874 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="proxy-httpd" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032887 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="proxy-httpd" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032907 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2710b23-7a5c-44cb-b916-9e08edc59636" containerName="nova-scheduler-scheduler" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032919 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2710b23-7a5c-44cb-b916-9e08edc59636" containerName="nova-scheduler-scheduler" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032935 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="793bbadc-8b53-4084-a63a-0b76b37284df" containerName="nova-api-log" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032947 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="793bbadc-8b53-4084-a63a-0b76b37284df" containerName="nova-api-log" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032960 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="account-server" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032971 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="account-server" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032991 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" containerName="glance-log" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033003 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" containerName="glance-log" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033026 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="container-server" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033037 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="container-server" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033059 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b928260-ac65-479d-bd4b-f14b48d24ddb" containerName="keystone-api" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033070 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b928260-ac65-479d-bd4b-f14b48d24ddb" containerName="keystone-api" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033092 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2561f4e-0a01-4927-96f8-ee7bef69f561" containerName="cinder-api-log" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033104 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2561f4e-0a01-4927-96f8-ee7bef69f561" containerName="cinder-api-log" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033128 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da2e3f89-bf0b-4371-8e5b-a0037f266c70" containerName="proxy-server" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033140 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="da2e3f89-bf0b-4371-8e5b-a0037f266c70" containerName="proxy-server" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033153 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3697a3b0-4077-4837-bcdc-c17d8aa361f1" containerName="glance-log" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033195 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3697a3b0-4077-4837-bcdc-c17d8aa361f1" containerName="glance-log" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033212 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" containerName="nova-metadata-log" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033224 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" containerName="nova-metadata-log" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033244 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="ceilometer-notification-agent" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033256 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="ceilometer-notification-agent" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033279 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30f2c894-7a7a-4e5a-a090-a28ab50c766a" containerName="neutron-api" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033293 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="30f2c894-7a7a-4e5a-a090-a28ab50c766a" containerName="neutron-api" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033316 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-updater" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033328 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-updater" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033349 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-auditor" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033361 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-auditor" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033378 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-replicator" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033390 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-replicator" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033411 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c54f77a4-1095-4ff1-bc74-b845cde659d9" containerName="probe" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033423 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c54f77a4-1095-4ff1-bc74-b845cde659d9" containerName="probe" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033442 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b096325-542d-4ac6-8d16-8aa0937013b2" containerName="rabbitmq" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033454 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b096325-542d-4ac6-8d16-8aa0937013b2" containerName="rabbitmq" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033494 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da2e3f89-bf0b-4371-8e5b-a0037f266c70" containerName="proxy-httpd" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033509 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="da2e3f89-bf0b-4371-8e5b-a0037f266c70" containerName="proxy-httpd" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033536 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="container-updater" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033549 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="container-updater" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033569 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="sg-core" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033581 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="sg-core" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033598 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b01bcd5b-435a-4702-b0a4-8dfe8f553c23" containerName="placement-log" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033609 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b01bcd5b-435a-4702-b0a4-8dfe8f553c23" containerName="placement-log" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033626 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a6a069-904a-4072-b98c-346f67f22def" containerName="barbican-api" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033638 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a6a069-904a-4072-b98c-346f67f22def" containerName="barbican-api" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033653 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c54f77a4-1095-4ff1-bc74-b845cde659d9" containerName="cinder-scheduler" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033664 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c54f77a4-1095-4ff1-bc74-b845cde659d9" containerName="cinder-scheduler" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033685 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovsdb-server-init" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033697 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovsdb-server-init" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033717 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovsdb-server" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033729 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovsdb-server" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033744 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="ceilometer-central-agent" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033756 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="ceilometer-central-agent" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033772 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" containerName="nova-metadata-metadata" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033784 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" containerName="nova-metadata-metadata" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033808 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57b83043-2f7c-4b55-a2b9-66eef96f0008" containerName="nova-cell0-conductor-conductor" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033822 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="57b83043-2f7c-4b55-a2b9-66eef96f0008" containerName="nova-cell0-conductor-conductor" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033838 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovs-vswitchd" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033849 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovs-vswitchd" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033865 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2164f9d1-1d8b-486b-beca-0d3a5172b302" containerName="mariadb-account-create-update" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033877 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2164f9d1-1d8b-486b-beca-0d3a5172b302" containerName="mariadb-account-create-update" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034115 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2710b23-7a5c-44cb-b916-9e08edc59636" containerName="nova-scheduler-scheduler" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034134 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2164f9d1-1d8b-486b-beca-0d3a5172b302" containerName="mariadb-account-create-update" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034151 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="42e81a78-17fd-4ed3-b072-dd6a20cbe5d3" containerName="memcached" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034201 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2561f4e-0a01-4927-96f8-ee7bef69f561" containerName="cinder-api" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034224 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4a6a069-904a-4072-b98c-346f67f22def" containerName="barbican-api" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034249 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="da2e3f89-bf0b-4371-8e5b-a0037f266c70" containerName="proxy-server" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034272 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="57b83043-2f7c-4b55-a2b9-66eef96f0008" containerName="nova-cell0-conductor-conductor" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034297 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b01bcd5b-435a-4702-b0a4-8dfe8f553c23" containerName="placement-api" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034322 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="account-replicator" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034343 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="account-auditor" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034363 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" containerName="nova-metadata-log" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034384 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3697a3b0-4077-4837-bcdc-c17d8aa361f1" containerName="glance-log" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034398 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b01bcd5b-435a-4702-b0a4-8dfe8f553c23" containerName="placement-log" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034411 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-expirer" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034432 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b928260-ac65-479d-bd4b-f14b48d24ddb" containerName="keystone-api" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034451 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="793bbadc-8b53-4084-a63a-0b76b37284df" containerName="nova-api-api" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034467 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovsdb-server" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034480 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="container-auditor" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034496 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="container-updater" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034514 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="da2e3f89-bf0b-4371-8e5b-a0037f266c70" containerName="proxy-httpd" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034531 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" containerName="openstack-network-exporter" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034543 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="account-reaper" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034565 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" containerName="galera" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034577 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" containerName="glance-log" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034591 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-server" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034607 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" containerName="ovn-northd" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034619 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="account-server" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034634 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="proxy-httpd" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034651 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="ceilometer-notification-agent" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034664 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3697a3b0-4077-4837-bcdc-c17d8aa361f1" containerName="glance-httpd" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034678 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="ceilometer-central-agent" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034700 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c30e8522-2d7f-4f10-a0b4-a7cfc351d093" containerName="ovn-controller" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034718 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-updater" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034733 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="30f2c894-7a7a-4e5a-a090-a28ab50c766a" containerName="neutron-httpd" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034748 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovs-vswitchd" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034767 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="rsync" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034786 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2561f4e-0a01-4927-96f8-ee7bef69f561" containerName="cinder-api-log" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034801 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="793bbadc-8b53-4084-a63a-0b76b37284df" containerName="nova-api-log" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034822 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="sg-core" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034840 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" containerName="rabbitmq" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034852 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-replicator" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034870 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b096325-542d-4ac6-8d16-8aa0937013b2" containerName="rabbitmq" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034885 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c54f77a4-1095-4ff1-bc74-b845cde659d9" containerName="probe" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034903 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" containerName="glance-httpd" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034922 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="container-replicator" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034935 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" containerName="nova-metadata-metadata" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034950 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-auditor" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034966 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="30f2c894-7a7a-4e5a-a090-a28ab50c766a" containerName="neutron-api" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034985 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="296f6b57-de45-495d-abe9-8c779c157057" containerName="kube-state-metrics" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.035004 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c54f77a4-1095-4ff1-bc74-b845cde659d9" containerName="cinder-scheduler" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.035022 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="container-server" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.035035 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="swift-recon-cron" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.035050 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4a6a069-904a-4072-b98c-346f67f22def" containerName="barbican-api-log" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.035625 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2164f9d1-1d8b-486b-beca-0d3a5172b302" containerName="mariadb-account-create-update" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.036823 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.064110 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-crhfj"] Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.153588 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39607a76-451a-4cd6-806b-c14c6a94b5ae-catalog-content\") pod \"certified-operators-crhfj\" (UID: \"39607a76-451a-4cd6-806b-c14c6a94b5ae\") " pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.153720 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39607a76-451a-4cd6-806b-c14c6a94b5ae-utilities\") pod \"certified-operators-crhfj\" (UID: \"39607a76-451a-4cd6-806b-c14c6a94b5ae\") " pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.153750 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v68jd\" (UniqueName: \"kubernetes.io/projected/39607a76-451a-4cd6-806b-c14c6a94b5ae-kube-api-access-v68jd\") pod \"certified-operators-crhfj\" (UID: \"39607a76-451a-4cd6-806b-c14c6a94b5ae\") " pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.255080 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39607a76-451a-4cd6-806b-c14c6a94b5ae-utilities\") pod \"certified-operators-crhfj\" (UID: \"39607a76-451a-4cd6-806b-c14c6a94b5ae\") " pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.255574 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v68jd\" (UniqueName: \"kubernetes.io/projected/39607a76-451a-4cd6-806b-c14c6a94b5ae-kube-api-access-v68jd\") pod \"certified-operators-crhfj\" (UID: \"39607a76-451a-4cd6-806b-c14c6a94b5ae\") " pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.255715 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39607a76-451a-4cd6-806b-c14c6a94b5ae-catalog-content\") pod \"certified-operators-crhfj\" (UID: \"39607a76-451a-4cd6-806b-c14c6a94b5ae\") " pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.255845 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39607a76-451a-4cd6-806b-c14c6a94b5ae-utilities\") pod \"certified-operators-crhfj\" (UID: \"39607a76-451a-4cd6-806b-c14c6a94b5ae\") " pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.256500 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39607a76-451a-4cd6-806b-c14c6a94b5ae-catalog-content\") pod \"certified-operators-crhfj\" (UID: \"39607a76-451a-4cd6-806b-c14c6a94b5ae\") " pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.288494 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v68jd\" (UniqueName: \"kubernetes.io/projected/39607a76-451a-4cd6-806b-c14c6a94b5ae-kube-api-access-v68jd\") pod \"certified-operators-crhfj\" (UID: \"39607a76-451a-4cd6-806b-c14c6a94b5ae\") " pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.372353 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.871907 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-crhfj"] Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.966758 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-crhfj" event={"ID":"39607a76-451a-4cd6-806b-c14c6a94b5ae","Type":"ContainerStarted","Data":"29c905448e4c99f2240081e71d0b997093c5980cb0276302ef057ea183c10166"} Feb 19 21:53:44 crc kubenswrapper[4795]: I0219 21:53:44.975769 4795 generic.go:334] "Generic (PLEG): container finished" podID="39607a76-451a-4cd6-806b-c14c6a94b5ae" containerID="1ac1040e8221e1eae8c74ba52013471edf1256a7d756330020a1e0f282c811cb" exitCode=0 Feb 19 21:53:44 crc kubenswrapper[4795]: I0219 21:53:44.975855 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-crhfj" event={"ID":"39607a76-451a-4cd6-806b-c14c6a94b5ae","Type":"ContainerDied","Data":"1ac1040e8221e1eae8c74ba52013471edf1256a7d756330020a1e0f282c811cb"} Feb 19 21:53:44 crc kubenswrapper[4795]: I0219 21:53:44.978136 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 21:53:45 crc kubenswrapper[4795]: I0219 21:53:45.984362 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-crhfj" event={"ID":"39607a76-451a-4cd6-806b-c14c6a94b5ae","Type":"ContainerStarted","Data":"8be6ee752fd8728485a58c86ac7f0f5fc78541f658433bfe18cdea07b3e39e04"} Feb 19 21:53:46 crc kubenswrapper[4795]: I0219 21:53:46.997265 4795 generic.go:334] "Generic (PLEG): container finished" podID="39607a76-451a-4cd6-806b-c14c6a94b5ae" containerID="8be6ee752fd8728485a58c86ac7f0f5fc78541f658433bfe18cdea07b3e39e04" exitCode=0 Feb 19 21:53:46 crc kubenswrapper[4795]: I0219 21:53:46.997313 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-crhfj" event={"ID":"39607a76-451a-4cd6-806b-c14c6a94b5ae","Type":"ContainerDied","Data":"8be6ee752fd8728485a58c86ac7f0f5fc78541f658433bfe18cdea07b3e39e04"} Feb 19 21:53:46 crc kubenswrapper[4795]: I0219 21:53:46.997345 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-crhfj" event={"ID":"39607a76-451a-4cd6-806b-c14c6a94b5ae","Type":"ContainerStarted","Data":"0765581b2730bc1923027a88691a1aecae8cf92bf227d9f2c680a8b520ad6841"} Feb 19 21:53:47 crc kubenswrapper[4795]: I0219 21:53:47.036627 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-crhfj" podStartSLOduration=2.606228411 podStartE2EDuration="4.036604796s" podCreationTimestamp="2026-02-19 21:53:43 +0000 UTC" firstStartedPulling="2026-02-19 21:53:44.977780752 +0000 UTC m=+1536.170298646" lastFinishedPulling="2026-02-19 21:53:46.408157137 +0000 UTC m=+1537.600675031" observedRunningTime="2026-02-19 21:53:47.03105751 +0000 UTC m=+1538.223575414" watchObservedRunningTime="2026-02-19 21:53:47.036604796 +0000 UTC m=+1538.229122670" Feb 19 21:53:53 crc kubenswrapper[4795]: I0219 21:53:53.373575 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:53 crc kubenswrapper[4795]: I0219 21:53:53.374325 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:53 crc kubenswrapper[4795]: I0219 21:53:53.426277 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:54 crc kubenswrapper[4795]: I0219 21:53:54.113616 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:54 crc kubenswrapper[4795]: I0219 21:53:54.182001 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-crhfj"] Feb 19 21:53:56 crc kubenswrapper[4795]: I0219 21:53:56.083076 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-crhfj" podUID="39607a76-451a-4cd6-806b-c14c6a94b5ae" containerName="registry-server" containerID="cri-o://0765581b2730bc1923027a88691a1aecae8cf92bf227d9f2c680a8b520ad6841" gracePeriod=2 Feb 19 21:53:56 crc kubenswrapper[4795]: I0219 21:53:56.560930 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:56 crc kubenswrapper[4795]: I0219 21:53:56.661947 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v68jd\" (UniqueName: \"kubernetes.io/projected/39607a76-451a-4cd6-806b-c14c6a94b5ae-kube-api-access-v68jd\") pod \"39607a76-451a-4cd6-806b-c14c6a94b5ae\" (UID: \"39607a76-451a-4cd6-806b-c14c6a94b5ae\") " Feb 19 21:53:56 crc kubenswrapper[4795]: I0219 21:53:56.662004 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39607a76-451a-4cd6-806b-c14c6a94b5ae-utilities\") pod \"39607a76-451a-4cd6-806b-c14c6a94b5ae\" (UID: \"39607a76-451a-4cd6-806b-c14c6a94b5ae\") " Feb 19 21:53:56 crc kubenswrapper[4795]: I0219 21:53:56.662023 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39607a76-451a-4cd6-806b-c14c6a94b5ae-catalog-content\") pod \"39607a76-451a-4cd6-806b-c14c6a94b5ae\" (UID: \"39607a76-451a-4cd6-806b-c14c6a94b5ae\") " Feb 19 21:53:56 crc kubenswrapper[4795]: I0219 21:53:56.663225 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39607a76-451a-4cd6-806b-c14c6a94b5ae-utilities" (OuterVolumeSpecName: "utilities") pod "39607a76-451a-4cd6-806b-c14c6a94b5ae" (UID: "39607a76-451a-4cd6-806b-c14c6a94b5ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:53:56 crc kubenswrapper[4795]: I0219 21:53:56.678057 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39607a76-451a-4cd6-806b-c14c6a94b5ae-kube-api-access-v68jd" (OuterVolumeSpecName: "kube-api-access-v68jd") pod "39607a76-451a-4cd6-806b-c14c6a94b5ae" (UID: "39607a76-451a-4cd6-806b-c14c6a94b5ae"). InnerVolumeSpecName "kube-api-access-v68jd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:56 crc kubenswrapper[4795]: I0219 21:53:56.717287 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39607a76-451a-4cd6-806b-c14c6a94b5ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "39607a76-451a-4cd6-806b-c14c6a94b5ae" (UID: "39607a76-451a-4cd6-806b-c14c6a94b5ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:53:56 crc kubenswrapper[4795]: I0219 21:53:56.763741 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v68jd\" (UniqueName: \"kubernetes.io/projected/39607a76-451a-4cd6-806b-c14c6a94b5ae-kube-api-access-v68jd\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:56 crc kubenswrapper[4795]: I0219 21:53:56.763770 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39607a76-451a-4cd6-806b-c14c6a94b5ae-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:56 crc kubenswrapper[4795]: I0219 21:53:56.763780 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39607a76-451a-4cd6-806b-c14c6a94b5ae-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.093535 4795 generic.go:334] "Generic (PLEG): container finished" podID="39607a76-451a-4cd6-806b-c14c6a94b5ae" containerID="0765581b2730bc1923027a88691a1aecae8cf92bf227d9f2c680a8b520ad6841" exitCode=0 Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.093584 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-crhfj" event={"ID":"39607a76-451a-4cd6-806b-c14c6a94b5ae","Type":"ContainerDied","Data":"0765581b2730bc1923027a88691a1aecae8cf92bf227d9f2c680a8b520ad6841"} Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.093661 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-crhfj" event={"ID":"39607a76-451a-4cd6-806b-c14c6a94b5ae","Type":"ContainerDied","Data":"29c905448e4c99f2240081e71d0b997093c5980cb0276302ef057ea183c10166"} Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.093692 4795 scope.go:117] "RemoveContainer" containerID="0765581b2730bc1923027a88691a1aecae8cf92bf227d9f2c680a8b520ad6841" Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.093612 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.126712 4795 scope.go:117] "RemoveContainer" containerID="8be6ee752fd8728485a58c86ac7f0f5fc78541f658433bfe18cdea07b3e39e04" Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.138658 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-crhfj"] Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.145831 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-crhfj"] Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.151859 4795 scope.go:117] "RemoveContainer" containerID="1ac1040e8221e1eae8c74ba52013471edf1256a7d756330020a1e0f282c811cb" Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.176308 4795 scope.go:117] "RemoveContainer" containerID="0765581b2730bc1923027a88691a1aecae8cf92bf227d9f2c680a8b520ad6841" Feb 19 21:53:57 crc kubenswrapper[4795]: E0219 21:53:57.176922 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0765581b2730bc1923027a88691a1aecae8cf92bf227d9f2c680a8b520ad6841\": container with ID starting with 0765581b2730bc1923027a88691a1aecae8cf92bf227d9f2c680a8b520ad6841 not found: ID does not exist" containerID="0765581b2730bc1923027a88691a1aecae8cf92bf227d9f2c680a8b520ad6841" Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.176972 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0765581b2730bc1923027a88691a1aecae8cf92bf227d9f2c680a8b520ad6841"} err="failed to get container status \"0765581b2730bc1923027a88691a1aecae8cf92bf227d9f2c680a8b520ad6841\": rpc error: code = NotFound desc = could not find container \"0765581b2730bc1923027a88691a1aecae8cf92bf227d9f2c680a8b520ad6841\": container with ID starting with 0765581b2730bc1923027a88691a1aecae8cf92bf227d9f2c680a8b520ad6841 not found: ID does not exist" Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.177002 4795 scope.go:117] "RemoveContainer" containerID="8be6ee752fd8728485a58c86ac7f0f5fc78541f658433bfe18cdea07b3e39e04" Feb 19 21:53:57 crc kubenswrapper[4795]: E0219 21:53:57.177302 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8be6ee752fd8728485a58c86ac7f0f5fc78541f658433bfe18cdea07b3e39e04\": container with ID starting with 8be6ee752fd8728485a58c86ac7f0f5fc78541f658433bfe18cdea07b3e39e04 not found: ID does not exist" containerID="8be6ee752fd8728485a58c86ac7f0f5fc78541f658433bfe18cdea07b3e39e04" Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.177335 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8be6ee752fd8728485a58c86ac7f0f5fc78541f658433bfe18cdea07b3e39e04"} err="failed to get container status \"8be6ee752fd8728485a58c86ac7f0f5fc78541f658433bfe18cdea07b3e39e04\": rpc error: code = NotFound desc = could not find container \"8be6ee752fd8728485a58c86ac7f0f5fc78541f658433bfe18cdea07b3e39e04\": container with ID starting with 8be6ee752fd8728485a58c86ac7f0f5fc78541f658433bfe18cdea07b3e39e04 not found: ID does not exist" Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.177353 4795 scope.go:117] "RemoveContainer" containerID="1ac1040e8221e1eae8c74ba52013471edf1256a7d756330020a1e0f282c811cb" Feb 19 21:53:57 crc kubenswrapper[4795]: E0219 21:53:57.177631 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ac1040e8221e1eae8c74ba52013471edf1256a7d756330020a1e0f282c811cb\": container with ID starting with 1ac1040e8221e1eae8c74ba52013471edf1256a7d756330020a1e0f282c811cb not found: ID does not exist" containerID="1ac1040e8221e1eae8c74ba52013471edf1256a7d756330020a1e0f282c811cb" Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.177660 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ac1040e8221e1eae8c74ba52013471edf1256a7d756330020a1e0f282c811cb"} err="failed to get container status \"1ac1040e8221e1eae8c74ba52013471edf1256a7d756330020a1e0f282c811cb\": rpc error: code = NotFound desc = could not find container \"1ac1040e8221e1eae8c74ba52013471edf1256a7d756330020a1e0f282c811cb\": container with ID starting with 1ac1040e8221e1eae8c74ba52013471edf1256a7d756330020a1e0f282c811cb not found: ID does not exist" Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.512565 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:53:57 crc kubenswrapper[4795]: E0219 21:53:57.512946 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.522863 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39607a76-451a-4cd6-806b-c14c6a94b5ae" path="/var/lib/kubelet/pods/39607a76-451a-4cd6-806b-c14c6a94b5ae/volumes" Feb 19 21:54:09 crc kubenswrapper[4795]: I0219 21:54:09.515675 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:54:09 crc kubenswrapper[4795]: E0219 21:54:09.516483 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:54:13 crc kubenswrapper[4795]: I0219 21:54:13.207932 4795 scope.go:117] "RemoveContainer" containerID="820a0240f9371635d4e5f03ad2ffcfa48ce070182eb86809479efd07b7626507" Feb 19 21:54:13 crc kubenswrapper[4795]: I0219 21:54:13.278278 4795 scope.go:117] "RemoveContainer" containerID="e324028f2d7a41155077f14e0d48b2d58c21ebdbf00e4a7e4bd8a8141187b6bd" Feb 19 21:54:13 crc kubenswrapper[4795]: I0219 21:54:13.303769 4795 scope.go:117] "RemoveContainer" containerID="c82c86becfd7208e347af805b27fa40c1a5698022b6509cd540c7832c74ea578" Feb 19 21:54:13 crc kubenswrapper[4795]: I0219 21:54:13.339956 4795 scope.go:117] "RemoveContainer" containerID="5c8a8280c91cc71f3e85d0b7e361ab76b49d5efc392b87ec1bb737d4336102fe" Feb 19 21:54:13 crc kubenswrapper[4795]: I0219 21:54:13.366787 4795 scope.go:117] "RemoveContainer" containerID="54158f46698b83516d68eee860a71369629e83ec491495039fcc2f5f6a5bd317" Feb 19 21:54:13 crc kubenswrapper[4795]: I0219 21:54:13.434872 4795 scope.go:117] "RemoveContainer" containerID="df968256162833d5078e440bc65555f6f0195c60776c40f5962f3cbd9e8c0552" Feb 19 21:54:13 crc kubenswrapper[4795]: I0219 21:54:13.464581 4795 scope.go:117] "RemoveContainer" containerID="a0f7c3f34c80b9a269fbc6418536c8bd187965bff2a033810a317f2452cc049c" Feb 19 21:54:13 crc kubenswrapper[4795]: I0219 21:54:13.496845 4795 scope.go:117] "RemoveContainer" containerID="a5072d3cb490beed9fbcf82cc94732aed7e308348a7a93463173543a97f32666" Feb 19 21:54:23 crc kubenswrapper[4795]: I0219 21:54:23.512628 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:54:23 crc kubenswrapper[4795]: E0219 21:54:23.513707 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:54:36 crc kubenswrapper[4795]: I0219 21:54:36.511505 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:54:36 crc kubenswrapper[4795]: E0219 21:54:36.512332 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:54:49 crc kubenswrapper[4795]: I0219 21:54:49.520835 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:54:49 crc kubenswrapper[4795]: E0219 21:54:49.522144 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:55:03 crc kubenswrapper[4795]: I0219 21:55:03.512215 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:55:03 crc kubenswrapper[4795]: E0219 21:55:03.513343 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:55:13 crc kubenswrapper[4795]: I0219 21:55:13.685742 4795 scope.go:117] "RemoveContainer" containerID="2f19263c7946bfa6317a4079eb988e89329f3723a790f4116428242859b8575d" Feb 19 21:55:13 crc kubenswrapper[4795]: I0219 21:55:13.730485 4795 scope.go:117] "RemoveContainer" containerID="971150e9373b5419fd5efc7fc3e78faafd9906f987cb7e0a9e042f04e22cb5a9" Feb 19 21:55:17 crc kubenswrapper[4795]: I0219 21:55:17.511124 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:55:17 crc kubenswrapper[4795]: E0219 21:55:17.511717 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:55:29 crc kubenswrapper[4795]: I0219 21:55:29.521351 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:55:29 crc kubenswrapper[4795]: E0219 21:55:29.522789 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:55:42 crc kubenswrapper[4795]: I0219 21:55:42.511799 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:55:42 crc kubenswrapper[4795]: E0219 21:55:42.512801 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:55:54 crc kubenswrapper[4795]: I0219 21:55:54.511591 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:55:54 crc kubenswrapper[4795]: E0219 21:55:54.512899 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:56:08 crc kubenswrapper[4795]: I0219 21:56:08.512071 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:56:08 crc kubenswrapper[4795]: E0219 21:56:08.513141 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:56:23 crc kubenswrapper[4795]: I0219 21:56:23.514332 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:56:23 crc kubenswrapper[4795]: E0219 21:56:23.515057 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:56:36 crc kubenswrapper[4795]: I0219 21:56:36.511334 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:56:36 crc kubenswrapper[4795]: E0219 21:56:36.512500 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:56:47 crc kubenswrapper[4795]: I0219 21:56:47.511580 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:56:47 crc kubenswrapper[4795]: E0219 21:56:47.512645 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:56:59 crc kubenswrapper[4795]: I0219 21:56:59.514958 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:56:59 crc kubenswrapper[4795]: E0219 21:56:59.515681 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:57:14 crc kubenswrapper[4795]: I0219 21:57:14.511899 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:57:14 crc kubenswrapper[4795]: E0219 21:57:14.512987 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:57:26 crc kubenswrapper[4795]: I0219 21:57:26.516192 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:57:26 crc kubenswrapper[4795]: E0219 21:57:26.516961 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:57:37 crc kubenswrapper[4795]: I0219 21:57:37.512018 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:57:37 crc kubenswrapper[4795]: E0219 21:57:37.512955 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:57:51 crc kubenswrapper[4795]: I0219 21:57:51.512607 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:57:51 crc kubenswrapper[4795]: E0219 21:57:51.513931 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:58:05 crc kubenswrapper[4795]: I0219 21:58:05.512240 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:58:05 crc kubenswrapper[4795]: E0219 21:58:05.513015 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:58:20 crc kubenswrapper[4795]: I0219 21:58:20.511523 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:58:20 crc kubenswrapper[4795]: E0219 21:58:20.512376 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:58:31 crc kubenswrapper[4795]: I0219 21:58:31.513873 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:58:32 crc kubenswrapper[4795]: I0219 21:58:32.420960 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"6c01aaf6e3d8e8204ba05a7b4dc2e4ab1c48baf63fa06a4eaffcb4f1cd336768"} Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.068195 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xkgwx"] Feb 19 21:59:54 crc kubenswrapper[4795]: E0219 21:59:54.069049 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39607a76-451a-4cd6-806b-c14c6a94b5ae" containerName="registry-server" Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.069063 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="39607a76-451a-4cd6-806b-c14c6a94b5ae" containerName="registry-server" Feb 19 21:59:54 crc kubenswrapper[4795]: E0219 21:59:54.069078 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39607a76-451a-4cd6-806b-c14c6a94b5ae" containerName="extract-content" Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.069083 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="39607a76-451a-4cd6-806b-c14c6a94b5ae" containerName="extract-content" Feb 19 21:59:54 crc kubenswrapper[4795]: E0219 21:59:54.069099 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39607a76-451a-4cd6-806b-c14c6a94b5ae" containerName="extract-utilities" Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.069105 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="39607a76-451a-4cd6-806b-c14c6a94b5ae" containerName="extract-utilities" Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.069261 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="39607a76-451a-4cd6-806b-c14c6a94b5ae" containerName="registry-server" Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.070227 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.093866 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xkgwx"] Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.155005 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwx5l\" (UniqueName: \"kubernetes.io/projected/4814500d-f15e-4457-ad1b-24ae2f076b47-kube-api-access-gwx5l\") pod \"redhat-operators-xkgwx\" (UID: \"4814500d-f15e-4457-ad1b-24ae2f076b47\") " pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.155056 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4814500d-f15e-4457-ad1b-24ae2f076b47-catalog-content\") pod \"redhat-operators-xkgwx\" (UID: \"4814500d-f15e-4457-ad1b-24ae2f076b47\") " pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.155115 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4814500d-f15e-4457-ad1b-24ae2f076b47-utilities\") pod \"redhat-operators-xkgwx\" (UID: \"4814500d-f15e-4457-ad1b-24ae2f076b47\") " pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.256595 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwx5l\" (UniqueName: \"kubernetes.io/projected/4814500d-f15e-4457-ad1b-24ae2f076b47-kube-api-access-gwx5l\") pod \"redhat-operators-xkgwx\" (UID: \"4814500d-f15e-4457-ad1b-24ae2f076b47\") " pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.256647 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4814500d-f15e-4457-ad1b-24ae2f076b47-catalog-content\") pod \"redhat-operators-xkgwx\" (UID: \"4814500d-f15e-4457-ad1b-24ae2f076b47\") " pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.256704 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4814500d-f15e-4457-ad1b-24ae2f076b47-utilities\") pod \"redhat-operators-xkgwx\" (UID: \"4814500d-f15e-4457-ad1b-24ae2f076b47\") " pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.257496 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4814500d-f15e-4457-ad1b-24ae2f076b47-utilities\") pod \"redhat-operators-xkgwx\" (UID: \"4814500d-f15e-4457-ad1b-24ae2f076b47\") " pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.257646 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4814500d-f15e-4457-ad1b-24ae2f076b47-catalog-content\") pod \"redhat-operators-xkgwx\" (UID: \"4814500d-f15e-4457-ad1b-24ae2f076b47\") " pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.275227 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwx5l\" (UniqueName: \"kubernetes.io/projected/4814500d-f15e-4457-ad1b-24ae2f076b47-kube-api-access-gwx5l\") pod \"redhat-operators-xkgwx\" (UID: \"4814500d-f15e-4457-ad1b-24ae2f076b47\") " pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.389750 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.872132 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xkgwx"] Feb 19 21:59:54 crc kubenswrapper[4795]: W0219 21:59:54.873600 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4814500d_f15e_4457_ad1b_24ae2f076b47.slice/crio-af917032b74b739b028f2d7dd43f2a3a18d1a2be6f96141ba995f6845233db95 WatchSource:0}: Error finding container af917032b74b739b028f2d7dd43f2a3a18d1a2be6f96141ba995f6845233db95: Status 404 returned error can't find the container with id af917032b74b739b028f2d7dd43f2a3a18d1a2be6f96141ba995f6845233db95 Feb 19 21:59:55 crc kubenswrapper[4795]: I0219 21:59:55.110181 4795 generic.go:334] "Generic (PLEG): container finished" podID="4814500d-f15e-4457-ad1b-24ae2f076b47" containerID="e5fb53341d7b3ce79711e4a20d06ba6ff9336ef38668b0ff7fb9994f9fff1772" exitCode=0 Feb 19 21:59:55 crc kubenswrapper[4795]: I0219 21:59:55.110241 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkgwx" event={"ID":"4814500d-f15e-4457-ad1b-24ae2f076b47","Type":"ContainerDied","Data":"e5fb53341d7b3ce79711e4a20d06ba6ff9336ef38668b0ff7fb9994f9fff1772"} Feb 19 21:59:55 crc kubenswrapper[4795]: I0219 21:59:55.110460 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkgwx" event={"ID":"4814500d-f15e-4457-ad1b-24ae2f076b47","Type":"ContainerStarted","Data":"af917032b74b739b028f2d7dd43f2a3a18d1a2be6f96141ba995f6845233db95"} Feb 19 21:59:55 crc kubenswrapper[4795]: I0219 21:59:55.112409 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 21:59:57 crc kubenswrapper[4795]: I0219 21:59:57.134084 4795 generic.go:334] "Generic (PLEG): container finished" podID="4814500d-f15e-4457-ad1b-24ae2f076b47" containerID="7c2c45c677e30510633a93f2ad571caf052ed8a68f47e0d48c10e86c9f5c5898" exitCode=0 Feb 19 21:59:57 crc kubenswrapper[4795]: I0219 21:59:57.134131 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkgwx" event={"ID":"4814500d-f15e-4457-ad1b-24ae2f076b47","Type":"ContainerDied","Data":"7c2c45c677e30510633a93f2ad571caf052ed8a68f47e0d48c10e86c9f5c5898"} Feb 19 21:59:58 crc kubenswrapper[4795]: I0219 21:59:58.144728 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkgwx" event={"ID":"4814500d-f15e-4457-ad1b-24ae2f076b47","Type":"ContainerStarted","Data":"e2352e213a74129ddfe1ba1875c8613d2ec7a14ae0ec2f089d5460d87f2543db"} Feb 19 21:59:58 crc kubenswrapper[4795]: I0219 21:59:58.171066 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xkgwx" podStartSLOduration=1.7317850890000002 podStartE2EDuration="4.171031281s" podCreationTimestamp="2026-02-19 21:59:54 +0000 UTC" firstStartedPulling="2026-02-19 21:59:55.112191625 +0000 UTC m=+1906.304709479" lastFinishedPulling="2026-02-19 21:59:57.551437797 +0000 UTC m=+1908.743955671" observedRunningTime="2026-02-19 21:59:58.16603065 +0000 UTC m=+1909.358548584" watchObservedRunningTime="2026-02-19 21:59:58.171031281 +0000 UTC m=+1909.363549155" Feb 19 22:00:00 crc kubenswrapper[4795]: I0219 22:00:00.146272 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt"] Feb 19 22:00:00 crc kubenswrapper[4795]: I0219 22:00:00.147691 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" Feb 19 22:00:00 crc kubenswrapper[4795]: I0219 22:00:00.150099 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 22:00:00 crc kubenswrapper[4795]: I0219 22:00:00.152495 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 22:00:00 crc kubenswrapper[4795]: I0219 22:00:00.188470 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt"] Feb 19 22:00:00 crc kubenswrapper[4795]: I0219 22:00:00.237966 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6bba469-9e7c-4517-bc8d-2d5a5308edef-config-volume\") pod \"collect-profiles-29525640-x6xrt\" (UID: \"b6bba469-9e7c-4517-bc8d-2d5a5308edef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" Feb 19 22:00:00 crc kubenswrapper[4795]: I0219 22:00:00.238073 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv7tf\" (UniqueName: \"kubernetes.io/projected/b6bba469-9e7c-4517-bc8d-2d5a5308edef-kube-api-access-gv7tf\") pod \"collect-profiles-29525640-x6xrt\" (UID: \"b6bba469-9e7c-4517-bc8d-2d5a5308edef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" Feb 19 22:00:00 crc kubenswrapper[4795]: I0219 22:00:00.238105 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6bba469-9e7c-4517-bc8d-2d5a5308edef-secret-volume\") pod \"collect-profiles-29525640-x6xrt\" (UID: \"b6bba469-9e7c-4517-bc8d-2d5a5308edef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" Feb 19 22:00:00 crc kubenswrapper[4795]: I0219 22:00:00.339607 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6bba469-9e7c-4517-bc8d-2d5a5308edef-config-volume\") pod \"collect-profiles-29525640-x6xrt\" (UID: \"b6bba469-9e7c-4517-bc8d-2d5a5308edef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" Feb 19 22:00:00 crc kubenswrapper[4795]: I0219 22:00:00.339758 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv7tf\" (UniqueName: \"kubernetes.io/projected/b6bba469-9e7c-4517-bc8d-2d5a5308edef-kube-api-access-gv7tf\") pod \"collect-profiles-29525640-x6xrt\" (UID: \"b6bba469-9e7c-4517-bc8d-2d5a5308edef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" Feb 19 22:00:00 crc kubenswrapper[4795]: I0219 22:00:00.339808 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6bba469-9e7c-4517-bc8d-2d5a5308edef-secret-volume\") pod \"collect-profiles-29525640-x6xrt\" (UID: \"b6bba469-9e7c-4517-bc8d-2d5a5308edef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" Feb 19 22:00:00 crc kubenswrapper[4795]: I0219 22:00:00.340848 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6bba469-9e7c-4517-bc8d-2d5a5308edef-config-volume\") pod \"collect-profiles-29525640-x6xrt\" (UID: \"b6bba469-9e7c-4517-bc8d-2d5a5308edef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" Feb 19 22:00:00 crc kubenswrapper[4795]: I0219 22:00:00.346463 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6bba469-9e7c-4517-bc8d-2d5a5308edef-secret-volume\") pod \"collect-profiles-29525640-x6xrt\" (UID: \"b6bba469-9e7c-4517-bc8d-2d5a5308edef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" Feb 19 22:00:00 crc kubenswrapper[4795]: I0219 22:00:00.357325 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv7tf\" (UniqueName: \"kubernetes.io/projected/b6bba469-9e7c-4517-bc8d-2d5a5308edef-kube-api-access-gv7tf\") pod \"collect-profiles-29525640-x6xrt\" (UID: \"b6bba469-9e7c-4517-bc8d-2d5a5308edef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" Feb 19 22:00:00 crc kubenswrapper[4795]: I0219 22:00:00.471889 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" Feb 19 22:00:00 crc kubenswrapper[4795]: I0219 22:00:00.910332 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt"] Feb 19 22:00:01 crc kubenswrapper[4795]: I0219 22:00:01.166142 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" event={"ID":"b6bba469-9e7c-4517-bc8d-2d5a5308edef","Type":"ContainerStarted","Data":"32bd4a96cb785a6511c7d0788e6684b8612ce255e3204d11b76077aa3fe75418"} Feb 19 22:00:01 crc kubenswrapper[4795]: I0219 22:00:01.178479 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" event={"ID":"b6bba469-9e7c-4517-bc8d-2d5a5308edef","Type":"ContainerStarted","Data":"ec59d844647e5cfdac2fe8c8c762677b385f33bbd1b22bc0652c9d827737d3fd"} Feb 19 22:00:01 crc kubenswrapper[4795]: I0219 22:00:01.202899 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" podStartSLOduration=1.202879485 podStartE2EDuration="1.202879485s" podCreationTimestamp="2026-02-19 22:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:00:01.198028129 +0000 UTC m=+1912.390546003" watchObservedRunningTime="2026-02-19 22:00:01.202879485 +0000 UTC m=+1912.395397369" Feb 19 22:00:01 crc kubenswrapper[4795]: E0219 22:00:01.322261 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6bba469_9e7c_4517_bc8d_2d5a5308edef.slice/crio-32bd4a96cb785a6511c7d0788e6684b8612ce255e3204d11b76077aa3fe75418.scope\": RecentStats: unable to find data in memory cache]" Feb 19 22:00:02 crc kubenswrapper[4795]: I0219 22:00:02.175966 4795 generic.go:334] "Generic (PLEG): container finished" podID="b6bba469-9e7c-4517-bc8d-2d5a5308edef" containerID="32bd4a96cb785a6511c7d0788e6684b8612ce255e3204d11b76077aa3fe75418" exitCode=0 Feb 19 22:00:02 crc kubenswrapper[4795]: I0219 22:00:02.176026 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" event={"ID":"b6bba469-9e7c-4517-bc8d-2d5a5308edef","Type":"ContainerDied","Data":"32bd4a96cb785a6511c7d0788e6684b8612ce255e3204d11b76077aa3fe75418"} Feb 19 22:00:03 crc kubenswrapper[4795]: I0219 22:00:03.445215 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" Feb 19 22:00:03 crc kubenswrapper[4795]: I0219 22:00:03.583835 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gv7tf\" (UniqueName: \"kubernetes.io/projected/b6bba469-9e7c-4517-bc8d-2d5a5308edef-kube-api-access-gv7tf\") pod \"b6bba469-9e7c-4517-bc8d-2d5a5308edef\" (UID: \"b6bba469-9e7c-4517-bc8d-2d5a5308edef\") " Feb 19 22:00:03 crc kubenswrapper[4795]: I0219 22:00:03.584008 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6bba469-9e7c-4517-bc8d-2d5a5308edef-secret-volume\") pod \"b6bba469-9e7c-4517-bc8d-2d5a5308edef\" (UID: \"b6bba469-9e7c-4517-bc8d-2d5a5308edef\") " Feb 19 22:00:03 crc kubenswrapper[4795]: I0219 22:00:03.584716 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6bba469-9e7c-4517-bc8d-2d5a5308edef-config-volume\") pod \"b6bba469-9e7c-4517-bc8d-2d5a5308edef\" (UID: \"b6bba469-9e7c-4517-bc8d-2d5a5308edef\") " Feb 19 22:00:03 crc kubenswrapper[4795]: I0219 22:00:03.585321 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6bba469-9e7c-4517-bc8d-2d5a5308edef-config-volume" (OuterVolumeSpecName: "config-volume") pod "b6bba469-9e7c-4517-bc8d-2d5a5308edef" (UID: "b6bba469-9e7c-4517-bc8d-2d5a5308edef"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:00:03 crc kubenswrapper[4795]: I0219 22:00:03.588995 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6bba469-9e7c-4517-bc8d-2d5a5308edef-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b6bba469-9e7c-4517-bc8d-2d5a5308edef" (UID: "b6bba469-9e7c-4517-bc8d-2d5a5308edef"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:00:03 crc kubenswrapper[4795]: I0219 22:00:03.589472 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6bba469-9e7c-4517-bc8d-2d5a5308edef-kube-api-access-gv7tf" (OuterVolumeSpecName: "kube-api-access-gv7tf") pod "b6bba469-9e7c-4517-bc8d-2d5a5308edef" (UID: "b6bba469-9e7c-4517-bc8d-2d5a5308edef"). InnerVolumeSpecName "kube-api-access-gv7tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:00:03 crc kubenswrapper[4795]: I0219 22:00:03.686408 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gv7tf\" (UniqueName: \"kubernetes.io/projected/b6bba469-9e7c-4517-bc8d-2d5a5308edef-kube-api-access-gv7tf\") on node \"crc\" DevicePath \"\"" Feb 19 22:00:03 crc kubenswrapper[4795]: I0219 22:00:03.686459 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6bba469-9e7c-4517-bc8d-2d5a5308edef-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 22:00:03 crc kubenswrapper[4795]: I0219 22:00:03.686474 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6bba469-9e7c-4517-bc8d-2d5a5308edef-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 22:00:04 crc kubenswrapper[4795]: I0219 22:00:04.191272 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" event={"ID":"b6bba469-9e7c-4517-bc8d-2d5a5308edef","Type":"ContainerDied","Data":"ec59d844647e5cfdac2fe8c8c762677b385f33bbd1b22bc0652c9d827737d3fd"} Feb 19 22:00:04 crc kubenswrapper[4795]: I0219 22:00:04.191328 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec59d844647e5cfdac2fe8c8c762677b385f33bbd1b22bc0652c9d827737d3fd" Feb 19 22:00:04 crc kubenswrapper[4795]: I0219 22:00:04.191626 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" Feb 19 22:00:04 crc kubenswrapper[4795]: I0219 22:00:04.390897 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 22:00:04 crc kubenswrapper[4795]: I0219 22:00:04.390955 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 22:00:04 crc kubenswrapper[4795]: I0219 22:00:04.435143 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 22:00:05 crc kubenswrapper[4795]: I0219 22:00:05.246671 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 22:00:06 crc kubenswrapper[4795]: I0219 22:00:06.236026 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xkgwx"] Feb 19 22:00:07 crc kubenswrapper[4795]: I0219 22:00:07.215078 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xkgwx" podUID="4814500d-f15e-4457-ad1b-24ae2f076b47" containerName="registry-server" containerID="cri-o://e2352e213a74129ddfe1ba1875c8613d2ec7a14ae0ec2f089d5460d87f2543db" gracePeriod=2 Feb 19 22:00:09 crc kubenswrapper[4795]: I0219 22:00:09.238364 4795 generic.go:334] "Generic (PLEG): container finished" podID="4814500d-f15e-4457-ad1b-24ae2f076b47" containerID="e2352e213a74129ddfe1ba1875c8613d2ec7a14ae0ec2f089d5460d87f2543db" exitCode=0 Feb 19 22:00:09 crc kubenswrapper[4795]: I0219 22:00:09.238450 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkgwx" event={"ID":"4814500d-f15e-4457-ad1b-24ae2f076b47","Type":"ContainerDied","Data":"e2352e213a74129ddfe1ba1875c8613d2ec7a14ae0ec2f089d5460d87f2543db"} Feb 19 22:00:09 crc kubenswrapper[4795]: I0219 22:00:09.504327 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 22:00:09 crc kubenswrapper[4795]: I0219 22:00:09.583030 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwx5l\" (UniqueName: \"kubernetes.io/projected/4814500d-f15e-4457-ad1b-24ae2f076b47-kube-api-access-gwx5l\") pod \"4814500d-f15e-4457-ad1b-24ae2f076b47\" (UID: \"4814500d-f15e-4457-ad1b-24ae2f076b47\") " Feb 19 22:00:09 crc kubenswrapper[4795]: I0219 22:00:09.583135 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4814500d-f15e-4457-ad1b-24ae2f076b47-catalog-content\") pod \"4814500d-f15e-4457-ad1b-24ae2f076b47\" (UID: \"4814500d-f15e-4457-ad1b-24ae2f076b47\") " Feb 19 22:00:09 crc kubenswrapper[4795]: I0219 22:00:09.583286 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4814500d-f15e-4457-ad1b-24ae2f076b47-utilities\") pod \"4814500d-f15e-4457-ad1b-24ae2f076b47\" (UID: \"4814500d-f15e-4457-ad1b-24ae2f076b47\") " Feb 19 22:00:09 crc kubenswrapper[4795]: I0219 22:00:09.585267 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4814500d-f15e-4457-ad1b-24ae2f076b47-utilities" (OuterVolumeSpecName: "utilities") pod "4814500d-f15e-4457-ad1b-24ae2f076b47" (UID: "4814500d-f15e-4457-ad1b-24ae2f076b47"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:00:09 crc kubenswrapper[4795]: I0219 22:00:09.595467 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4814500d-f15e-4457-ad1b-24ae2f076b47-kube-api-access-gwx5l" (OuterVolumeSpecName: "kube-api-access-gwx5l") pod "4814500d-f15e-4457-ad1b-24ae2f076b47" (UID: "4814500d-f15e-4457-ad1b-24ae2f076b47"). InnerVolumeSpecName "kube-api-access-gwx5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:00:09 crc kubenswrapper[4795]: I0219 22:00:09.687428 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4814500d-f15e-4457-ad1b-24ae2f076b47-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:00:09 crc kubenswrapper[4795]: I0219 22:00:09.687490 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwx5l\" (UniqueName: \"kubernetes.io/projected/4814500d-f15e-4457-ad1b-24ae2f076b47-kube-api-access-gwx5l\") on node \"crc\" DevicePath \"\"" Feb 19 22:00:09 crc kubenswrapper[4795]: I0219 22:00:09.708511 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4814500d-f15e-4457-ad1b-24ae2f076b47-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4814500d-f15e-4457-ad1b-24ae2f076b47" (UID: "4814500d-f15e-4457-ad1b-24ae2f076b47"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:00:09 crc kubenswrapper[4795]: I0219 22:00:09.788666 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4814500d-f15e-4457-ad1b-24ae2f076b47-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:00:10 crc kubenswrapper[4795]: I0219 22:00:10.247867 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkgwx" event={"ID":"4814500d-f15e-4457-ad1b-24ae2f076b47","Type":"ContainerDied","Data":"af917032b74b739b028f2d7dd43f2a3a18d1a2be6f96141ba995f6845233db95"} Feb 19 22:00:10 crc kubenswrapper[4795]: I0219 22:00:10.247910 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 22:00:10 crc kubenswrapper[4795]: I0219 22:00:10.247923 4795 scope.go:117] "RemoveContainer" containerID="e2352e213a74129ddfe1ba1875c8613d2ec7a14ae0ec2f089d5460d87f2543db" Feb 19 22:00:10 crc kubenswrapper[4795]: I0219 22:00:10.287944 4795 scope.go:117] "RemoveContainer" containerID="7c2c45c677e30510633a93f2ad571caf052ed8a68f47e0d48c10e86c9f5c5898" Feb 19 22:00:10 crc kubenswrapper[4795]: I0219 22:00:10.292826 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xkgwx"] Feb 19 22:00:10 crc kubenswrapper[4795]: I0219 22:00:10.303619 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xkgwx"] Feb 19 22:00:10 crc kubenswrapper[4795]: I0219 22:00:10.313127 4795 scope.go:117] "RemoveContainer" containerID="e5fb53341d7b3ce79711e4a20d06ba6ff9336ef38668b0ff7fb9994f9fff1772" Feb 19 22:00:11 crc kubenswrapper[4795]: I0219 22:00:11.519151 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4814500d-f15e-4457-ad1b-24ae2f076b47" path="/var/lib/kubelet/pods/4814500d-f15e-4457-ad1b-24ae2f076b47/volumes" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.140361 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nth7v"] Feb 19 22:00:31 crc kubenswrapper[4795]: E0219 22:00:31.141247 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4814500d-f15e-4457-ad1b-24ae2f076b47" containerName="registry-server" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.141264 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4814500d-f15e-4457-ad1b-24ae2f076b47" containerName="registry-server" Feb 19 22:00:31 crc kubenswrapper[4795]: E0219 22:00:31.141277 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4814500d-f15e-4457-ad1b-24ae2f076b47" containerName="extract-content" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.141284 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4814500d-f15e-4457-ad1b-24ae2f076b47" containerName="extract-content" Feb 19 22:00:31 crc kubenswrapper[4795]: E0219 22:00:31.141295 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4814500d-f15e-4457-ad1b-24ae2f076b47" containerName="extract-utilities" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.141305 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4814500d-f15e-4457-ad1b-24ae2f076b47" containerName="extract-utilities" Feb 19 22:00:31 crc kubenswrapper[4795]: E0219 22:00:31.141315 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6bba469-9e7c-4517-bc8d-2d5a5308edef" containerName="collect-profiles" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.141322 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6bba469-9e7c-4517-bc8d-2d5a5308edef" containerName="collect-profiles" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.141498 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4814500d-f15e-4457-ad1b-24ae2f076b47" containerName="registry-server" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.141519 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6bba469-9e7c-4517-bc8d-2d5a5308edef" containerName="collect-profiles" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.142840 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.147053 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nth7v"] Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.293506 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt6rq\" (UniqueName: \"kubernetes.io/projected/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-kube-api-access-lt6rq\") pod \"community-operators-nth7v\" (UID: \"e21c86a5-45d5-4a66-ab91-2c5f63ed9560\") " pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.293848 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-catalog-content\") pod \"community-operators-nth7v\" (UID: \"e21c86a5-45d5-4a66-ab91-2c5f63ed9560\") " pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.293958 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-utilities\") pod \"community-operators-nth7v\" (UID: \"e21c86a5-45d5-4a66-ab91-2c5f63ed9560\") " pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.394941 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt6rq\" (UniqueName: \"kubernetes.io/projected/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-kube-api-access-lt6rq\") pod \"community-operators-nth7v\" (UID: \"e21c86a5-45d5-4a66-ab91-2c5f63ed9560\") " pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.395010 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-catalog-content\") pod \"community-operators-nth7v\" (UID: \"e21c86a5-45d5-4a66-ab91-2c5f63ed9560\") " pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.395039 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-utilities\") pod \"community-operators-nth7v\" (UID: \"e21c86a5-45d5-4a66-ab91-2c5f63ed9560\") " pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.395592 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-catalog-content\") pod \"community-operators-nth7v\" (UID: \"e21c86a5-45d5-4a66-ab91-2c5f63ed9560\") " pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.395643 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-utilities\") pod \"community-operators-nth7v\" (UID: \"e21c86a5-45d5-4a66-ab91-2c5f63ed9560\") " pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.417459 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt6rq\" (UniqueName: \"kubernetes.io/projected/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-kube-api-access-lt6rq\") pod \"community-operators-nth7v\" (UID: \"e21c86a5-45d5-4a66-ab91-2c5f63ed9560\") " pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.465551 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.928961 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lgk8z"] Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.930606 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.959901 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lgk8z"] Feb 19 22:00:32 crc kubenswrapper[4795]: I0219 22:00:32.004457 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttm5g\" (UniqueName: \"kubernetes.io/projected/81fba4b2-8284-4218-848a-d969914c88d4-kube-api-access-ttm5g\") pod \"redhat-marketplace-lgk8z\" (UID: \"81fba4b2-8284-4218-848a-d969914c88d4\") " pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:32 crc kubenswrapper[4795]: I0219 22:00:32.004560 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81fba4b2-8284-4218-848a-d969914c88d4-catalog-content\") pod \"redhat-marketplace-lgk8z\" (UID: \"81fba4b2-8284-4218-848a-d969914c88d4\") " pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:32 crc kubenswrapper[4795]: I0219 22:00:32.004587 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81fba4b2-8284-4218-848a-d969914c88d4-utilities\") pod \"redhat-marketplace-lgk8z\" (UID: \"81fba4b2-8284-4218-848a-d969914c88d4\") " pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:32 crc kubenswrapper[4795]: I0219 22:00:32.004669 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nth7v"] Feb 19 22:00:32 crc kubenswrapper[4795]: I0219 22:00:32.105426 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81fba4b2-8284-4218-848a-d969914c88d4-catalog-content\") pod \"redhat-marketplace-lgk8z\" (UID: \"81fba4b2-8284-4218-848a-d969914c88d4\") " pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:32 crc kubenswrapper[4795]: I0219 22:00:32.105466 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81fba4b2-8284-4218-848a-d969914c88d4-utilities\") pod \"redhat-marketplace-lgk8z\" (UID: \"81fba4b2-8284-4218-848a-d969914c88d4\") " pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:32 crc kubenswrapper[4795]: I0219 22:00:32.105526 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttm5g\" (UniqueName: \"kubernetes.io/projected/81fba4b2-8284-4218-848a-d969914c88d4-kube-api-access-ttm5g\") pod \"redhat-marketplace-lgk8z\" (UID: \"81fba4b2-8284-4218-848a-d969914c88d4\") " pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:32 crc kubenswrapper[4795]: I0219 22:00:32.105948 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81fba4b2-8284-4218-848a-d969914c88d4-catalog-content\") pod \"redhat-marketplace-lgk8z\" (UID: \"81fba4b2-8284-4218-848a-d969914c88d4\") " pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:32 crc kubenswrapper[4795]: I0219 22:00:32.106063 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81fba4b2-8284-4218-848a-d969914c88d4-utilities\") pod \"redhat-marketplace-lgk8z\" (UID: \"81fba4b2-8284-4218-848a-d969914c88d4\") " pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:32 crc kubenswrapper[4795]: I0219 22:00:32.125390 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttm5g\" (UniqueName: \"kubernetes.io/projected/81fba4b2-8284-4218-848a-d969914c88d4-kube-api-access-ttm5g\") pod \"redhat-marketplace-lgk8z\" (UID: \"81fba4b2-8284-4218-848a-d969914c88d4\") " pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:32 crc kubenswrapper[4795]: I0219 22:00:32.280746 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:32 crc kubenswrapper[4795]: I0219 22:00:32.530601 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lgk8z"] Feb 19 22:00:32 crc kubenswrapper[4795]: W0219 22:00:32.534409 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81fba4b2_8284_4218_848a_d969914c88d4.slice/crio-c6acdb6aba45ef8ff16e31b1bbf6b17397cea11d384ef65816c436f0d3dba5b7 WatchSource:0}: Error finding container c6acdb6aba45ef8ff16e31b1bbf6b17397cea11d384ef65816c436f0d3dba5b7: Status 404 returned error can't find the container with id c6acdb6aba45ef8ff16e31b1bbf6b17397cea11d384ef65816c436f0d3dba5b7 Feb 19 22:00:32 crc kubenswrapper[4795]: I0219 22:00:32.607040 4795 generic.go:334] "Generic (PLEG): container finished" podID="e21c86a5-45d5-4a66-ab91-2c5f63ed9560" containerID="1eb7b71306343f2aaf19d2656ab650c032b78bd6319b8c34e6b6bb82e121576d" exitCode=0 Feb 19 22:00:32 crc kubenswrapper[4795]: I0219 22:00:32.607096 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nth7v" event={"ID":"e21c86a5-45d5-4a66-ab91-2c5f63ed9560","Type":"ContainerDied","Data":"1eb7b71306343f2aaf19d2656ab650c032b78bd6319b8c34e6b6bb82e121576d"} Feb 19 22:00:32 crc kubenswrapper[4795]: I0219 22:00:32.607408 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nth7v" event={"ID":"e21c86a5-45d5-4a66-ab91-2c5f63ed9560","Type":"ContainerStarted","Data":"67faa7fa182c32f61a7ee9cd36ad3cb7441940baaa79652676306e2045363915"} Feb 19 22:00:32 crc kubenswrapper[4795]: I0219 22:00:32.609983 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lgk8z" event={"ID":"81fba4b2-8284-4218-848a-d969914c88d4","Type":"ContainerStarted","Data":"c6acdb6aba45ef8ff16e31b1bbf6b17397cea11d384ef65816c436f0d3dba5b7"} Feb 19 22:00:33 crc kubenswrapper[4795]: I0219 22:00:33.618581 4795 generic.go:334] "Generic (PLEG): container finished" podID="e21c86a5-45d5-4a66-ab91-2c5f63ed9560" containerID="0d72bd7cfa5eaabd5dc9ed776be344a38d94fc2622735fd266c9e88dd232bb60" exitCode=0 Feb 19 22:00:33 crc kubenswrapper[4795]: I0219 22:00:33.618663 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nth7v" event={"ID":"e21c86a5-45d5-4a66-ab91-2c5f63ed9560","Type":"ContainerDied","Data":"0d72bd7cfa5eaabd5dc9ed776be344a38d94fc2622735fd266c9e88dd232bb60"} Feb 19 22:00:33 crc kubenswrapper[4795]: I0219 22:00:33.620159 4795 generic.go:334] "Generic (PLEG): container finished" podID="81fba4b2-8284-4218-848a-d969914c88d4" containerID="d2312f72c72ef688566d86b7fdb387a6daba57278a8dbfb214017e427ccccef3" exitCode=0 Feb 19 22:00:33 crc kubenswrapper[4795]: I0219 22:00:33.620248 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lgk8z" event={"ID":"81fba4b2-8284-4218-848a-d969914c88d4","Type":"ContainerDied","Data":"d2312f72c72ef688566d86b7fdb387a6daba57278a8dbfb214017e427ccccef3"} Feb 19 22:00:34 crc kubenswrapper[4795]: I0219 22:00:34.628624 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nth7v" event={"ID":"e21c86a5-45d5-4a66-ab91-2c5f63ed9560","Type":"ContainerStarted","Data":"5e21c4ae1f94887d1d9db8a6574b2bc29b068ff8a0821dd49fcbb44217347837"} Feb 19 22:00:34 crc kubenswrapper[4795]: I0219 22:00:34.630117 4795 generic.go:334] "Generic (PLEG): container finished" podID="81fba4b2-8284-4218-848a-d969914c88d4" containerID="21993cb96709bea6b105a0765c593584826bed846b84570a65a933931c1dcaa0" exitCode=0 Feb 19 22:00:34 crc kubenswrapper[4795]: I0219 22:00:34.630178 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lgk8z" event={"ID":"81fba4b2-8284-4218-848a-d969914c88d4","Type":"ContainerDied","Data":"21993cb96709bea6b105a0765c593584826bed846b84570a65a933931c1dcaa0"} Feb 19 22:00:34 crc kubenswrapper[4795]: I0219 22:00:34.663806 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nth7v" podStartSLOduration=2.166059846 podStartE2EDuration="3.663789099s" podCreationTimestamp="2026-02-19 22:00:31 +0000 UTC" firstStartedPulling="2026-02-19 22:00:32.608866156 +0000 UTC m=+1943.801384020" lastFinishedPulling="2026-02-19 22:00:34.106595389 +0000 UTC m=+1945.299113273" observedRunningTime="2026-02-19 22:00:34.655608903 +0000 UTC m=+1945.848126767" watchObservedRunningTime="2026-02-19 22:00:34.663789099 +0000 UTC m=+1945.856306963" Feb 19 22:00:35 crc kubenswrapper[4795]: I0219 22:00:35.638150 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lgk8z" event={"ID":"81fba4b2-8284-4218-848a-d969914c88d4","Type":"ContainerStarted","Data":"2844950df5dd099841ece43f4f4f6bf715e39808291749840a5ed01ccbf8f97d"} Feb 19 22:00:35 crc kubenswrapper[4795]: I0219 22:00:35.656940 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lgk8z" podStartSLOduration=3.263142377 podStartE2EDuration="4.656923198s" podCreationTimestamp="2026-02-19 22:00:31 +0000 UTC" firstStartedPulling="2026-02-19 22:00:33.621426771 +0000 UTC m=+1944.813944635" lastFinishedPulling="2026-02-19 22:00:35.015207472 +0000 UTC m=+1946.207725456" observedRunningTime="2026-02-19 22:00:35.653478783 +0000 UTC m=+1946.845996647" watchObservedRunningTime="2026-02-19 22:00:35.656923198 +0000 UTC m=+1946.849441062" Feb 19 22:00:41 crc kubenswrapper[4795]: I0219 22:00:41.465860 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:41 crc kubenswrapper[4795]: I0219 22:00:41.466342 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:41 crc kubenswrapper[4795]: I0219 22:00:41.520953 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:41 crc kubenswrapper[4795]: I0219 22:00:41.713414 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:41 crc kubenswrapper[4795]: I0219 22:00:41.755933 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nth7v"] Feb 19 22:00:42 crc kubenswrapper[4795]: I0219 22:00:42.280995 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:42 crc kubenswrapper[4795]: I0219 22:00:42.281352 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:42 crc kubenswrapper[4795]: I0219 22:00:42.330537 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:42 crc kubenswrapper[4795]: I0219 22:00:42.717874 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:43 crc kubenswrapper[4795]: I0219 22:00:43.681446 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nth7v" podUID="e21c86a5-45d5-4a66-ab91-2c5f63ed9560" containerName="registry-server" containerID="cri-o://5e21c4ae1f94887d1d9db8a6574b2bc29b068ff8a0821dd49fcbb44217347837" gracePeriod=2 Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.142240 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.152241 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lgk8z"] Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.313508 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-catalog-content\") pod \"e21c86a5-45d5-4a66-ab91-2c5f63ed9560\" (UID: \"e21c86a5-45d5-4a66-ab91-2c5f63ed9560\") " Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.313611 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt6rq\" (UniqueName: \"kubernetes.io/projected/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-kube-api-access-lt6rq\") pod \"e21c86a5-45d5-4a66-ab91-2c5f63ed9560\" (UID: \"e21c86a5-45d5-4a66-ab91-2c5f63ed9560\") " Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.313768 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-utilities\") pod \"e21c86a5-45d5-4a66-ab91-2c5f63ed9560\" (UID: \"e21c86a5-45d5-4a66-ab91-2c5f63ed9560\") " Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.314755 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-utilities" (OuterVolumeSpecName: "utilities") pod "e21c86a5-45d5-4a66-ab91-2c5f63ed9560" (UID: "e21c86a5-45d5-4a66-ab91-2c5f63ed9560"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.327388 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-kube-api-access-lt6rq" (OuterVolumeSpecName: "kube-api-access-lt6rq") pod "e21c86a5-45d5-4a66-ab91-2c5f63ed9560" (UID: "e21c86a5-45d5-4a66-ab91-2c5f63ed9560"). InnerVolumeSpecName "kube-api-access-lt6rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.415908 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.415946 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt6rq\" (UniqueName: \"kubernetes.io/projected/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-kube-api-access-lt6rq\") on node \"crc\" DevicePath \"\"" Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.582401 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e21c86a5-45d5-4a66-ab91-2c5f63ed9560" (UID: "e21c86a5-45d5-4a66-ab91-2c5f63ed9560"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.619124 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.694428 4795 generic.go:334] "Generic (PLEG): container finished" podID="e21c86a5-45d5-4a66-ab91-2c5f63ed9560" containerID="5e21c4ae1f94887d1d9db8a6574b2bc29b068ff8a0821dd49fcbb44217347837" exitCode=0 Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.694504 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.694588 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nth7v" event={"ID":"e21c86a5-45d5-4a66-ab91-2c5f63ed9560","Type":"ContainerDied","Data":"5e21c4ae1f94887d1d9db8a6574b2bc29b068ff8a0821dd49fcbb44217347837"} Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.694706 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nth7v" event={"ID":"e21c86a5-45d5-4a66-ab91-2c5f63ed9560","Type":"ContainerDied","Data":"67faa7fa182c32f61a7ee9cd36ad3cb7441940baaa79652676306e2045363915"} Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.694747 4795 scope.go:117] "RemoveContainer" containerID="5e21c4ae1f94887d1d9db8a6574b2bc29b068ff8a0821dd49fcbb44217347837" Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.722677 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nth7v"] Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.727944 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nth7v"] Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.739658 4795 scope.go:117] "RemoveContainer" containerID="0d72bd7cfa5eaabd5dc9ed776be344a38d94fc2622735fd266c9e88dd232bb60" Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.770462 4795 scope.go:117] "RemoveContainer" containerID="1eb7b71306343f2aaf19d2656ab650c032b78bd6319b8c34e6b6bb82e121576d" Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.795689 4795 scope.go:117] "RemoveContainer" containerID="5e21c4ae1f94887d1d9db8a6574b2bc29b068ff8a0821dd49fcbb44217347837" Feb 19 22:00:44 crc kubenswrapper[4795]: E0219 22:00:44.796293 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e21c4ae1f94887d1d9db8a6574b2bc29b068ff8a0821dd49fcbb44217347837\": container with ID starting with 5e21c4ae1f94887d1d9db8a6574b2bc29b068ff8a0821dd49fcbb44217347837 not found: ID does not exist" containerID="5e21c4ae1f94887d1d9db8a6574b2bc29b068ff8a0821dd49fcbb44217347837" Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.796341 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e21c4ae1f94887d1d9db8a6574b2bc29b068ff8a0821dd49fcbb44217347837"} err="failed to get container status \"5e21c4ae1f94887d1d9db8a6574b2bc29b068ff8a0821dd49fcbb44217347837\": rpc error: code = NotFound desc = could not find container \"5e21c4ae1f94887d1d9db8a6574b2bc29b068ff8a0821dd49fcbb44217347837\": container with ID starting with 5e21c4ae1f94887d1d9db8a6574b2bc29b068ff8a0821dd49fcbb44217347837 not found: ID does not exist" Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.796369 4795 scope.go:117] "RemoveContainer" containerID="0d72bd7cfa5eaabd5dc9ed776be344a38d94fc2622735fd266c9e88dd232bb60" Feb 19 22:00:44 crc kubenswrapper[4795]: E0219 22:00:44.796892 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d72bd7cfa5eaabd5dc9ed776be344a38d94fc2622735fd266c9e88dd232bb60\": container with ID starting with 0d72bd7cfa5eaabd5dc9ed776be344a38d94fc2622735fd266c9e88dd232bb60 not found: ID does not exist" containerID="0d72bd7cfa5eaabd5dc9ed776be344a38d94fc2622735fd266c9e88dd232bb60" Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.796922 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d72bd7cfa5eaabd5dc9ed776be344a38d94fc2622735fd266c9e88dd232bb60"} err="failed to get container status \"0d72bd7cfa5eaabd5dc9ed776be344a38d94fc2622735fd266c9e88dd232bb60\": rpc error: code = NotFound desc = could not find container \"0d72bd7cfa5eaabd5dc9ed776be344a38d94fc2622735fd266c9e88dd232bb60\": container with ID starting with 0d72bd7cfa5eaabd5dc9ed776be344a38d94fc2622735fd266c9e88dd232bb60 not found: ID does not exist" Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.796940 4795 scope.go:117] "RemoveContainer" containerID="1eb7b71306343f2aaf19d2656ab650c032b78bd6319b8c34e6b6bb82e121576d" Feb 19 22:00:44 crc kubenswrapper[4795]: E0219 22:00:44.797532 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eb7b71306343f2aaf19d2656ab650c032b78bd6319b8c34e6b6bb82e121576d\": container with ID starting with 1eb7b71306343f2aaf19d2656ab650c032b78bd6319b8c34e6b6bb82e121576d not found: ID does not exist" containerID="1eb7b71306343f2aaf19d2656ab650c032b78bd6319b8c34e6b6bb82e121576d" Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.797559 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eb7b71306343f2aaf19d2656ab650c032b78bd6319b8c34e6b6bb82e121576d"} err="failed to get container status \"1eb7b71306343f2aaf19d2656ab650c032b78bd6319b8c34e6b6bb82e121576d\": rpc error: code = NotFound desc = could not find container \"1eb7b71306343f2aaf19d2656ab650c032b78bd6319b8c34e6b6bb82e121576d\": container with ID starting with 1eb7b71306343f2aaf19d2656ab650c032b78bd6319b8c34e6b6bb82e121576d not found: ID does not exist" Feb 19 22:00:45 crc kubenswrapper[4795]: I0219 22:00:45.527060 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e21c86a5-45d5-4a66-ab91-2c5f63ed9560" path="/var/lib/kubelet/pods/e21c86a5-45d5-4a66-ab91-2c5f63ed9560/volumes" Feb 19 22:00:45 crc kubenswrapper[4795]: I0219 22:00:45.702782 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lgk8z" podUID="81fba4b2-8284-4218-848a-d969914c88d4" containerName="registry-server" containerID="cri-o://2844950df5dd099841ece43f4f4f6bf715e39808291749840a5ed01ccbf8f97d" gracePeriod=2 Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.143307 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.247813 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81fba4b2-8284-4218-848a-d969914c88d4-utilities\") pod \"81fba4b2-8284-4218-848a-d969914c88d4\" (UID: \"81fba4b2-8284-4218-848a-d969914c88d4\") " Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.247874 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttm5g\" (UniqueName: \"kubernetes.io/projected/81fba4b2-8284-4218-848a-d969914c88d4-kube-api-access-ttm5g\") pod \"81fba4b2-8284-4218-848a-d969914c88d4\" (UID: \"81fba4b2-8284-4218-848a-d969914c88d4\") " Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.247984 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81fba4b2-8284-4218-848a-d969914c88d4-catalog-content\") pod \"81fba4b2-8284-4218-848a-d969914c88d4\" (UID: \"81fba4b2-8284-4218-848a-d969914c88d4\") " Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.249462 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81fba4b2-8284-4218-848a-d969914c88d4-utilities" (OuterVolumeSpecName: "utilities") pod "81fba4b2-8284-4218-848a-d969914c88d4" (UID: "81fba4b2-8284-4218-848a-d969914c88d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.255364 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81fba4b2-8284-4218-848a-d969914c88d4-kube-api-access-ttm5g" (OuterVolumeSpecName: "kube-api-access-ttm5g") pod "81fba4b2-8284-4218-848a-d969914c88d4" (UID: "81fba4b2-8284-4218-848a-d969914c88d4"). InnerVolumeSpecName "kube-api-access-ttm5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.280579 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81fba4b2-8284-4218-848a-d969914c88d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81fba4b2-8284-4218-848a-d969914c88d4" (UID: "81fba4b2-8284-4218-848a-d969914c88d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.350271 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81fba4b2-8284-4218-848a-d969914c88d4-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.350306 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttm5g\" (UniqueName: \"kubernetes.io/projected/81fba4b2-8284-4218-848a-d969914c88d4-kube-api-access-ttm5g\") on node \"crc\" DevicePath \"\"" Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.350319 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81fba4b2-8284-4218-848a-d969914c88d4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.711053 4795 generic.go:334] "Generic (PLEG): container finished" podID="81fba4b2-8284-4218-848a-d969914c88d4" containerID="2844950df5dd099841ece43f4f4f6bf715e39808291749840a5ed01ccbf8f97d" exitCode=0 Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.711093 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lgk8z" event={"ID":"81fba4b2-8284-4218-848a-d969914c88d4","Type":"ContainerDied","Data":"2844950df5dd099841ece43f4f4f6bf715e39808291749840a5ed01ccbf8f97d"} Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.711119 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.711138 4795 scope.go:117] "RemoveContainer" containerID="2844950df5dd099841ece43f4f4f6bf715e39808291749840a5ed01ccbf8f97d" Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.711126 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lgk8z" event={"ID":"81fba4b2-8284-4218-848a-d969914c88d4","Type":"ContainerDied","Data":"c6acdb6aba45ef8ff16e31b1bbf6b17397cea11d384ef65816c436f0d3dba5b7"} Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.726010 4795 scope.go:117] "RemoveContainer" containerID="21993cb96709bea6b105a0765c593584826bed846b84570a65a933931c1dcaa0" Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.743877 4795 scope.go:117] "RemoveContainer" containerID="d2312f72c72ef688566d86b7fdb387a6daba57278a8dbfb214017e427ccccef3" Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.761561 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lgk8z"] Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.769270 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lgk8z"] Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.775559 4795 scope.go:117] "RemoveContainer" containerID="2844950df5dd099841ece43f4f4f6bf715e39808291749840a5ed01ccbf8f97d" Feb 19 22:00:46 crc kubenswrapper[4795]: E0219 22:00:46.778565 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2844950df5dd099841ece43f4f4f6bf715e39808291749840a5ed01ccbf8f97d\": container with ID starting with 2844950df5dd099841ece43f4f4f6bf715e39808291749840a5ed01ccbf8f97d not found: ID does not exist" containerID="2844950df5dd099841ece43f4f4f6bf715e39808291749840a5ed01ccbf8f97d" Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.778594 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2844950df5dd099841ece43f4f4f6bf715e39808291749840a5ed01ccbf8f97d"} err="failed to get container status \"2844950df5dd099841ece43f4f4f6bf715e39808291749840a5ed01ccbf8f97d\": rpc error: code = NotFound desc = could not find container \"2844950df5dd099841ece43f4f4f6bf715e39808291749840a5ed01ccbf8f97d\": container with ID starting with 2844950df5dd099841ece43f4f4f6bf715e39808291749840a5ed01ccbf8f97d not found: ID does not exist" Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.778616 4795 scope.go:117] "RemoveContainer" containerID="21993cb96709bea6b105a0765c593584826bed846b84570a65a933931c1dcaa0" Feb 19 22:00:46 crc kubenswrapper[4795]: E0219 22:00:46.778922 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21993cb96709bea6b105a0765c593584826bed846b84570a65a933931c1dcaa0\": container with ID starting with 21993cb96709bea6b105a0765c593584826bed846b84570a65a933931c1dcaa0 not found: ID does not exist" containerID="21993cb96709bea6b105a0765c593584826bed846b84570a65a933931c1dcaa0" Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.778943 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21993cb96709bea6b105a0765c593584826bed846b84570a65a933931c1dcaa0"} err="failed to get container status \"21993cb96709bea6b105a0765c593584826bed846b84570a65a933931c1dcaa0\": rpc error: code = NotFound desc = could not find container \"21993cb96709bea6b105a0765c593584826bed846b84570a65a933931c1dcaa0\": container with ID starting with 21993cb96709bea6b105a0765c593584826bed846b84570a65a933931c1dcaa0 not found: ID does not exist" Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.778957 4795 scope.go:117] "RemoveContainer" containerID="d2312f72c72ef688566d86b7fdb387a6daba57278a8dbfb214017e427ccccef3" Feb 19 22:00:46 crc kubenswrapper[4795]: E0219 22:00:46.779206 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2312f72c72ef688566d86b7fdb387a6daba57278a8dbfb214017e427ccccef3\": container with ID starting with d2312f72c72ef688566d86b7fdb387a6daba57278a8dbfb214017e427ccccef3 not found: ID does not exist" containerID="d2312f72c72ef688566d86b7fdb387a6daba57278a8dbfb214017e427ccccef3" Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.779230 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2312f72c72ef688566d86b7fdb387a6daba57278a8dbfb214017e427ccccef3"} err="failed to get container status \"d2312f72c72ef688566d86b7fdb387a6daba57278a8dbfb214017e427ccccef3\": rpc error: code = NotFound desc = could not find container \"d2312f72c72ef688566d86b7fdb387a6daba57278a8dbfb214017e427ccccef3\": container with ID starting with d2312f72c72ef688566d86b7fdb387a6daba57278a8dbfb214017e427ccccef3 not found: ID does not exist" Feb 19 22:00:47 crc kubenswrapper[4795]: I0219 22:00:47.522881 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81fba4b2-8284-4218-848a-d969914c88d4" path="/var/lib/kubelet/pods/81fba4b2-8284-4218-848a-d969914c88d4/volumes" Feb 19 22:00:58 crc kubenswrapper[4795]: I0219 22:00:58.428044 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:00:58 crc kubenswrapper[4795]: I0219 22:00:58.428708 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:01:28 crc kubenswrapper[4795]: I0219 22:01:28.427948 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:01:28 crc kubenswrapper[4795]: I0219 22:01:28.428512 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:01:58 crc kubenswrapper[4795]: I0219 22:01:58.427720 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:01:58 crc kubenswrapper[4795]: I0219 22:01:58.428267 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:01:58 crc kubenswrapper[4795]: I0219 22:01:58.428312 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 22:01:58 crc kubenswrapper[4795]: I0219 22:01:58.428922 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6c01aaf6e3d8e8204ba05a7b4dc2e4ab1c48baf63fa06a4eaffcb4f1cd336768"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 22:01:58 crc kubenswrapper[4795]: I0219 22:01:58.428981 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://6c01aaf6e3d8e8204ba05a7b4dc2e4ab1c48baf63fa06a4eaffcb4f1cd336768" gracePeriod=600 Feb 19 22:01:59 crc kubenswrapper[4795]: I0219 22:01:59.222318 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="6c01aaf6e3d8e8204ba05a7b4dc2e4ab1c48baf63fa06a4eaffcb4f1cd336768" exitCode=0 Feb 19 22:01:59 crc kubenswrapper[4795]: I0219 22:01:59.222425 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"6c01aaf6e3d8e8204ba05a7b4dc2e4ab1c48baf63fa06a4eaffcb4f1cd336768"} Feb 19 22:01:59 crc kubenswrapper[4795]: I0219 22:01:59.222930 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0"} Feb 19 22:01:59 crc kubenswrapper[4795]: I0219 22:01:59.222951 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 22:03:58 crc kubenswrapper[4795]: I0219 22:03:58.428288 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:03:58 crc kubenswrapper[4795]: I0219 22:03:58.429267 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.117549 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m84kt"] Feb 19 22:04:22 crc kubenswrapper[4795]: E0219 22:04:22.118566 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e21c86a5-45d5-4a66-ab91-2c5f63ed9560" containerName="registry-server" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.118584 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e21c86a5-45d5-4a66-ab91-2c5f63ed9560" containerName="registry-server" Feb 19 22:04:22 crc kubenswrapper[4795]: E0219 22:04:22.118600 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81fba4b2-8284-4218-848a-d969914c88d4" containerName="extract-content" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.118607 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="81fba4b2-8284-4218-848a-d969914c88d4" containerName="extract-content" Feb 19 22:04:22 crc kubenswrapper[4795]: E0219 22:04:22.118619 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e21c86a5-45d5-4a66-ab91-2c5f63ed9560" containerName="extract-utilities" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.118626 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e21c86a5-45d5-4a66-ab91-2c5f63ed9560" containerName="extract-utilities" Feb 19 22:04:22 crc kubenswrapper[4795]: E0219 22:04:22.118641 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e21c86a5-45d5-4a66-ab91-2c5f63ed9560" containerName="extract-content" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.118648 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e21c86a5-45d5-4a66-ab91-2c5f63ed9560" containerName="extract-content" Feb 19 22:04:22 crc kubenswrapper[4795]: E0219 22:04:22.118663 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81fba4b2-8284-4218-848a-d969914c88d4" containerName="registry-server" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.118671 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="81fba4b2-8284-4218-848a-d969914c88d4" containerName="registry-server" Feb 19 22:04:22 crc kubenswrapper[4795]: E0219 22:04:22.118682 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81fba4b2-8284-4218-848a-d969914c88d4" containerName="extract-utilities" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.118689 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="81fba4b2-8284-4218-848a-d969914c88d4" containerName="extract-utilities" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.118874 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="81fba4b2-8284-4218-848a-d969914c88d4" containerName="registry-server" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.118900 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e21c86a5-45d5-4a66-ab91-2c5f63ed9560" containerName="registry-server" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.120108 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.142752 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m84kt"] Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.252765 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-utilities\") pod \"certified-operators-m84kt\" (UID: \"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd\") " pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.252842 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-catalog-content\") pod \"certified-operators-m84kt\" (UID: \"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd\") " pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.252926 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc4r6\" (UniqueName: \"kubernetes.io/projected/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-kube-api-access-bc4r6\") pod \"certified-operators-m84kt\" (UID: \"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd\") " pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.358168 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-utilities\") pod \"certified-operators-m84kt\" (UID: \"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd\") " pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.358602 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-catalog-content\") pod \"certified-operators-m84kt\" (UID: \"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd\") " pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.358654 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc4r6\" (UniqueName: \"kubernetes.io/projected/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-kube-api-access-bc4r6\") pod \"certified-operators-m84kt\" (UID: \"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd\") " pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.358730 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-utilities\") pod \"certified-operators-m84kt\" (UID: \"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd\") " pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.359031 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-catalog-content\") pod \"certified-operators-m84kt\" (UID: \"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd\") " pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.384855 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc4r6\" (UniqueName: \"kubernetes.io/projected/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-kube-api-access-bc4r6\") pod \"certified-operators-m84kt\" (UID: \"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd\") " pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.456399 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.908434 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m84kt"] Feb 19 22:04:23 crc kubenswrapper[4795]: I0219 22:04:23.399717 4795 generic.go:334] "Generic (PLEG): container finished" podID="5cf90d2b-fcf4-4082-8bb2-892e23a85cfd" containerID="0c995342e1a665caa9965f136f59b3e69da438e060c8e9efffdf6e2b986ba8b7" exitCode=0 Feb 19 22:04:23 crc kubenswrapper[4795]: I0219 22:04:23.399777 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m84kt" event={"ID":"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd","Type":"ContainerDied","Data":"0c995342e1a665caa9965f136f59b3e69da438e060c8e9efffdf6e2b986ba8b7"} Feb 19 22:04:23 crc kubenswrapper[4795]: I0219 22:04:23.399816 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m84kt" event={"ID":"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd","Type":"ContainerStarted","Data":"bd7076d5f999cd2836934242f282bd72d95dbb6690558499c8b183f067011a71"} Feb 19 22:04:24 crc kubenswrapper[4795]: I0219 22:04:24.431321 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m84kt" event={"ID":"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd","Type":"ContainerStarted","Data":"007dfd5b6719160dec98fc1f3e0143ffe486e3ffcc4e053f3ae8f4c4d442d571"} Feb 19 22:04:25 crc kubenswrapper[4795]: I0219 22:04:25.440472 4795 generic.go:334] "Generic (PLEG): container finished" podID="5cf90d2b-fcf4-4082-8bb2-892e23a85cfd" containerID="007dfd5b6719160dec98fc1f3e0143ffe486e3ffcc4e053f3ae8f4c4d442d571" exitCode=0 Feb 19 22:04:25 crc kubenswrapper[4795]: I0219 22:04:25.440523 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m84kt" event={"ID":"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd","Type":"ContainerDied","Data":"007dfd5b6719160dec98fc1f3e0143ffe486e3ffcc4e053f3ae8f4c4d442d571"} Feb 19 22:04:26 crc kubenswrapper[4795]: I0219 22:04:26.449918 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m84kt" event={"ID":"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd","Type":"ContainerStarted","Data":"14fb0f3a25fe9c61e1fe3b83081602db8ffdc377387580ff45e90f34bd7a9df2"} Feb 19 22:04:26 crc kubenswrapper[4795]: I0219 22:04:26.471801 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m84kt" podStartSLOduration=2.023590685 podStartE2EDuration="4.471779389s" podCreationTimestamp="2026-02-19 22:04:22 +0000 UTC" firstStartedPulling="2026-02-19 22:04:23.401931813 +0000 UTC m=+2174.594449717" lastFinishedPulling="2026-02-19 22:04:25.850120547 +0000 UTC m=+2177.042638421" observedRunningTime="2026-02-19 22:04:26.468012565 +0000 UTC m=+2177.660530439" watchObservedRunningTime="2026-02-19 22:04:26.471779389 +0000 UTC m=+2177.664297263" Feb 19 22:04:28 crc kubenswrapper[4795]: I0219 22:04:28.427815 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:04:28 crc kubenswrapper[4795]: I0219 22:04:28.428118 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:04:32 crc kubenswrapper[4795]: I0219 22:04:32.456852 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:32 crc kubenswrapper[4795]: I0219 22:04:32.457298 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:32 crc kubenswrapper[4795]: I0219 22:04:32.510701 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:32 crc kubenswrapper[4795]: I0219 22:04:32.568150 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:32 crc kubenswrapper[4795]: I0219 22:04:32.751393 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m84kt"] Feb 19 22:04:34 crc kubenswrapper[4795]: I0219 22:04:34.524608 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-m84kt" podUID="5cf90d2b-fcf4-4082-8bb2-892e23a85cfd" containerName="registry-server" containerID="cri-o://14fb0f3a25fe9c61e1fe3b83081602db8ffdc377387580ff45e90f34bd7a9df2" gracePeriod=2 Feb 19 22:04:34 crc kubenswrapper[4795]: I0219 22:04:34.931310 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.050367 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bc4r6\" (UniqueName: \"kubernetes.io/projected/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-kube-api-access-bc4r6\") pod \"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd\" (UID: \"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd\") " Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.050414 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-utilities\") pod \"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd\" (UID: \"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd\") " Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.050464 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-catalog-content\") pod \"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd\" (UID: \"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd\") " Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.051700 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-utilities" (OuterVolumeSpecName: "utilities") pod "5cf90d2b-fcf4-4082-8bb2-892e23a85cfd" (UID: "5cf90d2b-fcf4-4082-8bb2-892e23a85cfd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.070205 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-kube-api-access-bc4r6" (OuterVolumeSpecName: "kube-api-access-bc4r6") pod "5cf90d2b-fcf4-4082-8bb2-892e23a85cfd" (UID: "5cf90d2b-fcf4-4082-8bb2-892e23a85cfd"). InnerVolumeSpecName "kube-api-access-bc4r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.152241 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bc4r6\" (UniqueName: \"kubernetes.io/projected/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-kube-api-access-bc4r6\") on node \"crc\" DevicePath \"\"" Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.152511 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.410087 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5cf90d2b-fcf4-4082-8bb2-892e23a85cfd" (UID: "5cf90d2b-fcf4-4082-8bb2-892e23a85cfd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.456461 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.536086 4795 generic.go:334] "Generic (PLEG): container finished" podID="5cf90d2b-fcf4-4082-8bb2-892e23a85cfd" containerID="14fb0f3a25fe9c61e1fe3b83081602db8ffdc377387580ff45e90f34bd7a9df2" exitCode=0 Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.536149 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m84kt" event={"ID":"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd","Type":"ContainerDied","Data":"14fb0f3a25fe9c61e1fe3b83081602db8ffdc377387580ff45e90f34bd7a9df2"} Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.536207 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m84kt" event={"ID":"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd","Type":"ContainerDied","Data":"bd7076d5f999cd2836934242f282bd72d95dbb6690558499c8b183f067011a71"} Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.536234 4795 scope.go:117] "RemoveContainer" containerID="14fb0f3a25fe9c61e1fe3b83081602db8ffdc377387580ff45e90f34bd7a9df2" Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.536117 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.567130 4795 scope.go:117] "RemoveContainer" containerID="007dfd5b6719160dec98fc1f3e0143ffe486e3ffcc4e053f3ae8f4c4d442d571" Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.568081 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m84kt"] Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.580952 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-m84kt"] Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.600555 4795 scope.go:117] "RemoveContainer" containerID="0c995342e1a665caa9965f136f59b3e69da438e060c8e9efffdf6e2b986ba8b7" Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.625080 4795 scope.go:117] "RemoveContainer" containerID="14fb0f3a25fe9c61e1fe3b83081602db8ffdc377387580ff45e90f34bd7a9df2" Feb 19 22:04:35 crc kubenswrapper[4795]: E0219 22:04:35.625680 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14fb0f3a25fe9c61e1fe3b83081602db8ffdc377387580ff45e90f34bd7a9df2\": container with ID starting with 14fb0f3a25fe9c61e1fe3b83081602db8ffdc377387580ff45e90f34bd7a9df2 not found: ID does not exist" containerID="14fb0f3a25fe9c61e1fe3b83081602db8ffdc377387580ff45e90f34bd7a9df2" Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.625710 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14fb0f3a25fe9c61e1fe3b83081602db8ffdc377387580ff45e90f34bd7a9df2"} err="failed to get container status \"14fb0f3a25fe9c61e1fe3b83081602db8ffdc377387580ff45e90f34bd7a9df2\": rpc error: code = NotFound desc = could not find container \"14fb0f3a25fe9c61e1fe3b83081602db8ffdc377387580ff45e90f34bd7a9df2\": container with ID starting with 14fb0f3a25fe9c61e1fe3b83081602db8ffdc377387580ff45e90f34bd7a9df2 not found: ID does not exist" Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.625736 4795 scope.go:117] "RemoveContainer" containerID="007dfd5b6719160dec98fc1f3e0143ffe486e3ffcc4e053f3ae8f4c4d442d571" Feb 19 22:04:35 crc kubenswrapper[4795]: E0219 22:04:35.626080 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"007dfd5b6719160dec98fc1f3e0143ffe486e3ffcc4e053f3ae8f4c4d442d571\": container with ID starting with 007dfd5b6719160dec98fc1f3e0143ffe486e3ffcc4e053f3ae8f4c4d442d571 not found: ID does not exist" containerID="007dfd5b6719160dec98fc1f3e0143ffe486e3ffcc4e053f3ae8f4c4d442d571" Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.626116 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"007dfd5b6719160dec98fc1f3e0143ffe486e3ffcc4e053f3ae8f4c4d442d571"} err="failed to get container status \"007dfd5b6719160dec98fc1f3e0143ffe486e3ffcc4e053f3ae8f4c4d442d571\": rpc error: code = NotFound desc = could not find container \"007dfd5b6719160dec98fc1f3e0143ffe486e3ffcc4e053f3ae8f4c4d442d571\": container with ID starting with 007dfd5b6719160dec98fc1f3e0143ffe486e3ffcc4e053f3ae8f4c4d442d571 not found: ID does not exist" Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.626138 4795 scope.go:117] "RemoveContainer" containerID="0c995342e1a665caa9965f136f59b3e69da438e060c8e9efffdf6e2b986ba8b7" Feb 19 22:04:35 crc kubenswrapper[4795]: E0219 22:04:35.626515 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c995342e1a665caa9965f136f59b3e69da438e060c8e9efffdf6e2b986ba8b7\": container with ID starting with 0c995342e1a665caa9965f136f59b3e69da438e060c8e9efffdf6e2b986ba8b7 not found: ID does not exist" containerID="0c995342e1a665caa9965f136f59b3e69da438e060c8e9efffdf6e2b986ba8b7" Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.626536 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c995342e1a665caa9965f136f59b3e69da438e060c8e9efffdf6e2b986ba8b7"} err="failed to get container status \"0c995342e1a665caa9965f136f59b3e69da438e060c8e9efffdf6e2b986ba8b7\": rpc error: code = NotFound desc = could not find container \"0c995342e1a665caa9965f136f59b3e69da438e060c8e9efffdf6e2b986ba8b7\": container with ID starting with 0c995342e1a665caa9965f136f59b3e69da438e060c8e9efffdf6e2b986ba8b7 not found: ID does not exist" Feb 19 22:04:37 crc kubenswrapper[4795]: I0219 22:04:37.529403 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cf90d2b-fcf4-4082-8bb2-892e23a85cfd" path="/var/lib/kubelet/pods/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd/volumes" Feb 19 22:04:58 crc kubenswrapper[4795]: I0219 22:04:58.428278 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:04:58 crc kubenswrapper[4795]: I0219 22:04:58.429118 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:04:58 crc kubenswrapper[4795]: I0219 22:04:58.429231 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 22:04:58 crc kubenswrapper[4795]: I0219 22:04:58.430095 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 22:04:58 crc kubenswrapper[4795]: I0219 22:04:58.430238 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" gracePeriod=600 Feb 19 22:04:58 crc kubenswrapper[4795]: E0219 22:04:58.563997 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:04:58 crc kubenswrapper[4795]: I0219 22:04:58.730001 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" exitCode=0 Feb 19 22:04:58 crc kubenswrapper[4795]: I0219 22:04:58.730049 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0"} Feb 19 22:04:58 crc kubenswrapper[4795]: I0219 22:04:58.730084 4795 scope.go:117] "RemoveContainer" containerID="6c01aaf6e3d8e8204ba05a7b4dc2e4ab1c48baf63fa06a4eaffcb4f1cd336768" Feb 19 22:04:58 crc kubenswrapper[4795]: I0219 22:04:58.730533 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:04:58 crc kubenswrapper[4795]: E0219 22:04:58.730762 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:05:12 crc kubenswrapper[4795]: I0219 22:05:12.511685 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:05:12 crc kubenswrapper[4795]: E0219 22:05:12.512232 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:05:24 crc kubenswrapper[4795]: I0219 22:05:24.512152 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:05:24 crc kubenswrapper[4795]: E0219 22:05:24.513027 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:05:36 crc kubenswrapper[4795]: I0219 22:05:36.511541 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:05:36 crc kubenswrapper[4795]: E0219 22:05:36.512295 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:05:49 crc kubenswrapper[4795]: I0219 22:05:49.511991 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:05:49 crc kubenswrapper[4795]: E0219 22:05:49.512729 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:06:00 crc kubenswrapper[4795]: I0219 22:06:00.511623 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:06:00 crc kubenswrapper[4795]: E0219 22:06:00.512358 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:06:11 crc kubenswrapper[4795]: I0219 22:06:11.512112 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:06:11 crc kubenswrapper[4795]: E0219 22:06:11.512866 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:06:23 crc kubenswrapper[4795]: I0219 22:06:23.513154 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:06:23 crc kubenswrapper[4795]: E0219 22:06:23.513922 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:06:35 crc kubenswrapper[4795]: I0219 22:06:35.511743 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:06:35 crc kubenswrapper[4795]: E0219 22:06:35.512555 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:06:47 crc kubenswrapper[4795]: I0219 22:06:47.511631 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:06:47 crc kubenswrapper[4795]: E0219 22:06:47.512519 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:07:00 crc kubenswrapper[4795]: I0219 22:07:00.248494 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" podUID="5a3a8d91-b500-48db-9ceb-cc105b2eeb3a" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 22:07:00 crc kubenswrapper[4795]: I0219 22:07:00.276063 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:07:00 crc kubenswrapper[4795]: E0219 22:07:00.276557 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:07:10 crc kubenswrapper[4795]: I0219 22:07:10.513214 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:07:10 crc kubenswrapper[4795]: E0219 22:07:10.514010 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:07:23 crc kubenswrapper[4795]: I0219 22:07:23.513182 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:07:23 crc kubenswrapper[4795]: E0219 22:07:23.513895 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:07:35 crc kubenswrapper[4795]: I0219 22:07:35.512449 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:07:35 crc kubenswrapper[4795]: E0219 22:07:35.513342 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:07:46 crc kubenswrapper[4795]: I0219 22:07:46.511851 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:07:46 crc kubenswrapper[4795]: E0219 22:07:46.512641 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:07:58 crc kubenswrapper[4795]: I0219 22:07:58.512067 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:07:58 crc kubenswrapper[4795]: E0219 22:07:58.512983 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:08:09 crc kubenswrapper[4795]: I0219 22:08:09.521054 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:08:09 crc kubenswrapper[4795]: E0219 22:08:09.521796 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:08:23 crc kubenswrapper[4795]: I0219 22:08:23.511521 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:08:23 crc kubenswrapper[4795]: E0219 22:08:23.512481 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:08:34 crc kubenswrapper[4795]: I0219 22:08:34.512679 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:08:34 crc kubenswrapper[4795]: E0219 22:08:34.513545 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:08:48 crc kubenswrapper[4795]: I0219 22:08:48.512263 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:08:48 crc kubenswrapper[4795]: E0219 22:08:48.514972 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:09:01 crc kubenswrapper[4795]: I0219 22:09:01.514221 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:09:01 crc kubenswrapper[4795]: E0219 22:09:01.517459 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:09:16 crc kubenswrapper[4795]: I0219 22:09:16.512284 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:09:16 crc kubenswrapper[4795]: E0219 22:09:16.512987 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:09:28 crc kubenswrapper[4795]: I0219 22:09:28.512616 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:09:28 crc kubenswrapper[4795]: E0219 22:09:28.513859 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:09:39 crc kubenswrapper[4795]: I0219 22:09:39.521098 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:09:39 crc kubenswrapper[4795]: E0219 22:09:39.522694 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:09:51 crc kubenswrapper[4795]: I0219 22:09:51.511335 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:09:51 crc kubenswrapper[4795]: E0219 22:09:51.512069 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:10:02 crc kubenswrapper[4795]: I0219 22:10:02.802019 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ngq7m"] Feb 19 22:10:02 crc kubenswrapper[4795]: E0219 22:10:02.806497 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cf90d2b-fcf4-4082-8bb2-892e23a85cfd" containerName="extract-utilities" Feb 19 22:10:02 crc kubenswrapper[4795]: I0219 22:10:02.806528 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cf90d2b-fcf4-4082-8bb2-892e23a85cfd" containerName="extract-utilities" Feb 19 22:10:02 crc kubenswrapper[4795]: E0219 22:10:02.806540 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cf90d2b-fcf4-4082-8bb2-892e23a85cfd" containerName="extract-content" Feb 19 22:10:02 crc kubenswrapper[4795]: I0219 22:10:02.806547 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cf90d2b-fcf4-4082-8bb2-892e23a85cfd" containerName="extract-content" Feb 19 22:10:02 crc kubenswrapper[4795]: E0219 22:10:02.806565 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cf90d2b-fcf4-4082-8bb2-892e23a85cfd" containerName="registry-server" Feb 19 22:10:02 crc kubenswrapper[4795]: I0219 22:10:02.806571 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cf90d2b-fcf4-4082-8bb2-892e23a85cfd" containerName="registry-server" Feb 19 22:10:02 crc kubenswrapper[4795]: I0219 22:10:02.806708 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cf90d2b-fcf4-4082-8bb2-892e23a85cfd" containerName="registry-server" Feb 19 22:10:02 crc kubenswrapper[4795]: I0219 22:10:02.807647 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:02 crc kubenswrapper[4795]: I0219 22:10:02.813620 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ngq7m"] Feb 19 22:10:02 crc kubenswrapper[4795]: I0219 22:10:02.973759 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds46x\" (UniqueName: \"kubernetes.io/projected/7d37836f-dda1-46ea-8bd3-46b3d1e40115-kube-api-access-ds46x\") pod \"redhat-operators-ngq7m\" (UID: \"7d37836f-dda1-46ea-8bd3-46b3d1e40115\") " pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:02 crc kubenswrapper[4795]: I0219 22:10:02.973872 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d37836f-dda1-46ea-8bd3-46b3d1e40115-catalog-content\") pod \"redhat-operators-ngq7m\" (UID: \"7d37836f-dda1-46ea-8bd3-46b3d1e40115\") " pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:02 crc kubenswrapper[4795]: I0219 22:10:02.973897 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d37836f-dda1-46ea-8bd3-46b3d1e40115-utilities\") pod \"redhat-operators-ngq7m\" (UID: \"7d37836f-dda1-46ea-8bd3-46b3d1e40115\") " pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:03 crc kubenswrapper[4795]: I0219 22:10:03.076065 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d37836f-dda1-46ea-8bd3-46b3d1e40115-catalog-content\") pod \"redhat-operators-ngq7m\" (UID: \"7d37836f-dda1-46ea-8bd3-46b3d1e40115\") " pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:03 crc kubenswrapper[4795]: I0219 22:10:03.076146 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d37836f-dda1-46ea-8bd3-46b3d1e40115-utilities\") pod \"redhat-operators-ngq7m\" (UID: \"7d37836f-dda1-46ea-8bd3-46b3d1e40115\") " pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:03 crc kubenswrapper[4795]: I0219 22:10:03.076233 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ds46x\" (UniqueName: \"kubernetes.io/projected/7d37836f-dda1-46ea-8bd3-46b3d1e40115-kube-api-access-ds46x\") pod \"redhat-operators-ngq7m\" (UID: \"7d37836f-dda1-46ea-8bd3-46b3d1e40115\") " pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:03 crc kubenswrapper[4795]: I0219 22:10:03.076813 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d37836f-dda1-46ea-8bd3-46b3d1e40115-utilities\") pod \"redhat-operators-ngq7m\" (UID: \"7d37836f-dda1-46ea-8bd3-46b3d1e40115\") " pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:03 crc kubenswrapper[4795]: I0219 22:10:03.077024 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d37836f-dda1-46ea-8bd3-46b3d1e40115-catalog-content\") pod \"redhat-operators-ngq7m\" (UID: \"7d37836f-dda1-46ea-8bd3-46b3d1e40115\") " pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:03 crc kubenswrapper[4795]: I0219 22:10:03.094957 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds46x\" (UniqueName: \"kubernetes.io/projected/7d37836f-dda1-46ea-8bd3-46b3d1e40115-kube-api-access-ds46x\") pod \"redhat-operators-ngq7m\" (UID: \"7d37836f-dda1-46ea-8bd3-46b3d1e40115\") " pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:03 crc kubenswrapper[4795]: I0219 22:10:03.136924 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:03 crc kubenswrapper[4795]: I0219 22:10:03.968909 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ngq7m"] Feb 19 22:10:04 crc kubenswrapper[4795]: I0219 22:10:04.593018 4795 generic.go:334] "Generic (PLEG): container finished" podID="7d37836f-dda1-46ea-8bd3-46b3d1e40115" containerID="0f19328301f5eb25ceb7841e018e2094ea8c1ef435e6b04075ef5e41263da0fb" exitCode=0 Feb 19 22:10:04 crc kubenswrapper[4795]: I0219 22:10:04.593239 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngq7m" event={"ID":"7d37836f-dda1-46ea-8bd3-46b3d1e40115","Type":"ContainerDied","Data":"0f19328301f5eb25ceb7841e018e2094ea8c1ef435e6b04075ef5e41263da0fb"} Feb 19 22:10:04 crc kubenswrapper[4795]: I0219 22:10:04.593389 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngq7m" event={"ID":"7d37836f-dda1-46ea-8bd3-46b3d1e40115","Type":"ContainerStarted","Data":"46c880d5e4e646af453acd759e46c0247571e60c4ac2e1e243dece3a82b42d6d"} Feb 19 22:10:04 crc kubenswrapper[4795]: I0219 22:10:04.594540 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 22:10:06 crc kubenswrapper[4795]: I0219 22:10:06.518055 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:10:07 crc kubenswrapper[4795]: I0219 22:10:07.612937 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"dae87592e9f8f7d3ce64c7cc5ffb3429ecd2b708a654e97c0a7e99d0113e7b2c"} Feb 19 22:10:08 crc kubenswrapper[4795]: I0219 22:10:08.621089 4795 generic.go:334] "Generic (PLEG): container finished" podID="7d37836f-dda1-46ea-8bd3-46b3d1e40115" containerID="ae775cbe4f577ef72803181a6ba2b4bab0a5cc7b7f8d76afd9f73fc2df207a43" exitCode=0 Feb 19 22:10:08 crc kubenswrapper[4795]: I0219 22:10:08.621206 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngq7m" event={"ID":"7d37836f-dda1-46ea-8bd3-46b3d1e40115","Type":"ContainerDied","Data":"ae775cbe4f577ef72803181a6ba2b4bab0a5cc7b7f8d76afd9f73fc2df207a43"} Feb 19 22:10:09 crc kubenswrapper[4795]: I0219 22:10:09.632418 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngq7m" event={"ID":"7d37836f-dda1-46ea-8bd3-46b3d1e40115","Type":"ContainerStarted","Data":"0b8a4c25ac1794a48e077962e5f430b0a67013ac3a2f8e66d1ac424739506f6e"} Feb 19 22:10:09 crc kubenswrapper[4795]: I0219 22:10:09.664209 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ngq7m" podStartSLOduration=3.01183413 podStartE2EDuration="7.66419235s" podCreationTimestamp="2026-02-19 22:10:02 +0000 UTC" firstStartedPulling="2026-02-19 22:10:04.594306978 +0000 UTC m=+2515.786824842" lastFinishedPulling="2026-02-19 22:10:09.246665198 +0000 UTC m=+2520.439183062" observedRunningTime="2026-02-19 22:10:09.657675639 +0000 UTC m=+2520.850193523" watchObservedRunningTime="2026-02-19 22:10:09.66419235 +0000 UTC m=+2520.856710214" Feb 19 22:10:13 crc kubenswrapper[4795]: I0219 22:10:13.137966 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:13 crc kubenswrapper[4795]: I0219 22:10:13.138364 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:14 crc kubenswrapper[4795]: I0219 22:10:14.181952 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ngq7m" podUID="7d37836f-dda1-46ea-8bd3-46b3d1e40115" containerName="registry-server" probeResult="failure" output=< Feb 19 22:10:14 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Feb 19 22:10:14 crc kubenswrapper[4795]: > Feb 19 22:10:23 crc kubenswrapper[4795]: I0219 22:10:23.186040 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:23 crc kubenswrapper[4795]: I0219 22:10:23.238635 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:23 crc kubenswrapper[4795]: I0219 22:10:23.414946 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ngq7m"] Feb 19 22:10:24 crc kubenswrapper[4795]: I0219 22:10:24.745902 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ngq7m" podUID="7d37836f-dda1-46ea-8bd3-46b3d1e40115" containerName="registry-server" containerID="cri-o://0b8a4c25ac1794a48e077962e5f430b0a67013ac3a2f8e66d1ac424739506f6e" gracePeriod=2 Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.148018 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.299189 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d37836f-dda1-46ea-8bd3-46b3d1e40115-catalog-content\") pod \"7d37836f-dda1-46ea-8bd3-46b3d1e40115\" (UID: \"7d37836f-dda1-46ea-8bd3-46b3d1e40115\") " Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.299389 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d37836f-dda1-46ea-8bd3-46b3d1e40115-utilities\") pod \"7d37836f-dda1-46ea-8bd3-46b3d1e40115\" (UID: \"7d37836f-dda1-46ea-8bd3-46b3d1e40115\") " Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.299427 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ds46x\" (UniqueName: \"kubernetes.io/projected/7d37836f-dda1-46ea-8bd3-46b3d1e40115-kube-api-access-ds46x\") pod \"7d37836f-dda1-46ea-8bd3-46b3d1e40115\" (UID: \"7d37836f-dda1-46ea-8bd3-46b3d1e40115\") " Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.300521 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d37836f-dda1-46ea-8bd3-46b3d1e40115-utilities" (OuterVolumeSpecName: "utilities") pod "7d37836f-dda1-46ea-8bd3-46b3d1e40115" (UID: "7d37836f-dda1-46ea-8bd3-46b3d1e40115"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.305376 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d37836f-dda1-46ea-8bd3-46b3d1e40115-kube-api-access-ds46x" (OuterVolumeSpecName: "kube-api-access-ds46x") pod "7d37836f-dda1-46ea-8bd3-46b3d1e40115" (UID: "7d37836f-dda1-46ea-8bd3-46b3d1e40115"). InnerVolumeSpecName "kube-api-access-ds46x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.401385 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d37836f-dda1-46ea-8bd3-46b3d1e40115-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.401428 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ds46x\" (UniqueName: \"kubernetes.io/projected/7d37836f-dda1-46ea-8bd3-46b3d1e40115-kube-api-access-ds46x\") on node \"crc\" DevicePath \"\"" Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.429487 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d37836f-dda1-46ea-8bd3-46b3d1e40115-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7d37836f-dda1-46ea-8bd3-46b3d1e40115" (UID: "7d37836f-dda1-46ea-8bd3-46b3d1e40115"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.503578 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d37836f-dda1-46ea-8bd3-46b3d1e40115-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.758824 4795 generic.go:334] "Generic (PLEG): container finished" podID="7d37836f-dda1-46ea-8bd3-46b3d1e40115" containerID="0b8a4c25ac1794a48e077962e5f430b0a67013ac3a2f8e66d1ac424739506f6e" exitCode=0 Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.758881 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngq7m" event={"ID":"7d37836f-dda1-46ea-8bd3-46b3d1e40115","Type":"ContainerDied","Data":"0b8a4c25ac1794a48e077962e5f430b0a67013ac3a2f8e66d1ac424739506f6e"} Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.758897 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.758916 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngq7m" event={"ID":"7d37836f-dda1-46ea-8bd3-46b3d1e40115","Type":"ContainerDied","Data":"46c880d5e4e646af453acd759e46c0247571e60c4ac2e1e243dece3a82b42d6d"} Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.758942 4795 scope.go:117] "RemoveContainer" containerID="0b8a4c25ac1794a48e077962e5f430b0a67013ac3a2f8e66d1ac424739506f6e" Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.791439 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ngq7m"] Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.800476 4795 scope.go:117] "RemoveContainer" containerID="ae775cbe4f577ef72803181a6ba2b4bab0a5cc7b7f8d76afd9f73fc2df207a43" Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.802910 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ngq7m"] Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.831485 4795 scope.go:117] "RemoveContainer" containerID="0f19328301f5eb25ceb7841e018e2094ea8c1ef435e6b04075ef5e41263da0fb" Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.844963 4795 scope.go:117] "RemoveContainer" containerID="0b8a4c25ac1794a48e077962e5f430b0a67013ac3a2f8e66d1ac424739506f6e" Feb 19 22:10:25 crc kubenswrapper[4795]: E0219 22:10:25.845430 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b8a4c25ac1794a48e077962e5f430b0a67013ac3a2f8e66d1ac424739506f6e\": container with ID starting with 0b8a4c25ac1794a48e077962e5f430b0a67013ac3a2f8e66d1ac424739506f6e not found: ID does not exist" containerID="0b8a4c25ac1794a48e077962e5f430b0a67013ac3a2f8e66d1ac424739506f6e" Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.845482 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b8a4c25ac1794a48e077962e5f430b0a67013ac3a2f8e66d1ac424739506f6e"} err="failed to get container status \"0b8a4c25ac1794a48e077962e5f430b0a67013ac3a2f8e66d1ac424739506f6e\": rpc error: code = NotFound desc = could not find container \"0b8a4c25ac1794a48e077962e5f430b0a67013ac3a2f8e66d1ac424739506f6e\": container with ID starting with 0b8a4c25ac1794a48e077962e5f430b0a67013ac3a2f8e66d1ac424739506f6e not found: ID does not exist" Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.845507 4795 scope.go:117] "RemoveContainer" containerID="ae775cbe4f577ef72803181a6ba2b4bab0a5cc7b7f8d76afd9f73fc2df207a43" Feb 19 22:10:25 crc kubenswrapper[4795]: E0219 22:10:25.845906 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae775cbe4f577ef72803181a6ba2b4bab0a5cc7b7f8d76afd9f73fc2df207a43\": container with ID starting with ae775cbe4f577ef72803181a6ba2b4bab0a5cc7b7f8d76afd9f73fc2df207a43 not found: ID does not exist" containerID="ae775cbe4f577ef72803181a6ba2b4bab0a5cc7b7f8d76afd9f73fc2df207a43" Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.845965 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae775cbe4f577ef72803181a6ba2b4bab0a5cc7b7f8d76afd9f73fc2df207a43"} err="failed to get container status \"ae775cbe4f577ef72803181a6ba2b4bab0a5cc7b7f8d76afd9f73fc2df207a43\": rpc error: code = NotFound desc = could not find container \"ae775cbe4f577ef72803181a6ba2b4bab0a5cc7b7f8d76afd9f73fc2df207a43\": container with ID starting with ae775cbe4f577ef72803181a6ba2b4bab0a5cc7b7f8d76afd9f73fc2df207a43 not found: ID does not exist" Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.845997 4795 scope.go:117] "RemoveContainer" containerID="0f19328301f5eb25ceb7841e018e2094ea8c1ef435e6b04075ef5e41263da0fb" Feb 19 22:10:25 crc kubenswrapper[4795]: E0219 22:10:25.846335 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f19328301f5eb25ceb7841e018e2094ea8c1ef435e6b04075ef5e41263da0fb\": container with ID starting with 0f19328301f5eb25ceb7841e018e2094ea8c1ef435e6b04075ef5e41263da0fb not found: ID does not exist" containerID="0f19328301f5eb25ceb7841e018e2094ea8c1ef435e6b04075ef5e41263da0fb" Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.846363 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f19328301f5eb25ceb7841e018e2094ea8c1ef435e6b04075ef5e41263da0fb"} err="failed to get container status \"0f19328301f5eb25ceb7841e018e2094ea8c1ef435e6b04075ef5e41263da0fb\": rpc error: code = NotFound desc = could not find container \"0f19328301f5eb25ceb7841e018e2094ea8c1ef435e6b04075ef5e41263da0fb\": container with ID starting with 0f19328301f5eb25ceb7841e018e2094ea8c1ef435e6b04075ef5e41263da0fb not found: ID does not exist" Feb 19 22:10:27 crc kubenswrapper[4795]: I0219 22:10:27.523145 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d37836f-dda1-46ea-8bd3-46b3d1e40115" path="/var/lib/kubelet/pods/7d37836f-dda1-46ea-8bd3-46b3d1e40115/volumes" Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.621034 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ctpsf"] Feb 19 22:10:36 crc kubenswrapper[4795]: E0219 22:10:36.622055 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d37836f-dda1-46ea-8bd3-46b3d1e40115" containerName="extract-utilities" Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.622075 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d37836f-dda1-46ea-8bd3-46b3d1e40115" containerName="extract-utilities" Feb 19 22:10:36 crc kubenswrapper[4795]: E0219 22:10:36.622097 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d37836f-dda1-46ea-8bd3-46b3d1e40115" containerName="extract-content" Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.622106 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d37836f-dda1-46ea-8bd3-46b3d1e40115" containerName="extract-content" Feb 19 22:10:36 crc kubenswrapper[4795]: E0219 22:10:36.622126 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d37836f-dda1-46ea-8bd3-46b3d1e40115" containerName="registry-server" Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.622134 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d37836f-dda1-46ea-8bd3-46b3d1e40115" containerName="registry-server" Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.622322 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d37836f-dda1-46ea-8bd3-46b3d1e40115" containerName="registry-server" Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.623572 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.637688 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ctpsf"] Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.763614 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7zms\" (UniqueName: \"kubernetes.io/projected/9215f5c9-f305-43b3-8e82-902a494f07d9-kube-api-access-w7zms\") pod \"redhat-marketplace-ctpsf\" (UID: \"9215f5c9-f305-43b3-8e82-902a494f07d9\") " pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.763904 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9215f5c9-f305-43b3-8e82-902a494f07d9-utilities\") pod \"redhat-marketplace-ctpsf\" (UID: \"9215f5c9-f305-43b3-8e82-902a494f07d9\") " pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.764005 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9215f5c9-f305-43b3-8e82-902a494f07d9-catalog-content\") pod \"redhat-marketplace-ctpsf\" (UID: \"9215f5c9-f305-43b3-8e82-902a494f07d9\") " pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.865812 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7zms\" (UniqueName: \"kubernetes.io/projected/9215f5c9-f305-43b3-8e82-902a494f07d9-kube-api-access-w7zms\") pod \"redhat-marketplace-ctpsf\" (UID: \"9215f5c9-f305-43b3-8e82-902a494f07d9\") " pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.866114 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9215f5c9-f305-43b3-8e82-902a494f07d9-utilities\") pod \"redhat-marketplace-ctpsf\" (UID: \"9215f5c9-f305-43b3-8e82-902a494f07d9\") " pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.866275 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9215f5c9-f305-43b3-8e82-902a494f07d9-catalog-content\") pod \"redhat-marketplace-ctpsf\" (UID: \"9215f5c9-f305-43b3-8e82-902a494f07d9\") " pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.866820 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9215f5c9-f305-43b3-8e82-902a494f07d9-utilities\") pod \"redhat-marketplace-ctpsf\" (UID: \"9215f5c9-f305-43b3-8e82-902a494f07d9\") " pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.866919 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9215f5c9-f305-43b3-8e82-902a494f07d9-catalog-content\") pod \"redhat-marketplace-ctpsf\" (UID: \"9215f5c9-f305-43b3-8e82-902a494f07d9\") " pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.897319 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7zms\" (UniqueName: \"kubernetes.io/projected/9215f5c9-f305-43b3-8e82-902a494f07d9-kube-api-access-w7zms\") pod \"redhat-marketplace-ctpsf\" (UID: \"9215f5c9-f305-43b3-8e82-902a494f07d9\") " pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.940874 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:37 crc kubenswrapper[4795]: I0219 22:10:37.154046 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ctpsf"] Feb 19 22:10:37 crc kubenswrapper[4795]: I0219 22:10:37.860279 4795 generic.go:334] "Generic (PLEG): container finished" podID="9215f5c9-f305-43b3-8e82-902a494f07d9" containerID="9642bb6b6fd3fddface433a98f740926eb45e711ae4821ecbeae7f50f1f18225" exitCode=0 Feb 19 22:10:37 crc kubenswrapper[4795]: I0219 22:10:37.860326 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ctpsf" event={"ID":"9215f5c9-f305-43b3-8e82-902a494f07d9","Type":"ContainerDied","Data":"9642bb6b6fd3fddface433a98f740926eb45e711ae4821ecbeae7f50f1f18225"} Feb 19 22:10:37 crc kubenswrapper[4795]: I0219 22:10:37.860360 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ctpsf" event={"ID":"9215f5c9-f305-43b3-8e82-902a494f07d9","Type":"ContainerStarted","Data":"c62b34ad6fb32b473e03faa814eef441291609a20f868aa6c180ac55b5240145"} Feb 19 22:10:39 crc kubenswrapper[4795]: I0219 22:10:39.875958 4795 generic.go:334] "Generic (PLEG): container finished" podID="9215f5c9-f305-43b3-8e82-902a494f07d9" containerID="81ad6355095939f2fc302dfa0385b5cf46b203abd83546d96e07b8003f8ff0d7" exitCode=0 Feb 19 22:10:39 crc kubenswrapper[4795]: I0219 22:10:39.876028 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ctpsf" event={"ID":"9215f5c9-f305-43b3-8e82-902a494f07d9","Type":"ContainerDied","Data":"81ad6355095939f2fc302dfa0385b5cf46b203abd83546d96e07b8003f8ff0d7"} Feb 19 22:10:40 crc kubenswrapper[4795]: I0219 22:10:40.884899 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ctpsf" event={"ID":"9215f5c9-f305-43b3-8e82-902a494f07d9","Type":"ContainerStarted","Data":"bd2ce00fe78a824703dd1a02537a427d4a590b0beebf0fc2f40bee35018e5f7e"} Feb 19 22:10:40 crc kubenswrapper[4795]: I0219 22:10:40.911640 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ctpsf" podStartSLOduration=2.522363009 podStartE2EDuration="4.911620683s" podCreationTimestamp="2026-02-19 22:10:36 +0000 UTC" firstStartedPulling="2026-02-19 22:10:37.861869188 +0000 UTC m=+2549.054387062" lastFinishedPulling="2026-02-19 22:10:40.251126872 +0000 UTC m=+2551.443644736" observedRunningTime="2026-02-19 22:10:40.90269146 +0000 UTC m=+2552.095209324" watchObservedRunningTime="2026-02-19 22:10:40.911620683 +0000 UTC m=+2552.104138547" Feb 19 22:10:46 crc kubenswrapper[4795]: I0219 22:10:46.941538 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:46 crc kubenswrapper[4795]: I0219 22:10:46.942021 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:46 crc kubenswrapper[4795]: I0219 22:10:46.988010 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:47 crc kubenswrapper[4795]: I0219 22:10:47.984857 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:48 crc kubenswrapper[4795]: I0219 22:10:48.027509 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ctpsf"] Feb 19 22:10:49 crc kubenswrapper[4795]: I0219 22:10:49.943037 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ctpsf" podUID="9215f5c9-f305-43b3-8e82-902a494f07d9" containerName="registry-server" containerID="cri-o://bd2ce00fe78a824703dd1a02537a427d4a590b0beebf0fc2f40bee35018e5f7e" gracePeriod=2 Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.408084 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.573663 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9215f5c9-f305-43b3-8e82-902a494f07d9-utilities\") pod \"9215f5c9-f305-43b3-8e82-902a494f07d9\" (UID: \"9215f5c9-f305-43b3-8e82-902a494f07d9\") " Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.573734 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7zms\" (UniqueName: \"kubernetes.io/projected/9215f5c9-f305-43b3-8e82-902a494f07d9-kube-api-access-w7zms\") pod \"9215f5c9-f305-43b3-8e82-902a494f07d9\" (UID: \"9215f5c9-f305-43b3-8e82-902a494f07d9\") " Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.573782 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9215f5c9-f305-43b3-8e82-902a494f07d9-catalog-content\") pod \"9215f5c9-f305-43b3-8e82-902a494f07d9\" (UID: \"9215f5c9-f305-43b3-8e82-902a494f07d9\") " Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.574663 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9215f5c9-f305-43b3-8e82-902a494f07d9-utilities" (OuterVolumeSpecName: "utilities") pod "9215f5c9-f305-43b3-8e82-902a494f07d9" (UID: "9215f5c9-f305-43b3-8e82-902a494f07d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.582082 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9215f5c9-f305-43b3-8e82-902a494f07d9-kube-api-access-w7zms" (OuterVolumeSpecName: "kube-api-access-w7zms") pod "9215f5c9-f305-43b3-8e82-902a494f07d9" (UID: "9215f5c9-f305-43b3-8e82-902a494f07d9"). InnerVolumeSpecName "kube-api-access-w7zms". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.616806 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9215f5c9-f305-43b3-8e82-902a494f07d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9215f5c9-f305-43b3-8e82-902a494f07d9" (UID: "9215f5c9-f305-43b3-8e82-902a494f07d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.675149 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9215f5c9-f305-43b3-8e82-902a494f07d9-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.675259 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7zms\" (UniqueName: \"kubernetes.io/projected/9215f5c9-f305-43b3-8e82-902a494f07d9-kube-api-access-w7zms\") on node \"crc\" DevicePath \"\"" Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.675279 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9215f5c9-f305-43b3-8e82-902a494f07d9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.953733 4795 generic.go:334] "Generic (PLEG): container finished" podID="9215f5c9-f305-43b3-8e82-902a494f07d9" containerID="bd2ce00fe78a824703dd1a02537a427d4a590b0beebf0fc2f40bee35018e5f7e" exitCode=0 Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.953797 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ctpsf" event={"ID":"9215f5c9-f305-43b3-8e82-902a494f07d9","Type":"ContainerDied","Data":"bd2ce00fe78a824703dd1a02537a427d4a590b0beebf0fc2f40bee35018e5f7e"} Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.953811 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.953851 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ctpsf" event={"ID":"9215f5c9-f305-43b3-8e82-902a494f07d9","Type":"ContainerDied","Data":"c62b34ad6fb32b473e03faa814eef441291609a20f868aa6c180ac55b5240145"} Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.953938 4795 scope.go:117] "RemoveContainer" containerID="bd2ce00fe78a824703dd1a02537a427d4a590b0beebf0fc2f40bee35018e5f7e" Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.992433 4795 scope.go:117] "RemoveContainer" containerID="81ad6355095939f2fc302dfa0385b5cf46b203abd83546d96e07b8003f8ff0d7" Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.993546 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ctpsf"] Feb 19 22:10:51 crc kubenswrapper[4795]: I0219 22:10:51.018736 4795 scope.go:117] "RemoveContainer" containerID="9642bb6b6fd3fddface433a98f740926eb45e711ae4821ecbeae7f50f1f18225" Feb 19 22:10:51 crc kubenswrapper[4795]: I0219 22:10:51.027816 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ctpsf"] Feb 19 22:10:51 crc kubenswrapper[4795]: I0219 22:10:51.050512 4795 scope.go:117] "RemoveContainer" containerID="bd2ce00fe78a824703dd1a02537a427d4a590b0beebf0fc2f40bee35018e5f7e" Feb 19 22:10:51 crc kubenswrapper[4795]: E0219 22:10:51.051193 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd2ce00fe78a824703dd1a02537a427d4a590b0beebf0fc2f40bee35018e5f7e\": container with ID starting with bd2ce00fe78a824703dd1a02537a427d4a590b0beebf0fc2f40bee35018e5f7e not found: ID does not exist" containerID="bd2ce00fe78a824703dd1a02537a427d4a590b0beebf0fc2f40bee35018e5f7e" Feb 19 22:10:51 crc kubenswrapper[4795]: I0219 22:10:51.051246 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd2ce00fe78a824703dd1a02537a427d4a590b0beebf0fc2f40bee35018e5f7e"} err="failed to get container status \"bd2ce00fe78a824703dd1a02537a427d4a590b0beebf0fc2f40bee35018e5f7e\": rpc error: code = NotFound desc = could not find container \"bd2ce00fe78a824703dd1a02537a427d4a590b0beebf0fc2f40bee35018e5f7e\": container with ID starting with bd2ce00fe78a824703dd1a02537a427d4a590b0beebf0fc2f40bee35018e5f7e not found: ID does not exist" Feb 19 22:10:51 crc kubenswrapper[4795]: I0219 22:10:51.051276 4795 scope.go:117] "RemoveContainer" containerID="81ad6355095939f2fc302dfa0385b5cf46b203abd83546d96e07b8003f8ff0d7" Feb 19 22:10:51 crc kubenswrapper[4795]: E0219 22:10:51.051729 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81ad6355095939f2fc302dfa0385b5cf46b203abd83546d96e07b8003f8ff0d7\": container with ID starting with 81ad6355095939f2fc302dfa0385b5cf46b203abd83546d96e07b8003f8ff0d7 not found: ID does not exist" containerID="81ad6355095939f2fc302dfa0385b5cf46b203abd83546d96e07b8003f8ff0d7" Feb 19 22:10:51 crc kubenswrapper[4795]: I0219 22:10:51.051767 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81ad6355095939f2fc302dfa0385b5cf46b203abd83546d96e07b8003f8ff0d7"} err="failed to get container status \"81ad6355095939f2fc302dfa0385b5cf46b203abd83546d96e07b8003f8ff0d7\": rpc error: code = NotFound desc = could not find container \"81ad6355095939f2fc302dfa0385b5cf46b203abd83546d96e07b8003f8ff0d7\": container with ID starting with 81ad6355095939f2fc302dfa0385b5cf46b203abd83546d96e07b8003f8ff0d7 not found: ID does not exist" Feb 19 22:10:51 crc kubenswrapper[4795]: I0219 22:10:51.051791 4795 scope.go:117] "RemoveContainer" containerID="9642bb6b6fd3fddface433a98f740926eb45e711ae4821ecbeae7f50f1f18225" Feb 19 22:10:51 crc kubenswrapper[4795]: E0219 22:10:51.052107 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9642bb6b6fd3fddface433a98f740926eb45e711ae4821ecbeae7f50f1f18225\": container with ID starting with 9642bb6b6fd3fddface433a98f740926eb45e711ae4821ecbeae7f50f1f18225 not found: ID does not exist" containerID="9642bb6b6fd3fddface433a98f740926eb45e711ae4821ecbeae7f50f1f18225" Feb 19 22:10:51 crc kubenswrapper[4795]: I0219 22:10:51.052137 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9642bb6b6fd3fddface433a98f740926eb45e711ae4821ecbeae7f50f1f18225"} err="failed to get container status \"9642bb6b6fd3fddface433a98f740926eb45e711ae4821ecbeae7f50f1f18225\": rpc error: code = NotFound desc = could not find container \"9642bb6b6fd3fddface433a98f740926eb45e711ae4821ecbeae7f50f1f18225\": container with ID starting with 9642bb6b6fd3fddface433a98f740926eb45e711ae4821ecbeae7f50f1f18225 not found: ID does not exist" Feb 19 22:10:51 crc kubenswrapper[4795]: I0219 22:10:51.522072 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9215f5c9-f305-43b3-8e82-902a494f07d9" path="/var/lib/kubelet/pods/9215f5c9-f305-43b3-8e82-902a494f07d9/volumes" Feb 19 22:11:23 crc kubenswrapper[4795]: I0219 22:11:23.872283 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x7hcn"] Feb 19 22:11:23 crc kubenswrapper[4795]: E0219 22:11:23.873355 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9215f5c9-f305-43b3-8e82-902a494f07d9" containerName="extract-utilities" Feb 19 22:11:23 crc kubenswrapper[4795]: I0219 22:11:23.873391 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9215f5c9-f305-43b3-8e82-902a494f07d9" containerName="extract-utilities" Feb 19 22:11:23 crc kubenswrapper[4795]: E0219 22:11:23.873403 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9215f5c9-f305-43b3-8e82-902a494f07d9" containerName="extract-content" Feb 19 22:11:23 crc kubenswrapper[4795]: I0219 22:11:23.873410 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9215f5c9-f305-43b3-8e82-902a494f07d9" containerName="extract-content" Feb 19 22:11:23 crc kubenswrapper[4795]: E0219 22:11:23.873425 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9215f5c9-f305-43b3-8e82-902a494f07d9" containerName="registry-server" Feb 19 22:11:23 crc kubenswrapper[4795]: I0219 22:11:23.873431 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9215f5c9-f305-43b3-8e82-902a494f07d9" containerName="registry-server" Feb 19 22:11:23 crc kubenswrapper[4795]: I0219 22:11:23.873558 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9215f5c9-f305-43b3-8e82-902a494f07d9" containerName="registry-server" Feb 19 22:11:23 crc kubenswrapper[4795]: I0219 22:11:23.874619 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:23 crc kubenswrapper[4795]: I0219 22:11:23.894961 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x7hcn"] Feb 19 22:11:24 crc kubenswrapper[4795]: I0219 22:11:24.021701 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-catalog-content\") pod \"community-operators-x7hcn\" (UID: \"e5828c8e-eae0-448e-882d-fc02dc4ec6bb\") " pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:24 crc kubenswrapper[4795]: I0219 22:11:24.021750 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-utilities\") pod \"community-operators-x7hcn\" (UID: \"e5828c8e-eae0-448e-882d-fc02dc4ec6bb\") " pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:24 crc kubenswrapper[4795]: I0219 22:11:24.021881 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6774\" (UniqueName: \"kubernetes.io/projected/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-kube-api-access-p6774\") pod \"community-operators-x7hcn\" (UID: \"e5828c8e-eae0-448e-882d-fc02dc4ec6bb\") " pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:24 crc kubenswrapper[4795]: I0219 22:11:24.123267 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-catalog-content\") pod \"community-operators-x7hcn\" (UID: \"e5828c8e-eae0-448e-882d-fc02dc4ec6bb\") " pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:24 crc kubenswrapper[4795]: I0219 22:11:24.123561 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-utilities\") pod \"community-operators-x7hcn\" (UID: \"e5828c8e-eae0-448e-882d-fc02dc4ec6bb\") " pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:24 crc kubenswrapper[4795]: I0219 22:11:24.123686 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6774\" (UniqueName: \"kubernetes.io/projected/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-kube-api-access-p6774\") pod \"community-operators-x7hcn\" (UID: \"e5828c8e-eae0-448e-882d-fc02dc4ec6bb\") " pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:24 crc kubenswrapper[4795]: I0219 22:11:24.123882 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-catalog-content\") pod \"community-operators-x7hcn\" (UID: \"e5828c8e-eae0-448e-882d-fc02dc4ec6bb\") " pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:24 crc kubenswrapper[4795]: I0219 22:11:24.124054 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-utilities\") pod \"community-operators-x7hcn\" (UID: \"e5828c8e-eae0-448e-882d-fc02dc4ec6bb\") " pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:24 crc kubenswrapper[4795]: I0219 22:11:24.146405 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6774\" (UniqueName: \"kubernetes.io/projected/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-kube-api-access-p6774\") pod \"community-operators-x7hcn\" (UID: \"e5828c8e-eae0-448e-882d-fc02dc4ec6bb\") " pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:24 crc kubenswrapper[4795]: I0219 22:11:24.202151 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:24 crc kubenswrapper[4795]: I0219 22:11:24.683009 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x7hcn"] Feb 19 22:11:25 crc kubenswrapper[4795]: E0219 22:11:25.463338 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5828c8e_eae0_448e_882d_fc02dc4ec6bb.slice/crio-conmon-0f132ae36e43ad77380b97912b0ea58ef69b6271ceadb36c722ba86b44b0b973.scope\": RecentStats: unable to find data in memory cache]" Feb 19 22:11:25 crc kubenswrapper[4795]: I0219 22:11:25.490192 4795 generic.go:334] "Generic (PLEG): container finished" podID="e5828c8e-eae0-448e-882d-fc02dc4ec6bb" containerID="0f132ae36e43ad77380b97912b0ea58ef69b6271ceadb36c722ba86b44b0b973" exitCode=0 Feb 19 22:11:25 crc kubenswrapper[4795]: I0219 22:11:25.490240 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x7hcn" event={"ID":"e5828c8e-eae0-448e-882d-fc02dc4ec6bb","Type":"ContainerDied","Data":"0f132ae36e43ad77380b97912b0ea58ef69b6271ceadb36c722ba86b44b0b973"} Feb 19 22:11:25 crc kubenswrapper[4795]: I0219 22:11:25.490270 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x7hcn" event={"ID":"e5828c8e-eae0-448e-882d-fc02dc4ec6bb","Type":"ContainerStarted","Data":"151baa46fb777750b92c9310f491b67b5f5cb18c83876a635d6951907b4bc717"} Feb 19 22:11:27 crc kubenswrapper[4795]: I0219 22:11:27.957722 4795 patch_prober.go:28] interesting pod/route-controller-manager-84d5f88f56-vjpj2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 22:11:27 crc kubenswrapper[4795]: I0219 22:11:27.958132 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" podUID="8e80efae-f1ac-40f9-ad38-61dc2821499e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 22:11:27 crc kubenswrapper[4795]: I0219 22:11:27.972009 4795 patch_prober.go:28] interesting pod/route-controller-manager-84d5f88f56-vjpj2 container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 22:11:27 crc kubenswrapper[4795]: I0219 22:11:27.972066 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" podUID="8e80efae-f1ac-40f9-ad38-61dc2821499e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 22:11:30 crc kubenswrapper[4795]: I0219 22:11:30.188383 4795 generic.go:334] "Generic (PLEG): container finished" podID="e5828c8e-eae0-448e-882d-fc02dc4ec6bb" containerID="0ce80f2be267d708c1f1a4d0b961a7253a7f1358ae9be8f0d714ff3ea01023d2" exitCode=0 Feb 19 22:11:30 crc kubenswrapper[4795]: I0219 22:11:30.188469 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x7hcn" event={"ID":"e5828c8e-eae0-448e-882d-fc02dc4ec6bb","Type":"ContainerDied","Data":"0ce80f2be267d708c1f1a4d0b961a7253a7f1358ae9be8f0d714ff3ea01023d2"} Feb 19 22:11:31 crc kubenswrapper[4795]: I0219 22:11:31.198424 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x7hcn" event={"ID":"e5828c8e-eae0-448e-882d-fc02dc4ec6bb","Type":"ContainerStarted","Data":"dbfbe6cde84e5fb7cb2cd7fde30ec03735fe87ac29ed54b70f882321443f8013"} Feb 19 22:11:31 crc kubenswrapper[4795]: I0219 22:11:31.228306 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x7hcn" podStartSLOduration=3.168102005 podStartE2EDuration="8.228280534s" podCreationTimestamp="2026-02-19 22:11:23 +0000 UTC" firstStartedPulling="2026-02-19 22:11:25.492439423 +0000 UTC m=+2596.684957287" lastFinishedPulling="2026-02-19 22:11:30.552617942 +0000 UTC m=+2601.745135816" observedRunningTime="2026-02-19 22:11:31.224948129 +0000 UTC m=+2602.417466023" watchObservedRunningTime="2026-02-19 22:11:31.228280534 +0000 UTC m=+2602.420798428" Feb 19 22:11:34 crc kubenswrapper[4795]: I0219 22:11:34.202271 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:34 crc kubenswrapper[4795]: I0219 22:11:34.202646 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:34 crc kubenswrapper[4795]: I0219 22:11:34.263681 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:44 crc kubenswrapper[4795]: I0219 22:11:44.251285 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:44 crc kubenswrapper[4795]: I0219 22:11:44.314826 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x7hcn"] Feb 19 22:11:44 crc kubenswrapper[4795]: I0219 22:11:44.328719 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-x7hcn" podUID="e5828c8e-eae0-448e-882d-fc02dc4ec6bb" containerName="registry-server" containerID="cri-o://dbfbe6cde84e5fb7cb2cd7fde30ec03735fe87ac29ed54b70f882321443f8013" gracePeriod=2 Feb 19 22:11:44 crc kubenswrapper[4795]: I0219 22:11:44.743693 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:44 crc kubenswrapper[4795]: I0219 22:11:44.891434 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-catalog-content\") pod \"e5828c8e-eae0-448e-882d-fc02dc4ec6bb\" (UID: \"e5828c8e-eae0-448e-882d-fc02dc4ec6bb\") " Feb 19 22:11:44 crc kubenswrapper[4795]: I0219 22:11:44.891503 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6774\" (UniqueName: \"kubernetes.io/projected/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-kube-api-access-p6774\") pod \"e5828c8e-eae0-448e-882d-fc02dc4ec6bb\" (UID: \"e5828c8e-eae0-448e-882d-fc02dc4ec6bb\") " Feb 19 22:11:44 crc kubenswrapper[4795]: I0219 22:11:44.891539 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-utilities\") pod \"e5828c8e-eae0-448e-882d-fc02dc4ec6bb\" (UID: \"e5828c8e-eae0-448e-882d-fc02dc4ec6bb\") " Feb 19 22:11:44 crc kubenswrapper[4795]: I0219 22:11:44.892603 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-utilities" (OuterVolumeSpecName: "utilities") pod "e5828c8e-eae0-448e-882d-fc02dc4ec6bb" (UID: "e5828c8e-eae0-448e-882d-fc02dc4ec6bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:11:44 crc kubenswrapper[4795]: I0219 22:11:44.897261 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-kube-api-access-p6774" (OuterVolumeSpecName: "kube-api-access-p6774") pod "e5828c8e-eae0-448e-882d-fc02dc4ec6bb" (UID: "e5828c8e-eae0-448e-882d-fc02dc4ec6bb"). InnerVolumeSpecName "kube-api-access-p6774". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:11:44 crc kubenswrapper[4795]: I0219 22:11:44.950901 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5828c8e-eae0-448e-882d-fc02dc4ec6bb" (UID: "e5828c8e-eae0-448e-882d-fc02dc4ec6bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:11:44 crc kubenswrapper[4795]: I0219 22:11:44.993367 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:11:44 crc kubenswrapper[4795]: I0219 22:11:44.993420 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6774\" (UniqueName: \"kubernetes.io/projected/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-kube-api-access-p6774\") on node \"crc\" DevicePath \"\"" Feb 19 22:11:44 crc kubenswrapper[4795]: I0219 22:11:44.993449 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:11:45 crc kubenswrapper[4795]: I0219 22:11:45.342289 4795 generic.go:334] "Generic (PLEG): container finished" podID="e5828c8e-eae0-448e-882d-fc02dc4ec6bb" containerID="dbfbe6cde84e5fb7cb2cd7fde30ec03735fe87ac29ed54b70f882321443f8013" exitCode=0 Feb 19 22:11:45 crc kubenswrapper[4795]: I0219 22:11:45.342332 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x7hcn" event={"ID":"e5828c8e-eae0-448e-882d-fc02dc4ec6bb","Type":"ContainerDied","Data":"dbfbe6cde84e5fb7cb2cd7fde30ec03735fe87ac29ed54b70f882321443f8013"} Feb 19 22:11:45 crc kubenswrapper[4795]: I0219 22:11:45.342363 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x7hcn" event={"ID":"e5828c8e-eae0-448e-882d-fc02dc4ec6bb","Type":"ContainerDied","Data":"151baa46fb777750b92c9310f491b67b5f5cb18c83876a635d6951907b4bc717"} Feb 19 22:11:45 crc kubenswrapper[4795]: I0219 22:11:45.342379 4795 scope.go:117] "RemoveContainer" containerID="dbfbe6cde84e5fb7cb2cd7fde30ec03735fe87ac29ed54b70f882321443f8013" Feb 19 22:11:45 crc kubenswrapper[4795]: I0219 22:11:45.342398 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:45 crc kubenswrapper[4795]: I0219 22:11:45.374847 4795 scope.go:117] "RemoveContainer" containerID="0ce80f2be267d708c1f1a4d0b961a7253a7f1358ae9be8f0d714ff3ea01023d2" Feb 19 22:11:45 crc kubenswrapper[4795]: I0219 22:11:45.398652 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x7hcn"] Feb 19 22:11:45 crc kubenswrapper[4795]: I0219 22:11:45.407908 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-x7hcn"] Feb 19 22:11:45 crc kubenswrapper[4795]: I0219 22:11:45.415917 4795 scope.go:117] "RemoveContainer" containerID="0f132ae36e43ad77380b97912b0ea58ef69b6271ceadb36c722ba86b44b0b973" Feb 19 22:11:45 crc kubenswrapper[4795]: I0219 22:11:45.437478 4795 scope.go:117] "RemoveContainer" containerID="dbfbe6cde84e5fb7cb2cd7fde30ec03735fe87ac29ed54b70f882321443f8013" Feb 19 22:11:45 crc kubenswrapper[4795]: E0219 22:11:45.437935 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbfbe6cde84e5fb7cb2cd7fde30ec03735fe87ac29ed54b70f882321443f8013\": container with ID starting with dbfbe6cde84e5fb7cb2cd7fde30ec03735fe87ac29ed54b70f882321443f8013 not found: ID does not exist" containerID="dbfbe6cde84e5fb7cb2cd7fde30ec03735fe87ac29ed54b70f882321443f8013" Feb 19 22:11:45 crc kubenswrapper[4795]: I0219 22:11:45.437985 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbfbe6cde84e5fb7cb2cd7fde30ec03735fe87ac29ed54b70f882321443f8013"} err="failed to get container status \"dbfbe6cde84e5fb7cb2cd7fde30ec03735fe87ac29ed54b70f882321443f8013\": rpc error: code = NotFound desc = could not find container \"dbfbe6cde84e5fb7cb2cd7fde30ec03735fe87ac29ed54b70f882321443f8013\": container with ID starting with dbfbe6cde84e5fb7cb2cd7fde30ec03735fe87ac29ed54b70f882321443f8013 not found: ID does not exist" Feb 19 22:11:45 crc kubenswrapper[4795]: I0219 22:11:45.438023 4795 scope.go:117] "RemoveContainer" containerID="0ce80f2be267d708c1f1a4d0b961a7253a7f1358ae9be8f0d714ff3ea01023d2" Feb 19 22:11:45 crc kubenswrapper[4795]: E0219 22:11:45.438389 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ce80f2be267d708c1f1a4d0b961a7253a7f1358ae9be8f0d714ff3ea01023d2\": container with ID starting with 0ce80f2be267d708c1f1a4d0b961a7253a7f1358ae9be8f0d714ff3ea01023d2 not found: ID does not exist" containerID="0ce80f2be267d708c1f1a4d0b961a7253a7f1358ae9be8f0d714ff3ea01023d2" Feb 19 22:11:45 crc kubenswrapper[4795]: I0219 22:11:45.438419 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ce80f2be267d708c1f1a4d0b961a7253a7f1358ae9be8f0d714ff3ea01023d2"} err="failed to get container status \"0ce80f2be267d708c1f1a4d0b961a7253a7f1358ae9be8f0d714ff3ea01023d2\": rpc error: code = NotFound desc = could not find container \"0ce80f2be267d708c1f1a4d0b961a7253a7f1358ae9be8f0d714ff3ea01023d2\": container with ID starting with 0ce80f2be267d708c1f1a4d0b961a7253a7f1358ae9be8f0d714ff3ea01023d2 not found: ID does not exist" Feb 19 22:11:45 crc kubenswrapper[4795]: I0219 22:11:45.438437 4795 scope.go:117] "RemoveContainer" containerID="0f132ae36e43ad77380b97912b0ea58ef69b6271ceadb36c722ba86b44b0b973" Feb 19 22:11:45 crc kubenswrapper[4795]: E0219 22:11:45.438755 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f132ae36e43ad77380b97912b0ea58ef69b6271ceadb36c722ba86b44b0b973\": container with ID starting with 0f132ae36e43ad77380b97912b0ea58ef69b6271ceadb36c722ba86b44b0b973 not found: ID does not exist" containerID="0f132ae36e43ad77380b97912b0ea58ef69b6271ceadb36c722ba86b44b0b973" Feb 19 22:11:45 crc kubenswrapper[4795]: I0219 22:11:45.438847 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f132ae36e43ad77380b97912b0ea58ef69b6271ceadb36c722ba86b44b0b973"} err="failed to get container status \"0f132ae36e43ad77380b97912b0ea58ef69b6271ceadb36c722ba86b44b0b973\": rpc error: code = NotFound desc = could not find container \"0f132ae36e43ad77380b97912b0ea58ef69b6271ceadb36c722ba86b44b0b973\": container with ID starting with 0f132ae36e43ad77380b97912b0ea58ef69b6271ceadb36c722ba86b44b0b973 not found: ID does not exist" Feb 19 22:11:45 crc kubenswrapper[4795]: I0219 22:11:45.531810 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5828c8e-eae0-448e-882d-fc02dc4ec6bb" path="/var/lib/kubelet/pods/e5828c8e-eae0-448e-882d-fc02dc4ec6bb/volumes" Feb 19 22:12:28 crc kubenswrapper[4795]: I0219 22:12:28.427828 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:12:28 crc kubenswrapper[4795]: I0219 22:12:28.428500 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:12:58 crc kubenswrapper[4795]: I0219 22:12:58.427635 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:12:58 crc kubenswrapper[4795]: I0219 22:12:58.428390 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:13:28 crc kubenswrapper[4795]: I0219 22:13:28.427210 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:13:28 crc kubenswrapper[4795]: I0219 22:13:28.427827 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:13:28 crc kubenswrapper[4795]: I0219 22:13:28.427880 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 22:13:28 crc kubenswrapper[4795]: I0219 22:13:28.428561 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dae87592e9f8f7d3ce64c7cc5ffb3429ecd2b708a654e97c0a7e99d0113e7b2c"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 22:13:28 crc kubenswrapper[4795]: I0219 22:13:28.428630 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://dae87592e9f8f7d3ce64c7cc5ffb3429ecd2b708a654e97c0a7e99d0113e7b2c" gracePeriod=600 Feb 19 22:13:28 crc kubenswrapper[4795]: I0219 22:13:28.928376 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="dae87592e9f8f7d3ce64c7cc5ffb3429ecd2b708a654e97c0a7e99d0113e7b2c" exitCode=0 Feb 19 22:13:28 crc kubenswrapper[4795]: I0219 22:13:28.928628 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"dae87592e9f8f7d3ce64c7cc5ffb3429ecd2b708a654e97c0a7e99d0113e7b2c"} Feb 19 22:13:28 crc kubenswrapper[4795]: I0219 22:13:28.928845 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc"} Feb 19 22:13:28 crc kubenswrapper[4795]: I0219 22:13:28.928883 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:14:56 crc kubenswrapper[4795]: I0219 22:14:56.843208 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-98crc"] Feb 19 22:14:56 crc kubenswrapper[4795]: E0219 22:14:56.844063 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5828c8e-eae0-448e-882d-fc02dc4ec6bb" containerName="extract-utilities" Feb 19 22:14:56 crc kubenswrapper[4795]: I0219 22:14:56.844075 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5828c8e-eae0-448e-882d-fc02dc4ec6bb" containerName="extract-utilities" Feb 19 22:14:56 crc kubenswrapper[4795]: E0219 22:14:56.844090 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5828c8e-eae0-448e-882d-fc02dc4ec6bb" containerName="registry-server" Feb 19 22:14:56 crc kubenswrapper[4795]: I0219 22:14:56.844097 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5828c8e-eae0-448e-882d-fc02dc4ec6bb" containerName="registry-server" Feb 19 22:14:56 crc kubenswrapper[4795]: E0219 22:14:56.844111 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5828c8e-eae0-448e-882d-fc02dc4ec6bb" containerName="extract-content" Feb 19 22:14:56 crc kubenswrapper[4795]: I0219 22:14:56.844117 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5828c8e-eae0-448e-882d-fc02dc4ec6bb" containerName="extract-content" Feb 19 22:14:56 crc kubenswrapper[4795]: I0219 22:14:56.844257 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5828c8e-eae0-448e-882d-fc02dc4ec6bb" containerName="registry-server" Feb 19 22:14:56 crc kubenswrapper[4795]: I0219 22:14:56.845133 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:14:56 crc kubenswrapper[4795]: I0219 22:14:56.859574 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-98crc"] Feb 19 22:14:56 crc kubenswrapper[4795]: I0219 22:14:56.920199 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2acef6a5-277e-40ec-bf10-e7da2131e214-utilities\") pod \"certified-operators-98crc\" (UID: \"2acef6a5-277e-40ec-bf10-e7da2131e214\") " pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:14:56 crc kubenswrapper[4795]: I0219 22:14:56.920579 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2acef6a5-277e-40ec-bf10-e7da2131e214-catalog-content\") pod \"certified-operators-98crc\" (UID: \"2acef6a5-277e-40ec-bf10-e7da2131e214\") " pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:14:56 crc kubenswrapper[4795]: I0219 22:14:56.920604 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwxjw\" (UniqueName: \"kubernetes.io/projected/2acef6a5-277e-40ec-bf10-e7da2131e214-kube-api-access-lwxjw\") pod \"certified-operators-98crc\" (UID: \"2acef6a5-277e-40ec-bf10-e7da2131e214\") " pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:14:57 crc kubenswrapper[4795]: I0219 22:14:57.021907 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwxjw\" (UniqueName: \"kubernetes.io/projected/2acef6a5-277e-40ec-bf10-e7da2131e214-kube-api-access-lwxjw\") pod \"certified-operators-98crc\" (UID: \"2acef6a5-277e-40ec-bf10-e7da2131e214\") " pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:14:57 crc kubenswrapper[4795]: I0219 22:14:57.021993 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2acef6a5-277e-40ec-bf10-e7da2131e214-utilities\") pod \"certified-operators-98crc\" (UID: \"2acef6a5-277e-40ec-bf10-e7da2131e214\") " pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:14:57 crc kubenswrapper[4795]: I0219 22:14:57.022074 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2acef6a5-277e-40ec-bf10-e7da2131e214-catalog-content\") pod \"certified-operators-98crc\" (UID: \"2acef6a5-277e-40ec-bf10-e7da2131e214\") " pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:14:57 crc kubenswrapper[4795]: I0219 22:14:57.022548 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2acef6a5-277e-40ec-bf10-e7da2131e214-utilities\") pod \"certified-operators-98crc\" (UID: \"2acef6a5-277e-40ec-bf10-e7da2131e214\") " pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:14:57 crc kubenswrapper[4795]: I0219 22:14:57.022631 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2acef6a5-277e-40ec-bf10-e7da2131e214-catalog-content\") pod \"certified-operators-98crc\" (UID: \"2acef6a5-277e-40ec-bf10-e7da2131e214\") " pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:14:57 crc kubenswrapper[4795]: I0219 22:14:57.049940 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwxjw\" (UniqueName: \"kubernetes.io/projected/2acef6a5-277e-40ec-bf10-e7da2131e214-kube-api-access-lwxjw\") pod \"certified-operators-98crc\" (UID: \"2acef6a5-277e-40ec-bf10-e7da2131e214\") " pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:14:57 crc kubenswrapper[4795]: I0219 22:14:57.167647 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:14:57 crc kubenswrapper[4795]: I0219 22:14:57.639884 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-98crc"] Feb 19 22:14:58 crc kubenswrapper[4795]: I0219 22:14:58.639142 4795 generic.go:334] "Generic (PLEG): container finished" podID="2acef6a5-277e-40ec-bf10-e7da2131e214" containerID="83f92a64ce726c3bf68e2c8fbda9cdd63c9ac951c73af7b82ca4b45d0f0751a3" exitCode=0 Feb 19 22:14:58 crc kubenswrapper[4795]: I0219 22:14:58.639332 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-98crc" event={"ID":"2acef6a5-277e-40ec-bf10-e7da2131e214","Type":"ContainerDied","Data":"83f92a64ce726c3bf68e2c8fbda9cdd63c9ac951c73af7b82ca4b45d0f0751a3"} Feb 19 22:14:58 crc kubenswrapper[4795]: I0219 22:14:58.639636 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-98crc" event={"ID":"2acef6a5-277e-40ec-bf10-e7da2131e214","Type":"ContainerStarted","Data":"9bbf4153c93ff4e35104fe4418d5dffeea83e979c4b08141e95d740546057f98"} Feb 19 22:14:59 crc kubenswrapper[4795]: I0219 22:14:59.654955 4795 generic.go:334] "Generic (PLEG): container finished" podID="2acef6a5-277e-40ec-bf10-e7da2131e214" containerID="21e651cede2b6c92056c42aef8e3f5b46a38c10ce5b1b177fcedf006410c15d2" exitCode=0 Feb 19 22:14:59 crc kubenswrapper[4795]: I0219 22:14:59.655024 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-98crc" event={"ID":"2acef6a5-277e-40ec-bf10-e7da2131e214","Type":"ContainerDied","Data":"21e651cede2b6c92056c42aef8e3f5b46a38c10ce5b1b177fcedf006410c15d2"} Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.142660 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv"] Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.143786 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv" Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.146905 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.146999 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.153600 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv"] Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.272099 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-542fj\" (UniqueName: \"kubernetes.io/projected/1d0ffce6-6c23-4d04-a029-6322d065ff24-kube-api-access-542fj\") pod \"collect-profiles-29525655-2s9rv\" (UID: \"1d0ffce6-6c23-4d04-a029-6322d065ff24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv" Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.272250 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d0ffce6-6c23-4d04-a029-6322d065ff24-config-volume\") pod \"collect-profiles-29525655-2s9rv\" (UID: \"1d0ffce6-6c23-4d04-a029-6322d065ff24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv" Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.272282 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1d0ffce6-6c23-4d04-a029-6322d065ff24-secret-volume\") pod \"collect-profiles-29525655-2s9rv\" (UID: \"1d0ffce6-6c23-4d04-a029-6322d065ff24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv" Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.373740 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d0ffce6-6c23-4d04-a029-6322d065ff24-config-volume\") pod \"collect-profiles-29525655-2s9rv\" (UID: \"1d0ffce6-6c23-4d04-a029-6322d065ff24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv" Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.373791 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1d0ffce6-6c23-4d04-a029-6322d065ff24-secret-volume\") pod \"collect-profiles-29525655-2s9rv\" (UID: \"1d0ffce6-6c23-4d04-a029-6322d065ff24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv" Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.373833 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-542fj\" (UniqueName: \"kubernetes.io/projected/1d0ffce6-6c23-4d04-a029-6322d065ff24-kube-api-access-542fj\") pod \"collect-profiles-29525655-2s9rv\" (UID: \"1d0ffce6-6c23-4d04-a029-6322d065ff24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv" Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.374721 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d0ffce6-6c23-4d04-a029-6322d065ff24-config-volume\") pod \"collect-profiles-29525655-2s9rv\" (UID: \"1d0ffce6-6c23-4d04-a029-6322d065ff24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv" Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.379255 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1d0ffce6-6c23-4d04-a029-6322d065ff24-secret-volume\") pod \"collect-profiles-29525655-2s9rv\" (UID: \"1d0ffce6-6c23-4d04-a029-6322d065ff24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv" Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.389248 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-542fj\" (UniqueName: \"kubernetes.io/projected/1d0ffce6-6c23-4d04-a029-6322d065ff24-kube-api-access-542fj\") pod \"collect-profiles-29525655-2s9rv\" (UID: \"1d0ffce6-6c23-4d04-a029-6322d065ff24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv" Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.479949 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv" Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.686188 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv"] Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.689788 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-98crc" event={"ID":"2acef6a5-277e-40ec-bf10-e7da2131e214","Type":"ContainerStarted","Data":"a64e04ef76c8a398a8afdf80a0d2ff7ae5c32954b2cc8d93ea5da2de8db4410d"} Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.718304 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-98crc" podStartSLOduration=3.310897461 podStartE2EDuration="4.718285429s" podCreationTimestamp="2026-02-19 22:14:56 +0000 UTC" firstStartedPulling="2026-02-19 22:14:58.641543773 +0000 UTC m=+2809.834061667" lastFinishedPulling="2026-02-19 22:15:00.048931731 +0000 UTC m=+2811.241449635" observedRunningTime="2026-02-19 22:15:00.714469969 +0000 UTC m=+2811.906987843" watchObservedRunningTime="2026-02-19 22:15:00.718285429 +0000 UTC m=+2811.910803293" Feb 19 22:15:01 crc kubenswrapper[4795]: I0219 22:15:01.699143 4795 generic.go:334] "Generic (PLEG): container finished" podID="1d0ffce6-6c23-4d04-a029-6322d065ff24" containerID="e5dc57de5d860d1b9f4b0c7fa5487f6cf98d60033e436fbbba7b629cc254b689" exitCode=0 Feb 19 22:15:01 crc kubenswrapper[4795]: I0219 22:15:01.699197 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv" event={"ID":"1d0ffce6-6c23-4d04-a029-6322d065ff24","Type":"ContainerDied","Data":"e5dc57de5d860d1b9f4b0c7fa5487f6cf98d60033e436fbbba7b629cc254b689"} Feb 19 22:15:01 crc kubenswrapper[4795]: I0219 22:15:01.699479 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv" event={"ID":"1d0ffce6-6c23-4d04-a029-6322d065ff24","Type":"ContainerStarted","Data":"cd961236cfb338693aa8da100c9802e866f4fc6a2a3b71e1259855b1da0ee739"} Feb 19 22:15:02 crc kubenswrapper[4795]: I0219 22:15:02.947101 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv" Feb 19 22:15:03 crc kubenswrapper[4795]: I0219 22:15:03.009942 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1d0ffce6-6c23-4d04-a029-6322d065ff24-secret-volume\") pod \"1d0ffce6-6c23-4d04-a029-6322d065ff24\" (UID: \"1d0ffce6-6c23-4d04-a029-6322d065ff24\") " Feb 19 22:15:03 crc kubenswrapper[4795]: I0219 22:15:03.010345 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d0ffce6-6c23-4d04-a029-6322d065ff24-config-volume\") pod \"1d0ffce6-6c23-4d04-a029-6322d065ff24\" (UID: \"1d0ffce6-6c23-4d04-a029-6322d065ff24\") " Feb 19 22:15:03 crc kubenswrapper[4795]: I0219 22:15:03.010392 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-542fj\" (UniqueName: \"kubernetes.io/projected/1d0ffce6-6c23-4d04-a029-6322d065ff24-kube-api-access-542fj\") pod \"1d0ffce6-6c23-4d04-a029-6322d065ff24\" (UID: \"1d0ffce6-6c23-4d04-a029-6322d065ff24\") " Feb 19 22:15:03 crc kubenswrapper[4795]: I0219 22:15:03.011435 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d0ffce6-6c23-4d04-a029-6322d065ff24-config-volume" (OuterVolumeSpecName: "config-volume") pod "1d0ffce6-6c23-4d04-a029-6322d065ff24" (UID: "1d0ffce6-6c23-4d04-a029-6322d065ff24"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:15:03 crc kubenswrapper[4795]: I0219 22:15:03.015045 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d0ffce6-6c23-4d04-a029-6322d065ff24-kube-api-access-542fj" (OuterVolumeSpecName: "kube-api-access-542fj") pod "1d0ffce6-6c23-4d04-a029-6322d065ff24" (UID: "1d0ffce6-6c23-4d04-a029-6322d065ff24"). InnerVolumeSpecName "kube-api-access-542fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:15:03 crc kubenswrapper[4795]: I0219 22:15:03.015102 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d0ffce6-6c23-4d04-a029-6322d065ff24-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1d0ffce6-6c23-4d04-a029-6322d065ff24" (UID: "1d0ffce6-6c23-4d04-a029-6322d065ff24"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:15:03 crc kubenswrapper[4795]: I0219 22:15:03.111865 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-542fj\" (UniqueName: \"kubernetes.io/projected/1d0ffce6-6c23-4d04-a029-6322d065ff24-kube-api-access-542fj\") on node \"crc\" DevicePath \"\"" Feb 19 22:15:03 crc kubenswrapper[4795]: I0219 22:15:03.111899 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1d0ffce6-6c23-4d04-a029-6322d065ff24-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 22:15:03 crc kubenswrapper[4795]: I0219 22:15:03.111909 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d0ffce6-6c23-4d04-a029-6322d065ff24-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 22:15:03 crc kubenswrapper[4795]: I0219 22:15:03.712566 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv" event={"ID":"1d0ffce6-6c23-4d04-a029-6322d065ff24","Type":"ContainerDied","Data":"cd961236cfb338693aa8da100c9802e866f4fc6a2a3b71e1259855b1da0ee739"} Feb 19 22:15:03 crc kubenswrapper[4795]: I0219 22:15:03.712606 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd961236cfb338693aa8da100c9802e866f4fc6a2a3b71e1259855b1da0ee739" Feb 19 22:15:03 crc kubenswrapper[4795]: I0219 22:15:03.712656 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv" Feb 19 22:15:04 crc kubenswrapper[4795]: I0219 22:15:04.030857 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj"] Feb 19 22:15:04 crc kubenswrapper[4795]: I0219 22:15:04.036939 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj"] Feb 19 22:15:05 crc kubenswrapper[4795]: I0219 22:15:05.528095 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a" path="/var/lib/kubelet/pods/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a/volumes" Feb 19 22:15:07 crc kubenswrapper[4795]: I0219 22:15:07.167890 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:15:07 crc kubenswrapper[4795]: I0219 22:15:07.168431 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:15:07 crc kubenswrapper[4795]: I0219 22:15:07.222886 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:15:07 crc kubenswrapper[4795]: I0219 22:15:07.784763 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:15:07 crc kubenswrapper[4795]: I0219 22:15:07.838714 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-98crc"] Feb 19 22:15:09 crc kubenswrapper[4795]: I0219 22:15:09.752216 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-98crc" podUID="2acef6a5-277e-40ec-bf10-e7da2131e214" containerName="registry-server" containerID="cri-o://a64e04ef76c8a398a8afdf80a0d2ff7ae5c32954b2cc8d93ea5da2de8db4410d" gracePeriod=2 Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.662369 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.726936 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2acef6a5-277e-40ec-bf10-e7da2131e214-utilities\") pod \"2acef6a5-277e-40ec-bf10-e7da2131e214\" (UID: \"2acef6a5-277e-40ec-bf10-e7da2131e214\") " Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.727261 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2acef6a5-277e-40ec-bf10-e7da2131e214-catalog-content\") pod \"2acef6a5-277e-40ec-bf10-e7da2131e214\" (UID: \"2acef6a5-277e-40ec-bf10-e7da2131e214\") " Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.727421 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwxjw\" (UniqueName: \"kubernetes.io/projected/2acef6a5-277e-40ec-bf10-e7da2131e214-kube-api-access-lwxjw\") pod \"2acef6a5-277e-40ec-bf10-e7da2131e214\" (UID: \"2acef6a5-277e-40ec-bf10-e7da2131e214\") " Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.728803 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2acef6a5-277e-40ec-bf10-e7da2131e214-utilities" (OuterVolumeSpecName: "utilities") pod "2acef6a5-277e-40ec-bf10-e7da2131e214" (UID: "2acef6a5-277e-40ec-bf10-e7da2131e214"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.732680 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2acef6a5-277e-40ec-bf10-e7da2131e214-kube-api-access-lwxjw" (OuterVolumeSpecName: "kube-api-access-lwxjw") pod "2acef6a5-277e-40ec-bf10-e7da2131e214" (UID: "2acef6a5-277e-40ec-bf10-e7da2131e214"). InnerVolumeSpecName "kube-api-access-lwxjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.761116 4795 generic.go:334] "Generic (PLEG): container finished" podID="2acef6a5-277e-40ec-bf10-e7da2131e214" containerID="a64e04ef76c8a398a8afdf80a0d2ff7ae5c32954b2cc8d93ea5da2de8db4410d" exitCode=0 Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.761155 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-98crc" event={"ID":"2acef6a5-277e-40ec-bf10-e7da2131e214","Type":"ContainerDied","Data":"a64e04ef76c8a398a8afdf80a0d2ff7ae5c32954b2cc8d93ea5da2de8db4410d"} Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.761197 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-98crc" event={"ID":"2acef6a5-277e-40ec-bf10-e7da2131e214","Type":"ContainerDied","Data":"9bbf4153c93ff4e35104fe4418d5dffeea83e979c4b08141e95d740546057f98"} Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.761194 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.761211 4795 scope.go:117] "RemoveContainer" containerID="a64e04ef76c8a398a8afdf80a0d2ff7ae5c32954b2cc8d93ea5da2de8db4410d" Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.786319 4795 scope.go:117] "RemoveContainer" containerID="21e651cede2b6c92056c42aef8e3f5b46a38c10ce5b1b177fcedf006410c15d2" Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.786515 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2acef6a5-277e-40ec-bf10-e7da2131e214-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2acef6a5-277e-40ec-bf10-e7da2131e214" (UID: "2acef6a5-277e-40ec-bf10-e7da2131e214"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.806429 4795 scope.go:117] "RemoveContainer" containerID="83f92a64ce726c3bf68e2c8fbda9cdd63c9ac951c73af7b82ca4b45d0f0751a3" Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.829103 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwxjw\" (UniqueName: \"kubernetes.io/projected/2acef6a5-277e-40ec-bf10-e7da2131e214-kube-api-access-lwxjw\") on node \"crc\" DevicePath \"\"" Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.829135 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2acef6a5-277e-40ec-bf10-e7da2131e214-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.829144 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2acef6a5-277e-40ec-bf10-e7da2131e214-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.837471 4795 scope.go:117] "RemoveContainer" containerID="a64e04ef76c8a398a8afdf80a0d2ff7ae5c32954b2cc8d93ea5da2de8db4410d" Feb 19 22:15:10 crc kubenswrapper[4795]: E0219 22:15:10.837887 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a64e04ef76c8a398a8afdf80a0d2ff7ae5c32954b2cc8d93ea5da2de8db4410d\": container with ID starting with a64e04ef76c8a398a8afdf80a0d2ff7ae5c32954b2cc8d93ea5da2de8db4410d not found: ID does not exist" containerID="a64e04ef76c8a398a8afdf80a0d2ff7ae5c32954b2cc8d93ea5da2de8db4410d" Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.837914 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a64e04ef76c8a398a8afdf80a0d2ff7ae5c32954b2cc8d93ea5da2de8db4410d"} err="failed to get container status \"a64e04ef76c8a398a8afdf80a0d2ff7ae5c32954b2cc8d93ea5da2de8db4410d\": rpc error: code = NotFound desc = could not find container \"a64e04ef76c8a398a8afdf80a0d2ff7ae5c32954b2cc8d93ea5da2de8db4410d\": container with ID starting with a64e04ef76c8a398a8afdf80a0d2ff7ae5c32954b2cc8d93ea5da2de8db4410d not found: ID does not exist" Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.837935 4795 scope.go:117] "RemoveContainer" containerID="21e651cede2b6c92056c42aef8e3f5b46a38c10ce5b1b177fcedf006410c15d2" Feb 19 22:15:10 crc kubenswrapper[4795]: E0219 22:15:10.838397 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21e651cede2b6c92056c42aef8e3f5b46a38c10ce5b1b177fcedf006410c15d2\": container with ID starting with 21e651cede2b6c92056c42aef8e3f5b46a38c10ce5b1b177fcedf006410c15d2 not found: ID does not exist" containerID="21e651cede2b6c92056c42aef8e3f5b46a38c10ce5b1b177fcedf006410c15d2" Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.838445 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21e651cede2b6c92056c42aef8e3f5b46a38c10ce5b1b177fcedf006410c15d2"} err="failed to get container status \"21e651cede2b6c92056c42aef8e3f5b46a38c10ce5b1b177fcedf006410c15d2\": rpc error: code = NotFound desc = could not find container \"21e651cede2b6c92056c42aef8e3f5b46a38c10ce5b1b177fcedf006410c15d2\": container with ID starting with 21e651cede2b6c92056c42aef8e3f5b46a38c10ce5b1b177fcedf006410c15d2 not found: ID does not exist" Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.838474 4795 scope.go:117] "RemoveContainer" containerID="83f92a64ce726c3bf68e2c8fbda9cdd63c9ac951c73af7b82ca4b45d0f0751a3" Feb 19 22:15:10 crc kubenswrapper[4795]: E0219 22:15:10.838800 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83f92a64ce726c3bf68e2c8fbda9cdd63c9ac951c73af7b82ca4b45d0f0751a3\": container with ID starting with 83f92a64ce726c3bf68e2c8fbda9cdd63c9ac951c73af7b82ca4b45d0f0751a3 not found: ID does not exist" containerID="83f92a64ce726c3bf68e2c8fbda9cdd63c9ac951c73af7b82ca4b45d0f0751a3" Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.838835 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83f92a64ce726c3bf68e2c8fbda9cdd63c9ac951c73af7b82ca4b45d0f0751a3"} err="failed to get container status \"83f92a64ce726c3bf68e2c8fbda9cdd63c9ac951c73af7b82ca4b45d0f0751a3\": rpc error: code = NotFound desc = could not find container \"83f92a64ce726c3bf68e2c8fbda9cdd63c9ac951c73af7b82ca4b45d0f0751a3\": container with ID starting with 83f92a64ce726c3bf68e2c8fbda9cdd63c9ac951c73af7b82ca4b45d0f0751a3 not found: ID does not exist" Feb 19 22:15:11 crc kubenswrapper[4795]: I0219 22:15:11.086607 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-98crc"] Feb 19 22:15:11 crc kubenswrapper[4795]: I0219 22:15:11.091334 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-98crc"] Feb 19 22:15:11 crc kubenswrapper[4795]: I0219 22:15:11.529826 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2acef6a5-277e-40ec-bf10-e7da2131e214" path="/var/lib/kubelet/pods/2acef6a5-277e-40ec-bf10-e7da2131e214/volumes" Feb 19 22:15:14 crc kubenswrapper[4795]: I0219 22:15:14.117112 4795 scope.go:117] "RemoveContainer" containerID="8f02da0738e9d178691c0389499a8480ffbd2f961710f2202fa44516bf81c4c6" Feb 19 22:15:28 crc kubenswrapper[4795]: I0219 22:15:28.427111 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:15:28 crc kubenswrapper[4795]: I0219 22:15:28.427705 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:15:58 crc kubenswrapper[4795]: I0219 22:15:58.428531 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:15:58 crc kubenswrapper[4795]: I0219 22:15:58.430349 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:16:28 crc kubenswrapper[4795]: I0219 22:16:28.427752 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:16:28 crc kubenswrapper[4795]: I0219 22:16:28.428665 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:16:28 crc kubenswrapper[4795]: I0219 22:16:28.428755 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 22:16:28 crc kubenswrapper[4795]: I0219 22:16:28.429900 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 22:16:28 crc kubenswrapper[4795]: I0219 22:16:28.430045 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" gracePeriod=600 Feb 19 22:16:28 crc kubenswrapper[4795]: E0219 22:16:28.560045 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:16:29 crc kubenswrapper[4795]: I0219 22:16:29.388829 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" exitCode=0 Feb 19 22:16:29 crc kubenswrapper[4795]: I0219 22:16:29.388889 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc"} Feb 19 22:16:29 crc kubenswrapper[4795]: I0219 22:16:29.388942 4795 scope.go:117] "RemoveContainer" containerID="dae87592e9f8f7d3ce64c7cc5ffb3429ecd2b708a654e97c0a7e99d0113e7b2c" Feb 19 22:16:29 crc kubenswrapper[4795]: I0219 22:16:29.389490 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:16:29 crc kubenswrapper[4795]: E0219 22:16:29.389720 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:16:41 crc kubenswrapper[4795]: I0219 22:16:41.512802 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:16:41 crc kubenswrapper[4795]: E0219 22:16:41.513412 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:16:53 crc kubenswrapper[4795]: I0219 22:16:53.513232 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:16:53 crc kubenswrapper[4795]: E0219 22:16:53.514406 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:17:04 crc kubenswrapper[4795]: I0219 22:17:04.512206 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:17:04 crc kubenswrapper[4795]: E0219 22:17:04.513530 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:17:17 crc kubenswrapper[4795]: I0219 22:17:17.511638 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:17:17 crc kubenswrapper[4795]: E0219 22:17:17.512404 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:17:32 crc kubenswrapper[4795]: I0219 22:17:32.512120 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:17:32 crc kubenswrapper[4795]: E0219 22:17:32.513317 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:17:47 crc kubenswrapper[4795]: I0219 22:17:47.511730 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:17:47 crc kubenswrapper[4795]: E0219 22:17:47.512899 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:18:01 crc kubenswrapper[4795]: I0219 22:18:01.512346 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:18:01 crc kubenswrapper[4795]: E0219 22:18:01.513080 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:18:13 crc kubenswrapper[4795]: I0219 22:18:13.514135 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:18:13 crc kubenswrapper[4795]: E0219 22:18:13.514906 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:18:28 crc kubenswrapper[4795]: I0219 22:18:28.512301 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:18:28 crc kubenswrapper[4795]: E0219 22:18:28.513411 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:18:43 crc kubenswrapper[4795]: I0219 22:18:43.511970 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:18:43 crc kubenswrapper[4795]: E0219 22:18:43.512676 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:18:58 crc kubenswrapper[4795]: I0219 22:18:58.511647 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:18:58 crc kubenswrapper[4795]: E0219 22:18:58.512573 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:19:12 crc kubenswrapper[4795]: I0219 22:19:12.512010 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:19:12 crc kubenswrapper[4795]: E0219 22:19:12.512634 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:19:25 crc kubenswrapper[4795]: I0219 22:19:25.511756 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:19:25 crc kubenswrapper[4795]: E0219 22:19:25.512515 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:19:39 crc kubenswrapper[4795]: I0219 22:19:39.544572 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:19:39 crc kubenswrapper[4795]: E0219 22:19:39.545575 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:19:53 crc kubenswrapper[4795]: I0219 22:19:53.512599 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:19:53 crc kubenswrapper[4795]: E0219 22:19:53.513549 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:20:04 crc kubenswrapper[4795]: I0219 22:20:04.511945 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:20:04 crc kubenswrapper[4795]: E0219 22:20:04.512473 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.424103 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mftpp"] Feb 19 22:20:06 crc kubenswrapper[4795]: E0219 22:20:06.424668 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d0ffce6-6c23-4d04-a029-6322d065ff24" containerName="collect-profiles" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.424684 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d0ffce6-6c23-4d04-a029-6322d065ff24" containerName="collect-profiles" Feb 19 22:20:06 crc kubenswrapper[4795]: E0219 22:20:06.424694 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2acef6a5-277e-40ec-bf10-e7da2131e214" containerName="extract-utilities" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.424699 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2acef6a5-277e-40ec-bf10-e7da2131e214" containerName="extract-utilities" Feb 19 22:20:06 crc kubenswrapper[4795]: E0219 22:20:06.424713 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2acef6a5-277e-40ec-bf10-e7da2131e214" containerName="extract-content" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.424720 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2acef6a5-277e-40ec-bf10-e7da2131e214" containerName="extract-content" Feb 19 22:20:06 crc kubenswrapper[4795]: E0219 22:20:06.424729 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2acef6a5-277e-40ec-bf10-e7da2131e214" containerName="registry-server" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.424737 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2acef6a5-277e-40ec-bf10-e7da2131e214" containerName="registry-server" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.424898 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2acef6a5-277e-40ec-bf10-e7da2131e214" containerName="registry-server" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.424913 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d0ffce6-6c23-4d04-a029-6322d065ff24" containerName="collect-profiles" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.425842 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.438358 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mftpp"] Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.606035 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-utilities\") pod \"redhat-operators-mftpp\" (UID: \"cda983e9-8c54-4e35-aa6a-a3ae6501c46e\") " pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.606439 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk55d\" (UniqueName: \"kubernetes.io/projected/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-kube-api-access-wk55d\") pod \"redhat-operators-mftpp\" (UID: \"cda983e9-8c54-4e35-aa6a-a3ae6501c46e\") " pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.606563 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-catalog-content\") pod \"redhat-operators-mftpp\" (UID: \"cda983e9-8c54-4e35-aa6a-a3ae6501c46e\") " pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.708044 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk55d\" (UniqueName: \"kubernetes.io/projected/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-kube-api-access-wk55d\") pod \"redhat-operators-mftpp\" (UID: \"cda983e9-8c54-4e35-aa6a-a3ae6501c46e\") " pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.708392 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-catalog-content\") pod \"redhat-operators-mftpp\" (UID: \"cda983e9-8c54-4e35-aa6a-a3ae6501c46e\") " pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.708526 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-utilities\") pod \"redhat-operators-mftpp\" (UID: \"cda983e9-8c54-4e35-aa6a-a3ae6501c46e\") " pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.708824 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-catalog-content\") pod \"redhat-operators-mftpp\" (UID: \"cda983e9-8c54-4e35-aa6a-a3ae6501c46e\") " pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.708932 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-utilities\") pod \"redhat-operators-mftpp\" (UID: \"cda983e9-8c54-4e35-aa6a-a3ae6501c46e\") " pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.728650 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk55d\" (UniqueName: \"kubernetes.io/projected/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-kube-api-access-wk55d\") pod \"redhat-operators-mftpp\" (UID: \"cda983e9-8c54-4e35-aa6a-a3ae6501c46e\") " pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.746011 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:07 crc kubenswrapper[4795]: I0219 22:20:07.206505 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mftpp"] Feb 19 22:20:07 crc kubenswrapper[4795]: I0219 22:20:07.742686 4795 generic.go:334] "Generic (PLEG): container finished" podID="cda983e9-8c54-4e35-aa6a-a3ae6501c46e" containerID="30bf44d3b22bfa515599ddb948f8701928949b25404fed453e4217fa382b7d0d" exitCode=0 Feb 19 22:20:07 crc kubenswrapper[4795]: I0219 22:20:07.742991 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mftpp" event={"ID":"cda983e9-8c54-4e35-aa6a-a3ae6501c46e","Type":"ContainerDied","Data":"30bf44d3b22bfa515599ddb948f8701928949b25404fed453e4217fa382b7d0d"} Feb 19 22:20:07 crc kubenswrapper[4795]: I0219 22:20:07.743024 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mftpp" event={"ID":"cda983e9-8c54-4e35-aa6a-a3ae6501c46e","Type":"ContainerStarted","Data":"92f4e566bfbad9d24382f5e82cfab642304375b17d5b3ef1979e50a28f7f804a"} Feb 19 22:20:07 crc kubenswrapper[4795]: I0219 22:20:07.744196 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 22:20:08 crc kubenswrapper[4795]: I0219 22:20:08.757682 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mftpp" event={"ID":"cda983e9-8c54-4e35-aa6a-a3ae6501c46e","Type":"ContainerStarted","Data":"e66cef71d209ee780a2f0c8088d62b4d7d67e10f5089b829b0ff0443ea452b10"} Feb 19 22:20:09 crc kubenswrapper[4795]: I0219 22:20:09.765355 4795 generic.go:334] "Generic (PLEG): container finished" podID="cda983e9-8c54-4e35-aa6a-a3ae6501c46e" containerID="e66cef71d209ee780a2f0c8088d62b4d7d67e10f5089b829b0ff0443ea452b10" exitCode=0 Feb 19 22:20:09 crc kubenswrapper[4795]: I0219 22:20:09.765429 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mftpp" event={"ID":"cda983e9-8c54-4e35-aa6a-a3ae6501c46e","Type":"ContainerDied","Data":"e66cef71d209ee780a2f0c8088d62b4d7d67e10f5089b829b0ff0443ea452b10"} Feb 19 22:20:10 crc kubenswrapper[4795]: I0219 22:20:10.774130 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mftpp" event={"ID":"cda983e9-8c54-4e35-aa6a-a3ae6501c46e","Type":"ContainerStarted","Data":"57616f97cc3e6ed6bae93b783711e80303cffe521c28b47b5142d77329d68261"} Feb 19 22:20:10 crc kubenswrapper[4795]: I0219 22:20:10.793514 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mftpp" podStartSLOduration=2.399963529 podStartE2EDuration="4.793495513s" podCreationTimestamp="2026-02-19 22:20:06 +0000 UTC" firstStartedPulling="2026-02-19 22:20:07.743908219 +0000 UTC m=+3118.936426083" lastFinishedPulling="2026-02-19 22:20:10.137440213 +0000 UTC m=+3121.329958067" observedRunningTime="2026-02-19 22:20:10.791228986 +0000 UTC m=+3121.983746860" watchObservedRunningTime="2026-02-19 22:20:10.793495513 +0000 UTC m=+3121.986013377" Feb 19 22:20:17 crc kubenswrapper[4795]: I0219 22:20:17.110106 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:17 crc kubenswrapper[4795]: I0219 22:20:17.112014 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:18 crc kubenswrapper[4795]: I0219 22:20:18.253914 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mftpp" podUID="cda983e9-8c54-4e35-aa6a-a3ae6501c46e" containerName="registry-server" probeResult="failure" output=< Feb 19 22:20:18 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Feb 19 22:20:18 crc kubenswrapper[4795]: > Feb 19 22:20:18 crc kubenswrapper[4795]: I0219 22:20:18.511658 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:20:18 crc kubenswrapper[4795]: E0219 22:20:18.512155 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:20:26 crc kubenswrapper[4795]: I0219 22:20:26.781953 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:26 crc kubenswrapper[4795]: I0219 22:20:26.834522 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:27 crc kubenswrapper[4795]: I0219 22:20:27.017613 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mftpp"] Feb 19 22:20:28 crc kubenswrapper[4795]: I0219 22:20:28.234067 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mftpp" podUID="cda983e9-8c54-4e35-aa6a-a3ae6501c46e" containerName="registry-server" containerID="cri-o://57616f97cc3e6ed6bae93b783711e80303cffe521c28b47b5142d77329d68261" gracePeriod=2 Feb 19 22:20:28 crc kubenswrapper[4795]: I0219 22:20:28.599093 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:28 crc kubenswrapper[4795]: I0219 22:20:28.766369 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-utilities\") pod \"cda983e9-8c54-4e35-aa6a-a3ae6501c46e\" (UID: \"cda983e9-8c54-4e35-aa6a-a3ae6501c46e\") " Feb 19 22:20:28 crc kubenswrapper[4795]: I0219 22:20:28.766456 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-catalog-content\") pod \"cda983e9-8c54-4e35-aa6a-a3ae6501c46e\" (UID: \"cda983e9-8c54-4e35-aa6a-a3ae6501c46e\") " Feb 19 22:20:28 crc kubenswrapper[4795]: I0219 22:20:28.766491 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk55d\" (UniqueName: \"kubernetes.io/projected/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-kube-api-access-wk55d\") pod \"cda983e9-8c54-4e35-aa6a-a3ae6501c46e\" (UID: \"cda983e9-8c54-4e35-aa6a-a3ae6501c46e\") " Feb 19 22:20:28 crc kubenswrapper[4795]: I0219 22:20:28.767104 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-utilities" (OuterVolumeSpecName: "utilities") pod "cda983e9-8c54-4e35-aa6a-a3ae6501c46e" (UID: "cda983e9-8c54-4e35-aa6a-a3ae6501c46e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:20:28 crc kubenswrapper[4795]: I0219 22:20:28.771725 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-kube-api-access-wk55d" (OuterVolumeSpecName: "kube-api-access-wk55d") pod "cda983e9-8c54-4e35-aa6a-a3ae6501c46e" (UID: "cda983e9-8c54-4e35-aa6a-a3ae6501c46e"). InnerVolumeSpecName "kube-api-access-wk55d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:20:28 crc kubenswrapper[4795]: I0219 22:20:28.868231 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:20:28 crc kubenswrapper[4795]: I0219 22:20:28.868270 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk55d\" (UniqueName: \"kubernetes.io/projected/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-kube-api-access-wk55d\") on node \"crc\" DevicePath \"\"" Feb 19 22:20:28 crc kubenswrapper[4795]: I0219 22:20:28.885544 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cda983e9-8c54-4e35-aa6a-a3ae6501c46e" (UID: "cda983e9-8c54-4e35-aa6a-a3ae6501c46e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:20:28 crc kubenswrapper[4795]: I0219 22:20:28.969900 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:20:29 crc kubenswrapper[4795]: I0219 22:20:29.245222 4795 generic.go:334] "Generic (PLEG): container finished" podID="cda983e9-8c54-4e35-aa6a-a3ae6501c46e" containerID="57616f97cc3e6ed6bae93b783711e80303cffe521c28b47b5142d77329d68261" exitCode=0 Feb 19 22:20:29 crc kubenswrapper[4795]: I0219 22:20:29.245288 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:29 crc kubenswrapper[4795]: I0219 22:20:29.245294 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mftpp" event={"ID":"cda983e9-8c54-4e35-aa6a-a3ae6501c46e","Type":"ContainerDied","Data":"57616f97cc3e6ed6bae93b783711e80303cffe521c28b47b5142d77329d68261"} Feb 19 22:20:29 crc kubenswrapper[4795]: I0219 22:20:29.245362 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mftpp" event={"ID":"cda983e9-8c54-4e35-aa6a-a3ae6501c46e","Type":"ContainerDied","Data":"92f4e566bfbad9d24382f5e82cfab642304375b17d5b3ef1979e50a28f7f804a"} Feb 19 22:20:29 crc kubenswrapper[4795]: I0219 22:20:29.245390 4795 scope.go:117] "RemoveContainer" containerID="57616f97cc3e6ed6bae93b783711e80303cffe521c28b47b5142d77329d68261" Feb 19 22:20:29 crc kubenswrapper[4795]: I0219 22:20:29.265700 4795 scope.go:117] "RemoveContainer" containerID="e66cef71d209ee780a2f0c8088d62b4d7d67e10f5089b829b0ff0443ea452b10" Feb 19 22:20:29 crc kubenswrapper[4795]: I0219 22:20:29.279359 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mftpp"] Feb 19 22:20:29 crc kubenswrapper[4795]: I0219 22:20:29.292466 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mftpp"] Feb 19 22:20:29 crc kubenswrapper[4795]: I0219 22:20:29.306433 4795 scope.go:117] "RemoveContainer" containerID="30bf44d3b22bfa515599ddb948f8701928949b25404fed453e4217fa382b7d0d" Feb 19 22:20:29 crc kubenswrapper[4795]: I0219 22:20:29.323698 4795 scope.go:117] "RemoveContainer" containerID="57616f97cc3e6ed6bae93b783711e80303cffe521c28b47b5142d77329d68261" Feb 19 22:20:29 crc kubenswrapper[4795]: E0219 22:20:29.325938 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57616f97cc3e6ed6bae93b783711e80303cffe521c28b47b5142d77329d68261\": container with ID starting with 57616f97cc3e6ed6bae93b783711e80303cffe521c28b47b5142d77329d68261 not found: ID does not exist" containerID="57616f97cc3e6ed6bae93b783711e80303cffe521c28b47b5142d77329d68261" Feb 19 22:20:29 crc kubenswrapper[4795]: I0219 22:20:29.325983 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57616f97cc3e6ed6bae93b783711e80303cffe521c28b47b5142d77329d68261"} err="failed to get container status \"57616f97cc3e6ed6bae93b783711e80303cffe521c28b47b5142d77329d68261\": rpc error: code = NotFound desc = could not find container \"57616f97cc3e6ed6bae93b783711e80303cffe521c28b47b5142d77329d68261\": container with ID starting with 57616f97cc3e6ed6bae93b783711e80303cffe521c28b47b5142d77329d68261 not found: ID does not exist" Feb 19 22:20:29 crc kubenswrapper[4795]: I0219 22:20:29.326020 4795 scope.go:117] "RemoveContainer" containerID="e66cef71d209ee780a2f0c8088d62b4d7d67e10f5089b829b0ff0443ea452b10" Feb 19 22:20:29 crc kubenswrapper[4795]: E0219 22:20:29.326489 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e66cef71d209ee780a2f0c8088d62b4d7d67e10f5089b829b0ff0443ea452b10\": container with ID starting with e66cef71d209ee780a2f0c8088d62b4d7d67e10f5089b829b0ff0443ea452b10 not found: ID does not exist" containerID="e66cef71d209ee780a2f0c8088d62b4d7d67e10f5089b829b0ff0443ea452b10" Feb 19 22:20:29 crc kubenswrapper[4795]: I0219 22:20:29.326543 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e66cef71d209ee780a2f0c8088d62b4d7d67e10f5089b829b0ff0443ea452b10"} err="failed to get container status \"e66cef71d209ee780a2f0c8088d62b4d7d67e10f5089b829b0ff0443ea452b10\": rpc error: code = NotFound desc = could not find container \"e66cef71d209ee780a2f0c8088d62b4d7d67e10f5089b829b0ff0443ea452b10\": container with ID starting with e66cef71d209ee780a2f0c8088d62b4d7d67e10f5089b829b0ff0443ea452b10 not found: ID does not exist" Feb 19 22:20:29 crc kubenswrapper[4795]: I0219 22:20:29.326580 4795 scope.go:117] "RemoveContainer" containerID="30bf44d3b22bfa515599ddb948f8701928949b25404fed453e4217fa382b7d0d" Feb 19 22:20:29 crc kubenswrapper[4795]: E0219 22:20:29.327011 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30bf44d3b22bfa515599ddb948f8701928949b25404fed453e4217fa382b7d0d\": container with ID starting with 30bf44d3b22bfa515599ddb948f8701928949b25404fed453e4217fa382b7d0d not found: ID does not exist" containerID="30bf44d3b22bfa515599ddb948f8701928949b25404fed453e4217fa382b7d0d" Feb 19 22:20:29 crc kubenswrapper[4795]: I0219 22:20:29.327052 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30bf44d3b22bfa515599ddb948f8701928949b25404fed453e4217fa382b7d0d"} err="failed to get container status \"30bf44d3b22bfa515599ddb948f8701928949b25404fed453e4217fa382b7d0d\": rpc error: code = NotFound desc = could not find container \"30bf44d3b22bfa515599ddb948f8701928949b25404fed453e4217fa382b7d0d\": container with ID starting with 30bf44d3b22bfa515599ddb948f8701928949b25404fed453e4217fa382b7d0d not found: ID does not exist" Feb 19 22:20:29 crc kubenswrapper[4795]: I0219 22:20:29.523785 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cda983e9-8c54-4e35-aa6a-a3ae6501c46e" path="/var/lib/kubelet/pods/cda983e9-8c54-4e35-aa6a-a3ae6501c46e/volumes" Feb 19 22:20:33 crc kubenswrapper[4795]: I0219 22:20:33.511871 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:20:33 crc kubenswrapper[4795]: E0219 22:20:33.512557 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:20:45 crc kubenswrapper[4795]: I0219 22:20:45.512536 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:20:45 crc kubenswrapper[4795]: E0219 22:20:45.514994 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:20:47 crc kubenswrapper[4795]: I0219 22:20:47.866981 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j6m5s"] Feb 19 22:20:47 crc kubenswrapper[4795]: E0219 22:20:47.867575 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cda983e9-8c54-4e35-aa6a-a3ae6501c46e" containerName="registry-server" Feb 19 22:20:47 crc kubenswrapper[4795]: I0219 22:20:47.867590 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cda983e9-8c54-4e35-aa6a-a3ae6501c46e" containerName="registry-server" Feb 19 22:20:47 crc kubenswrapper[4795]: E0219 22:20:47.867610 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cda983e9-8c54-4e35-aa6a-a3ae6501c46e" containerName="extract-content" Feb 19 22:20:47 crc kubenswrapper[4795]: I0219 22:20:47.867619 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cda983e9-8c54-4e35-aa6a-a3ae6501c46e" containerName="extract-content" Feb 19 22:20:47 crc kubenswrapper[4795]: E0219 22:20:47.867641 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cda983e9-8c54-4e35-aa6a-a3ae6501c46e" containerName="extract-utilities" Feb 19 22:20:47 crc kubenswrapper[4795]: I0219 22:20:47.867650 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cda983e9-8c54-4e35-aa6a-a3ae6501c46e" containerName="extract-utilities" Feb 19 22:20:47 crc kubenswrapper[4795]: I0219 22:20:47.867824 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cda983e9-8c54-4e35-aa6a-a3ae6501c46e" containerName="registry-server" Feb 19 22:20:47 crc kubenswrapper[4795]: I0219 22:20:47.869028 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:20:47 crc kubenswrapper[4795]: I0219 22:20:47.882844 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j6m5s"] Feb 19 22:20:48 crc kubenswrapper[4795]: I0219 22:20:48.004320 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/583aaf4a-8b98-4385-b16a-009ddc9d03c1-catalog-content\") pod \"redhat-marketplace-j6m5s\" (UID: \"583aaf4a-8b98-4385-b16a-009ddc9d03c1\") " pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:20:48 crc kubenswrapper[4795]: I0219 22:20:48.004476 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/583aaf4a-8b98-4385-b16a-009ddc9d03c1-utilities\") pod \"redhat-marketplace-j6m5s\" (UID: \"583aaf4a-8b98-4385-b16a-009ddc9d03c1\") " pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:20:48 crc kubenswrapper[4795]: I0219 22:20:48.004586 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgmvs\" (UniqueName: \"kubernetes.io/projected/583aaf4a-8b98-4385-b16a-009ddc9d03c1-kube-api-access-dgmvs\") pod \"redhat-marketplace-j6m5s\" (UID: \"583aaf4a-8b98-4385-b16a-009ddc9d03c1\") " pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:20:48 crc kubenswrapper[4795]: I0219 22:20:48.105971 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgmvs\" (UniqueName: \"kubernetes.io/projected/583aaf4a-8b98-4385-b16a-009ddc9d03c1-kube-api-access-dgmvs\") pod \"redhat-marketplace-j6m5s\" (UID: \"583aaf4a-8b98-4385-b16a-009ddc9d03c1\") " pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:20:48 crc kubenswrapper[4795]: I0219 22:20:48.106073 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/583aaf4a-8b98-4385-b16a-009ddc9d03c1-catalog-content\") pod \"redhat-marketplace-j6m5s\" (UID: \"583aaf4a-8b98-4385-b16a-009ddc9d03c1\") " pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:20:48 crc kubenswrapper[4795]: I0219 22:20:48.106110 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/583aaf4a-8b98-4385-b16a-009ddc9d03c1-utilities\") pod \"redhat-marketplace-j6m5s\" (UID: \"583aaf4a-8b98-4385-b16a-009ddc9d03c1\") " pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:20:48 crc kubenswrapper[4795]: I0219 22:20:48.106884 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/583aaf4a-8b98-4385-b16a-009ddc9d03c1-utilities\") pod \"redhat-marketplace-j6m5s\" (UID: \"583aaf4a-8b98-4385-b16a-009ddc9d03c1\") " pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:20:48 crc kubenswrapper[4795]: I0219 22:20:48.107038 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/583aaf4a-8b98-4385-b16a-009ddc9d03c1-catalog-content\") pod \"redhat-marketplace-j6m5s\" (UID: \"583aaf4a-8b98-4385-b16a-009ddc9d03c1\") " pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:20:48 crc kubenswrapper[4795]: I0219 22:20:48.128784 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgmvs\" (UniqueName: \"kubernetes.io/projected/583aaf4a-8b98-4385-b16a-009ddc9d03c1-kube-api-access-dgmvs\") pod \"redhat-marketplace-j6m5s\" (UID: \"583aaf4a-8b98-4385-b16a-009ddc9d03c1\") " pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:20:48 crc kubenswrapper[4795]: I0219 22:20:48.190206 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:20:48 crc kubenswrapper[4795]: I0219 22:20:48.452846 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j6m5s"] Feb 19 22:20:48 crc kubenswrapper[4795]: I0219 22:20:48.925720 4795 generic.go:334] "Generic (PLEG): container finished" podID="583aaf4a-8b98-4385-b16a-009ddc9d03c1" containerID="5c7275e62f7372a36164f21f6ec2134e29ed2f9c38f129655aa63e1205e38c33" exitCode=0 Feb 19 22:20:48 crc kubenswrapper[4795]: I0219 22:20:48.925783 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j6m5s" event={"ID":"583aaf4a-8b98-4385-b16a-009ddc9d03c1","Type":"ContainerDied","Data":"5c7275e62f7372a36164f21f6ec2134e29ed2f9c38f129655aa63e1205e38c33"} Feb 19 22:20:48 crc kubenswrapper[4795]: I0219 22:20:48.926072 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j6m5s" event={"ID":"583aaf4a-8b98-4385-b16a-009ddc9d03c1","Type":"ContainerStarted","Data":"b756a66576e7cccaecd610df95a2f875171020cbdb675495d633bcadca6e3e32"} Feb 19 22:20:49 crc kubenswrapper[4795]: I0219 22:20:49.937040 4795 generic.go:334] "Generic (PLEG): container finished" podID="583aaf4a-8b98-4385-b16a-009ddc9d03c1" containerID="242c1278d3b75e924f451cdde3eb750c98b500bf199d980f136750a047a5320b" exitCode=0 Feb 19 22:20:49 crc kubenswrapper[4795]: I0219 22:20:49.937177 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j6m5s" event={"ID":"583aaf4a-8b98-4385-b16a-009ddc9d03c1","Type":"ContainerDied","Data":"242c1278d3b75e924f451cdde3eb750c98b500bf199d980f136750a047a5320b"} Feb 19 22:20:50 crc kubenswrapper[4795]: I0219 22:20:50.948222 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j6m5s" event={"ID":"583aaf4a-8b98-4385-b16a-009ddc9d03c1","Type":"ContainerStarted","Data":"b0e173b2579c3f35c12a29aff93a1ebf8d4ad9804faefc64bf3d9f48f42e7bf9"} Feb 19 22:20:50 crc kubenswrapper[4795]: I0219 22:20:50.969940 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j6m5s" podStartSLOduration=2.535970871 podStartE2EDuration="3.969885321s" podCreationTimestamp="2026-02-19 22:20:47 +0000 UTC" firstStartedPulling="2026-02-19 22:20:48.927960411 +0000 UTC m=+3160.120478275" lastFinishedPulling="2026-02-19 22:20:50.361874861 +0000 UTC m=+3161.554392725" observedRunningTime="2026-02-19 22:20:50.966241897 +0000 UTC m=+3162.158759801" watchObservedRunningTime="2026-02-19 22:20:50.969885321 +0000 UTC m=+3162.162403205" Feb 19 22:20:57 crc kubenswrapper[4795]: I0219 22:20:57.512556 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:20:57 crc kubenswrapper[4795]: E0219 22:20:57.513484 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:20:58 crc kubenswrapper[4795]: I0219 22:20:58.191358 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:20:58 crc kubenswrapper[4795]: I0219 22:20:58.191424 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:20:58 crc kubenswrapper[4795]: I0219 22:20:58.256620 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:20:59 crc kubenswrapper[4795]: I0219 22:20:59.089814 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:20:59 crc kubenswrapper[4795]: I0219 22:20:59.155501 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j6m5s"] Feb 19 22:21:01 crc kubenswrapper[4795]: I0219 22:21:01.030465 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j6m5s" podUID="583aaf4a-8b98-4385-b16a-009ddc9d03c1" containerName="registry-server" containerID="cri-o://b0e173b2579c3f35c12a29aff93a1ebf8d4ad9804faefc64bf3d9f48f42e7bf9" gracePeriod=2 Feb 19 22:21:01 crc kubenswrapper[4795]: I0219 22:21:01.477995 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:21:01 crc kubenswrapper[4795]: I0219 22:21:01.596297 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/583aaf4a-8b98-4385-b16a-009ddc9d03c1-utilities\") pod \"583aaf4a-8b98-4385-b16a-009ddc9d03c1\" (UID: \"583aaf4a-8b98-4385-b16a-009ddc9d03c1\") " Feb 19 22:21:01 crc kubenswrapper[4795]: I0219 22:21:01.596434 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgmvs\" (UniqueName: \"kubernetes.io/projected/583aaf4a-8b98-4385-b16a-009ddc9d03c1-kube-api-access-dgmvs\") pod \"583aaf4a-8b98-4385-b16a-009ddc9d03c1\" (UID: \"583aaf4a-8b98-4385-b16a-009ddc9d03c1\") " Feb 19 22:21:01 crc kubenswrapper[4795]: I0219 22:21:01.596524 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/583aaf4a-8b98-4385-b16a-009ddc9d03c1-catalog-content\") pod \"583aaf4a-8b98-4385-b16a-009ddc9d03c1\" (UID: \"583aaf4a-8b98-4385-b16a-009ddc9d03c1\") " Feb 19 22:21:01 crc kubenswrapper[4795]: I0219 22:21:01.597606 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/583aaf4a-8b98-4385-b16a-009ddc9d03c1-utilities" (OuterVolumeSpecName: "utilities") pod "583aaf4a-8b98-4385-b16a-009ddc9d03c1" (UID: "583aaf4a-8b98-4385-b16a-009ddc9d03c1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:21:01 crc kubenswrapper[4795]: I0219 22:21:01.611664 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/583aaf4a-8b98-4385-b16a-009ddc9d03c1-kube-api-access-dgmvs" (OuterVolumeSpecName: "kube-api-access-dgmvs") pod "583aaf4a-8b98-4385-b16a-009ddc9d03c1" (UID: "583aaf4a-8b98-4385-b16a-009ddc9d03c1"). InnerVolumeSpecName "kube-api-access-dgmvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:21:01 crc kubenswrapper[4795]: I0219 22:21:01.621039 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/583aaf4a-8b98-4385-b16a-009ddc9d03c1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "583aaf4a-8b98-4385-b16a-009ddc9d03c1" (UID: "583aaf4a-8b98-4385-b16a-009ddc9d03c1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:21:01 crc kubenswrapper[4795]: I0219 22:21:01.697956 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgmvs\" (UniqueName: \"kubernetes.io/projected/583aaf4a-8b98-4385-b16a-009ddc9d03c1-kube-api-access-dgmvs\") on node \"crc\" DevicePath \"\"" Feb 19 22:21:01 crc kubenswrapper[4795]: I0219 22:21:01.698228 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/583aaf4a-8b98-4385-b16a-009ddc9d03c1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:21:01 crc kubenswrapper[4795]: I0219 22:21:01.698295 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/583aaf4a-8b98-4385-b16a-009ddc9d03c1-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:21:02 crc kubenswrapper[4795]: I0219 22:21:02.043490 4795 generic.go:334] "Generic (PLEG): container finished" podID="583aaf4a-8b98-4385-b16a-009ddc9d03c1" containerID="b0e173b2579c3f35c12a29aff93a1ebf8d4ad9804faefc64bf3d9f48f42e7bf9" exitCode=0 Feb 19 22:21:02 crc kubenswrapper[4795]: I0219 22:21:02.043533 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j6m5s" event={"ID":"583aaf4a-8b98-4385-b16a-009ddc9d03c1","Type":"ContainerDied","Data":"b0e173b2579c3f35c12a29aff93a1ebf8d4ad9804faefc64bf3d9f48f42e7bf9"} Feb 19 22:21:02 crc kubenswrapper[4795]: I0219 22:21:02.043560 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j6m5s" event={"ID":"583aaf4a-8b98-4385-b16a-009ddc9d03c1","Type":"ContainerDied","Data":"b756a66576e7cccaecd610df95a2f875171020cbdb675495d633bcadca6e3e32"} Feb 19 22:21:02 crc kubenswrapper[4795]: I0219 22:21:02.043584 4795 scope.go:117] "RemoveContainer" containerID="b0e173b2579c3f35c12a29aff93a1ebf8d4ad9804faefc64bf3d9f48f42e7bf9" Feb 19 22:21:02 crc kubenswrapper[4795]: I0219 22:21:02.043655 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:21:02 crc kubenswrapper[4795]: I0219 22:21:02.071291 4795 scope.go:117] "RemoveContainer" containerID="242c1278d3b75e924f451cdde3eb750c98b500bf199d980f136750a047a5320b" Feb 19 22:21:02 crc kubenswrapper[4795]: I0219 22:21:02.085339 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j6m5s"] Feb 19 22:21:02 crc kubenswrapper[4795]: I0219 22:21:02.090004 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j6m5s"] Feb 19 22:21:02 crc kubenswrapper[4795]: I0219 22:21:02.107571 4795 scope.go:117] "RemoveContainer" containerID="5c7275e62f7372a36164f21f6ec2134e29ed2f9c38f129655aa63e1205e38c33" Feb 19 22:21:02 crc kubenswrapper[4795]: I0219 22:21:02.147086 4795 scope.go:117] "RemoveContainer" containerID="b0e173b2579c3f35c12a29aff93a1ebf8d4ad9804faefc64bf3d9f48f42e7bf9" Feb 19 22:21:02 crc kubenswrapper[4795]: E0219 22:21:02.147674 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0e173b2579c3f35c12a29aff93a1ebf8d4ad9804faefc64bf3d9f48f42e7bf9\": container with ID starting with b0e173b2579c3f35c12a29aff93a1ebf8d4ad9804faefc64bf3d9f48f42e7bf9 not found: ID does not exist" containerID="b0e173b2579c3f35c12a29aff93a1ebf8d4ad9804faefc64bf3d9f48f42e7bf9" Feb 19 22:21:02 crc kubenswrapper[4795]: I0219 22:21:02.147732 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0e173b2579c3f35c12a29aff93a1ebf8d4ad9804faefc64bf3d9f48f42e7bf9"} err="failed to get container status \"b0e173b2579c3f35c12a29aff93a1ebf8d4ad9804faefc64bf3d9f48f42e7bf9\": rpc error: code = NotFound desc = could not find container \"b0e173b2579c3f35c12a29aff93a1ebf8d4ad9804faefc64bf3d9f48f42e7bf9\": container with ID starting with b0e173b2579c3f35c12a29aff93a1ebf8d4ad9804faefc64bf3d9f48f42e7bf9 not found: ID does not exist" Feb 19 22:21:02 crc kubenswrapper[4795]: I0219 22:21:02.147757 4795 scope.go:117] "RemoveContainer" containerID="242c1278d3b75e924f451cdde3eb750c98b500bf199d980f136750a047a5320b" Feb 19 22:21:02 crc kubenswrapper[4795]: E0219 22:21:02.148149 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"242c1278d3b75e924f451cdde3eb750c98b500bf199d980f136750a047a5320b\": container with ID starting with 242c1278d3b75e924f451cdde3eb750c98b500bf199d980f136750a047a5320b not found: ID does not exist" containerID="242c1278d3b75e924f451cdde3eb750c98b500bf199d980f136750a047a5320b" Feb 19 22:21:02 crc kubenswrapper[4795]: I0219 22:21:02.148262 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"242c1278d3b75e924f451cdde3eb750c98b500bf199d980f136750a047a5320b"} err="failed to get container status \"242c1278d3b75e924f451cdde3eb750c98b500bf199d980f136750a047a5320b\": rpc error: code = NotFound desc = could not find container \"242c1278d3b75e924f451cdde3eb750c98b500bf199d980f136750a047a5320b\": container with ID starting with 242c1278d3b75e924f451cdde3eb750c98b500bf199d980f136750a047a5320b not found: ID does not exist" Feb 19 22:21:02 crc kubenswrapper[4795]: I0219 22:21:02.148340 4795 scope.go:117] "RemoveContainer" containerID="5c7275e62f7372a36164f21f6ec2134e29ed2f9c38f129655aa63e1205e38c33" Feb 19 22:21:02 crc kubenswrapper[4795]: E0219 22:21:02.148754 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c7275e62f7372a36164f21f6ec2134e29ed2f9c38f129655aa63e1205e38c33\": container with ID starting with 5c7275e62f7372a36164f21f6ec2134e29ed2f9c38f129655aa63e1205e38c33 not found: ID does not exist" containerID="5c7275e62f7372a36164f21f6ec2134e29ed2f9c38f129655aa63e1205e38c33" Feb 19 22:21:02 crc kubenswrapper[4795]: I0219 22:21:02.148785 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c7275e62f7372a36164f21f6ec2134e29ed2f9c38f129655aa63e1205e38c33"} err="failed to get container status \"5c7275e62f7372a36164f21f6ec2134e29ed2f9c38f129655aa63e1205e38c33\": rpc error: code = NotFound desc = could not find container \"5c7275e62f7372a36164f21f6ec2134e29ed2f9c38f129655aa63e1205e38c33\": container with ID starting with 5c7275e62f7372a36164f21f6ec2134e29ed2f9c38f129655aa63e1205e38c33 not found: ID does not exist" Feb 19 22:21:03 crc kubenswrapper[4795]: I0219 22:21:03.530393 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="583aaf4a-8b98-4385-b16a-009ddc9d03c1" path="/var/lib/kubelet/pods/583aaf4a-8b98-4385-b16a-009ddc9d03c1/volumes" Feb 19 22:21:08 crc kubenswrapper[4795]: I0219 22:21:08.511604 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:21:08 crc kubenswrapper[4795]: E0219 22:21:08.512030 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:21:23 crc kubenswrapper[4795]: I0219 22:21:23.511842 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:21:23 crc kubenswrapper[4795]: E0219 22:21:23.512828 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:21:37 crc kubenswrapper[4795]: I0219 22:21:37.512119 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:21:38 crc kubenswrapper[4795]: I0219 22:21:38.396397 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"0edfa681896b7b4c4053886f43d282fa1e3dc3cde5816d95a6de237745fabdcb"} Feb 19 22:23:58 crc kubenswrapper[4795]: I0219 22:23:58.427962 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:23:58 crc kubenswrapper[4795]: I0219 22:23:58.428769 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:24:28 crc kubenswrapper[4795]: I0219 22:24:28.427748 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:24:28 crc kubenswrapper[4795]: I0219 22:24:28.428559 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:24:58 crc kubenswrapper[4795]: I0219 22:24:58.428048 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:24:58 crc kubenswrapper[4795]: I0219 22:24:58.428608 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:24:58 crc kubenswrapper[4795]: I0219 22:24:58.428652 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 22:24:58 crc kubenswrapper[4795]: I0219 22:24:58.429208 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0edfa681896b7b4c4053886f43d282fa1e3dc3cde5816d95a6de237745fabdcb"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 22:24:58 crc kubenswrapper[4795]: I0219 22:24:58.429283 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://0edfa681896b7b4c4053886f43d282fa1e3dc3cde5816d95a6de237745fabdcb" gracePeriod=600 Feb 19 22:24:58 crc kubenswrapper[4795]: I0219 22:24:58.936391 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="0edfa681896b7b4c4053886f43d282fa1e3dc3cde5816d95a6de237745fabdcb" exitCode=0 Feb 19 22:24:58 crc kubenswrapper[4795]: I0219 22:24:58.936447 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"0edfa681896b7b4c4053886f43d282fa1e3dc3cde5816d95a6de237745fabdcb"} Feb 19 22:24:58 crc kubenswrapper[4795]: I0219 22:24:58.936618 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f"} Feb 19 22:24:58 crc kubenswrapper[4795]: I0219 22:24:58.936640 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.012332 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7zgnn"] Feb 19 22:25:15 crc kubenswrapper[4795]: E0219 22:25:15.013343 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="583aaf4a-8b98-4385-b16a-009ddc9d03c1" containerName="extract-utilities" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.013360 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="583aaf4a-8b98-4385-b16a-009ddc9d03c1" containerName="extract-utilities" Feb 19 22:25:15 crc kubenswrapper[4795]: E0219 22:25:15.013379 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="583aaf4a-8b98-4385-b16a-009ddc9d03c1" containerName="registry-server" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.013386 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="583aaf4a-8b98-4385-b16a-009ddc9d03c1" containerName="registry-server" Feb 19 22:25:15 crc kubenswrapper[4795]: E0219 22:25:15.013417 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="583aaf4a-8b98-4385-b16a-009ddc9d03c1" containerName="extract-content" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.013424 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="583aaf4a-8b98-4385-b16a-009ddc9d03c1" containerName="extract-content" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.013591 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="583aaf4a-8b98-4385-b16a-009ddc9d03c1" containerName="registry-server" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.014718 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.024833 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7zgnn"] Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.072308 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/369db8af-2908-4268-9589-a73559afc23d-utilities\") pod \"certified-operators-7zgnn\" (UID: \"369db8af-2908-4268-9589-a73559afc23d\") " pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.072391 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7jzh\" (UniqueName: \"kubernetes.io/projected/369db8af-2908-4268-9589-a73559afc23d-kube-api-access-z7jzh\") pod \"certified-operators-7zgnn\" (UID: \"369db8af-2908-4268-9589-a73559afc23d\") " pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.072429 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/369db8af-2908-4268-9589-a73559afc23d-catalog-content\") pod \"certified-operators-7zgnn\" (UID: \"369db8af-2908-4268-9589-a73559afc23d\") " pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.173916 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7jzh\" (UniqueName: \"kubernetes.io/projected/369db8af-2908-4268-9589-a73559afc23d-kube-api-access-z7jzh\") pod \"certified-operators-7zgnn\" (UID: \"369db8af-2908-4268-9589-a73559afc23d\") " pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.173985 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/369db8af-2908-4268-9589-a73559afc23d-catalog-content\") pod \"certified-operators-7zgnn\" (UID: \"369db8af-2908-4268-9589-a73559afc23d\") " pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.174021 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/369db8af-2908-4268-9589-a73559afc23d-utilities\") pod \"certified-operators-7zgnn\" (UID: \"369db8af-2908-4268-9589-a73559afc23d\") " pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.174635 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/369db8af-2908-4268-9589-a73559afc23d-catalog-content\") pod \"certified-operators-7zgnn\" (UID: \"369db8af-2908-4268-9589-a73559afc23d\") " pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.174722 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/369db8af-2908-4268-9589-a73559afc23d-utilities\") pod \"certified-operators-7zgnn\" (UID: \"369db8af-2908-4268-9589-a73559afc23d\") " pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.195310 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7jzh\" (UniqueName: \"kubernetes.io/projected/369db8af-2908-4268-9589-a73559afc23d-kube-api-access-z7jzh\") pod \"certified-operators-7zgnn\" (UID: \"369db8af-2908-4268-9589-a73559afc23d\") " pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.332974 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.791308 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7zgnn"] Feb 19 22:25:16 crc kubenswrapper[4795]: I0219 22:25:16.086344 4795 generic.go:334] "Generic (PLEG): container finished" podID="369db8af-2908-4268-9589-a73559afc23d" containerID="0475702ad800a7b7283d8dcc3c1114d18b94d85b153da8165185a5176c3e9d04" exitCode=0 Feb 19 22:25:16 crc kubenswrapper[4795]: I0219 22:25:16.086387 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zgnn" event={"ID":"369db8af-2908-4268-9589-a73559afc23d","Type":"ContainerDied","Data":"0475702ad800a7b7283d8dcc3c1114d18b94d85b153da8165185a5176c3e9d04"} Feb 19 22:25:16 crc kubenswrapper[4795]: I0219 22:25:16.086410 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zgnn" event={"ID":"369db8af-2908-4268-9589-a73559afc23d","Type":"ContainerStarted","Data":"7586dc34590369cf1598e1b2c64ff5204daaaef5a0c2190d004748d61e3ad765"} Feb 19 22:25:16 crc kubenswrapper[4795]: I0219 22:25:16.088650 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 22:25:17 crc kubenswrapper[4795]: I0219 22:25:17.095350 4795 generic.go:334] "Generic (PLEG): container finished" podID="369db8af-2908-4268-9589-a73559afc23d" containerID="dab2f634983d085c5c877b27a71167baa639e42c55d0cae668d3dc56006484ed" exitCode=0 Feb 19 22:25:17 crc kubenswrapper[4795]: I0219 22:25:17.095407 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zgnn" event={"ID":"369db8af-2908-4268-9589-a73559afc23d","Type":"ContainerDied","Data":"dab2f634983d085c5c877b27a71167baa639e42c55d0cae668d3dc56006484ed"} Feb 19 22:25:18 crc kubenswrapper[4795]: I0219 22:25:18.107374 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zgnn" event={"ID":"369db8af-2908-4268-9589-a73559afc23d","Type":"ContainerStarted","Data":"26ddb0d333102c1d7fa50e8a29a4524dbfcbd602fb6b4474c4fb485995ac3351"} Feb 19 22:25:18 crc kubenswrapper[4795]: I0219 22:25:18.132572 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7zgnn" podStartSLOduration=2.717176035 podStartE2EDuration="4.132542546s" podCreationTimestamp="2026-02-19 22:25:14 +0000 UTC" firstStartedPulling="2026-02-19 22:25:16.088445772 +0000 UTC m=+3427.280963636" lastFinishedPulling="2026-02-19 22:25:17.503812253 +0000 UTC m=+3428.696330147" observedRunningTime="2026-02-19 22:25:18.126459613 +0000 UTC m=+3429.318977497" watchObservedRunningTime="2026-02-19 22:25:18.132542546 +0000 UTC m=+3429.325060450" Feb 19 22:25:25 crc kubenswrapper[4795]: I0219 22:25:25.333811 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:25 crc kubenswrapper[4795]: I0219 22:25:25.334470 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:25 crc kubenswrapper[4795]: I0219 22:25:25.387724 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:26 crc kubenswrapper[4795]: I0219 22:25:26.199404 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:26 crc kubenswrapper[4795]: I0219 22:25:26.248712 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7zgnn"] Feb 19 22:25:28 crc kubenswrapper[4795]: I0219 22:25:28.187475 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7zgnn" podUID="369db8af-2908-4268-9589-a73559afc23d" containerName="registry-server" containerID="cri-o://26ddb0d333102c1d7fa50e8a29a4524dbfcbd602fb6b4474c4fb485995ac3351" gracePeriod=2 Feb 19 22:25:28 crc kubenswrapper[4795]: I0219 22:25:28.751835 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:28 crc kubenswrapper[4795]: I0219 22:25:28.802717 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7jzh\" (UniqueName: \"kubernetes.io/projected/369db8af-2908-4268-9589-a73559afc23d-kube-api-access-z7jzh\") pod \"369db8af-2908-4268-9589-a73559afc23d\" (UID: \"369db8af-2908-4268-9589-a73559afc23d\") " Feb 19 22:25:28 crc kubenswrapper[4795]: I0219 22:25:28.802844 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/369db8af-2908-4268-9589-a73559afc23d-catalog-content\") pod \"369db8af-2908-4268-9589-a73559afc23d\" (UID: \"369db8af-2908-4268-9589-a73559afc23d\") " Feb 19 22:25:28 crc kubenswrapper[4795]: I0219 22:25:28.802882 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/369db8af-2908-4268-9589-a73559afc23d-utilities\") pod \"369db8af-2908-4268-9589-a73559afc23d\" (UID: \"369db8af-2908-4268-9589-a73559afc23d\") " Feb 19 22:25:28 crc kubenswrapper[4795]: I0219 22:25:28.803689 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/369db8af-2908-4268-9589-a73559afc23d-utilities" (OuterVolumeSpecName: "utilities") pod "369db8af-2908-4268-9589-a73559afc23d" (UID: "369db8af-2908-4268-9589-a73559afc23d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:25:28 crc kubenswrapper[4795]: I0219 22:25:28.808271 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/369db8af-2908-4268-9589-a73559afc23d-kube-api-access-z7jzh" (OuterVolumeSpecName: "kube-api-access-z7jzh") pod "369db8af-2908-4268-9589-a73559afc23d" (UID: "369db8af-2908-4268-9589-a73559afc23d"). InnerVolumeSpecName "kube-api-access-z7jzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:25:28 crc kubenswrapper[4795]: I0219 22:25:28.851983 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/369db8af-2908-4268-9589-a73559afc23d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "369db8af-2908-4268-9589-a73559afc23d" (UID: "369db8af-2908-4268-9589-a73559afc23d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:25:28 crc kubenswrapper[4795]: I0219 22:25:28.904076 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/369db8af-2908-4268-9589-a73559afc23d-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:25:28 crc kubenswrapper[4795]: I0219 22:25:28.904111 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7jzh\" (UniqueName: \"kubernetes.io/projected/369db8af-2908-4268-9589-a73559afc23d-kube-api-access-z7jzh\") on node \"crc\" DevicePath \"\"" Feb 19 22:25:28 crc kubenswrapper[4795]: I0219 22:25:28.904122 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/369db8af-2908-4268-9589-a73559afc23d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:25:29 crc kubenswrapper[4795]: I0219 22:25:29.198974 4795 generic.go:334] "Generic (PLEG): container finished" podID="369db8af-2908-4268-9589-a73559afc23d" containerID="26ddb0d333102c1d7fa50e8a29a4524dbfcbd602fb6b4474c4fb485995ac3351" exitCode=0 Feb 19 22:25:29 crc kubenswrapper[4795]: I0219 22:25:29.199024 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zgnn" event={"ID":"369db8af-2908-4268-9589-a73559afc23d","Type":"ContainerDied","Data":"26ddb0d333102c1d7fa50e8a29a4524dbfcbd602fb6b4474c4fb485995ac3351"} Feb 19 22:25:29 crc kubenswrapper[4795]: I0219 22:25:29.199048 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:29 crc kubenswrapper[4795]: I0219 22:25:29.199067 4795 scope.go:117] "RemoveContainer" containerID="26ddb0d333102c1d7fa50e8a29a4524dbfcbd602fb6b4474c4fb485995ac3351" Feb 19 22:25:29 crc kubenswrapper[4795]: I0219 22:25:29.199054 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zgnn" event={"ID":"369db8af-2908-4268-9589-a73559afc23d","Type":"ContainerDied","Data":"7586dc34590369cf1598e1b2c64ff5204daaaef5a0c2190d004748d61e3ad765"} Feb 19 22:25:29 crc kubenswrapper[4795]: I0219 22:25:29.233781 4795 scope.go:117] "RemoveContainer" containerID="dab2f634983d085c5c877b27a71167baa639e42c55d0cae668d3dc56006484ed" Feb 19 22:25:29 crc kubenswrapper[4795]: I0219 22:25:29.239078 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7zgnn"] Feb 19 22:25:29 crc kubenswrapper[4795]: I0219 22:25:29.244625 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7zgnn"] Feb 19 22:25:29 crc kubenswrapper[4795]: I0219 22:25:29.255604 4795 scope.go:117] "RemoveContainer" containerID="0475702ad800a7b7283d8dcc3c1114d18b94d85b153da8165185a5176c3e9d04" Feb 19 22:25:29 crc kubenswrapper[4795]: I0219 22:25:29.279134 4795 scope.go:117] "RemoveContainer" containerID="26ddb0d333102c1d7fa50e8a29a4524dbfcbd602fb6b4474c4fb485995ac3351" Feb 19 22:25:29 crc kubenswrapper[4795]: E0219 22:25:29.279825 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26ddb0d333102c1d7fa50e8a29a4524dbfcbd602fb6b4474c4fb485995ac3351\": container with ID starting with 26ddb0d333102c1d7fa50e8a29a4524dbfcbd602fb6b4474c4fb485995ac3351 not found: ID does not exist" containerID="26ddb0d333102c1d7fa50e8a29a4524dbfcbd602fb6b4474c4fb485995ac3351" Feb 19 22:25:29 crc kubenswrapper[4795]: I0219 22:25:29.279869 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26ddb0d333102c1d7fa50e8a29a4524dbfcbd602fb6b4474c4fb485995ac3351"} err="failed to get container status \"26ddb0d333102c1d7fa50e8a29a4524dbfcbd602fb6b4474c4fb485995ac3351\": rpc error: code = NotFound desc = could not find container \"26ddb0d333102c1d7fa50e8a29a4524dbfcbd602fb6b4474c4fb485995ac3351\": container with ID starting with 26ddb0d333102c1d7fa50e8a29a4524dbfcbd602fb6b4474c4fb485995ac3351 not found: ID does not exist" Feb 19 22:25:29 crc kubenswrapper[4795]: I0219 22:25:29.279898 4795 scope.go:117] "RemoveContainer" containerID="dab2f634983d085c5c877b27a71167baa639e42c55d0cae668d3dc56006484ed" Feb 19 22:25:29 crc kubenswrapper[4795]: E0219 22:25:29.280253 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dab2f634983d085c5c877b27a71167baa639e42c55d0cae668d3dc56006484ed\": container with ID starting with dab2f634983d085c5c877b27a71167baa639e42c55d0cae668d3dc56006484ed not found: ID does not exist" containerID="dab2f634983d085c5c877b27a71167baa639e42c55d0cae668d3dc56006484ed" Feb 19 22:25:29 crc kubenswrapper[4795]: I0219 22:25:29.280323 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dab2f634983d085c5c877b27a71167baa639e42c55d0cae668d3dc56006484ed"} err="failed to get container status \"dab2f634983d085c5c877b27a71167baa639e42c55d0cae668d3dc56006484ed\": rpc error: code = NotFound desc = could not find container \"dab2f634983d085c5c877b27a71167baa639e42c55d0cae668d3dc56006484ed\": container with ID starting with dab2f634983d085c5c877b27a71167baa639e42c55d0cae668d3dc56006484ed not found: ID does not exist" Feb 19 22:25:29 crc kubenswrapper[4795]: I0219 22:25:29.280373 4795 scope.go:117] "RemoveContainer" containerID="0475702ad800a7b7283d8dcc3c1114d18b94d85b153da8165185a5176c3e9d04" Feb 19 22:25:29 crc kubenswrapper[4795]: E0219 22:25:29.280826 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0475702ad800a7b7283d8dcc3c1114d18b94d85b153da8165185a5176c3e9d04\": container with ID starting with 0475702ad800a7b7283d8dcc3c1114d18b94d85b153da8165185a5176c3e9d04 not found: ID does not exist" containerID="0475702ad800a7b7283d8dcc3c1114d18b94d85b153da8165185a5176c3e9d04" Feb 19 22:25:29 crc kubenswrapper[4795]: I0219 22:25:29.280869 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0475702ad800a7b7283d8dcc3c1114d18b94d85b153da8165185a5176c3e9d04"} err="failed to get container status \"0475702ad800a7b7283d8dcc3c1114d18b94d85b153da8165185a5176c3e9d04\": rpc error: code = NotFound desc = could not find container \"0475702ad800a7b7283d8dcc3c1114d18b94d85b153da8165185a5176c3e9d04\": container with ID starting with 0475702ad800a7b7283d8dcc3c1114d18b94d85b153da8165185a5176c3e9d04 not found: ID does not exist" Feb 19 22:25:29 crc kubenswrapper[4795]: I0219 22:25:29.520702 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="369db8af-2908-4268-9589-a73559afc23d" path="/var/lib/kubelet/pods/369db8af-2908-4268-9589-a73559afc23d/volumes" Feb 19 22:26:58 crc kubenswrapper[4795]: I0219 22:26:58.428106 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:26:58 crc kubenswrapper[4795]: I0219 22:26:58.428771 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:27:00 crc kubenswrapper[4795]: I0219 22:27:00.837602 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2rc7n"] Feb 19 22:27:00 crc kubenswrapper[4795]: E0219 22:27:00.838844 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="369db8af-2908-4268-9589-a73559afc23d" containerName="extract-utilities" Feb 19 22:27:00 crc kubenswrapper[4795]: I0219 22:27:00.838878 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="369db8af-2908-4268-9589-a73559afc23d" containerName="extract-utilities" Feb 19 22:27:00 crc kubenswrapper[4795]: E0219 22:27:00.838928 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="369db8af-2908-4268-9589-a73559afc23d" containerName="registry-server" Feb 19 22:27:00 crc kubenswrapper[4795]: I0219 22:27:00.838949 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="369db8af-2908-4268-9589-a73559afc23d" containerName="registry-server" Feb 19 22:27:00 crc kubenswrapper[4795]: E0219 22:27:00.838979 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="369db8af-2908-4268-9589-a73559afc23d" containerName="extract-content" Feb 19 22:27:00 crc kubenswrapper[4795]: I0219 22:27:00.839018 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="369db8af-2908-4268-9589-a73559afc23d" containerName="extract-content" Feb 19 22:27:00 crc kubenswrapper[4795]: I0219 22:27:00.839445 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="369db8af-2908-4268-9589-a73559afc23d" containerName="registry-server" Feb 19 22:27:00 crc kubenswrapper[4795]: I0219 22:27:00.841965 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:00 crc kubenswrapper[4795]: I0219 22:27:00.852290 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2rc7n"] Feb 19 22:27:00 crc kubenswrapper[4795]: I0219 22:27:00.913234 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f690af90-79ff-44b5-88ad-970bfe721e55-catalog-content\") pod \"community-operators-2rc7n\" (UID: \"f690af90-79ff-44b5-88ad-970bfe721e55\") " pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:00 crc kubenswrapper[4795]: I0219 22:27:00.913304 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55bbf\" (UniqueName: \"kubernetes.io/projected/f690af90-79ff-44b5-88ad-970bfe721e55-kube-api-access-55bbf\") pod \"community-operators-2rc7n\" (UID: \"f690af90-79ff-44b5-88ad-970bfe721e55\") " pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:00 crc kubenswrapper[4795]: I0219 22:27:00.913692 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f690af90-79ff-44b5-88ad-970bfe721e55-utilities\") pod \"community-operators-2rc7n\" (UID: \"f690af90-79ff-44b5-88ad-970bfe721e55\") " pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:01 crc kubenswrapper[4795]: I0219 22:27:01.015960 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f690af90-79ff-44b5-88ad-970bfe721e55-catalog-content\") pod \"community-operators-2rc7n\" (UID: \"f690af90-79ff-44b5-88ad-970bfe721e55\") " pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:01 crc kubenswrapper[4795]: I0219 22:27:01.016037 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55bbf\" (UniqueName: \"kubernetes.io/projected/f690af90-79ff-44b5-88ad-970bfe721e55-kube-api-access-55bbf\") pod \"community-operators-2rc7n\" (UID: \"f690af90-79ff-44b5-88ad-970bfe721e55\") " pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:01 crc kubenswrapper[4795]: I0219 22:27:01.016119 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f690af90-79ff-44b5-88ad-970bfe721e55-utilities\") pod \"community-operators-2rc7n\" (UID: \"f690af90-79ff-44b5-88ad-970bfe721e55\") " pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:01 crc kubenswrapper[4795]: I0219 22:27:01.016647 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f690af90-79ff-44b5-88ad-970bfe721e55-catalog-content\") pod \"community-operators-2rc7n\" (UID: \"f690af90-79ff-44b5-88ad-970bfe721e55\") " pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:01 crc kubenswrapper[4795]: I0219 22:27:01.016675 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f690af90-79ff-44b5-88ad-970bfe721e55-utilities\") pod \"community-operators-2rc7n\" (UID: \"f690af90-79ff-44b5-88ad-970bfe721e55\") " pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:01 crc kubenswrapper[4795]: I0219 22:27:01.045840 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55bbf\" (UniqueName: \"kubernetes.io/projected/f690af90-79ff-44b5-88ad-970bfe721e55-kube-api-access-55bbf\") pod \"community-operators-2rc7n\" (UID: \"f690af90-79ff-44b5-88ad-970bfe721e55\") " pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:01 crc kubenswrapper[4795]: I0219 22:27:01.188515 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:01 crc kubenswrapper[4795]: I0219 22:27:01.491523 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2rc7n"] Feb 19 22:27:01 crc kubenswrapper[4795]: I0219 22:27:01.958274 4795 generic.go:334] "Generic (PLEG): container finished" podID="f690af90-79ff-44b5-88ad-970bfe721e55" containerID="c620ac7e95c13b534e0213c772c625adfacf036efad644b1e2751c5c052a4dee" exitCode=0 Feb 19 22:27:01 crc kubenswrapper[4795]: I0219 22:27:01.958354 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2rc7n" event={"ID":"f690af90-79ff-44b5-88ad-970bfe721e55","Type":"ContainerDied","Data":"c620ac7e95c13b534e0213c772c625adfacf036efad644b1e2751c5c052a4dee"} Feb 19 22:27:01 crc kubenswrapper[4795]: I0219 22:27:01.958607 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2rc7n" event={"ID":"f690af90-79ff-44b5-88ad-970bfe721e55","Type":"ContainerStarted","Data":"0f6c7a9b355c7198402cd2d227bffb002fc0b92346bf0bc97737e356a9a96ce1"} Feb 19 22:27:02 crc kubenswrapper[4795]: I0219 22:27:02.967978 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2rc7n" event={"ID":"f690af90-79ff-44b5-88ad-970bfe721e55","Type":"ContainerStarted","Data":"dc7aea0fef383654073c21717782d8c43a203053c552cbc8b640c0e1602c27dc"} Feb 19 22:27:03 crc kubenswrapper[4795]: I0219 22:27:03.980693 4795 generic.go:334] "Generic (PLEG): container finished" podID="f690af90-79ff-44b5-88ad-970bfe721e55" containerID="dc7aea0fef383654073c21717782d8c43a203053c552cbc8b640c0e1602c27dc" exitCode=0 Feb 19 22:27:03 crc kubenswrapper[4795]: I0219 22:27:03.980768 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2rc7n" event={"ID":"f690af90-79ff-44b5-88ad-970bfe721e55","Type":"ContainerDied","Data":"dc7aea0fef383654073c21717782d8c43a203053c552cbc8b640c0e1602c27dc"} Feb 19 22:27:04 crc kubenswrapper[4795]: I0219 22:27:04.991958 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2rc7n" event={"ID":"f690af90-79ff-44b5-88ad-970bfe721e55","Type":"ContainerStarted","Data":"43460a0f9f63dd86db6b84af2f33b4a7517aa6852b9e2194fac2307459a54000"} Feb 19 22:27:05 crc kubenswrapper[4795]: I0219 22:27:05.016892 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2rc7n" podStartSLOduration=2.584289957 podStartE2EDuration="5.016873221s" podCreationTimestamp="2026-02-19 22:27:00 +0000 UTC" firstStartedPulling="2026-02-19 22:27:01.962575703 +0000 UTC m=+3533.155093567" lastFinishedPulling="2026-02-19 22:27:04.395158947 +0000 UTC m=+3535.587676831" observedRunningTime="2026-02-19 22:27:05.012038023 +0000 UTC m=+3536.204555917" watchObservedRunningTime="2026-02-19 22:27:05.016873221 +0000 UTC m=+3536.209391095" Feb 19 22:27:11 crc kubenswrapper[4795]: I0219 22:27:11.189259 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:11 crc kubenswrapper[4795]: I0219 22:27:11.189660 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:11 crc kubenswrapper[4795]: I0219 22:27:11.254422 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:12 crc kubenswrapper[4795]: I0219 22:27:12.116520 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:12 crc kubenswrapper[4795]: I0219 22:27:12.166918 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2rc7n"] Feb 19 22:27:14 crc kubenswrapper[4795]: I0219 22:27:14.066925 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2rc7n" podUID="f690af90-79ff-44b5-88ad-970bfe721e55" containerName="registry-server" containerID="cri-o://43460a0f9f63dd86db6b84af2f33b4a7517aa6852b9e2194fac2307459a54000" gracePeriod=2 Feb 19 22:27:14 crc kubenswrapper[4795]: I0219 22:27:14.445017 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:14 crc kubenswrapper[4795]: I0219 22:27:14.608767 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f690af90-79ff-44b5-88ad-970bfe721e55-utilities\") pod \"f690af90-79ff-44b5-88ad-970bfe721e55\" (UID: \"f690af90-79ff-44b5-88ad-970bfe721e55\") " Feb 19 22:27:14 crc kubenswrapper[4795]: I0219 22:27:14.609074 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55bbf\" (UniqueName: \"kubernetes.io/projected/f690af90-79ff-44b5-88ad-970bfe721e55-kube-api-access-55bbf\") pod \"f690af90-79ff-44b5-88ad-970bfe721e55\" (UID: \"f690af90-79ff-44b5-88ad-970bfe721e55\") " Feb 19 22:27:14 crc kubenswrapper[4795]: I0219 22:27:14.609117 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f690af90-79ff-44b5-88ad-970bfe721e55-catalog-content\") pod \"f690af90-79ff-44b5-88ad-970bfe721e55\" (UID: \"f690af90-79ff-44b5-88ad-970bfe721e55\") " Feb 19 22:27:14 crc kubenswrapper[4795]: I0219 22:27:14.609636 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f690af90-79ff-44b5-88ad-970bfe721e55-utilities" (OuterVolumeSpecName: "utilities") pod "f690af90-79ff-44b5-88ad-970bfe721e55" (UID: "f690af90-79ff-44b5-88ad-970bfe721e55"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:27:14 crc kubenswrapper[4795]: I0219 22:27:14.619337 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f690af90-79ff-44b5-88ad-970bfe721e55-kube-api-access-55bbf" (OuterVolumeSpecName: "kube-api-access-55bbf") pod "f690af90-79ff-44b5-88ad-970bfe721e55" (UID: "f690af90-79ff-44b5-88ad-970bfe721e55"). InnerVolumeSpecName "kube-api-access-55bbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:27:14 crc kubenswrapper[4795]: I0219 22:27:14.659861 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f690af90-79ff-44b5-88ad-970bfe721e55-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f690af90-79ff-44b5-88ad-970bfe721e55" (UID: "f690af90-79ff-44b5-88ad-970bfe721e55"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:27:14 crc kubenswrapper[4795]: I0219 22:27:14.710815 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f690af90-79ff-44b5-88ad-970bfe721e55-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:27:14 crc kubenswrapper[4795]: I0219 22:27:14.710857 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55bbf\" (UniqueName: \"kubernetes.io/projected/f690af90-79ff-44b5-88ad-970bfe721e55-kube-api-access-55bbf\") on node \"crc\" DevicePath \"\"" Feb 19 22:27:14 crc kubenswrapper[4795]: I0219 22:27:14.710872 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f690af90-79ff-44b5-88ad-970bfe721e55-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:27:15 crc kubenswrapper[4795]: I0219 22:27:15.093860 4795 generic.go:334] "Generic (PLEG): container finished" podID="f690af90-79ff-44b5-88ad-970bfe721e55" containerID="43460a0f9f63dd86db6b84af2f33b4a7517aa6852b9e2194fac2307459a54000" exitCode=0 Feb 19 22:27:15 crc kubenswrapper[4795]: I0219 22:27:15.093922 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2rc7n" event={"ID":"f690af90-79ff-44b5-88ad-970bfe721e55","Type":"ContainerDied","Data":"43460a0f9f63dd86db6b84af2f33b4a7517aa6852b9e2194fac2307459a54000"} Feb 19 22:27:15 crc kubenswrapper[4795]: I0219 22:27:15.093947 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2rc7n" event={"ID":"f690af90-79ff-44b5-88ad-970bfe721e55","Type":"ContainerDied","Data":"0f6c7a9b355c7198402cd2d227bffb002fc0b92346bf0bc97737e356a9a96ce1"} Feb 19 22:27:15 crc kubenswrapper[4795]: I0219 22:27:15.093962 4795 scope.go:117] "RemoveContainer" containerID="43460a0f9f63dd86db6b84af2f33b4a7517aa6852b9e2194fac2307459a54000" Feb 19 22:27:15 crc kubenswrapper[4795]: I0219 22:27:15.094079 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:15 crc kubenswrapper[4795]: I0219 22:27:15.122614 4795 scope.go:117] "RemoveContainer" containerID="dc7aea0fef383654073c21717782d8c43a203053c552cbc8b640c0e1602c27dc" Feb 19 22:27:15 crc kubenswrapper[4795]: I0219 22:27:15.129865 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2rc7n"] Feb 19 22:27:15 crc kubenswrapper[4795]: I0219 22:27:15.136010 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2rc7n"] Feb 19 22:27:15 crc kubenswrapper[4795]: I0219 22:27:15.140176 4795 scope.go:117] "RemoveContainer" containerID="c620ac7e95c13b534e0213c772c625adfacf036efad644b1e2751c5c052a4dee" Feb 19 22:27:15 crc kubenswrapper[4795]: I0219 22:27:15.159656 4795 scope.go:117] "RemoveContainer" containerID="43460a0f9f63dd86db6b84af2f33b4a7517aa6852b9e2194fac2307459a54000" Feb 19 22:27:15 crc kubenswrapper[4795]: E0219 22:27:15.160129 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43460a0f9f63dd86db6b84af2f33b4a7517aa6852b9e2194fac2307459a54000\": container with ID starting with 43460a0f9f63dd86db6b84af2f33b4a7517aa6852b9e2194fac2307459a54000 not found: ID does not exist" containerID="43460a0f9f63dd86db6b84af2f33b4a7517aa6852b9e2194fac2307459a54000" Feb 19 22:27:15 crc kubenswrapper[4795]: I0219 22:27:15.160160 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43460a0f9f63dd86db6b84af2f33b4a7517aa6852b9e2194fac2307459a54000"} err="failed to get container status \"43460a0f9f63dd86db6b84af2f33b4a7517aa6852b9e2194fac2307459a54000\": rpc error: code = NotFound desc = could not find container \"43460a0f9f63dd86db6b84af2f33b4a7517aa6852b9e2194fac2307459a54000\": container with ID starting with 43460a0f9f63dd86db6b84af2f33b4a7517aa6852b9e2194fac2307459a54000 not found: ID does not exist" Feb 19 22:27:15 crc kubenswrapper[4795]: I0219 22:27:15.160202 4795 scope.go:117] "RemoveContainer" containerID="dc7aea0fef383654073c21717782d8c43a203053c552cbc8b640c0e1602c27dc" Feb 19 22:27:15 crc kubenswrapper[4795]: E0219 22:27:15.160427 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc7aea0fef383654073c21717782d8c43a203053c552cbc8b640c0e1602c27dc\": container with ID starting with dc7aea0fef383654073c21717782d8c43a203053c552cbc8b640c0e1602c27dc not found: ID does not exist" containerID="dc7aea0fef383654073c21717782d8c43a203053c552cbc8b640c0e1602c27dc" Feb 19 22:27:15 crc kubenswrapper[4795]: I0219 22:27:15.160456 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc7aea0fef383654073c21717782d8c43a203053c552cbc8b640c0e1602c27dc"} err="failed to get container status \"dc7aea0fef383654073c21717782d8c43a203053c552cbc8b640c0e1602c27dc\": rpc error: code = NotFound desc = could not find container \"dc7aea0fef383654073c21717782d8c43a203053c552cbc8b640c0e1602c27dc\": container with ID starting with dc7aea0fef383654073c21717782d8c43a203053c552cbc8b640c0e1602c27dc not found: ID does not exist" Feb 19 22:27:15 crc kubenswrapper[4795]: I0219 22:27:15.160472 4795 scope.go:117] "RemoveContainer" containerID="c620ac7e95c13b534e0213c772c625adfacf036efad644b1e2751c5c052a4dee" Feb 19 22:27:15 crc kubenswrapper[4795]: E0219 22:27:15.160694 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c620ac7e95c13b534e0213c772c625adfacf036efad644b1e2751c5c052a4dee\": container with ID starting with c620ac7e95c13b534e0213c772c625adfacf036efad644b1e2751c5c052a4dee not found: ID does not exist" containerID="c620ac7e95c13b534e0213c772c625adfacf036efad644b1e2751c5c052a4dee" Feb 19 22:27:15 crc kubenswrapper[4795]: I0219 22:27:15.160719 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c620ac7e95c13b534e0213c772c625adfacf036efad644b1e2751c5c052a4dee"} err="failed to get container status \"c620ac7e95c13b534e0213c772c625adfacf036efad644b1e2751c5c052a4dee\": rpc error: code = NotFound desc = could not find container \"c620ac7e95c13b534e0213c772c625adfacf036efad644b1e2751c5c052a4dee\": container with ID starting with c620ac7e95c13b534e0213c772c625adfacf036efad644b1e2751c5c052a4dee not found: ID does not exist" Feb 19 22:27:15 crc kubenswrapper[4795]: I0219 22:27:15.521875 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f690af90-79ff-44b5-88ad-970bfe721e55" path="/var/lib/kubelet/pods/f690af90-79ff-44b5-88ad-970bfe721e55/volumes" Feb 19 22:27:28 crc kubenswrapper[4795]: I0219 22:27:28.428020 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:27:28 crc kubenswrapper[4795]: I0219 22:27:28.429093 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:27:58 crc kubenswrapper[4795]: I0219 22:27:58.427325 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:27:58 crc kubenswrapper[4795]: I0219 22:27:58.428189 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:27:58 crc kubenswrapper[4795]: I0219 22:27:58.428267 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 22:27:58 crc kubenswrapper[4795]: I0219 22:27:58.429325 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 22:27:58 crc kubenswrapper[4795]: I0219 22:27:58.429445 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" gracePeriod=600 Feb 19 22:27:58 crc kubenswrapper[4795]: E0219 22:27:58.548074 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:27:59 crc kubenswrapper[4795]: I0219 22:27:59.428084 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" exitCode=0 Feb 19 22:27:59 crc kubenswrapper[4795]: I0219 22:27:59.428147 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f"} Feb 19 22:27:59 crc kubenswrapper[4795]: I0219 22:27:59.428218 4795 scope.go:117] "RemoveContainer" containerID="0edfa681896b7b4c4053886f43d282fa1e3dc3cde5816d95a6de237745fabdcb" Feb 19 22:27:59 crc kubenswrapper[4795]: I0219 22:27:59.428891 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:27:59 crc kubenswrapper[4795]: E0219 22:27:59.429219 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:28:13 crc kubenswrapper[4795]: I0219 22:28:13.511726 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:28:13 crc kubenswrapper[4795]: E0219 22:28:13.512660 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:28:25 crc kubenswrapper[4795]: I0219 22:28:25.512193 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:28:25 crc kubenswrapper[4795]: E0219 22:28:25.512941 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:28:37 crc kubenswrapper[4795]: I0219 22:28:37.512024 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:28:37 crc kubenswrapper[4795]: E0219 22:28:37.513029 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:28:49 crc kubenswrapper[4795]: I0219 22:28:49.519500 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:28:49 crc kubenswrapper[4795]: E0219 22:28:49.520569 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:29:04 crc kubenswrapper[4795]: I0219 22:29:04.512020 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:29:04 crc kubenswrapper[4795]: E0219 22:29:04.513038 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:29:16 crc kubenswrapper[4795]: I0219 22:29:16.513402 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:29:16 crc kubenswrapper[4795]: E0219 22:29:16.514426 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:29:31 crc kubenswrapper[4795]: I0219 22:29:31.511799 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:29:31 crc kubenswrapper[4795]: E0219 22:29:31.512874 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:29:43 crc kubenswrapper[4795]: I0219 22:29:43.512856 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:29:43 crc kubenswrapper[4795]: E0219 22:29:43.513473 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:29:58 crc kubenswrapper[4795]: I0219 22:29:58.511262 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:29:58 crc kubenswrapper[4795]: E0219 22:29:58.512400 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.181376 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr"] Feb 19 22:30:00 crc kubenswrapper[4795]: E0219 22:30:00.181812 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f690af90-79ff-44b5-88ad-970bfe721e55" containerName="extract-content" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.181829 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f690af90-79ff-44b5-88ad-970bfe721e55" containerName="extract-content" Feb 19 22:30:00 crc kubenswrapper[4795]: E0219 22:30:00.181845 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f690af90-79ff-44b5-88ad-970bfe721e55" containerName="registry-server" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.181852 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f690af90-79ff-44b5-88ad-970bfe721e55" containerName="registry-server" Feb 19 22:30:00 crc kubenswrapper[4795]: E0219 22:30:00.181867 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f690af90-79ff-44b5-88ad-970bfe721e55" containerName="extract-utilities" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.181879 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f690af90-79ff-44b5-88ad-970bfe721e55" containerName="extract-utilities" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.182046 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f690af90-79ff-44b5-88ad-970bfe721e55" containerName="registry-server" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.182837 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.185394 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.185915 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.187634 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr"] Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.232983 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8d7fc5a-2c38-45d1-92d4-e30329082e49-config-volume\") pod \"collect-profiles-29525670-bx5qr\" (UID: \"e8d7fc5a-2c38-45d1-92d4-e30329082e49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.233346 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8d7fc5a-2c38-45d1-92d4-e30329082e49-secret-volume\") pod \"collect-profiles-29525670-bx5qr\" (UID: \"e8d7fc5a-2c38-45d1-92d4-e30329082e49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.233490 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f62pp\" (UniqueName: \"kubernetes.io/projected/e8d7fc5a-2c38-45d1-92d4-e30329082e49-kube-api-access-f62pp\") pod \"collect-profiles-29525670-bx5qr\" (UID: \"e8d7fc5a-2c38-45d1-92d4-e30329082e49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.334883 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8d7fc5a-2c38-45d1-92d4-e30329082e49-config-volume\") pod \"collect-profiles-29525670-bx5qr\" (UID: \"e8d7fc5a-2c38-45d1-92d4-e30329082e49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.335210 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8d7fc5a-2c38-45d1-92d4-e30329082e49-secret-volume\") pod \"collect-profiles-29525670-bx5qr\" (UID: \"e8d7fc5a-2c38-45d1-92d4-e30329082e49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.335339 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f62pp\" (UniqueName: \"kubernetes.io/projected/e8d7fc5a-2c38-45d1-92d4-e30329082e49-kube-api-access-f62pp\") pod \"collect-profiles-29525670-bx5qr\" (UID: \"e8d7fc5a-2c38-45d1-92d4-e30329082e49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.335729 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8d7fc5a-2c38-45d1-92d4-e30329082e49-config-volume\") pod \"collect-profiles-29525670-bx5qr\" (UID: \"e8d7fc5a-2c38-45d1-92d4-e30329082e49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.345811 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8d7fc5a-2c38-45d1-92d4-e30329082e49-secret-volume\") pod \"collect-profiles-29525670-bx5qr\" (UID: \"e8d7fc5a-2c38-45d1-92d4-e30329082e49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.353055 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f62pp\" (UniqueName: \"kubernetes.io/projected/e8d7fc5a-2c38-45d1-92d4-e30329082e49-kube-api-access-f62pp\") pod \"collect-profiles-29525670-bx5qr\" (UID: \"e8d7fc5a-2c38-45d1-92d4-e30329082e49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.509385 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.922602 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr"] Feb 19 22:30:00 crc kubenswrapper[4795]: W0219 22:30:00.934907 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8d7fc5a_2c38_45d1_92d4_e30329082e49.slice/crio-9050471d75442fed58e886865edb6229d98e97f463caeb47fb3ee9fd37f0c6ea WatchSource:0}: Error finding container 9050471d75442fed58e886865edb6229d98e97f463caeb47fb3ee9fd37f0c6ea: Status 404 returned error can't find the container with id 9050471d75442fed58e886865edb6229d98e97f463caeb47fb3ee9fd37f0c6ea Feb 19 22:30:01 crc kubenswrapper[4795]: I0219 22:30:01.356467 4795 generic.go:334] "Generic (PLEG): container finished" podID="e8d7fc5a-2c38-45d1-92d4-e30329082e49" containerID="2891e05af0080148a30c661d705a64987123339160913040e4d09a5170f489c1" exitCode=0 Feb 19 22:30:01 crc kubenswrapper[4795]: I0219 22:30:01.356511 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr" event={"ID":"e8d7fc5a-2c38-45d1-92d4-e30329082e49","Type":"ContainerDied","Data":"2891e05af0080148a30c661d705a64987123339160913040e4d09a5170f489c1"} Feb 19 22:30:01 crc kubenswrapper[4795]: I0219 22:30:01.356806 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr" event={"ID":"e8d7fc5a-2c38-45d1-92d4-e30329082e49","Type":"ContainerStarted","Data":"9050471d75442fed58e886865edb6229d98e97f463caeb47fb3ee9fd37f0c6ea"} Feb 19 22:30:02 crc kubenswrapper[4795]: I0219 22:30:02.628840 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr" Feb 19 22:30:02 crc kubenswrapper[4795]: I0219 22:30:02.766867 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8d7fc5a-2c38-45d1-92d4-e30329082e49-secret-volume\") pod \"e8d7fc5a-2c38-45d1-92d4-e30329082e49\" (UID: \"e8d7fc5a-2c38-45d1-92d4-e30329082e49\") " Feb 19 22:30:02 crc kubenswrapper[4795]: I0219 22:30:02.766946 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8d7fc5a-2c38-45d1-92d4-e30329082e49-config-volume\") pod \"e8d7fc5a-2c38-45d1-92d4-e30329082e49\" (UID: \"e8d7fc5a-2c38-45d1-92d4-e30329082e49\") " Feb 19 22:30:02 crc kubenswrapper[4795]: I0219 22:30:02.767052 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f62pp\" (UniqueName: \"kubernetes.io/projected/e8d7fc5a-2c38-45d1-92d4-e30329082e49-kube-api-access-f62pp\") pod \"e8d7fc5a-2c38-45d1-92d4-e30329082e49\" (UID: \"e8d7fc5a-2c38-45d1-92d4-e30329082e49\") " Feb 19 22:30:02 crc kubenswrapper[4795]: I0219 22:30:02.767681 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8d7fc5a-2c38-45d1-92d4-e30329082e49-config-volume" (OuterVolumeSpecName: "config-volume") pod "e8d7fc5a-2c38-45d1-92d4-e30329082e49" (UID: "e8d7fc5a-2c38-45d1-92d4-e30329082e49"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:30:02 crc kubenswrapper[4795]: I0219 22:30:02.771722 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8d7fc5a-2c38-45d1-92d4-e30329082e49-kube-api-access-f62pp" (OuterVolumeSpecName: "kube-api-access-f62pp") pod "e8d7fc5a-2c38-45d1-92d4-e30329082e49" (UID: "e8d7fc5a-2c38-45d1-92d4-e30329082e49"). InnerVolumeSpecName "kube-api-access-f62pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:30:02 crc kubenswrapper[4795]: I0219 22:30:02.771998 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8d7fc5a-2c38-45d1-92d4-e30329082e49-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e8d7fc5a-2c38-45d1-92d4-e30329082e49" (UID: "e8d7fc5a-2c38-45d1-92d4-e30329082e49"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:30:02 crc kubenswrapper[4795]: I0219 22:30:02.869720 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8d7fc5a-2c38-45d1-92d4-e30329082e49-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 22:30:02 crc kubenswrapper[4795]: I0219 22:30:02.870029 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8d7fc5a-2c38-45d1-92d4-e30329082e49-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 22:30:02 crc kubenswrapper[4795]: I0219 22:30:02.870141 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f62pp\" (UniqueName: \"kubernetes.io/projected/e8d7fc5a-2c38-45d1-92d4-e30329082e49-kube-api-access-f62pp\") on node \"crc\" DevicePath \"\"" Feb 19 22:30:03 crc kubenswrapper[4795]: I0219 22:30:03.370316 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr" event={"ID":"e8d7fc5a-2c38-45d1-92d4-e30329082e49","Type":"ContainerDied","Data":"9050471d75442fed58e886865edb6229d98e97f463caeb47fb3ee9fd37f0c6ea"} Feb 19 22:30:03 crc kubenswrapper[4795]: I0219 22:30:03.370377 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9050471d75442fed58e886865edb6229d98e97f463caeb47fb3ee9fd37f0c6ea" Feb 19 22:30:03 crc kubenswrapper[4795]: I0219 22:30:03.370441 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr" Feb 19 22:30:03 crc kubenswrapper[4795]: I0219 22:30:03.707667 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v"] Feb 19 22:30:03 crc kubenswrapper[4795]: I0219 22:30:03.717315 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v"] Feb 19 22:30:05 crc kubenswrapper[4795]: I0219 22:30:05.523689 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1627c007-5a7c-4fa5-a15f-0da43560c849" path="/var/lib/kubelet/pods/1627c007-5a7c-4fa5-a15f-0da43560c849/volumes" Feb 19 22:30:13 crc kubenswrapper[4795]: I0219 22:30:13.512060 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:30:13 crc kubenswrapper[4795]: E0219 22:30:13.512906 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:30:14 crc kubenswrapper[4795]: I0219 22:30:14.417996 4795 scope.go:117] "RemoveContainer" containerID="6d51fcadf72cdef4da5baae1dcfd6cb93884c814c13eaf31041e3c1336f34a93" Feb 19 22:30:25 crc kubenswrapper[4795]: I0219 22:30:25.512679 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:30:25 crc kubenswrapper[4795]: E0219 22:30:25.513471 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:30:36 crc kubenswrapper[4795]: I0219 22:30:36.513096 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:30:36 crc kubenswrapper[4795]: E0219 22:30:36.514025 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:30:48 crc kubenswrapper[4795]: I0219 22:30:48.512751 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:30:48 crc kubenswrapper[4795]: E0219 22:30:48.513788 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:31:03 crc kubenswrapper[4795]: I0219 22:31:03.512587 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:31:03 crc kubenswrapper[4795]: E0219 22:31:03.513309 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:31:17 crc kubenswrapper[4795]: I0219 22:31:17.511379 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:31:17 crc kubenswrapper[4795]: E0219 22:31:17.511952 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:31:30 crc kubenswrapper[4795]: I0219 22:31:30.511408 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:31:30 crc kubenswrapper[4795]: E0219 22:31:30.512143 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:31:43 crc kubenswrapper[4795]: I0219 22:31:43.515426 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:31:43 crc kubenswrapper[4795]: E0219 22:31:43.516247 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:31:56 crc kubenswrapper[4795]: I0219 22:31:56.511930 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:31:56 crc kubenswrapper[4795]: E0219 22:31:56.512741 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:32:09 crc kubenswrapper[4795]: I0219 22:32:09.275488 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fjjbn"] Feb 19 22:32:09 crc kubenswrapper[4795]: E0219 22:32:09.276670 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8d7fc5a-2c38-45d1-92d4-e30329082e49" containerName="collect-profiles" Feb 19 22:32:09 crc kubenswrapper[4795]: I0219 22:32:09.276695 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8d7fc5a-2c38-45d1-92d4-e30329082e49" containerName="collect-profiles" Feb 19 22:32:09 crc kubenswrapper[4795]: I0219 22:32:09.277007 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8d7fc5a-2c38-45d1-92d4-e30329082e49" containerName="collect-profiles" Feb 19 22:32:09 crc kubenswrapper[4795]: I0219 22:32:09.278899 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:09 crc kubenswrapper[4795]: I0219 22:32:09.299810 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fjjbn"] Feb 19 22:32:09 crc kubenswrapper[4795]: I0219 22:32:09.340107 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9216072e-17e9-4ae1-8bfb-52ce38a26f21-utilities\") pod \"redhat-marketplace-fjjbn\" (UID: \"9216072e-17e9-4ae1-8bfb-52ce38a26f21\") " pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:09 crc kubenswrapper[4795]: I0219 22:32:09.340278 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9216072e-17e9-4ae1-8bfb-52ce38a26f21-catalog-content\") pod \"redhat-marketplace-fjjbn\" (UID: \"9216072e-17e9-4ae1-8bfb-52ce38a26f21\") " pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:09 crc kubenswrapper[4795]: I0219 22:32:09.340423 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmzjs\" (UniqueName: \"kubernetes.io/projected/9216072e-17e9-4ae1-8bfb-52ce38a26f21-kube-api-access-cmzjs\") pod \"redhat-marketplace-fjjbn\" (UID: \"9216072e-17e9-4ae1-8bfb-52ce38a26f21\") " pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:09 crc kubenswrapper[4795]: I0219 22:32:09.483822 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmzjs\" (UniqueName: \"kubernetes.io/projected/9216072e-17e9-4ae1-8bfb-52ce38a26f21-kube-api-access-cmzjs\") pod \"redhat-marketplace-fjjbn\" (UID: \"9216072e-17e9-4ae1-8bfb-52ce38a26f21\") " pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:09 crc kubenswrapper[4795]: I0219 22:32:09.483898 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9216072e-17e9-4ae1-8bfb-52ce38a26f21-utilities\") pod \"redhat-marketplace-fjjbn\" (UID: \"9216072e-17e9-4ae1-8bfb-52ce38a26f21\") " pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:09 crc kubenswrapper[4795]: I0219 22:32:09.483929 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9216072e-17e9-4ae1-8bfb-52ce38a26f21-catalog-content\") pod \"redhat-marketplace-fjjbn\" (UID: \"9216072e-17e9-4ae1-8bfb-52ce38a26f21\") " pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:09 crc kubenswrapper[4795]: I0219 22:32:09.484362 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9216072e-17e9-4ae1-8bfb-52ce38a26f21-catalog-content\") pod \"redhat-marketplace-fjjbn\" (UID: \"9216072e-17e9-4ae1-8bfb-52ce38a26f21\") " pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:09 crc kubenswrapper[4795]: I0219 22:32:09.484623 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9216072e-17e9-4ae1-8bfb-52ce38a26f21-utilities\") pod \"redhat-marketplace-fjjbn\" (UID: \"9216072e-17e9-4ae1-8bfb-52ce38a26f21\") " pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:09 crc kubenswrapper[4795]: I0219 22:32:09.507463 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmzjs\" (UniqueName: \"kubernetes.io/projected/9216072e-17e9-4ae1-8bfb-52ce38a26f21-kube-api-access-cmzjs\") pod \"redhat-marketplace-fjjbn\" (UID: \"9216072e-17e9-4ae1-8bfb-52ce38a26f21\") " pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:09 crc kubenswrapper[4795]: I0219 22:32:09.605374 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:10 crc kubenswrapper[4795]: I0219 22:32:10.086205 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fjjbn"] Feb 19 22:32:10 crc kubenswrapper[4795]: I0219 22:32:10.356008 4795 generic.go:334] "Generic (PLEG): container finished" podID="9216072e-17e9-4ae1-8bfb-52ce38a26f21" containerID="5bc6d00036b8bfee908f4f769e91e7cd738a63d0c687f7e81771cab88749eafa" exitCode=0 Feb 19 22:32:10 crc kubenswrapper[4795]: I0219 22:32:10.356087 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fjjbn" event={"ID":"9216072e-17e9-4ae1-8bfb-52ce38a26f21","Type":"ContainerDied","Data":"5bc6d00036b8bfee908f4f769e91e7cd738a63d0c687f7e81771cab88749eafa"} Feb 19 22:32:10 crc kubenswrapper[4795]: I0219 22:32:10.356338 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fjjbn" event={"ID":"9216072e-17e9-4ae1-8bfb-52ce38a26f21","Type":"ContainerStarted","Data":"fd8dea2cc9816bec097755dd6dba91e5daa3982abf647afa038114a1324c1e97"} Feb 19 22:32:10 crc kubenswrapper[4795]: I0219 22:32:10.358883 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 22:32:10 crc kubenswrapper[4795]: I0219 22:32:10.512854 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:32:10 crc kubenswrapper[4795]: E0219 22:32:10.513248 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:32:11 crc kubenswrapper[4795]: I0219 22:32:11.367827 4795 generic.go:334] "Generic (PLEG): container finished" podID="9216072e-17e9-4ae1-8bfb-52ce38a26f21" containerID="6c8e7b1979af7fb73cfd3bd6373f9f5a03eed0729d0d9ab939d133908788ce3e" exitCode=0 Feb 19 22:32:11 crc kubenswrapper[4795]: I0219 22:32:11.367884 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fjjbn" event={"ID":"9216072e-17e9-4ae1-8bfb-52ce38a26f21","Type":"ContainerDied","Data":"6c8e7b1979af7fb73cfd3bd6373f9f5a03eed0729d0d9ab939d133908788ce3e"} Feb 19 22:32:12 crc kubenswrapper[4795]: I0219 22:32:12.376649 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fjjbn" event={"ID":"9216072e-17e9-4ae1-8bfb-52ce38a26f21","Type":"ContainerStarted","Data":"1bffb3ce7c00123b1a58cad03352ac7ea4a5fd8034f223383885b43909c36dfe"} Feb 19 22:32:12 crc kubenswrapper[4795]: I0219 22:32:12.395383 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fjjbn" podStartSLOduration=1.9136564470000001 podStartE2EDuration="3.39536524s" podCreationTimestamp="2026-02-19 22:32:09 +0000 UTC" firstStartedPulling="2026-02-19 22:32:10.358499767 +0000 UTC m=+3841.551017661" lastFinishedPulling="2026-02-19 22:32:11.84020859 +0000 UTC m=+3843.032726454" observedRunningTime="2026-02-19 22:32:12.391426888 +0000 UTC m=+3843.583944752" watchObservedRunningTime="2026-02-19 22:32:12.39536524 +0000 UTC m=+3843.587883104" Feb 19 22:32:19 crc kubenswrapper[4795]: I0219 22:32:19.605955 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:19 crc kubenswrapper[4795]: I0219 22:32:19.606727 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:19 crc kubenswrapper[4795]: I0219 22:32:19.665576 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:20 crc kubenswrapper[4795]: I0219 22:32:20.495721 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:20 crc kubenswrapper[4795]: I0219 22:32:20.552058 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fjjbn"] Feb 19 22:32:22 crc kubenswrapper[4795]: I0219 22:32:22.444920 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fjjbn" podUID="9216072e-17e9-4ae1-8bfb-52ce38a26f21" containerName="registry-server" containerID="cri-o://1bffb3ce7c00123b1a58cad03352ac7ea4a5fd8034f223383885b43909c36dfe" gracePeriod=2 Feb 19 22:32:22 crc kubenswrapper[4795]: I0219 22:32:22.961655 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.105322 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmzjs\" (UniqueName: \"kubernetes.io/projected/9216072e-17e9-4ae1-8bfb-52ce38a26f21-kube-api-access-cmzjs\") pod \"9216072e-17e9-4ae1-8bfb-52ce38a26f21\" (UID: \"9216072e-17e9-4ae1-8bfb-52ce38a26f21\") " Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.105376 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9216072e-17e9-4ae1-8bfb-52ce38a26f21-utilities\") pod \"9216072e-17e9-4ae1-8bfb-52ce38a26f21\" (UID: \"9216072e-17e9-4ae1-8bfb-52ce38a26f21\") " Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.105412 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9216072e-17e9-4ae1-8bfb-52ce38a26f21-catalog-content\") pod \"9216072e-17e9-4ae1-8bfb-52ce38a26f21\" (UID: \"9216072e-17e9-4ae1-8bfb-52ce38a26f21\") " Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.108080 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9216072e-17e9-4ae1-8bfb-52ce38a26f21-utilities" (OuterVolumeSpecName: "utilities") pod "9216072e-17e9-4ae1-8bfb-52ce38a26f21" (UID: "9216072e-17e9-4ae1-8bfb-52ce38a26f21"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.112419 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9216072e-17e9-4ae1-8bfb-52ce38a26f21-kube-api-access-cmzjs" (OuterVolumeSpecName: "kube-api-access-cmzjs") pod "9216072e-17e9-4ae1-8bfb-52ce38a26f21" (UID: "9216072e-17e9-4ae1-8bfb-52ce38a26f21"). InnerVolumeSpecName "kube-api-access-cmzjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.132710 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9216072e-17e9-4ae1-8bfb-52ce38a26f21-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9216072e-17e9-4ae1-8bfb-52ce38a26f21" (UID: "9216072e-17e9-4ae1-8bfb-52ce38a26f21"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.207311 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmzjs\" (UniqueName: \"kubernetes.io/projected/9216072e-17e9-4ae1-8bfb-52ce38a26f21-kube-api-access-cmzjs\") on node \"crc\" DevicePath \"\"" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.207341 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9216072e-17e9-4ae1-8bfb-52ce38a26f21-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.207351 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9216072e-17e9-4ae1-8bfb-52ce38a26f21-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.453276 4795 generic.go:334] "Generic (PLEG): container finished" podID="9216072e-17e9-4ae1-8bfb-52ce38a26f21" containerID="1bffb3ce7c00123b1a58cad03352ac7ea4a5fd8034f223383885b43909c36dfe" exitCode=0 Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.453317 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fjjbn" event={"ID":"9216072e-17e9-4ae1-8bfb-52ce38a26f21","Type":"ContainerDied","Data":"1bffb3ce7c00123b1a58cad03352ac7ea4a5fd8034f223383885b43909c36dfe"} Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.453340 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.453365 4795 scope.go:117] "RemoveContainer" containerID="1bffb3ce7c00123b1a58cad03352ac7ea4a5fd8034f223383885b43909c36dfe" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.453351 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fjjbn" event={"ID":"9216072e-17e9-4ae1-8bfb-52ce38a26f21","Type":"ContainerDied","Data":"fd8dea2cc9816bec097755dd6dba91e5daa3982abf647afa038114a1324c1e97"} Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.484461 4795 scope.go:117] "RemoveContainer" containerID="6c8e7b1979af7fb73cfd3bd6373f9f5a03eed0729d0d9ab939d133908788ce3e" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.491205 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fjjbn"] Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.498177 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fjjbn"] Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.512393 4795 scope.go:117] "RemoveContainer" containerID="5bc6d00036b8bfee908f4f769e91e7cd738a63d0c687f7e81771cab88749eafa" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.516622 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:32:23 crc kubenswrapper[4795]: E0219 22:32:23.517205 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.528447 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9216072e-17e9-4ae1-8bfb-52ce38a26f21" path="/var/lib/kubelet/pods/9216072e-17e9-4ae1-8bfb-52ce38a26f21/volumes" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.548433 4795 scope.go:117] "RemoveContainer" containerID="1bffb3ce7c00123b1a58cad03352ac7ea4a5fd8034f223383885b43909c36dfe" Feb 19 22:32:23 crc kubenswrapper[4795]: E0219 22:32:23.548911 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bffb3ce7c00123b1a58cad03352ac7ea4a5fd8034f223383885b43909c36dfe\": container with ID starting with 1bffb3ce7c00123b1a58cad03352ac7ea4a5fd8034f223383885b43909c36dfe not found: ID does not exist" containerID="1bffb3ce7c00123b1a58cad03352ac7ea4a5fd8034f223383885b43909c36dfe" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.548938 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bffb3ce7c00123b1a58cad03352ac7ea4a5fd8034f223383885b43909c36dfe"} err="failed to get container status \"1bffb3ce7c00123b1a58cad03352ac7ea4a5fd8034f223383885b43909c36dfe\": rpc error: code = NotFound desc = could not find container \"1bffb3ce7c00123b1a58cad03352ac7ea4a5fd8034f223383885b43909c36dfe\": container with ID starting with 1bffb3ce7c00123b1a58cad03352ac7ea4a5fd8034f223383885b43909c36dfe not found: ID does not exist" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.548959 4795 scope.go:117] "RemoveContainer" containerID="6c8e7b1979af7fb73cfd3bd6373f9f5a03eed0729d0d9ab939d133908788ce3e" Feb 19 22:32:23 crc kubenswrapper[4795]: E0219 22:32:23.549300 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c8e7b1979af7fb73cfd3bd6373f9f5a03eed0729d0d9ab939d133908788ce3e\": container with ID starting with 6c8e7b1979af7fb73cfd3bd6373f9f5a03eed0729d0d9ab939d133908788ce3e not found: ID does not exist" containerID="6c8e7b1979af7fb73cfd3bd6373f9f5a03eed0729d0d9ab939d133908788ce3e" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.549351 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c8e7b1979af7fb73cfd3bd6373f9f5a03eed0729d0d9ab939d133908788ce3e"} err="failed to get container status \"6c8e7b1979af7fb73cfd3bd6373f9f5a03eed0729d0d9ab939d133908788ce3e\": rpc error: code = NotFound desc = could not find container \"6c8e7b1979af7fb73cfd3bd6373f9f5a03eed0729d0d9ab939d133908788ce3e\": container with ID starting with 6c8e7b1979af7fb73cfd3bd6373f9f5a03eed0729d0d9ab939d133908788ce3e not found: ID does not exist" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.549377 4795 scope.go:117] "RemoveContainer" containerID="5bc6d00036b8bfee908f4f769e91e7cd738a63d0c687f7e81771cab88749eafa" Feb 19 22:32:23 crc kubenswrapper[4795]: E0219 22:32:23.549653 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bc6d00036b8bfee908f4f769e91e7cd738a63d0c687f7e81771cab88749eafa\": container with ID starting with 5bc6d00036b8bfee908f4f769e91e7cd738a63d0c687f7e81771cab88749eafa not found: ID does not exist" containerID="5bc6d00036b8bfee908f4f769e91e7cd738a63d0c687f7e81771cab88749eafa" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.549677 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bc6d00036b8bfee908f4f769e91e7cd738a63d0c687f7e81771cab88749eafa"} err="failed to get container status \"5bc6d00036b8bfee908f4f769e91e7cd738a63d0c687f7e81771cab88749eafa\": rpc error: code = NotFound desc = could not find container \"5bc6d00036b8bfee908f4f769e91e7cd738a63d0c687f7e81771cab88749eafa\": container with ID starting with 5bc6d00036b8bfee908f4f769e91e7cd738a63d0c687f7e81771cab88749eafa not found: ID does not exist" Feb 19 22:32:38 crc kubenswrapper[4795]: I0219 22:32:38.511588 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:32:38 crc kubenswrapper[4795]: E0219 22:32:38.512559 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:32:49 crc kubenswrapper[4795]: I0219 22:32:49.516684 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:32:49 crc kubenswrapper[4795]: E0219 22:32:49.517473 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:33:03 crc kubenswrapper[4795]: I0219 22:33:03.512192 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:33:03 crc kubenswrapper[4795]: I0219 22:33:03.731276 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"a74e459fc04040e78e04a2f2c4ee65cfd25848c3c5a03bfbff2374c621904cd2"} Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.058022 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r4gjz"] Feb 19 22:35:02 crc kubenswrapper[4795]: E0219 22:35:02.059080 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9216072e-17e9-4ae1-8bfb-52ce38a26f21" containerName="extract-content" Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.059103 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9216072e-17e9-4ae1-8bfb-52ce38a26f21" containerName="extract-content" Feb 19 22:35:02 crc kubenswrapper[4795]: E0219 22:35:02.059121 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9216072e-17e9-4ae1-8bfb-52ce38a26f21" containerName="registry-server" Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.059132 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9216072e-17e9-4ae1-8bfb-52ce38a26f21" containerName="registry-server" Feb 19 22:35:02 crc kubenswrapper[4795]: E0219 22:35:02.059168 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9216072e-17e9-4ae1-8bfb-52ce38a26f21" containerName="extract-utilities" Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.059203 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9216072e-17e9-4ae1-8bfb-52ce38a26f21" containerName="extract-utilities" Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.059445 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9216072e-17e9-4ae1-8bfb-52ce38a26f21" containerName="registry-server" Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.062164 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.083504 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r4gjz"] Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.151651 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edef5238-d122-4fb5-9078-fff0c0e423af-catalog-content\") pod \"redhat-operators-r4gjz\" (UID: \"edef5238-d122-4fb5-9078-fff0c0e423af\") " pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.151736 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edef5238-d122-4fb5-9078-fff0c0e423af-utilities\") pod \"redhat-operators-r4gjz\" (UID: \"edef5238-d122-4fb5-9078-fff0c0e423af\") " pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.151769 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rntp4\" (UniqueName: \"kubernetes.io/projected/edef5238-d122-4fb5-9078-fff0c0e423af-kube-api-access-rntp4\") pod \"redhat-operators-r4gjz\" (UID: \"edef5238-d122-4fb5-9078-fff0c0e423af\") " pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.252793 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edef5238-d122-4fb5-9078-fff0c0e423af-catalog-content\") pod \"redhat-operators-r4gjz\" (UID: \"edef5238-d122-4fb5-9078-fff0c0e423af\") " pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.252940 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edef5238-d122-4fb5-9078-fff0c0e423af-utilities\") pod \"redhat-operators-r4gjz\" (UID: \"edef5238-d122-4fb5-9078-fff0c0e423af\") " pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.253128 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rntp4\" (UniqueName: \"kubernetes.io/projected/edef5238-d122-4fb5-9078-fff0c0e423af-kube-api-access-rntp4\") pod \"redhat-operators-r4gjz\" (UID: \"edef5238-d122-4fb5-9078-fff0c0e423af\") " pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.253620 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edef5238-d122-4fb5-9078-fff0c0e423af-catalog-content\") pod \"redhat-operators-r4gjz\" (UID: \"edef5238-d122-4fb5-9078-fff0c0e423af\") " pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.253772 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edef5238-d122-4fb5-9078-fff0c0e423af-utilities\") pod \"redhat-operators-r4gjz\" (UID: \"edef5238-d122-4fb5-9078-fff0c0e423af\") " pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.302476 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rntp4\" (UniqueName: \"kubernetes.io/projected/edef5238-d122-4fb5-9078-fff0c0e423af-kube-api-access-rntp4\") pod \"redhat-operators-r4gjz\" (UID: \"edef5238-d122-4fb5-9078-fff0c0e423af\") " pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.413115 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.835918 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r4gjz"] Feb 19 22:35:03 crc kubenswrapper[4795]: I0219 22:35:03.004435 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r4gjz" event={"ID":"edef5238-d122-4fb5-9078-fff0c0e423af","Type":"ContainerStarted","Data":"3fcf80fbd39f147a966a81e1f88e1f1ddc485406b83cb240cf7cd273c36899cc"} Feb 19 22:35:04 crc kubenswrapper[4795]: I0219 22:35:04.019619 4795 generic.go:334] "Generic (PLEG): container finished" podID="edef5238-d122-4fb5-9078-fff0c0e423af" containerID="0970a85b2508fe227d301d30a6474254a87473f0d7fba1d8e43c1a7590508126" exitCode=0 Feb 19 22:35:04 crc kubenswrapper[4795]: I0219 22:35:04.019715 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r4gjz" event={"ID":"edef5238-d122-4fb5-9078-fff0c0e423af","Type":"ContainerDied","Data":"0970a85b2508fe227d301d30a6474254a87473f0d7fba1d8e43c1a7590508126"} Feb 19 22:35:06 crc kubenswrapper[4795]: I0219 22:35:06.037894 4795 generic.go:334] "Generic (PLEG): container finished" podID="edef5238-d122-4fb5-9078-fff0c0e423af" containerID="1cb4fdcd7a5581f2998a483d346d5f9609da52abee6197af388aa092aece9555" exitCode=0 Feb 19 22:35:06 crc kubenswrapper[4795]: I0219 22:35:06.037984 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r4gjz" event={"ID":"edef5238-d122-4fb5-9078-fff0c0e423af","Type":"ContainerDied","Data":"1cb4fdcd7a5581f2998a483d346d5f9609da52abee6197af388aa092aece9555"} Feb 19 22:35:07 crc kubenswrapper[4795]: I0219 22:35:07.047371 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r4gjz" event={"ID":"edef5238-d122-4fb5-9078-fff0c0e423af","Type":"ContainerStarted","Data":"99749496347c139653a0cef67ba667830ae0233aba951d7ee58fd92b6d533d9d"} Feb 19 22:35:07 crc kubenswrapper[4795]: I0219 22:35:07.078307 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r4gjz" podStartSLOduration=2.577351891 podStartE2EDuration="5.078282424s" podCreationTimestamp="2026-02-19 22:35:02 +0000 UTC" firstStartedPulling="2026-02-19 22:35:04.022064247 +0000 UTC m=+4015.214582121" lastFinishedPulling="2026-02-19 22:35:06.52299475 +0000 UTC m=+4017.715512654" observedRunningTime="2026-02-19 22:35:07.071434299 +0000 UTC m=+4018.263952183" watchObservedRunningTime="2026-02-19 22:35:07.078282424 +0000 UTC m=+4018.270800328" Feb 19 22:35:12 crc kubenswrapper[4795]: I0219 22:35:12.414084 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:12 crc kubenswrapper[4795]: I0219 22:35:12.414852 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:13 crc kubenswrapper[4795]: I0219 22:35:13.458371 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-r4gjz" podUID="edef5238-d122-4fb5-9078-fff0c0e423af" containerName="registry-server" probeResult="failure" output=< Feb 19 22:35:13 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Feb 19 22:35:13 crc kubenswrapper[4795]: > Feb 19 22:35:22 crc kubenswrapper[4795]: I0219 22:35:22.478972 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:22 crc kubenswrapper[4795]: I0219 22:35:22.553154 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:22 crc kubenswrapper[4795]: I0219 22:35:22.729288 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r4gjz"] Feb 19 22:35:24 crc kubenswrapper[4795]: I0219 22:35:24.200040 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-r4gjz" podUID="edef5238-d122-4fb5-9078-fff0c0e423af" containerName="registry-server" containerID="cri-o://99749496347c139653a0cef67ba667830ae0233aba951d7ee58fd92b6d533d9d" gracePeriod=2 Feb 19 22:35:24 crc kubenswrapper[4795]: I0219 22:35:24.714351 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:24 crc kubenswrapper[4795]: I0219 22:35:24.818084 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edef5238-d122-4fb5-9078-fff0c0e423af-catalog-content\") pod \"edef5238-d122-4fb5-9078-fff0c0e423af\" (UID: \"edef5238-d122-4fb5-9078-fff0c0e423af\") " Feb 19 22:35:24 crc kubenswrapper[4795]: I0219 22:35:24.818218 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edef5238-d122-4fb5-9078-fff0c0e423af-utilities\") pod \"edef5238-d122-4fb5-9078-fff0c0e423af\" (UID: \"edef5238-d122-4fb5-9078-fff0c0e423af\") " Feb 19 22:35:24 crc kubenswrapper[4795]: I0219 22:35:24.818321 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rntp4\" (UniqueName: \"kubernetes.io/projected/edef5238-d122-4fb5-9078-fff0c0e423af-kube-api-access-rntp4\") pod \"edef5238-d122-4fb5-9078-fff0c0e423af\" (UID: \"edef5238-d122-4fb5-9078-fff0c0e423af\") " Feb 19 22:35:24 crc kubenswrapper[4795]: I0219 22:35:24.819237 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edef5238-d122-4fb5-9078-fff0c0e423af-utilities" (OuterVolumeSpecName: "utilities") pod "edef5238-d122-4fb5-9078-fff0c0e423af" (UID: "edef5238-d122-4fb5-9078-fff0c0e423af"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:35:24 crc kubenswrapper[4795]: I0219 22:35:24.824408 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edef5238-d122-4fb5-9078-fff0c0e423af-kube-api-access-rntp4" (OuterVolumeSpecName: "kube-api-access-rntp4") pod "edef5238-d122-4fb5-9078-fff0c0e423af" (UID: "edef5238-d122-4fb5-9078-fff0c0e423af"). InnerVolumeSpecName "kube-api-access-rntp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:35:24 crc kubenswrapper[4795]: I0219 22:35:24.920220 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edef5238-d122-4fb5-9078-fff0c0e423af-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:35:24 crc kubenswrapper[4795]: I0219 22:35:24.920249 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rntp4\" (UniqueName: \"kubernetes.io/projected/edef5238-d122-4fb5-9078-fff0c0e423af-kube-api-access-rntp4\") on node \"crc\" DevicePath \"\"" Feb 19 22:35:24 crc kubenswrapper[4795]: I0219 22:35:24.968949 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edef5238-d122-4fb5-9078-fff0c0e423af-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "edef5238-d122-4fb5-9078-fff0c0e423af" (UID: "edef5238-d122-4fb5-9078-fff0c0e423af"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.021616 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edef5238-d122-4fb5-9078-fff0c0e423af-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.216634 4795 generic.go:334] "Generic (PLEG): container finished" podID="edef5238-d122-4fb5-9078-fff0c0e423af" containerID="99749496347c139653a0cef67ba667830ae0233aba951d7ee58fd92b6d533d9d" exitCode=0 Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.216679 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r4gjz" event={"ID":"edef5238-d122-4fb5-9078-fff0c0e423af","Type":"ContainerDied","Data":"99749496347c139653a0cef67ba667830ae0233aba951d7ee58fd92b6d533d9d"} Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.216704 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r4gjz" event={"ID":"edef5238-d122-4fb5-9078-fff0c0e423af","Type":"ContainerDied","Data":"3fcf80fbd39f147a966a81e1f88e1f1ddc485406b83cb240cf7cd273c36899cc"} Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.216721 4795 scope.go:117] "RemoveContainer" containerID="99749496347c139653a0cef67ba667830ae0233aba951d7ee58fd92b6d533d9d" Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.218796 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.243052 4795 scope.go:117] "RemoveContainer" containerID="1cb4fdcd7a5581f2998a483d346d5f9609da52abee6197af388aa092aece9555" Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.266024 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r4gjz"] Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.272088 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-r4gjz"] Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.283054 4795 scope.go:117] "RemoveContainer" containerID="0970a85b2508fe227d301d30a6474254a87473f0d7fba1d8e43c1a7590508126" Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.298716 4795 scope.go:117] "RemoveContainer" containerID="99749496347c139653a0cef67ba667830ae0233aba951d7ee58fd92b6d533d9d" Feb 19 22:35:25 crc kubenswrapper[4795]: E0219 22:35:25.299254 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99749496347c139653a0cef67ba667830ae0233aba951d7ee58fd92b6d533d9d\": container with ID starting with 99749496347c139653a0cef67ba667830ae0233aba951d7ee58fd92b6d533d9d not found: ID does not exist" containerID="99749496347c139653a0cef67ba667830ae0233aba951d7ee58fd92b6d533d9d" Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.299301 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99749496347c139653a0cef67ba667830ae0233aba951d7ee58fd92b6d533d9d"} err="failed to get container status \"99749496347c139653a0cef67ba667830ae0233aba951d7ee58fd92b6d533d9d\": rpc error: code = NotFound desc = could not find container \"99749496347c139653a0cef67ba667830ae0233aba951d7ee58fd92b6d533d9d\": container with ID starting with 99749496347c139653a0cef67ba667830ae0233aba951d7ee58fd92b6d533d9d not found: ID does not exist" Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.299343 4795 scope.go:117] "RemoveContainer" containerID="1cb4fdcd7a5581f2998a483d346d5f9609da52abee6197af388aa092aece9555" Feb 19 22:35:25 crc kubenswrapper[4795]: E0219 22:35:25.299637 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cb4fdcd7a5581f2998a483d346d5f9609da52abee6197af388aa092aece9555\": container with ID starting with 1cb4fdcd7a5581f2998a483d346d5f9609da52abee6197af388aa092aece9555 not found: ID does not exist" containerID="1cb4fdcd7a5581f2998a483d346d5f9609da52abee6197af388aa092aece9555" Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.299671 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cb4fdcd7a5581f2998a483d346d5f9609da52abee6197af388aa092aece9555"} err="failed to get container status \"1cb4fdcd7a5581f2998a483d346d5f9609da52abee6197af388aa092aece9555\": rpc error: code = NotFound desc = could not find container \"1cb4fdcd7a5581f2998a483d346d5f9609da52abee6197af388aa092aece9555\": container with ID starting with 1cb4fdcd7a5581f2998a483d346d5f9609da52abee6197af388aa092aece9555 not found: ID does not exist" Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.299690 4795 scope.go:117] "RemoveContainer" containerID="0970a85b2508fe227d301d30a6474254a87473f0d7fba1d8e43c1a7590508126" Feb 19 22:35:25 crc kubenswrapper[4795]: E0219 22:35:25.300235 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0970a85b2508fe227d301d30a6474254a87473f0d7fba1d8e43c1a7590508126\": container with ID starting with 0970a85b2508fe227d301d30a6474254a87473f0d7fba1d8e43c1a7590508126 not found: ID does not exist" containerID="0970a85b2508fe227d301d30a6474254a87473f0d7fba1d8e43c1a7590508126" Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.300282 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0970a85b2508fe227d301d30a6474254a87473f0d7fba1d8e43c1a7590508126"} err="failed to get container status \"0970a85b2508fe227d301d30a6474254a87473f0d7fba1d8e43c1a7590508126\": rpc error: code = NotFound desc = could not find container \"0970a85b2508fe227d301d30a6474254a87473f0d7fba1d8e43c1a7590508126\": container with ID starting with 0970a85b2508fe227d301d30a6474254a87473f0d7fba1d8e43c1a7590508126 not found: ID does not exist" Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.524801 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edef5238-d122-4fb5-9078-fff0c0e423af" path="/var/lib/kubelet/pods/edef5238-d122-4fb5-9078-fff0c0e423af/volumes" Feb 19 22:35:28 crc kubenswrapper[4795]: I0219 22:35:28.427570 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:35:28 crc kubenswrapper[4795]: I0219 22:35:28.427937 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:35:28 crc kubenswrapper[4795]: I0219 22:35:28.940405 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6cn6h"] Feb 19 22:35:28 crc kubenswrapper[4795]: E0219 22:35:28.941335 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edef5238-d122-4fb5-9078-fff0c0e423af" containerName="extract-utilities" Feb 19 22:35:28 crc kubenswrapper[4795]: I0219 22:35:28.941362 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="edef5238-d122-4fb5-9078-fff0c0e423af" containerName="extract-utilities" Feb 19 22:35:28 crc kubenswrapper[4795]: E0219 22:35:28.941396 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edef5238-d122-4fb5-9078-fff0c0e423af" containerName="registry-server" Feb 19 22:35:28 crc kubenswrapper[4795]: I0219 22:35:28.941409 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="edef5238-d122-4fb5-9078-fff0c0e423af" containerName="registry-server" Feb 19 22:35:28 crc kubenswrapper[4795]: E0219 22:35:28.941426 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edef5238-d122-4fb5-9078-fff0c0e423af" containerName="extract-content" Feb 19 22:35:28 crc kubenswrapper[4795]: I0219 22:35:28.941439 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="edef5238-d122-4fb5-9078-fff0c0e423af" containerName="extract-content" Feb 19 22:35:28 crc kubenswrapper[4795]: I0219 22:35:28.941684 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="edef5238-d122-4fb5-9078-fff0c0e423af" containerName="registry-server" Feb 19 22:35:28 crc kubenswrapper[4795]: I0219 22:35:28.944928 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:28 crc kubenswrapper[4795]: I0219 22:35:28.962770 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6cn6h"] Feb 19 22:35:29 crc kubenswrapper[4795]: I0219 22:35:29.082720 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/677da4f8-3439-41c1-b491-46ad22cf8f99-catalog-content\") pod \"certified-operators-6cn6h\" (UID: \"677da4f8-3439-41c1-b491-46ad22cf8f99\") " pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:29 crc kubenswrapper[4795]: I0219 22:35:29.082767 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/677da4f8-3439-41c1-b491-46ad22cf8f99-utilities\") pod \"certified-operators-6cn6h\" (UID: \"677da4f8-3439-41c1-b491-46ad22cf8f99\") " pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:29 crc kubenswrapper[4795]: I0219 22:35:29.082806 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b7gn\" (UniqueName: \"kubernetes.io/projected/677da4f8-3439-41c1-b491-46ad22cf8f99-kube-api-access-8b7gn\") pod \"certified-operators-6cn6h\" (UID: \"677da4f8-3439-41c1-b491-46ad22cf8f99\") " pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:29 crc kubenswrapper[4795]: I0219 22:35:29.184027 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/677da4f8-3439-41c1-b491-46ad22cf8f99-catalog-content\") pod \"certified-operators-6cn6h\" (UID: \"677da4f8-3439-41c1-b491-46ad22cf8f99\") " pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:29 crc kubenswrapper[4795]: I0219 22:35:29.184077 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/677da4f8-3439-41c1-b491-46ad22cf8f99-utilities\") pod \"certified-operators-6cn6h\" (UID: \"677da4f8-3439-41c1-b491-46ad22cf8f99\") " pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:29 crc kubenswrapper[4795]: I0219 22:35:29.184121 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b7gn\" (UniqueName: \"kubernetes.io/projected/677da4f8-3439-41c1-b491-46ad22cf8f99-kube-api-access-8b7gn\") pod \"certified-operators-6cn6h\" (UID: \"677da4f8-3439-41c1-b491-46ad22cf8f99\") " pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:29 crc kubenswrapper[4795]: I0219 22:35:29.184494 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/677da4f8-3439-41c1-b491-46ad22cf8f99-catalog-content\") pod \"certified-operators-6cn6h\" (UID: \"677da4f8-3439-41c1-b491-46ad22cf8f99\") " pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:29 crc kubenswrapper[4795]: I0219 22:35:29.184546 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/677da4f8-3439-41c1-b491-46ad22cf8f99-utilities\") pod \"certified-operators-6cn6h\" (UID: \"677da4f8-3439-41c1-b491-46ad22cf8f99\") " pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:29 crc kubenswrapper[4795]: I0219 22:35:29.210671 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b7gn\" (UniqueName: \"kubernetes.io/projected/677da4f8-3439-41c1-b491-46ad22cf8f99-kube-api-access-8b7gn\") pod \"certified-operators-6cn6h\" (UID: \"677da4f8-3439-41c1-b491-46ad22cf8f99\") " pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:29 crc kubenswrapper[4795]: I0219 22:35:29.290420 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:29 crc kubenswrapper[4795]: I0219 22:35:29.614232 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6cn6h"] Feb 19 22:35:30 crc kubenswrapper[4795]: I0219 22:35:30.261121 4795 generic.go:334] "Generic (PLEG): container finished" podID="677da4f8-3439-41c1-b491-46ad22cf8f99" containerID="3f36ea63387d19c31183daecf26693f1c9928e033424c3ff71d7ea7db5f54740" exitCode=0 Feb 19 22:35:30 crc kubenswrapper[4795]: I0219 22:35:30.261209 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6cn6h" event={"ID":"677da4f8-3439-41c1-b491-46ad22cf8f99","Type":"ContainerDied","Data":"3f36ea63387d19c31183daecf26693f1c9928e033424c3ff71d7ea7db5f54740"} Feb 19 22:35:30 crc kubenswrapper[4795]: I0219 22:35:30.261953 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6cn6h" event={"ID":"677da4f8-3439-41c1-b491-46ad22cf8f99","Type":"ContainerStarted","Data":"f25591d70490f0645bf4d9c7ca437387d9cbf3e1178df39d65c74bd44551e746"} Feb 19 22:35:31 crc kubenswrapper[4795]: I0219 22:35:31.274783 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6cn6h" event={"ID":"677da4f8-3439-41c1-b491-46ad22cf8f99","Type":"ContainerStarted","Data":"37923f399ed409a6ab7ef3d7baef4da55b6aa22861de0d066563c695efa48dfd"} Feb 19 22:35:32 crc kubenswrapper[4795]: I0219 22:35:32.283220 4795 generic.go:334] "Generic (PLEG): container finished" podID="677da4f8-3439-41c1-b491-46ad22cf8f99" containerID="37923f399ed409a6ab7ef3d7baef4da55b6aa22861de0d066563c695efa48dfd" exitCode=0 Feb 19 22:35:32 crc kubenswrapper[4795]: I0219 22:35:32.283349 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6cn6h" event={"ID":"677da4f8-3439-41c1-b491-46ad22cf8f99","Type":"ContainerDied","Data":"37923f399ed409a6ab7ef3d7baef4da55b6aa22861de0d066563c695efa48dfd"} Feb 19 22:35:32 crc kubenswrapper[4795]: I0219 22:35:32.283690 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6cn6h" event={"ID":"677da4f8-3439-41c1-b491-46ad22cf8f99","Type":"ContainerStarted","Data":"a92b235567e0d9b79f3b4147a93137b692107ed94c5afdda79b2c5695de8aaf8"} Feb 19 22:35:32 crc kubenswrapper[4795]: I0219 22:35:32.304958 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6cn6h" podStartSLOduration=2.916744929 podStartE2EDuration="4.304934872s" podCreationTimestamp="2026-02-19 22:35:28 +0000 UTC" firstStartedPulling="2026-02-19 22:35:30.264838097 +0000 UTC m=+4041.457355971" lastFinishedPulling="2026-02-19 22:35:31.65302805 +0000 UTC m=+4042.845545914" observedRunningTime="2026-02-19 22:35:32.300923297 +0000 UTC m=+4043.493441231" watchObservedRunningTime="2026-02-19 22:35:32.304934872 +0000 UTC m=+4043.497452736" Feb 19 22:35:39 crc kubenswrapper[4795]: I0219 22:35:39.290953 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:39 crc kubenswrapper[4795]: I0219 22:35:39.291672 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:39 crc kubenswrapper[4795]: I0219 22:35:39.357478 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:39 crc kubenswrapper[4795]: I0219 22:35:39.415967 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:39 crc kubenswrapper[4795]: I0219 22:35:39.592679 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6cn6h"] Feb 19 22:35:41 crc kubenswrapper[4795]: I0219 22:35:41.368940 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6cn6h" podUID="677da4f8-3439-41c1-b491-46ad22cf8f99" containerName="registry-server" containerID="cri-o://a92b235567e0d9b79f3b4147a93137b692107ed94c5afdda79b2c5695de8aaf8" gracePeriod=2 Feb 19 22:35:41 crc kubenswrapper[4795]: I0219 22:35:41.822781 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:41 crc kubenswrapper[4795]: I0219 22:35:41.884284 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/677da4f8-3439-41c1-b491-46ad22cf8f99-catalog-content\") pod \"677da4f8-3439-41c1-b491-46ad22cf8f99\" (UID: \"677da4f8-3439-41c1-b491-46ad22cf8f99\") " Feb 19 22:35:41 crc kubenswrapper[4795]: I0219 22:35:41.884381 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b7gn\" (UniqueName: \"kubernetes.io/projected/677da4f8-3439-41c1-b491-46ad22cf8f99-kube-api-access-8b7gn\") pod \"677da4f8-3439-41c1-b491-46ad22cf8f99\" (UID: \"677da4f8-3439-41c1-b491-46ad22cf8f99\") " Feb 19 22:35:41 crc kubenswrapper[4795]: I0219 22:35:41.884458 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/677da4f8-3439-41c1-b491-46ad22cf8f99-utilities\") pod \"677da4f8-3439-41c1-b491-46ad22cf8f99\" (UID: \"677da4f8-3439-41c1-b491-46ad22cf8f99\") " Feb 19 22:35:41 crc kubenswrapper[4795]: I0219 22:35:41.885438 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/677da4f8-3439-41c1-b491-46ad22cf8f99-utilities" (OuterVolumeSpecName: "utilities") pod "677da4f8-3439-41c1-b491-46ad22cf8f99" (UID: "677da4f8-3439-41c1-b491-46ad22cf8f99"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:35:41 crc kubenswrapper[4795]: I0219 22:35:41.889022 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/677da4f8-3439-41c1-b491-46ad22cf8f99-kube-api-access-8b7gn" (OuterVolumeSpecName: "kube-api-access-8b7gn") pod "677da4f8-3439-41c1-b491-46ad22cf8f99" (UID: "677da4f8-3439-41c1-b491-46ad22cf8f99"). InnerVolumeSpecName "kube-api-access-8b7gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:35:41 crc kubenswrapper[4795]: I0219 22:35:41.955948 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/677da4f8-3439-41c1-b491-46ad22cf8f99-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "677da4f8-3439-41c1-b491-46ad22cf8f99" (UID: "677da4f8-3439-41c1-b491-46ad22cf8f99"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:35:41 crc kubenswrapper[4795]: I0219 22:35:41.986023 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/677da4f8-3439-41c1-b491-46ad22cf8f99-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:35:41 crc kubenswrapper[4795]: I0219 22:35:41.986057 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b7gn\" (UniqueName: \"kubernetes.io/projected/677da4f8-3439-41c1-b491-46ad22cf8f99-kube-api-access-8b7gn\") on node \"crc\" DevicePath \"\"" Feb 19 22:35:41 crc kubenswrapper[4795]: I0219 22:35:41.986092 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/677da4f8-3439-41c1-b491-46ad22cf8f99-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:35:42 crc kubenswrapper[4795]: I0219 22:35:42.385444 4795 generic.go:334] "Generic (PLEG): container finished" podID="677da4f8-3439-41c1-b491-46ad22cf8f99" containerID="a92b235567e0d9b79f3b4147a93137b692107ed94c5afdda79b2c5695de8aaf8" exitCode=0 Feb 19 22:35:42 crc kubenswrapper[4795]: I0219 22:35:42.385532 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:42 crc kubenswrapper[4795]: I0219 22:35:42.385568 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6cn6h" event={"ID":"677da4f8-3439-41c1-b491-46ad22cf8f99","Type":"ContainerDied","Data":"a92b235567e0d9b79f3b4147a93137b692107ed94c5afdda79b2c5695de8aaf8"} Feb 19 22:35:42 crc kubenswrapper[4795]: I0219 22:35:42.386071 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6cn6h" event={"ID":"677da4f8-3439-41c1-b491-46ad22cf8f99","Type":"ContainerDied","Data":"f25591d70490f0645bf4d9c7ca437387d9cbf3e1178df39d65c74bd44551e746"} Feb 19 22:35:42 crc kubenswrapper[4795]: I0219 22:35:42.386113 4795 scope.go:117] "RemoveContainer" containerID="a92b235567e0d9b79f3b4147a93137b692107ed94c5afdda79b2c5695de8aaf8" Feb 19 22:35:42 crc kubenswrapper[4795]: I0219 22:35:42.411473 4795 scope.go:117] "RemoveContainer" containerID="37923f399ed409a6ab7ef3d7baef4da55b6aa22861de0d066563c695efa48dfd" Feb 19 22:35:42 crc kubenswrapper[4795]: I0219 22:35:42.436387 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6cn6h"] Feb 19 22:35:42 crc kubenswrapper[4795]: I0219 22:35:42.443855 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6cn6h"] Feb 19 22:35:42 crc kubenswrapper[4795]: I0219 22:35:42.453487 4795 scope.go:117] "RemoveContainer" containerID="3f36ea63387d19c31183daecf26693f1c9928e033424c3ff71d7ea7db5f54740" Feb 19 22:35:42 crc kubenswrapper[4795]: I0219 22:35:42.477834 4795 scope.go:117] "RemoveContainer" containerID="a92b235567e0d9b79f3b4147a93137b692107ed94c5afdda79b2c5695de8aaf8" Feb 19 22:35:42 crc kubenswrapper[4795]: E0219 22:35:42.478236 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a92b235567e0d9b79f3b4147a93137b692107ed94c5afdda79b2c5695de8aaf8\": container with ID starting with a92b235567e0d9b79f3b4147a93137b692107ed94c5afdda79b2c5695de8aaf8 not found: ID does not exist" containerID="a92b235567e0d9b79f3b4147a93137b692107ed94c5afdda79b2c5695de8aaf8" Feb 19 22:35:42 crc kubenswrapper[4795]: I0219 22:35:42.478268 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a92b235567e0d9b79f3b4147a93137b692107ed94c5afdda79b2c5695de8aaf8"} err="failed to get container status \"a92b235567e0d9b79f3b4147a93137b692107ed94c5afdda79b2c5695de8aaf8\": rpc error: code = NotFound desc = could not find container \"a92b235567e0d9b79f3b4147a93137b692107ed94c5afdda79b2c5695de8aaf8\": container with ID starting with a92b235567e0d9b79f3b4147a93137b692107ed94c5afdda79b2c5695de8aaf8 not found: ID does not exist" Feb 19 22:35:42 crc kubenswrapper[4795]: I0219 22:35:42.478291 4795 scope.go:117] "RemoveContainer" containerID="37923f399ed409a6ab7ef3d7baef4da55b6aa22861de0d066563c695efa48dfd" Feb 19 22:35:42 crc kubenswrapper[4795]: E0219 22:35:42.478783 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37923f399ed409a6ab7ef3d7baef4da55b6aa22861de0d066563c695efa48dfd\": container with ID starting with 37923f399ed409a6ab7ef3d7baef4da55b6aa22861de0d066563c695efa48dfd not found: ID does not exist" containerID="37923f399ed409a6ab7ef3d7baef4da55b6aa22861de0d066563c695efa48dfd" Feb 19 22:35:42 crc kubenswrapper[4795]: I0219 22:35:42.478837 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37923f399ed409a6ab7ef3d7baef4da55b6aa22861de0d066563c695efa48dfd"} err="failed to get container status \"37923f399ed409a6ab7ef3d7baef4da55b6aa22861de0d066563c695efa48dfd\": rpc error: code = NotFound desc = could not find container \"37923f399ed409a6ab7ef3d7baef4da55b6aa22861de0d066563c695efa48dfd\": container with ID starting with 37923f399ed409a6ab7ef3d7baef4da55b6aa22861de0d066563c695efa48dfd not found: ID does not exist" Feb 19 22:35:42 crc kubenswrapper[4795]: I0219 22:35:42.478879 4795 scope.go:117] "RemoveContainer" containerID="3f36ea63387d19c31183daecf26693f1c9928e033424c3ff71d7ea7db5f54740" Feb 19 22:35:42 crc kubenswrapper[4795]: E0219 22:35:42.479251 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f36ea63387d19c31183daecf26693f1c9928e033424c3ff71d7ea7db5f54740\": container with ID starting with 3f36ea63387d19c31183daecf26693f1c9928e033424c3ff71d7ea7db5f54740 not found: ID does not exist" containerID="3f36ea63387d19c31183daecf26693f1c9928e033424c3ff71d7ea7db5f54740" Feb 19 22:35:42 crc kubenswrapper[4795]: I0219 22:35:42.479285 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f36ea63387d19c31183daecf26693f1c9928e033424c3ff71d7ea7db5f54740"} err="failed to get container status \"3f36ea63387d19c31183daecf26693f1c9928e033424c3ff71d7ea7db5f54740\": rpc error: code = NotFound desc = could not find container \"3f36ea63387d19c31183daecf26693f1c9928e033424c3ff71d7ea7db5f54740\": container with ID starting with 3f36ea63387d19c31183daecf26693f1c9928e033424c3ff71d7ea7db5f54740 not found: ID does not exist" Feb 19 22:35:43 crc kubenswrapper[4795]: I0219 22:35:43.530868 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="677da4f8-3439-41c1-b491-46ad22cf8f99" path="/var/lib/kubelet/pods/677da4f8-3439-41c1-b491-46ad22cf8f99/volumes" Feb 19 22:35:58 crc kubenswrapper[4795]: I0219 22:35:58.427217 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:35:58 crc kubenswrapper[4795]: I0219 22:35:58.427915 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:36:28 crc kubenswrapper[4795]: I0219 22:36:28.427858 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:36:28 crc kubenswrapper[4795]: I0219 22:36:28.428489 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:36:28 crc kubenswrapper[4795]: I0219 22:36:28.428549 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 22:36:28 crc kubenswrapper[4795]: I0219 22:36:28.429227 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a74e459fc04040e78e04a2f2c4ee65cfd25848c3c5a03bfbff2374c621904cd2"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 22:36:28 crc kubenswrapper[4795]: I0219 22:36:28.429317 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://a74e459fc04040e78e04a2f2c4ee65cfd25848c3c5a03bfbff2374c621904cd2" gracePeriod=600 Feb 19 22:36:28 crc kubenswrapper[4795]: I0219 22:36:28.772721 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="a74e459fc04040e78e04a2f2c4ee65cfd25848c3c5a03bfbff2374c621904cd2" exitCode=0 Feb 19 22:36:28 crc kubenswrapper[4795]: I0219 22:36:28.772786 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"a74e459fc04040e78e04a2f2c4ee65cfd25848c3c5a03bfbff2374c621904cd2"} Feb 19 22:36:28 crc kubenswrapper[4795]: I0219 22:36:28.773101 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799"} Feb 19 22:36:28 crc kubenswrapper[4795]: I0219 22:36:28.773125 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:37:24 crc kubenswrapper[4795]: I0219 22:37:24.927668 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fhss8"] Feb 19 22:37:24 crc kubenswrapper[4795]: E0219 22:37:24.928631 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="677da4f8-3439-41c1-b491-46ad22cf8f99" containerName="extract-utilities" Feb 19 22:37:24 crc kubenswrapper[4795]: I0219 22:37:24.928651 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="677da4f8-3439-41c1-b491-46ad22cf8f99" containerName="extract-utilities" Feb 19 22:37:24 crc kubenswrapper[4795]: E0219 22:37:24.928667 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="677da4f8-3439-41c1-b491-46ad22cf8f99" containerName="extract-content" Feb 19 22:37:24 crc kubenswrapper[4795]: I0219 22:37:24.928675 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="677da4f8-3439-41c1-b491-46ad22cf8f99" containerName="extract-content" Feb 19 22:37:24 crc kubenswrapper[4795]: E0219 22:37:24.928689 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="677da4f8-3439-41c1-b491-46ad22cf8f99" containerName="registry-server" Feb 19 22:37:24 crc kubenswrapper[4795]: I0219 22:37:24.928699 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="677da4f8-3439-41c1-b491-46ad22cf8f99" containerName="registry-server" Feb 19 22:37:24 crc kubenswrapper[4795]: I0219 22:37:24.928884 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="677da4f8-3439-41c1-b491-46ad22cf8f99" containerName="registry-server" Feb 19 22:37:24 crc kubenswrapper[4795]: I0219 22:37:24.930192 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:24 crc kubenswrapper[4795]: I0219 22:37:24.979279 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fhss8"] Feb 19 22:37:25 crc kubenswrapper[4795]: I0219 22:37:25.063010 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-catalog-content\") pod \"community-operators-fhss8\" (UID: \"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff\") " pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:25 crc kubenswrapper[4795]: I0219 22:37:25.063079 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9249\" (UniqueName: \"kubernetes.io/projected/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-kube-api-access-w9249\") pod \"community-operators-fhss8\" (UID: \"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff\") " pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:25 crc kubenswrapper[4795]: I0219 22:37:25.063555 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-utilities\") pod \"community-operators-fhss8\" (UID: \"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff\") " pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:25 crc kubenswrapper[4795]: I0219 22:37:25.165451 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-utilities\") pod \"community-operators-fhss8\" (UID: \"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff\") " pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:25 crc kubenswrapper[4795]: I0219 22:37:25.165552 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-catalog-content\") pod \"community-operators-fhss8\" (UID: \"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff\") " pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:25 crc kubenswrapper[4795]: I0219 22:37:25.165590 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9249\" (UniqueName: \"kubernetes.io/projected/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-kube-api-access-w9249\") pod \"community-operators-fhss8\" (UID: \"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff\") " pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:25 crc kubenswrapper[4795]: I0219 22:37:25.165964 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-utilities\") pod \"community-operators-fhss8\" (UID: \"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff\") " pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:25 crc kubenswrapper[4795]: I0219 22:37:25.166209 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-catalog-content\") pod \"community-operators-fhss8\" (UID: \"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff\") " pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:25 crc kubenswrapper[4795]: I0219 22:37:25.184228 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9249\" (UniqueName: \"kubernetes.io/projected/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-kube-api-access-w9249\") pod \"community-operators-fhss8\" (UID: \"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff\") " pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:25 crc kubenswrapper[4795]: I0219 22:37:25.310218 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:25 crc kubenswrapper[4795]: I0219 22:37:25.757613 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fhss8"] Feb 19 22:37:25 crc kubenswrapper[4795]: W0219 22:37:25.762358 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fb7d96a_dce2_4dba_a43e_1fba3ebcb3ff.slice/crio-c0811f6d4eee4f68dfb1152548dce3ed43e0601930423a4305095732e7baa753 WatchSource:0}: Error finding container c0811f6d4eee4f68dfb1152548dce3ed43e0601930423a4305095732e7baa753: Status 404 returned error can't find the container with id c0811f6d4eee4f68dfb1152548dce3ed43e0601930423a4305095732e7baa753 Feb 19 22:37:26 crc kubenswrapper[4795]: I0219 22:37:26.246008 4795 generic.go:334] "Generic (PLEG): container finished" podID="1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff" containerID="8ab7d51f60e94e755f5efe3ea3c083c30640eb7d75d5ad1c4fcb8e6ac37739fa" exitCode=0 Feb 19 22:37:26 crc kubenswrapper[4795]: I0219 22:37:26.246091 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhss8" event={"ID":"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff","Type":"ContainerDied","Data":"8ab7d51f60e94e755f5efe3ea3c083c30640eb7d75d5ad1c4fcb8e6ac37739fa"} Feb 19 22:37:26 crc kubenswrapper[4795]: I0219 22:37:26.246330 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhss8" event={"ID":"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff","Type":"ContainerStarted","Data":"c0811f6d4eee4f68dfb1152548dce3ed43e0601930423a4305095732e7baa753"} Feb 19 22:37:26 crc kubenswrapper[4795]: I0219 22:37:26.248564 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 22:37:28 crc kubenswrapper[4795]: I0219 22:37:28.266753 4795 generic.go:334] "Generic (PLEG): container finished" podID="1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff" containerID="8e32f661daf3f3a36e8a7c04217754b0e696dd7557faca9ab17e09c9eabc3e7c" exitCode=0 Feb 19 22:37:28 crc kubenswrapper[4795]: I0219 22:37:28.266785 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhss8" event={"ID":"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff","Type":"ContainerDied","Data":"8e32f661daf3f3a36e8a7c04217754b0e696dd7557faca9ab17e09c9eabc3e7c"} Feb 19 22:37:29 crc kubenswrapper[4795]: I0219 22:37:29.273740 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhss8" event={"ID":"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff","Type":"ContainerStarted","Data":"5cd114c1fb2fb95c62c168b055daf3c378e71e5edd128a3284f05a448914a977"} Feb 19 22:37:29 crc kubenswrapper[4795]: I0219 22:37:29.288585 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fhss8" podStartSLOduration=2.862673698 podStartE2EDuration="5.288569279s" podCreationTimestamp="2026-02-19 22:37:24 +0000 UTC" firstStartedPulling="2026-02-19 22:37:26.248160173 +0000 UTC m=+4157.440678067" lastFinishedPulling="2026-02-19 22:37:28.674055784 +0000 UTC m=+4159.866573648" observedRunningTime="2026-02-19 22:37:29.287139828 +0000 UTC m=+4160.479657692" watchObservedRunningTime="2026-02-19 22:37:29.288569279 +0000 UTC m=+4160.481087143" Feb 19 22:37:35 crc kubenswrapper[4795]: I0219 22:37:35.311223 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:35 crc kubenswrapper[4795]: I0219 22:37:35.311795 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:35 crc kubenswrapper[4795]: I0219 22:37:35.364824 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:36 crc kubenswrapper[4795]: I0219 22:37:36.367416 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:36 crc kubenswrapper[4795]: I0219 22:37:36.408559 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fhss8"] Feb 19 22:37:38 crc kubenswrapper[4795]: I0219 22:37:38.330690 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fhss8" podUID="1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff" containerName="registry-server" containerID="cri-o://5cd114c1fb2fb95c62c168b055daf3c378e71e5edd128a3284f05a448914a977" gracePeriod=2 Feb 19 22:37:38 crc kubenswrapper[4795]: I0219 22:37:38.779000 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:38 crc kubenswrapper[4795]: I0219 22:37:38.870840 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-utilities\") pod \"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff\" (UID: \"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff\") " Feb 19 22:37:38 crc kubenswrapper[4795]: I0219 22:37:38.870914 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-catalog-content\") pod \"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff\" (UID: \"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff\") " Feb 19 22:37:38 crc kubenswrapper[4795]: I0219 22:37:38.870942 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9249\" (UniqueName: \"kubernetes.io/projected/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-kube-api-access-w9249\") pod \"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff\" (UID: \"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff\") " Feb 19 22:37:38 crc kubenswrapper[4795]: I0219 22:37:38.871893 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-utilities" (OuterVolumeSpecName: "utilities") pod "1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff" (UID: "1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:37:38 crc kubenswrapper[4795]: I0219 22:37:38.879422 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-kube-api-access-w9249" (OuterVolumeSpecName: "kube-api-access-w9249") pod "1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff" (UID: "1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff"). InnerVolumeSpecName "kube-api-access-w9249". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:37:38 crc kubenswrapper[4795]: I0219 22:37:38.945529 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff" (UID: "1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:37:38 crc kubenswrapper[4795]: I0219 22:37:38.971687 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:37:38 crc kubenswrapper[4795]: I0219 22:37:38.971720 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9249\" (UniqueName: \"kubernetes.io/projected/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-kube-api-access-w9249\") on node \"crc\" DevicePath \"\"" Feb 19 22:37:38 crc kubenswrapper[4795]: I0219 22:37:38.971729 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:37:39 crc kubenswrapper[4795]: I0219 22:37:39.341053 4795 generic.go:334] "Generic (PLEG): container finished" podID="1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff" containerID="5cd114c1fb2fb95c62c168b055daf3c378e71e5edd128a3284f05a448914a977" exitCode=0 Feb 19 22:37:39 crc kubenswrapper[4795]: I0219 22:37:39.341124 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhss8" event={"ID":"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff","Type":"ContainerDied","Data":"5cd114c1fb2fb95c62c168b055daf3c378e71e5edd128a3284f05a448914a977"} Feb 19 22:37:39 crc kubenswrapper[4795]: I0219 22:37:39.341248 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:39 crc kubenswrapper[4795]: I0219 22:37:39.342367 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhss8" event={"ID":"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff","Type":"ContainerDied","Data":"c0811f6d4eee4f68dfb1152548dce3ed43e0601930423a4305095732e7baa753"} Feb 19 22:37:39 crc kubenswrapper[4795]: I0219 22:37:39.342395 4795 scope.go:117] "RemoveContainer" containerID="5cd114c1fb2fb95c62c168b055daf3c378e71e5edd128a3284f05a448914a977" Feb 19 22:37:39 crc kubenswrapper[4795]: I0219 22:37:39.362914 4795 scope.go:117] "RemoveContainer" containerID="8e32f661daf3f3a36e8a7c04217754b0e696dd7557faca9ab17e09c9eabc3e7c" Feb 19 22:37:39 crc kubenswrapper[4795]: I0219 22:37:39.400625 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fhss8"] Feb 19 22:37:39 crc kubenswrapper[4795]: I0219 22:37:39.407730 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fhss8"] Feb 19 22:37:39 crc kubenswrapper[4795]: I0219 22:37:39.407910 4795 scope.go:117] "RemoveContainer" containerID="8ab7d51f60e94e755f5efe3ea3c083c30640eb7d75d5ad1c4fcb8e6ac37739fa" Feb 19 22:37:39 crc kubenswrapper[4795]: I0219 22:37:39.441752 4795 scope.go:117] "RemoveContainer" containerID="5cd114c1fb2fb95c62c168b055daf3c378e71e5edd128a3284f05a448914a977" Feb 19 22:37:39 crc kubenswrapper[4795]: E0219 22:37:39.442346 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cd114c1fb2fb95c62c168b055daf3c378e71e5edd128a3284f05a448914a977\": container with ID starting with 5cd114c1fb2fb95c62c168b055daf3c378e71e5edd128a3284f05a448914a977 not found: ID does not exist" containerID="5cd114c1fb2fb95c62c168b055daf3c378e71e5edd128a3284f05a448914a977" Feb 19 22:37:39 crc kubenswrapper[4795]: I0219 22:37:39.442396 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cd114c1fb2fb95c62c168b055daf3c378e71e5edd128a3284f05a448914a977"} err="failed to get container status \"5cd114c1fb2fb95c62c168b055daf3c378e71e5edd128a3284f05a448914a977\": rpc error: code = NotFound desc = could not find container \"5cd114c1fb2fb95c62c168b055daf3c378e71e5edd128a3284f05a448914a977\": container with ID starting with 5cd114c1fb2fb95c62c168b055daf3c378e71e5edd128a3284f05a448914a977 not found: ID does not exist" Feb 19 22:37:39 crc kubenswrapper[4795]: I0219 22:37:39.442423 4795 scope.go:117] "RemoveContainer" containerID="8e32f661daf3f3a36e8a7c04217754b0e696dd7557faca9ab17e09c9eabc3e7c" Feb 19 22:37:39 crc kubenswrapper[4795]: E0219 22:37:39.443311 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e32f661daf3f3a36e8a7c04217754b0e696dd7557faca9ab17e09c9eabc3e7c\": container with ID starting with 8e32f661daf3f3a36e8a7c04217754b0e696dd7557faca9ab17e09c9eabc3e7c not found: ID does not exist" containerID="8e32f661daf3f3a36e8a7c04217754b0e696dd7557faca9ab17e09c9eabc3e7c" Feb 19 22:37:39 crc kubenswrapper[4795]: I0219 22:37:39.443377 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e32f661daf3f3a36e8a7c04217754b0e696dd7557faca9ab17e09c9eabc3e7c"} err="failed to get container status \"8e32f661daf3f3a36e8a7c04217754b0e696dd7557faca9ab17e09c9eabc3e7c\": rpc error: code = NotFound desc = could not find container \"8e32f661daf3f3a36e8a7c04217754b0e696dd7557faca9ab17e09c9eabc3e7c\": container with ID starting with 8e32f661daf3f3a36e8a7c04217754b0e696dd7557faca9ab17e09c9eabc3e7c not found: ID does not exist" Feb 19 22:37:39 crc kubenswrapper[4795]: I0219 22:37:39.443627 4795 scope.go:117] "RemoveContainer" containerID="8ab7d51f60e94e755f5efe3ea3c083c30640eb7d75d5ad1c4fcb8e6ac37739fa" Feb 19 22:37:39 crc kubenswrapper[4795]: E0219 22:37:39.443985 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ab7d51f60e94e755f5efe3ea3c083c30640eb7d75d5ad1c4fcb8e6ac37739fa\": container with ID starting with 8ab7d51f60e94e755f5efe3ea3c083c30640eb7d75d5ad1c4fcb8e6ac37739fa not found: ID does not exist" containerID="8ab7d51f60e94e755f5efe3ea3c083c30640eb7d75d5ad1c4fcb8e6ac37739fa" Feb 19 22:37:39 crc kubenswrapper[4795]: I0219 22:37:39.444022 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ab7d51f60e94e755f5efe3ea3c083c30640eb7d75d5ad1c4fcb8e6ac37739fa"} err="failed to get container status \"8ab7d51f60e94e755f5efe3ea3c083c30640eb7d75d5ad1c4fcb8e6ac37739fa\": rpc error: code = NotFound desc = could not find container \"8ab7d51f60e94e755f5efe3ea3c083c30640eb7d75d5ad1c4fcb8e6ac37739fa\": container with ID starting with 8ab7d51f60e94e755f5efe3ea3c083c30640eb7d75d5ad1c4fcb8e6ac37739fa not found: ID does not exist" Feb 19 22:37:39 crc kubenswrapper[4795]: I0219 22:37:39.523324 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff" path="/var/lib/kubelet/pods/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff/volumes" Feb 19 22:38:28 crc kubenswrapper[4795]: I0219 22:38:28.428247 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:38:28 crc kubenswrapper[4795]: I0219 22:38:28.428809 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:38:58 crc kubenswrapper[4795]: I0219 22:38:58.427417 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:38:58 crc kubenswrapper[4795]: I0219 22:38:58.428064 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:39:28 crc kubenswrapper[4795]: I0219 22:39:28.427133 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:39:28 crc kubenswrapper[4795]: I0219 22:39:28.427609 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:39:28 crc kubenswrapper[4795]: I0219 22:39:28.427660 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 22:39:28 crc kubenswrapper[4795]: I0219 22:39:28.428266 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 22:39:28 crc kubenswrapper[4795]: I0219 22:39:28.428319 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" gracePeriod=600 Feb 19 22:39:28 crc kubenswrapper[4795]: E0219 22:39:28.558120 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:39:29 crc kubenswrapper[4795]: I0219 22:39:29.301133 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" exitCode=0 Feb 19 22:39:29 crc kubenswrapper[4795]: I0219 22:39:29.301184 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799"} Feb 19 22:39:29 crc kubenswrapper[4795]: I0219 22:39:29.301596 4795 scope.go:117] "RemoveContainer" containerID="a74e459fc04040e78e04a2f2c4ee65cfd25848c3c5a03bfbff2374c621904cd2" Feb 19 22:39:29 crc kubenswrapper[4795]: I0219 22:39:29.301995 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:39:29 crc kubenswrapper[4795]: E0219 22:39:29.302207 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:39:44 crc kubenswrapper[4795]: I0219 22:39:44.513122 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:39:44 crc kubenswrapper[4795]: E0219 22:39:44.514603 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:39:59 crc kubenswrapper[4795]: I0219 22:39:59.520223 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:39:59 crc kubenswrapper[4795]: E0219 22:39:59.521365 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:40:11 crc kubenswrapper[4795]: I0219 22:40:11.512073 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:40:11 crc kubenswrapper[4795]: E0219 22:40:11.513410 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:40:23 crc kubenswrapper[4795]: I0219 22:40:23.512365 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:40:23 crc kubenswrapper[4795]: E0219 22:40:23.513202 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:40:37 crc kubenswrapper[4795]: I0219 22:40:37.511969 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:40:37 crc kubenswrapper[4795]: E0219 22:40:37.512994 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:40:49 crc kubenswrapper[4795]: I0219 22:40:49.512694 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:40:49 crc kubenswrapper[4795]: E0219 22:40:49.515788 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:41:00 crc kubenswrapper[4795]: I0219 22:41:00.511903 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:41:00 crc kubenswrapper[4795]: E0219 22:41:00.512827 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:41:04 crc kubenswrapper[4795]: I0219 22:41:04.829441 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-6tsln"] Feb 19 22:41:04 crc kubenswrapper[4795]: I0219 22:41:04.835068 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-6tsln"] Feb 19 22:41:04 crc kubenswrapper[4795]: I0219 22:41:04.992699 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-dpkxw"] Feb 19 22:41:04 crc kubenswrapper[4795]: E0219 22:41:04.993040 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff" containerName="registry-server" Feb 19 22:41:04 crc kubenswrapper[4795]: I0219 22:41:04.993061 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff" containerName="registry-server" Feb 19 22:41:04 crc kubenswrapper[4795]: E0219 22:41:04.993085 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff" containerName="extract-content" Feb 19 22:41:04 crc kubenswrapper[4795]: I0219 22:41:04.993093 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff" containerName="extract-content" Feb 19 22:41:04 crc kubenswrapper[4795]: E0219 22:41:04.993122 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff" containerName="extract-utilities" Feb 19 22:41:04 crc kubenswrapper[4795]: I0219 22:41:04.993131 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff" containerName="extract-utilities" Feb 19 22:41:04 crc kubenswrapper[4795]: I0219 22:41:04.993310 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff" containerName="registry-server" Feb 19 22:41:04 crc kubenswrapper[4795]: I0219 22:41:04.993881 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dpkxw" Feb 19 22:41:04 crc kubenswrapper[4795]: I0219 22:41:04.996950 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 19 22:41:04 crc kubenswrapper[4795]: I0219 22:41:04.996979 4795 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-4rmf9" Feb 19 22:41:04 crc kubenswrapper[4795]: I0219 22:41:04.997522 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 19 22:41:05 crc kubenswrapper[4795]: I0219 22:41:05.001597 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-dpkxw"] Feb 19 22:41:05 crc kubenswrapper[4795]: I0219 22:41:05.003759 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 19 22:41:05 crc kubenswrapper[4795]: I0219 22:41:05.135479 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5e61919b-4848-43ec-8a16-6d752a04c5ac-node-mnt\") pod \"crc-storage-crc-dpkxw\" (UID: \"5e61919b-4848-43ec-8a16-6d752a04c5ac\") " pod="crc-storage/crc-storage-crc-dpkxw" Feb 19 22:41:05 crc kubenswrapper[4795]: I0219 22:41:05.135620 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flhx7\" (UniqueName: \"kubernetes.io/projected/5e61919b-4848-43ec-8a16-6d752a04c5ac-kube-api-access-flhx7\") pod \"crc-storage-crc-dpkxw\" (UID: \"5e61919b-4848-43ec-8a16-6d752a04c5ac\") " pod="crc-storage/crc-storage-crc-dpkxw" Feb 19 22:41:05 crc kubenswrapper[4795]: I0219 22:41:05.135673 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5e61919b-4848-43ec-8a16-6d752a04c5ac-crc-storage\") pod \"crc-storage-crc-dpkxw\" (UID: \"5e61919b-4848-43ec-8a16-6d752a04c5ac\") " pod="crc-storage/crc-storage-crc-dpkxw" Feb 19 22:41:05 crc kubenswrapper[4795]: I0219 22:41:05.236460 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5e61919b-4848-43ec-8a16-6d752a04c5ac-crc-storage\") pod \"crc-storage-crc-dpkxw\" (UID: \"5e61919b-4848-43ec-8a16-6d752a04c5ac\") " pod="crc-storage/crc-storage-crc-dpkxw" Feb 19 22:41:05 crc kubenswrapper[4795]: I0219 22:41:05.236520 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5e61919b-4848-43ec-8a16-6d752a04c5ac-node-mnt\") pod \"crc-storage-crc-dpkxw\" (UID: \"5e61919b-4848-43ec-8a16-6d752a04c5ac\") " pod="crc-storage/crc-storage-crc-dpkxw" Feb 19 22:41:05 crc kubenswrapper[4795]: I0219 22:41:05.236576 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flhx7\" (UniqueName: \"kubernetes.io/projected/5e61919b-4848-43ec-8a16-6d752a04c5ac-kube-api-access-flhx7\") pod \"crc-storage-crc-dpkxw\" (UID: \"5e61919b-4848-43ec-8a16-6d752a04c5ac\") " pod="crc-storage/crc-storage-crc-dpkxw" Feb 19 22:41:05 crc kubenswrapper[4795]: I0219 22:41:05.237444 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5e61919b-4848-43ec-8a16-6d752a04c5ac-crc-storage\") pod \"crc-storage-crc-dpkxw\" (UID: \"5e61919b-4848-43ec-8a16-6d752a04c5ac\") " pod="crc-storage/crc-storage-crc-dpkxw" Feb 19 22:41:05 crc kubenswrapper[4795]: I0219 22:41:05.237696 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5e61919b-4848-43ec-8a16-6d752a04c5ac-node-mnt\") pod \"crc-storage-crc-dpkxw\" (UID: \"5e61919b-4848-43ec-8a16-6d752a04c5ac\") " pod="crc-storage/crc-storage-crc-dpkxw" Feb 19 22:41:05 crc kubenswrapper[4795]: I0219 22:41:05.263642 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flhx7\" (UniqueName: \"kubernetes.io/projected/5e61919b-4848-43ec-8a16-6d752a04c5ac-kube-api-access-flhx7\") pod \"crc-storage-crc-dpkxw\" (UID: \"5e61919b-4848-43ec-8a16-6d752a04c5ac\") " pod="crc-storage/crc-storage-crc-dpkxw" Feb 19 22:41:05 crc kubenswrapper[4795]: I0219 22:41:05.330898 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dpkxw" Feb 19 22:41:05 crc kubenswrapper[4795]: I0219 22:41:05.525036 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc847694-39ea-4c3c-bb58-0f920e59ac62" path="/var/lib/kubelet/pods/dc847694-39ea-4c3c-bb58-0f920e59ac62/volumes" Feb 19 22:41:05 crc kubenswrapper[4795]: I0219 22:41:05.782749 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-dpkxw"] Feb 19 22:41:06 crc kubenswrapper[4795]: I0219 22:41:06.126412 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-dpkxw" event={"ID":"5e61919b-4848-43ec-8a16-6d752a04c5ac","Type":"ContainerStarted","Data":"2ec619e5b6188cbce687e8d5092eede0d3323f72a885aad70e24f71d39f7a75b"} Feb 19 22:41:07 crc kubenswrapper[4795]: I0219 22:41:07.133597 4795 generic.go:334] "Generic (PLEG): container finished" podID="5e61919b-4848-43ec-8a16-6d752a04c5ac" containerID="fa2e3c7da6ac02ed85489de382e40a9b438bc98edd6481546fefa8394b7e5fce" exitCode=0 Feb 19 22:41:07 crc kubenswrapper[4795]: I0219 22:41:07.133686 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-dpkxw" event={"ID":"5e61919b-4848-43ec-8a16-6d752a04c5ac","Type":"ContainerDied","Data":"fa2e3c7da6ac02ed85489de382e40a9b438bc98edd6481546fefa8394b7e5fce"} Feb 19 22:41:08 crc kubenswrapper[4795]: I0219 22:41:08.412515 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dpkxw" Feb 19 22:41:08 crc kubenswrapper[4795]: I0219 22:41:08.480537 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flhx7\" (UniqueName: \"kubernetes.io/projected/5e61919b-4848-43ec-8a16-6d752a04c5ac-kube-api-access-flhx7\") pod \"5e61919b-4848-43ec-8a16-6d752a04c5ac\" (UID: \"5e61919b-4848-43ec-8a16-6d752a04c5ac\") " Feb 19 22:41:08 crc kubenswrapper[4795]: I0219 22:41:08.480706 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5e61919b-4848-43ec-8a16-6d752a04c5ac-node-mnt\") pod \"5e61919b-4848-43ec-8a16-6d752a04c5ac\" (UID: \"5e61919b-4848-43ec-8a16-6d752a04c5ac\") " Feb 19 22:41:08 crc kubenswrapper[4795]: I0219 22:41:08.480795 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5e61919b-4848-43ec-8a16-6d752a04c5ac-crc-storage\") pod \"5e61919b-4848-43ec-8a16-6d752a04c5ac\" (UID: \"5e61919b-4848-43ec-8a16-6d752a04c5ac\") " Feb 19 22:41:08 crc kubenswrapper[4795]: I0219 22:41:08.480820 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e61919b-4848-43ec-8a16-6d752a04c5ac-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "5e61919b-4848-43ec-8a16-6d752a04c5ac" (UID: "5e61919b-4848-43ec-8a16-6d752a04c5ac"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 22:41:08 crc kubenswrapper[4795]: I0219 22:41:08.481238 4795 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5e61919b-4848-43ec-8a16-6d752a04c5ac-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 19 22:41:08 crc kubenswrapper[4795]: I0219 22:41:08.485621 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e61919b-4848-43ec-8a16-6d752a04c5ac-kube-api-access-flhx7" (OuterVolumeSpecName: "kube-api-access-flhx7") pod "5e61919b-4848-43ec-8a16-6d752a04c5ac" (UID: "5e61919b-4848-43ec-8a16-6d752a04c5ac"). InnerVolumeSpecName "kube-api-access-flhx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:41:08 crc kubenswrapper[4795]: I0219 22:41:08.501882 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e61919b-4848-43ec-8a16-6d752a04c5ac-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "5e61919b-4848-43ec-8a16-6d752a04c5ac" (UID: "5e61919b-4848-43ec-8a16-6d752a04c5ac"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:41:08 crc kubenswrapper[4795]: I0219 22:41:08.582760 4795 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5e61919b-4848-43ec-8a16-6d752a04c5ac-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 19 22:41:08 crc kubenswrapper[4795]: I0219 22:41:08.582800 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flhx7\" (UniqueName: \"kubernetes.io/projected/5e61919b-4848-43ec-8a16-6d752a04c5ac-kube-api-access-flhx7\") on node \"crc\" DevicePath \"\"" Feb 19 22:41:09 crc kubenswrapper[4795]: I0219 22:41:09.151395 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-dpkxw" event={"ID":"5e61919b-4848-43ec-8a16-6d752a04c5ac","Type":"ContainerDied","Data":"2ec619e5b6188cbce687e8d5092eede0d3323f72a885aad70e24f71d39f7a75b"} Feb 19 22:41:09 crc kubenswrapper[4795]: I0219 22:41:09.151437 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ec619e5b6188cbce687e8d5092eede0d3323f72a885aad70e24f71d39f7a75b" Feb 19 22:41:09 crc kubenswrapper[4795]: I0219 22:41:09.151466 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dpkxw" Feb 19 22:41:10 crc kubenswrapper[4795]: I0219 22:41:10.852319 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-dpkxw"] Feb 19 22:41:10 crc kubenswrapper[4795]: I0219 22:41:10.863009 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-dpkxw"] Feb 19 22:41:10 crc kubenswrapper[4795]: I0219 22:41:10.995602 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-h7bnb"] Feb 19 22:41:10 crc kubenswrapper[4795]: E0219 22:41:10.995871 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e61919b-4848-43ec-8a16-6d752a04c5ac" containerName="storage" Feb 19 22:41:10 crc kubenswrapper[4795]: I0219 22:41:10.995883 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e61919b-4848-43ec-8a16-6d752a04c5ac" containerName="storage" Feb 19 22:41:10 crc kubenswrapper[4795]: I0219 22:41:10.996033 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e61919b-4848-43ec-8a16-6d752a04c5ac" containerName="storage" Feb 19 22:41:10 crc kubenswrapper[4795]: I0219 22:41:10.996471 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-h7bnb" Feb 19 22:41:10 crc kubenswrapper[4795]: I0219 22:41:10.998620 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 19 22:41:10 crc kubenswrapper[4795]: I0219 22:41:10.998663 4795 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-4rmf9" Feb 19 22:41:10 crc kubenswrapper[4795]: I0219 22:41:10.998627 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 19 22:41:10 crc kubenswrapper[4795]: I0219 22:41:10.998894 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 19 22:41:11 crc kubenswrapper[4795]: I0219 22:41:11.010090 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-h7bnb"] Feb 19 22:41:11 crc kubenswrapper[4795]: I0219 22:41:11.134868 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e4298263-238f-468e-b008-ebf615095a56-node-mnt\") pod \"crc-storage-crc-h7bnb\" (UID: \"e4298263-238f-468e-b008-ebf615095a56\") " pod="crc-storage/crc-storage-crc-h7bnb" Feb 19 22:41:11 crc kubenswrapper[4795]: I0219 22:41:11.134916 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e4298263-238f-468e-b008-ebf615095a56-crc-storage\") pod \"crc-storage-crc-h7bnb\" (UID: \"e4298263-238f-468e-b008-ebf615095a56\") " pod="crc-storage/crc-storage-crc-h7bnb" Feb 19 22:41:11 crc kubenswrapper[4795]: I0219 22:41:11.135086 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v484\" (UniqueName: \"kubernetes.io/projected/e4298263-238f-468e-b008-ebf615095a56-kube-api-access-9v484\") pod \"crc-storage-crc-h7bnb\" (UID: \"e4298263-238f-468e-b008-ebf615095a56\") " pod="crc-storage/crc-storage-crc-h7bnb" Feb 19 22:41:11 crc kubenswrapper[4795]: I0219 22:41:11.236633 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e4298263-238f-468e-b008-ebf615095a56-node-mnt\") pod \"crc-storage-crc-h7bnb\" (UID: \"e4298263-238f-468e-b008-ebf615095a56\") " pod="crc-storage/crc-storage-crc-h7bnb" Feb 19 22:41:11 crc kubenswrapper[4795]: I0219 22:41:11.236686 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e4298263-238f-468e-b008-ebf615095a56-crc-storage\") pod \"crc-storage-crc-h7bnb\" (UID: \"e4298263-238f-468e-b008-ebf615095a56\") " pod="crc-storage/crc-storage-crc-h7bnb" Feb 19 22:41:11 crc kubenswrapper[4795]: I0219 22:41:11.236722 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v484\" (UniqueName: \"kubernetes.io/projected/e4298263-238f-468e-b008-ebf615095a56-kube-api-access-9v484\") pod \"crc-storage-crc-h7bnb\" (UID: \"e4298263-238f-468e-b008-ebf615095a56\") " pod="crc-storage/crc-storage-crc-h7bnb" Feb 19 22:41:11 crc kubenswrapper[4795]: I0219 22:41:11.237512 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e4298263-238f-468e-b008-ebf615095a56-crc-storage\") pod \"crc-storage-crc-h7bnb\" (UID: \"e4298263-238f-468e-b008-ebf615095a56\") " pod="crc-storage/crc-storage-crc-h7bnb" Feb 19 22:41:11 crc kubenswrapper[4795]: I0219 22:41:11.238698 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e4298263-238f-468e-b008-ebf615095a56-node-mnt\") pod \"crc-storage-crc-h7bnb\" (UID: \"e4298263-238f-468e-b008-ebf615095a56\") " pod="crc-storage/crc-storage-crc-h7bnb" Feb 19 22:41:11 crc kubenswrapper[4795]: I0219 22:41:11.257718 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v484\" (UniqueName: \"kubernetes.io/projected/e4298263-238f-468e-b008-ebf615095a56-kube-api-access-9v484\") pod \"crc-storage-crc-h7bnb\" (UID: \"e4298263-238f-468e-b008-ebf615095a56\") " pod="crc-storage/crc-storage-crc-h7bnb" Feb 19 22:41:11 crc kubenswrapper[4795]: I0219 22:41:11.342696 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-h7bnb" Feb 19 22:41:11 crc kubenswrapper[4795]: I0219 22:41:11.521459 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e61919b-4848-43ec-8a16-6d752a04c5ac" path="/var/lib/kubelet/pods/5e61919b-4848-43ec-8a16-6d752a04c5ac/volumes" Feb 19 22:41:11 crc kubenswrapper[4795]: I0219 22:41:11.770073 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-h7bnb"] Feb 19 22:41:12 crc kubenswrapper[4795]: I0219 22:41:12.181493 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-h7bnb" event={"ID":"e4298263-238f-468e-b008-ebf615095a56","Type":"ContainerStarted","Data":"6d3623332377e39d8f770fbc105d739fb237218dab1b9b12419d65a8241cd9ff"} Feb 19 22:41:13 crc kubenswrapper[4795]: I0219 22:41:13.190221 4795 generic.go:334] "Generic (PLEG): container finished" podID="e4298263-238f-468e-b008-ebf615095a56" containerID="a3123c77876bceedf67b9501c6d94d5632c3bf437df7dc0f4cd344e9184635e6" exitCode=0 Feb 19 22:41:13 crc kubenswrapper[4795]: I0219 22:41:13.190452 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-h7bnb" event={"ID":"e4298263-238f-468e-b008-ebf615095a56","Type":"ContainerDied","Data":"a3123c77876bceedf67b9501c6d94d5632c3bf437df7dc0f4cd344e9184635e6"} Feb 19 22:41:14 crc kubenswrapper[4795]: I0219 22:41:14.571847 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-h7bnb" Feb 19 22:41:14 crc kubenswrapper[4795]: I0219 22:41:14.676067 4795 scope.go:117] "RemoveContainer" containerID="1e723054d3a11aa58d4ff2996b03eae32a1cee8cb004d6442cd324f9c14218e2" Feb 19 22:41:14 crc kubenswrapper[4795]: I0219 22:41:14.695862 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v484\" (UniqueName: \"kubernetes.io/projected/e4298263-238f-468e-b008-ebf615095a56-kube-api-access-9v484\") pod \"e4298263-238f-468e-b008-ebf615095a56\" (UID: \"e4298263-238f-468e-b008-ebf615095a56\") " Feb 19 22:41:14 crc kubenswrapper[4795]: I0219 22:41:14.695921 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e4298263-238f-468e-b008-ebf615095a56-crc-storage\") pod \"e4298263-238f-468e-b008-ebf615095a56\" (UID: \"e4298263-238f-468e-b008-ebf615095a56\") " Feb 19 22:41:14 crc kubenswrapper[4795]: I0219 22:41:14.695973 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e4298263-238f-468e-b008-ebf615095a56-node-mnt\") pod \"e4298263-238f-468e-b008-ebf615095a56\" (UID: \"e4298263-238f-468e-b008-ebf615095a56\") " Feb 19 22:41:14 crc kubenswrapper[4795]: I0219 22:41:14.696204 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4298263-238f-468e-b008-ebf615095a56-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "e4298263-238f-468e-b008-ebf615095a56" (UID: "e4298263-238f-468e-b008-ebf615095a56"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 22:41:14 crc kubenswrapper[4795]: I0219 22:41:14.703034 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4298263-238f-468e-b008-ebf615095a56-kube-api-access-9v484" (OuterVolumeSpecName: "kube-api-access-9v484") pod "e4298263-238f-468e-b008-ebf615095a56" (UID: "e4298263-238f-468e-b008-ebf615095a56"). InnerVolumeSpecName "kube-api-access-9v484". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:41:14 crc kubenswrapper[4795]: I0219 22:41:14.717564 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4298263-238f-468e-b008-ebf615095a56-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "e4298263-238f-468e-b008-ebf615095a56" (UID: "e4298263-238f-468e-b008-ebf615095a56"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:41:14 crc kubenswrapper[4795]: I0219 22:41:14.797232 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v484\" (UniqueName: \"kubernetes.io/projected/e4298263-238f-468e-b008-ebf615095a56-kube-api-access-9v484\") on node \"crc\" DevicePath \"\"" Feb 19 22:41:14 crc kubenswrapper[4795]: I0219 22:41:14.797276 4795 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e4298263-238f-468e-b008-ebf615095a56-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 19 22:41:14 crc kubenswrapper[4795]: I0219 22:41:14.797288 4795 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e4298263-238f-468e-b008-ebf615095a56-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 19 22:41:15 crc kubenswrapper[4795]: I0219 22:41:15.222914 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-h7bnb" event={"ID":"e4298263-238f-468e-b008-ebf615095a56","Type":"ContainerDied","Data":"6d3623332377e39d8f770fbc105d739fb237218dab1b9b12419d65a8241cd9ff"} Feb 19 22:41:15 crc kubenswrapper[4795]: I0219 22:41:15.223379 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d3623332377e39d8f770fbc105d739fb237218dab1b9b12419d65a8241cd9ff" Feb 19 22:41:15 crc kubenswrapper[4795]: I0219 22:41:15.223341 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-h7bnb" Feb 19 22:41:15 crc kubenswrapper[4795]: I0219 22:41:15.511787 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:41:15 crc kubenswrapper[4795]: E0219 22:41:15.512306 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:41:29 crc kubenswrapper[4795]: I0219 22:41:29.519070 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:41:29 crc kubenswrapper[4795]: E0219 22:41:29.519922 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:41:41 crc kubenswrapper[4795]: I0219 22:41:41.512660 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:41:41 crc kubenswrapper[4795]: E0219 22:41:41.514889 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:41:56 crc kubenswrapper[4795]: I0219 22:41:56.511613 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:41:56 crc kubenswrapper[4795]: E0219 22:41:56.512334 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:42:07 crc kubenswrapper[4795]: I0219 22:42:07.512199 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:42:07 crc kubenswrapper[4795]: E0219 22:42:07.513260 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:42:20 crc kubenswrapper[4795]: I0219 22:42:20.511687 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:42:20 crc kubenswrapper[4795]: E0219 22:42:20.512484 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:42:34 crc kubenswrapper[4795]: I0219 22:42:34.511605 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:42:34 crc kubenswrapper[4795]: E0219 22:42:34.512534 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:42:46 crc kubenswrapper[4795]: I0219 22:42:46.511994 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:42:46 crc kubenswrapper[4795]: E0219 22:42:46.512936 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:42:57 crc kubenswrapper[4795]: I0219 22:42:57.201453 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8pwv8"] Feb 19 22:42:57 crc kubenswrapper[4795]: E0219 22:42:57.202260 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4298263-238f-468e-b008-ebf615095a56" containerName="storage" Feb 19 22:42:57 crc kubenswrapper[4795]: I0219 22:42:57.202275 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4298263-238f-468e-b008-ebf615095a56" containerName="storage" Feb 19 22:42:57 crc kubenswrapper[4795]: I0219 22:42:57.202432 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4298263-238f-468e-b008-ebf615095a56" containerName="storage" Feb 19 22:42:57 crc kubenswrapper[4795]: I0219 22:42:57.203436 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:42:57 crc kubenswrapper[4795]: I0219 22:42:57.219726 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8pwv8"] Feb 19 22:42:57 crc kubenswrapper[4795]: I0219 22:42:57.293997 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d5613ba-3d64-49d7-bba7-7b828e1c2948-utilities\") pod \"redhat-marketplace-8pwv8\" (UID: \"2d5613ba-3d64-49d7-bba7-7b828e1c2948\") " pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:42:57 crc kubenswrapper[4795]: I0219 22:42:57.294113 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgtjx\" (UniqueName: \"kubernetes.io/projected/2d5613ba-3d64-49d7-bba7-7b828e1c2948-kube-api-access-jgtjx\") pod \"redhat-marketplace-8pwv8\" (UID: \"2d5613ba-3d64-49d7-bba7-7b828e1c2948\") " pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:42:57 crc kubenswrapper[4795]: I0219 22:42:57.294277 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d5613ba-3d64-49d7-bba7-7b828e1c2948-catalog-content\") pod \"redhat-marketplace-8pwv8\" (UID: \"2d5613ba-3d64-49d7-bba7-7b828e1c2948\") " pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:42:57 crc kubenswrapper[4795]: I0219 22:42:57.395049 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d5613ba-3d64-49d7-bba7-7b828e1c2948-catalog-content\") pod \"redhat-marketplace-8pwv8\" (UID: \"2d5613ba-3d64-49d7-bba7-7b828e1c2948\") " pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:42:57 crc kubenswrapper[4795]: I0219 22:42:57.395126 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d5613ba-3d64-49d7-bba7-7b828e1c2948-utilities\") pod \"redhat-marketplace-8pwv8\" (UID: \"2d5613ba-3d64-49d7-bba7-7b828e1c2948\") " pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:42:57 crc kubenswrapper[4795]: I0219 22:42:57.395178 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgtjx\" (UniqueName: \"kubernetes.io/projected/2d5613ba-3d64-49d7-bba7-7b828e1c2948-kube-api-access-jgtjx\") pod \"redhat-marketplace-8pwv8\" (UID: \"2d5613ba-3d64-49d7-bba7-7b828e1c2948\") " pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:42:57 crc kubenswrapper[4795]: I0219 22:42:57.395863 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d5613ba-3d64-49d7-bba7-7b828e1c2948-catalog-content\") pod \"redhat-marketplace-8pwv8\" (UID: \"2d5613ba-3d64-49d7-bba7-7b828e1c2948\") " pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:42:57 crc kubenswrapper[4795]: I0219 22:42:57.395971 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d5613ba-3d64-49d7-bba7-7b828e1c2948-utilities\") pod \"redhat-marketplace-8pwv8\" (UID: \"2d5613ba-3d64-49d7-bba7-7b828e1c2948\") " pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:42:57 crc kubenswrapper[4795]: I0219 22:42:57.416378 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgtjx\" (UniqueName: \"kubernetes.io/projected/2d5613ba-3d64-49d7-bba7-7b828e1c2948-kube-api-access-jgtjx\") pod \"redhat-marketplace-8pwv8\" (UID: \"2d5613ba-3d64-49d7-bba7-7b828e1c2948\") " pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:42:57 crc kubenswrapper[4795]: I0219 22:42:57.522631 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:42:57 crc kubenswrapper[4795]: I0219 22:42:57.976328 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8pwv8"] Feb 19 22:42:58 crc kubenswrapper[4795]: I0219 22:42:58.511514 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:42:58 crc kubenswrapper[4795]: E0219 22:42:58.513608 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:42:58 crc kubenswrapper[4795]: I0219 22:42:58.972208 4795 generic.go:334] "Generic (PLEG): container finished" podID="2d5613ba-3d64-49d7-bba7-7b828e1c2948" containerID="ee34c575257d01581af5973462b336155233c634cf235cab5c2bdd649b3174f1" exitCode=0 Feb 19 22:42:58 crc kubenswrapper[4795]: I0219 22:42:58.972254 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pwv8" event={"ID":"2d5613ba-3d64-49d7-bba7-7b828e1c2948","Type":"ContainerDied","Data":"ee34c575257d01581af5973462b336155233c634cf235cab5c2bdd649b3174f1"} Feb 19 22:42:58 crc kubenswrapper[4795]: I0219 22:42:58.972283 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pwv8" event={"ID":"2d5613ba-3d64-49d7-bba7-7b828e1c2948","Type":"ContainerStarted","Data":"2f63c0c313b1177cbb92f2f2f5734fd3bb3497394ee74fa185fbfa12b567e96a"} Feb 19 22:42:58 crc kubenswrapper[4795]: I0219 22:42:58.975308 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 22:42:59 crc kubenswrapper[4795]: I0219 22:42:59.986114 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pwv8" event={"ID":"2d5613ba-3d64-49d7-bba7-7b828e1c2948","Type":"ContainerStarted","Data":"28c5efc54603a1cc911245b17436015b2671604c79ee3a551ddf2bc0c76f632e"} Feb 19 22:43:00 crc kubenswrapper[4795]: I0219 22:43:00.994940 4795 generic.go:334] "Generic (PLEG): container finished" podID="2d5613ba-3d64-49d7-bba7-7b828e1c2948" containerID="28c5efc54603a1cc911245b17436015b2671604c79ee3a551ddf2bc0c76f632e" exitCode=0 Feb 19 22:43:00 crc kubenswrapper[4795]: I0219 22:43:00.994983 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pwv8" event={"ID":"2d5613ba-3d64-49d7-bba7-7b828e1c2948","Type":"ContainerDied","Data":"28c5efc54603a1cc911245b17436015b2671604c79ee3a551ddf2bc0c76f632e"} Feb 19 22:43:02 crc kubenswrapper[4795]: I0219 22:43:02.004863 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pwv8" event={"ID":"2d5613ba-3d64-49d7-bba7-7b828e1c2948","Type":"ContainerStarted","Data":"d4bef00c8fcb6636b96945d0469617b366fc37593fd54b1ea94e9ca5c207ab22"} Feb 19 22:43:02 crc kubenswrapper[4795]: I0219 22:43:02.026409 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8pwv8" podStartSLOduration=2.374421014 podStartE2EDuration="5.026389137s" podCreationTimestamp="2026-02-19 22:42:57 +0000 UTC" firstStartedPulling="2026-02-19 22:42:58.974860907 +0000 UTC m=+4490.167378811" lastFinishedPulling="2026-02-19 22:43:01.62682907 +0000 UTC m=+4492.819346934" observedRunningTime="2026-02-19 22:43:02.025156011 +0000 UTC m=+4493.217673945" watchObservedRunningTime="2026-02-19 22:43:02.026389137 +0000 UTC m=+4493.218907001" Feb 19 22:43:07 crc kubenswrapper[4795]: I0219 22:43:07.523301 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:43:07 crc kubenswrapper[4795]: I0219 22:43:07.524335 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:43:07 crc kubenswrapper[4795]: I0219 22:43:07.569155 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:43:08 crc kubenswrapper[4795]: I0219 22:43:08.109673 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:43:08 crc kubenswrapper[4795]: I0219 22:43:08.179091 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8pwv8"] Feb 19 22:43:09 crc kubenswrapper[4795]: I0219 22:43:09.521026 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:43:09 crc kubenswrapper[4795]: E0219 22:43:09.522216 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:43:10 crc kubenswrapper[4795]: I0219 22:43:10.064910 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8pwv8" podUID="2d5613ba-3d64-49d7-bba7-7b828e1c2948" containerName="registry-server" containerID="cri-o://d4bef00c8fcb6636b96945d0469617b366fc37593fd54b1ea94e9ca5c207ab22" gracePeriod=2 Feb 19 22:43:10 crc kubenswrapper[4795]: I0219 22:43:10.564000 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:43:10 crc kubenswrapper[4795]: I0219 22:43:10.691267 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d5613ba-3d64-49d7-bba7-7b828e1c2948-utilities\") pod \"2d5613ba-3d64-49d7-bba7-7b828e1c2948\" (UID: \"2d5613ba-3d64-49d7-bba7-7b828e1c2948\") " Feb 19 22:43:10 crc kubenswrapper[4795]: I0219 22:43:10.691645 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d5613ba-3d64-49d7-bba7-7b828e1c2948-catalog-content\") pod \"2d5613ba-3d64-49d7-bba7-7b828e1c2948\" (UID: \"2d5613ba-3d64-49d7-bba7-7b828e1c2948\") " Feb 19 22:43:10 crc kubenswrapper[4795]: I0219 22:43:10.691718 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgtjx\" (UniqueName: \"kubernetes.io/projected/2d5613ba-3d64-49d7-bba7-7b828e1c2948-kube-api-access-jgtjx\") pod \"2d5613ba-3d64-49d7-bba7-7b828e1c2948\" (UID: \"2d5613ba-3d64-49d7-bba7-7b828e1c2948\") " Feb 19 22:43:10 crc kubenswrapper[4795]: I0219 22:43:10.694251 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d5613ba-3d64-49d7-bba7-7b828e1c2948-utilities" (OuterVolumeSpecName: "utilities") pod "2d5613ba-3d64-49d7-bba7-7b828e1c2948" (UID: "2d5613ba-3d64-49d7-bba7-7b828e1c2948"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:43:10 crc kubenswrapper[4795]: I0219 22:43:10.696877 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d5613ba-3d64-49d7-bba7-7b828e1c2948-kube-api-access-jgtjx" (OuterVolumeSpecName: "kube-api-access-jgtjx") pod "2d5613ba-3d64-49d7-bba7-7b828e1c2948" (UID: "2d5613ba-3d64-49d7-bba7-7b828e1c2948"). InnerVolumeSpecName "kube-api-access-jgtjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:43:10 crc kubenswrapper[4795]: I0219 22:43:10.715533 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d5613ba-3d64-49d7-bba7-7b828e1c2948-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d5613ba-3d64-49d7-bba7-7b828e1c2948" (UID: "2d5613ba-3d64-49d7-bba7-7b828e1c2948"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:43:10 crc kubenswrapper[4795]: I0219 22:43:10.793408 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d5613ba-3d64-49d7-bba7-7b828e1c2948-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:43:10 crc kubenswrapper[4795]: I0219 22:43:10.793443 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d5613ba-3d64-49d7-bba7-7b828e1c2948-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:43:10 crc kubenswrapper[4795]: I0219 22:43:10.793458 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgtjx\" (UniqueName: \"kubernetes.io/projected/2d5613ba-3d64-49d7-bba7-7b828e1c2948-kube-api-access-jgtjx\") on node \"crc\" DevicePath \"\"" Feb 19 22:43:11 crc kubenswrapper[4795]: I0219 22:43:11.075034 4795 generic.go:334] "Generic (PLEG): container finished" podID="2d5613ba-3d64-49d7-bba7-7b828e1c2948" containerID="d4bef00c8fcb6636b96945d0469617b366fc37593fd54b1ea94e9ca5c207ab22" exitCode=0 Feb 19 22:43:11 crc kubenswrapper[4795]: I0219 22:43:11.075103 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pwv8" event={"ID":"2d5613ba-3d64-49d7-bba7-7b828e1c2948","Type":"ContainerDied","Data":"d4bef00c8fcb6636b96945d0469617b366fc37593fd54b1ea94e9ca5c207ab22"} Feb 19 22:43:11 crc kubenswrapper[4795]: I0219 22:43:11.075223 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:43:11 crc kubenswrapper[4795]: I0219 22:43:11.075252 4795 scope.go:117] "RemoveContainer" containerID="d4bef00c8fcb6636b96945d0469617b366fc37593fd54b1ea94e9ca5c207ab22" Feb 19 22:43:11 crc kubenswrapper[4795]: I0219 22:43:11.075238 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pwv8" event={"ID":"2d5613ba-3d64-49d7-bba7-7b828e1c2948","Type":"ContainerDied","Data":"2f63c0c313b1177cbb92f2f2f5734fd3bb3497394ee74fa185fbfa12b567e96a"} Feb 19 22:43:11 crc kubenswrapper[4795]: I0219 22:43:11.103093 4795 scope.go:117] "RemoveContainer" containerID="28c5efc54603a1cc911245b17436015b2671604c79ee3a551ddf2bc0c76f632e" Feb 19 22:43:11 crc kubenswrapper[4795]: I0219 22:43:11.127672 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8pwv8"] Feb 19 22:43:11 crc kubenswrapper[4795]: I0219 22:43:11.138021 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8pwv8"] Feb 19 22:43:11 crc kubenswrapper[4795]: I0219 22:43:11.141184 4795 scope.go:117] "RemoveContainer" containerID="ee34c575257d01581af5973462b336155233c634cf235cab5c2bdd649b3174f1" Feb 19 22:43:11 crc kubenswrapper[4795]: I0219 22:43:11.159719 4795 scope.go:117] "RemoveContainer" containerID="d4bef00c8fcb6636b96945d0469617b366fc37593fd54b1ea94e9ca5c207ab22" Feb 19 22:43:11 crc kubenswrapper[4795]: E0219 22:43:11.160329 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4bef00c8fcb6636b96945d0469617b366fc37593fd54b1ea94e9ca5c207ab22\": container with ID starting with d4bef00c8fcb6636b96945d0469617b366fc37593fd54b1ea94e9ca5c207ab22 not found: ID does not exist" containerID="d4bef00c8fcb6636b96945d0469617b366fc37593fd54b1ea94e9ca5c207ab22" Feb 19 22:43:11 crc kubenswrapper[4795]: I0219 22:43:11.160369 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4bef00c8fcb6636b96945d0469617b366fc37593fd54b1ea94e9ca5c207ab22"} err="failed to get container status \"d4bef00c8fcb6636b96945d0469617b366fc37593fd54b1ea94e9ca5c207ab22\": rpc error: code = NotFound desc = could not find container \"d4bef00c8fcb6636b96945d0469617b366fc37593fd54b1ea94e9ca5c207ab22\": container with ID starting with d4bef00c8fcb6636b96945d0469617b366fc37593fd54b1ea94e9ca5c207ab22 not found: ID does not exist" Feb 19 22:43:11 crc kubenswrapper[4795]: I0219 22:43:11.160395 4795 scope.go:117] "RemoveContainer" containerID="28c5efc54603a1cc911245b17436015b2671604c79ee3a551ddf2bc0c76f632e" Feb 19 22:43:11 crc kubenswrapper[4795]: E0219 22:43:11.160744 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28c5efc54603a1cc911245b17436015b2671604c79ee3a551ddf2bc0c76f632e\": container with ID starting with 28c5efc54603a1cc911245b17436015b2671604c79ee3a551ddf2bc0c76f632e not found: ID does not exist" containerID="28c5efc54603a1cc911245b17436015b2671604c79ee3a551ddf2bc0c76f632e" Feb 19 22:43:11 crc kubenswrapper[4795]: I0219 22:43:11.160804 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28c5efc54603a1cc911245b17436015b2671604c79ee3a551ddf2bc0c76f632e"} err="failed to get container status \"28c5efc54603a1cc911245b17436015b2671604c79ee3a551ddf2bc0c76f632e\": rpc error: code = NotFound desc = could not find container \"28c5efc54603a1cc911245b17436015b2671604c79ee3a551ddf2bc0c76f632e\": container with ID starting with 28c5efc54603a1cc911245b17436015b2671604c79ee3a551ddf2bc0c76f632e not found: ID does not exist" Feb 19 22:43:11 crc kubenswrapper[4795]: I0219 22:43:11.160847 4795 scope.go:117] "RemoveContainer" containerID="ee34c575257d01581af5973462b336155233c634cf235cab5c2bdd649b3174f1" Feb 19 22:43:11 crc kubenswrapper[4795]: E0219 22:43:11.161362 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee34c575257d01581af5973462b336155233c634cf235cab5c2bdd649b3174f1\": container with ID starting with ee34c575257d01581af5973462b336155233c634cf235cab5c2bdd649b3174f1 not found: ID does not exist" containerID="ee34c575257d01581af5973462b336155233c634cf235cab5c2bdd649b3174f1" Feb 19 22:43:11 crc kubenswrapper[4795]: I0219 22:43:11.161402 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee34c575257d01581af5973462b336155233c634cf235cab5c2bdd649b3174f1"} err="failed to get container status \"ee34c575257d01581af5973462b336155233c634cf235cab5c2bdd649b3174f1\": rpc error: code = NotFound desc = could not find container \"ee34c575257d01581af5973462b336155233c634cf235cab5c2bdd649b3174f1\": container with ID starting with ee34c575257d01581af5973462b336155233c634cf235cab5c2bdd649b3174f1 not found: ID does not exist" Feb 19 22:43:11 crc kubenswrapper[4795]: I0219 22:43:11.526669 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d5613ba-3d64-49d7-bba7-7b828e1c2948" path="/var/lib/kubelet/pods/2d5613ba-3d64-49d7-bba7-7b828e1c2948/volumes" Feb 19 22:43:19 crc kubenswrapper[4795]: I0219 22:43:19.843076 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:43:19 crc kubenswrapper[4795]: E0219 22:43:19.845090 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:43:33 crc kubenswrapper[4795]: I0219 22:43:33.514492 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:43:33 crc kubenswrapper[4795]: E0219 22:43:33.515505 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:43:45 crc kubenswrapper[4795]: I0219 22:43:45.512587 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:43:45 crc kubenswrapper[4795]: E0219 22:43:45.513768 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:43:59 crc kubenswrapper[4795]: I0219 22:43:59.516845 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:43:59 crc kubenswrapper[4795]: E0219 22:43:59.517627 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:44:13 crc kubenswrapper[4795]: I0219 22:44:13.515499 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:44:13 crc kubenswrapper[4795]: E0219 22:44:13.516813 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:44:24 crc kubenswrapper[4795]: I0219 22:44:24.512071 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:44:24 crc kubenswrapper[4795]: E0219 22:44:24.512716 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.444503 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-rn4dm"] Feb 19 22:44:28 crc kubenswrapper[4795]: E0219 22:44:28.445095 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d5613ba-3d64-49d7-bba7-7b828e1c2948" containerName="registry-server" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.445108 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d5613ba-3d64-49d7-bba7-7b828e1c2948" containerName="registry-server" Feb 19 22:44:28 crc kubenswrapper[4795]: E0219 22:44:28.445122 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d5613ba-3d64-49d7-bba7-7b828e1c2948" containerName="extract-utilities" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.445128 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d5613ba-3d64-49d7-bba7-7b828e1c2948" containerName="extract-utilities" Feb 19 22:44:28 crc kubenswrapper[4795]: E0219 22:44:28.445144 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d5613ba-3d64-49d7-bba7-7b828e1c2948" containerName="extract-content" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.445150 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d5613ba-3d64-49d7-bba7-7b828e1c2948" containerName="extract-content" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.445299 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d5613ba-3d64-49d7-bba7-7b828e1c2948" containerName="registry-server" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.445944 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.448134 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.448196 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.448659 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.449880 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.451511 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-bbh84" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.469713 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-rn4dm"] Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.604976 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhmhn\" (UniqueName: \"kubernetes.io/projected/b743a36e-23aa-4a29-b400-a91ed0788bd7-kube-api-access-lhmhn\") pod \"dnsmasq-dns-7c4c8f55b5-rn4dm\" (UID: \"b743a36e-23aa-4a29-b400-a91ed0788bd7\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.605418 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b743a36e-23aa-4a29-b400-a91ed0788bd7-config\") pod \"dnsmasq-dns-7c4c8f55b5-rn4dm\" (UID: \"b743a36e-23aa-4a29-b400-a91ed0788bd7\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.605453 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b743a36e-23aa-4a29-b400-a91ed0788bd7-dns-svc\") pod \"dnsmasq-dns-7c4c8f55b5-rn4dm\" (UID: \"b743a36e-23aa-4a29-b400-a91ed0788bd7\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.706600 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b743a36e-23aa-4a29-b400-a91ed0788bd7-config\") pod \"dnsmasq-dns-7c4c8f55b5-rn4dm\" (UID: \"b743a36e-23aa-4a29-b400-a91ed0788bd7\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.706675 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b743a36e-23aa-4a29-b400-a91ed0788bd7-dns-svc\") pod \"dnsmasq-dns-7c4c8f55b5-rn4dm\" (UID: \"b743a36e-23aa-4a29-b400-a91ed0788bd7\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.706745 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhmhn\" (UniqueName: \"kubernetes.io/projected/b743a36e-23aa-4a29-b400-a91ed0788bd7-kube-api-access-lhmhn\") pod \"dnsmasq-dns-7c4c8f55b5-rn4dm\" (UID: \"b743a36e-23aa-4a29-b400-a91ed0788bd7\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.707670 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b743a36e-23aa-4a29-b400-a91ed0788bd7-config\") pod \"dnsmasq-dns-7c4c8f55b5-rn4dm\" (UID: \"b743a36e-23aa-4a29-b400-a91ed0788bd7\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.707734 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b743a36e-23aa-4a29-b400-a91ed0788bd7-dns-svc\") pod \"dnsmasq-dns-7c4c8f55b5-rn4dm\" (UID: \"b743a36e-23aa-4a29-b400-a91ed0788bd7\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.713422 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-589cf688cc-vf5s4"] Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.714990 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.733810 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589cf688cc-vf5s4"] Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.740031 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhmhn\" (UniqueName: \"kubernetes.io/projected/b743a36e-23aa-4a29-b400-a91ed0788bd7-kube-api-access-lhmhn\") pod \"dnsmasq-dns-7c4c8f55b5-rn4dm\" (UID: \"b743a36e-23aa-4a29-b400-a91ed0788bd7\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.808579 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-784t4\" (UniqueName: \"kubernetes.io/projected/cc119db7-03db-4838-b663-f244b7f93433-kube-api-access-784t4\") pod \"dnsmasq-dns-589cf688cc-vf5s4\" (UID: \"cc119db7-03db-4838-b663-f244b7f93433\") " pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.808684 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc119db7-03db-4838-b663-f244b7f93433-config\") pod \"dnsmasq-dns-589cf688cc-vf5s4\" (UID: \"cc119db7-03db-4838-b663-f244b7f93433\") " pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.808777 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc119db7-03db-4838-b663-f244b7f93433-dns-svc\") pod \"dnsmasq-dns-589cf688cc-vf5s4\" (UID: \"cc119db7-03db-4838-b663-f244b7f93433\") " pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.814383 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.910586 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-784t4\" (UniqueName: \"kubernetes.io/projected/cc119db7-03db-4838-b663-f244b7f93433-kube-api-access-784t4\") pod \"dnsmasq-dns-589cf688cc-vf5s4\" (UID: \"cc119db7-03db-4838-b663-f244b7f93433\") " pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.910656 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc119db7-03db-4838-b663-f244b7f93433-config\") pod \"dnsmasq-dns-589cf688cc-vf5s4\" (UID: \"cc119db7-03db-4838-b663-f244b7f93433\") " pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.910683 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc119db7-03db-4838-b663-f244b7f93433-dns-svc\") pod \"dnsmasq-dns-589cf688cc-vf5s4\" (UID: \"cc119db7-03db-4838-b663-f244b7f93433\") " pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.912054 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc119db7-03db-4838-b663-f244b7f93433-dns-svc\") pod \"dnsmasq-dns-589cf688cc-vf5s4\" (UID: \"cc119db7-03db-4838-b663-f244b7f93433\") " pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.912109 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc119db7-03db-4838-b663-f244b7f93433-config\") pod \"dnsmasq-dns-589cf688cc-vf5s4\" (UID: \"cc119db7-03db-4838-b663-f244b7f93433\") " pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.933127 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-784t4\" (UniqueName: \"kubernetes.io/projected/cc119db7-03db-4838-b663-f244b7f93433-kube-api-access-784t4\") pod \"dnsmasq-dns-589cf688cc-vf5s4\" (UID: \"cc119db7-03db-4838-b663-f244b7f93433\") " pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.029633 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.270193 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-rn4dm"] Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.483000 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589cf688cc-vf5s4"] Feb 19 22:44:29 crc kubenswrapper[4795]: W0219 22:44:29.524790 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc119db7_03db_4838_b663_f244b7f93433.slice/crio-8e504bc8e8afcb8d881cb268684b0b1a7bedc4275f260a1ee4b205b460ca8f3e WatchSource:0}: Error finding container 8e504bc8e8afcb8d881cb268684b0b1a7bedc4275f260a1ee4b205b460ca8f3e: Status 404 returned error can't find the container with id 8e504bc8e8afcb8d881cb268684b0b1a7bedc4275f260a1ee4b205b460ca8f3e Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.590912 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.592255 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.593765 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.603669 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.603671 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.603733 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.604029 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-rbx8c" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.604486 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.723258 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.723304 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9cd04173-2975-46bd-8602-f6561387d717-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.723325 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9cd04173-2975-46bd-8602-f6561387d717-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.723454 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.723509 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9cd04173-2975-46bd-8602-f6561387d717-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.723527 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnbmk\" (UniqueName: \"kubernetes.io/projected/9cd04173-2975-46bd-8602-f6561387d717-kube-api-access-wnbmk\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.723558 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.723635 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.723701 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9cd04173-2975-46bd-8602-f6561387d717-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.749790 4795 generic.go:334] "Generic (PLEG): container finished" podID="b743a36e-23aa-4a29-b400-a91ed0788bd7" containerID="e20ade87f2be3af18d652fc7a8d168566bbd9d8f1fa0af3e00e5b04add98e2f7" exitCode=0 Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.749880 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" event={"ID":"b743a36e-23aa-4a29-b400-a91ed0788bd7","Type":"ContainerDied","Data":"e20ade87f2be3af18d652fc7a8d168566bbd9d8f1fa0af3e00e5b04add98e2f7"} Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.749913 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" event={"ID":"b743a36e-23aa-4a29-b400-a91ed0788bd7","Type":"ContainerStarted","Data":"01c621c9ebf25431f41781d9a945a324e3f1e0ba1f3afbd4aaf02e91fa196557"} Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.751385 4795 generic.go:334] "Generic (PLEG): container finished" podID="cc119db7-03db-4838-b663-f244b7f93433" containerID="6c19182bc802a37ab3a253eaa33c548a0f3f15e1ee3c00a8c016db959a5b8557" exitCode=0 Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.751444 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" event={"ID":"cc119db7-03db-4838-b663-f244b7f93433","Type":"ContainerDied","Data":"6c19182bc802a37ab3a253eaa33c548a0f3f15e1ee3c00a8c016db959a5b8557"} Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.751460 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" event={"ID":"cc119db7-03db-4838-b663-f244b7f93433","Type":"ContainerStarted","Data":"8e504bc8e8afcb8d881cb268684b0b1a7bedc4275f260a1ee4b205b460ca8f3e"} Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.826326 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.826506 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.826635 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9cd04173-2975-46bd-8602-f6561387d717-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.826787 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.826843 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9cd04173-2975-46bd-8602-f6561387d717-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.826901 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9cd04173-2975-46bd-8602-f6561387d717-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.827032 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.827133 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9cd04173-2975-46bd-8602-f6561387d717-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.827212 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnbmk\" (UniqueName: \"kubernetes.io/projected/9cd04173-2975-46bd-8602-f6561387d717-kube-api-access-wnbmk\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.828605 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.829273 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.830786 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9cd04173-2975-46bd-8602-f6561387d717-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.833296 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.833328 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/68b77c70021788ffaf78dfe86ddece9c7d3d5c9cffb40f46dc15f8b79fc094aa/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.833772 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9cd04173-2975-46bd-8602-f6561387d717-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.835879 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9cd04173-2975-46bd-8602-f6561387d717-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.835993 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.836279 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9cd04173-2975-46bd-8602-f6561387d717-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.851687 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnbmk\" (UniqueName: \"kubernetes.io/projected/9cd04173-2975-46bd-8602-f6561387d717-kube-api-access-wnbmk\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.908637 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.908664 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.914849 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.919985 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.920051 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.920213 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.920587 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.922607 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-rh88g" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.928818 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.955846 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.030471 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.030873 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.031115 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.031317 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.031477 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24vlt\" (UniqueName: \"kubernetes.io/projected/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-kube-api-access-24vlt\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.031625 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.031830 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.031989 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.032122 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.133839 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.133882 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.135224 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.135275 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.135309 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.135365 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.135409 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.135453 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24vlt\" (UniqueName: \"kubernetes.io/projected/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-kube-api-access-24vlt\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.135476 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.136886 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.137257 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.138083 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.138437 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.142804 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.149303 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.149514 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.150311 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.150354 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ce7e4e38eafbc5e4f450d6daa37e1c166de00b3db9bc92a3ff92b374f626b390/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.168308 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24vlt\" (UniqueName: \"kubernetes.io/projected/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-kube-api-access-24vlt\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.186353 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.242972 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.418051 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 22:44:30 crc kubenswrapper[4795]: W0219 22:44:30.421956 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cd04173_2975_46bd_8602_f6561387d717.slice/crio-8ac632b756081c55272b5e493bca0401eb5a5d61edfe7cea78f2545acf8b31bd WatchSource:0}: Error finding container 8ac632b756081c55272b5e493bca0401eb5a5d61edfe7cea78f2545acf8b31bd: Status 404 returned error can't find the container with id 8ac632b756081c55272b5e493bca0401eb5a5d61edfe7cea78f2545acf8b31bd Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.658735 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 22:44:30 crc kubenswrapper[4795]: W0219 22:44:30.663491 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd43ca2d_c7e3_4fc7_84a5_74b50cadd268.slice/crio-e602b9b95a59c91b0341f4b50dac51b129fce0661e01cc211b5bddd7e49efb66 WatchSource:0}: Error finding container e602b9b95a59c91b0341f4b50dac51b129fce0661e01cc211b5bddd7e49efb66: Status 404 returned error can't find the container with id e602b9b95a59c91b0341f4b50dac51b129fce0661e01cc211b5bddd7e49efb66 Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.769733 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" event={"ID":"b743a36e-23aa-4a29-b400-a91ed0788bd7","Type":"ContainerStarted","Data":"72948b55c5297e74f47fec1c5b0abad9248f67622c9adf8cda348359b3335712"} Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.770189 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.772904 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9cd04173-2975-46bd-8602-f6561387d717","Type":"ContainerStarted","Data":"8ac632b756081c55272b5e493bca0401eb5a5d61edfe7cea78f2545acf8b31bd"} Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.778918 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" event={"ID":"cc119db7-03db-4838-b663-f244b7f93433","Type":"ContainerStarted","Data":"64136e69fe434b4b0bfaa8f633f2393606df65c80063c9e28ce2ef61c86da556"} Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.779133 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.780677 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268","Type":"ContainerStarted","Data":"e602b9b95a59c91b0341f4b50dac51b129fce0661e01cc211b5bddd7e49efb66"} Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.795714 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" podStartSLOduration=2.795685182 podStartE2EDuration="2.795685182s" podCreationTimestamp="2026-02-19 22:44:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:44:30.791212003 +0000 UTC m=+4581.983729857" watchObservedRunningTime="2026-02-19 22:44:30.795685182 +0000 UTC m=+4581.988203076" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.810092 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" podStartSLOduration=2.8100759760000003 podStartE2EDuration="2.810075976s" podCreationTimestamp="2026-02-19 22:44:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:44:30.806532534 +0000 UTC m=+4581.999050398" watchObservedRunningTime="2026-02-19 22:44:30.810075976 +0000 UTC m=+4582.002593830" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.244866 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.246876 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.249783 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-phh6q" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.250058 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.250072 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.250878 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.260343 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.271988 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.352444 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24345708-df30-4486-bc7e-44eaa7722ffd-operator-scripts\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.352558 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/24345708-df30-4486-bc7e-44eaa7722ffd-kolla-config\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.352610 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxqtp\" (UniqueName: \"kubernetes.io/projected/24345708-df30-4486-bc7e-44eaa7722ffd-kube-api-access-cxqtp\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.352643 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/24345708-df30-4486-bc7e-44eaa7722ffd-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.352682 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/24345708-df30-4486-bc7e-44eaa7722ffd-config-data-generated\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.352708 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/24345708-df30-4486-bc7e-44eaa7722ffd-config-data-default\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.352803 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24345708-df30-4486-bc7e-44eaa7722ffd-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.352936 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f83c3f4a-c115-4d4b-8143-ecab40616954\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f83c3f4a-c115-4d4b-8143-ecab40616954\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.454713 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24345708-df30-4486-bc7e-44eaa7722ffd-operator-scripts\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.454774 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/24345708-df30-4486-bc7e-44eaa7722ffd-kolla-config\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.454803 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxqtp\" (UniqueName: \"kubernetes.io/projected/24345708-df30-4486-bc7e-44eaa7722ffd-kube-api-access-cxqtp\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.454820 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/24345708-df30-4486-bc7e-44eaa7722ffd-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.454850 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/24345708-df30-4486-bc7e-44eaa7722ffd-config-data-generated\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.454873 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/24345708-df30-4486-bc7e-44eaa7722ffd-config-data-default\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.454894 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24345708-df30-4486-bc7e-44eaa7722ffd-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.454936 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f83c3f4a-c115-4d4b-8143-ecab40616954\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f83c3f4a-c115-4d4b-8143-ecab40616954\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.455776 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/24345708-df30-4486-bc7e-44eaa7722ffd-config-data-generated\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.456436 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/24345708-df30-4486-bc7e-44eaa7722ffd-kolla-config\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.456553 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/24345708-df30-4486-bc7e-44eaa7722ffd-config-data-default\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.456864 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24345708-df30-4486-bc7e-44eaa7722ffd-operator-scripts\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.460706 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/24345708-df30-4486-bc7e-44eaa7722ffd-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.463910 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24345708-df30-4486-bc7e-44eaa7722ffd-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.464060 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.464144 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f83c3f4a-c115-4d4b-8143-ecab40616954\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f83c3f4a-c115-4d4b-8143-ecab40616954\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1165ef74fbcf4a9deee3143388ab045d5c4f0facc83de427042e5c5065a01ca4/globalmount\"" pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.476104 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxqtp\" (UniqueName: \"kubernetes.io/projected/24345708-df30-4486-bc7e-44eaa7722ffd-kube-api-access-cxqtp\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.494173 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f83c3f4a-c115-4d4b-8143-ecab40616954\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f83c3f4a-c115-4d4b-8143-ecab40616954\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.572791 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.631092 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.631959 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.635663 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-r89zr" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.636914 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.641804 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.765944 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3d516d65-1efc-42ee-ab17-971e2d94e4a7-kolla-config\") pod \"memcached-0\" (UID: \"3d516d65-1efc-42ee-ab17-971e2d94e4a7\") " pod="openstack/memcached-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.766339 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d516d65-1efc-42ee-ab17-971e2d94e4a7-config-data\") pod \"memcached-0\" (UID: \"3d516d65-1efc-42ee-ab17-971e2d94e4a7\") " pod="openstack/memcached-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.766361 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxdjr\" (UniqueName: \"kubernetes.io/projected/3d516d65-1efc-42ee-ab17-971e2d94e4a7-kube-api-access-bxdjr\") pod \"memcached-0\" (UID: \"3d516d65-1efc-42ee-ab17-971e2d94e4a7\") " pod="openstack/memcached-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.788395 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9cd04173-2975-46bd-8602-f6561387d717","Type":"ContainerStarted","Data":"ef3bf36d2dad06468cc4add7eb9c2429b178c38643511387b593eff67aac784a"} Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.792793 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268","Type":"ContainerStarted","Data":"48d9ac5474c94a11a6da74ec61c602260d05f7961e1a831acf6b59d650ddb95f"} Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.867563 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3d516d65-1efc-42ee-ab17-971e2d94e4a7-kolla-config\") pod \"memcached-0\" (UID: \"3d516d65-1efc-42ee-ab17-971e2d94e4a7\") " pod="openstack/memcached-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.867698 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d516d65-1efc-42ee-ab17-971e2d94e4a7-config-data\") pod \"memcached-0\" (UID: \"3d516d65-1efc-42ee-ab17-971e2d94e4a7\") " pod="openstack/memcached-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.867727 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxdjr\" (UniqueName: \"kubernetes.io/projected/3d516d65-1efc-42ee-ab17-971e2d94e4a7-kube-api-access-bxdjr\") pod \"memcached-0\" (UID: \"3d516d65-1efc-42ee-ab17-971e2d94e4a7\") " pod="openstack/memcached-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.868962 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3d516d65-1efc-42ee-ab17-971e2d94e4a7-kolla-config\") pod \"memcached-0\" (UID: \"3d516d65-1efc-42ee-ab17-971e2d94e4a7\") " pod="openstack/memcached-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.869029 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d516d65-1efc-42ee-ab17-971e2d94e4a7-config-data\") pod \"memcached-0\" (UID: \"3d516d65-1efc-42ee-ab17-971e2d94e4a7\") " pod="openstack/memcached-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.884066 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxdjr\" (UniqueName: \"kubernetes.io/projected/3d516d65-1efc-42ee-ab17-971e2d94e4a7-kube-api-access-bxdjr\") pod \"memcached-0\" (UID: \"3d516d65-1efc-42ee-ab17-971e2d94e4a7\") " pod="openstack/memcached-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.991992 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.102864 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 19 22:44:32 crc kubenswrapper[4795]: W0219 22:44:32.111764 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24345708_df30_4486_bc7e_44eaa7722ffd.slice/crio-cf85a9ef7b807f9974ab2cbe53b98edbf6af52cd1621fe033355c6cbf1228f54 WatchSource:0}: Error finding container cf85a9ef7b807f9974ab2cbe53b98edbf6af52cd1621fe033355c6cbf1228f54: Status 404 returned error can't find the container with id cf85a9ef7b807f9974ab2cbe53b98edbf6af52cd1621fe033355c6cbf1228f54 Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.459982 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.719439 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.720600 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.724330 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-dpr8j" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.725423 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.725695 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.729664 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.745439 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.784371 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8f55130-d799-45ef-b174-450b6c3b52ff-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.784520 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c8f55130-d799-45ef-b174-450b6c3b52ff-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.784555 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c8f55130-d799-45ef-b174-450b6c3b52ff-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.784595 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a8efa8ec-ded9-4ef5-a87c-fbff73e9dea7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a8efa8ec-ded9-4ef5-a87c-fbff73e9dea7\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.784641 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8f55130-d799-45ef-b174-450b6c3b52ff-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.784698 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8f55130-d799-45ef-b174-450b6c3b52ff-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.784738 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56864\" (UniqueName: \"kubernetes.io/projected/c8f55130-d799-45ef-b174-450b6c3b52ff-kube-api-access-56864\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.784809 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c8f55130-d799-45ef-b174-450b6c3b52ff-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.802648 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3d516d65-1efc-42ee-ab17-971e2d94e4a7","Type":"ContainerStarted","Data":"542e1c232194402e8e8b9ea4bb6f613c9c7838e74833e76b1c80ddf15722c8b6"} Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.812884 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"24345708-df30-4486-bc7e-44eaa7722ffd","Type":"ContainerStarted","Data":"98338aaa8b7058583c69c89378f49b27fd375f134eb71210d2b4a7d131e400c2"} Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.812935 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"24345708-df30-4486-bc7e-44eaa7722ffd","Type":"ContainerStarted","Data":"cf85a9ef7b807f9974ab2cbe53b98edbf6af52cd1621fe033355c6cbf1228f54"} Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.896929 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a8efa8ec-ded9-4ef5-a87c-fbff73e9dea7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a8efa8ec-ded9-4ef5-a87c-fbff73e9dea7\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.896987 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8f55130-d799-45ef-b174-450b6c3b52ff-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.897074 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8f55130-d799-45ef-b174-450b6c3b52ff-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.897119 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56864\" (UniqueName: \"kubernetes.io/projected/c8f55130-d799-45ef-b174-450b6c3b52ff-kube-api-access-56864\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.897198 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c8f55130-d799-45ef-b174-450b6c3b52ff-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.897530 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8f55130-d799-45ef-b174-450b6c3b52ff-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.897697 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c8f55130-d799-45ef-b174-450b6c3b52ff-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.897729 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c8f55130-d799-45ef-b174-450b6c3b52ff-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.901598 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c8f55130-d799-45ef-b174-450b6c3b52ff-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.903641 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c8f55130-d799-45ef-b174-450b6c3b52ff-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.905666 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8f55130-d799-45ef-b174-450b6c3b52ff-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.909367 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8f55130-d799-45ef-b174-450b6c3b52ff-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.909607 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8f55130-d799-45ef-b174-450b6c3b52ff-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.917503 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c8f55130-d799-45ef-b174-450b6c3b52ff-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:33 crc kubenswrapper[4795]: I0219 22:44:33.026019 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:44:33 crc kubenswrapper[4795]: I0219 22:44:33.026087 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a8efa8ec-ded9-4ef5-a87c-fbff73e9dea7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a8efa8ec-ded9-4ef5-a87c-fbff73e9dea7\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a8f4eba54544a651e5637ee119bec0f221e08393ed9dbf224d2d4dc3517dd96b/globalmount\"" pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:33 crc kubenswrapper[4795]: I0219 22:44:33.027820 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56864\" (UniqueName: \"kubernetes.io/projected/c8f55130-d799-45ef-b174-450b6c3b52ff-kube-api-access-56864\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:33 crc kubenswrapper[4795]: I0219 22:44:33.056440 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a8efa8ec-ded9-4ef5-a87c-fbff73e9dea7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a8efa8ec-ded9-4ef5-a87c-fbff73e9dea7\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:33 crc kubenswrapper[4795]: I0219 22:44:33.351651 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:33 crc kubenswrapper[4795]: I0219 22:44:33.827556 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 22:44:33 crc kubenswrapper[4795]: I0219 22:44:33.827835 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3d516d65-1efc-42ee-ab17-971e2d94e4a7","Type":"ContainerStarted","Data":"e8e119ce9753532a404fdc5789740424a476dbf17064d349b96fc359a962c90b"} Feb 19 22:44:33 crc kubenswrapper[4795]: I0219 22:44:33.847399 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.8473679450000002 podStartE2EDuration="2.847367945s" podCreationTimestamp="2026-02-19 22:44:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:44:33.839884959 +0000 UTC m=+4585.032402863" watchObservedRunningTime="2026-02-19 22:44:33.847367945 +0000 UTC m=+4585.039885829" Feb 19 22:44:34 crc kubenswrapper[4795]: W0219 22:44:34.132543 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8f55130_d799_45ef_b174_450b6c3b52ff.slice/crio-0db400379163aac0fd1a842dce6434bcf2d05bbf475831f1adc63f9acfbd742f WatchSource:0}: Error finding container 0db400379163aac0fd1a842dce6434bcf2d05bbf475831f1adc63f9acfbd742f: Status 404 returned error can't find the container with id 0db400379163aac0fd1a842dce6434bcf2d05bbf475831f1adc63f9acfbd742f Feb 19 22:44:34 crc kubenswrapper[4795]: I0219 22:44:34.836477 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c8f55130-d799-45ef-b174-450b6c3b52ff","Type":"ContainerStarted","Data":"c0a61604098544497ff7d80649240bf23ed9ae4b34dabfe24a1174f77e04b797"} Feb 19 22:44:34 crc kubenswrapper[4795]: I0219 22:44:34.836816 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 19 22:44:34 crc kubenswrapper[4795]: I0219 22:44:34.836832 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c8f55130-d799-45ef-b174-450b6c3b52ff","Type":"ContainerStarted","Data":"0db400379163aac0fd1a842dce6434bcf2d05bbf475831f1adc63f9acfbd742f"} Feb 19 22:44:35 crc kubenswrapper[4795]: I0219 22:44:35.512237 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:44:35 crc kubenswrapper[4795]: I0219 22:44:35.846977 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"85a7b1a9bb1dc26b66551125ca13bcd3b5f6f4015b61adf891b31e1f3b13c640"} Feb 19 22:44:36 crc kubenswrapper[4795]: I0219 22:44:36.854786 4795 generic.go:334] "Generic (PLEG): container finished" podID="24345708-df30-4486-bc7e-44eaa7722ffd" containerID="98338aaa8b7058583c69c89378f49b27fd375f134eb71210d2b4a7d131e400c2" exitCode=0 Feb 19 22:44:36 crc kubenswrapper[4795]: I0219 22:44:36.854871 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"24345708-df30-4486-bc7e-44eaa7722ffd","Type":"ContainerDied","Data":"98338aaa8b7058583c69c89378f49b27fd375f134eb71210d2b4a7d131e400c2"} Feb 19 22:44:37 crc kubenswrapper[4795]: I0219 22:44:37.861683 4795 generic.go:334] "Generic (PLEG): container finished" podID="c8f55130-d799-45ef-b174-450b6c3b52ff" containerID="c0a61604098544497ff7d80649240bf23ed9ae4b34dabfe24a1174f77e04b797" exitCode=0 Feb 19 22:44:37 crc kubenswrapper[4795]: I0219 22:44:37.862248 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c8f55130-d799-45ef-b174-450b6c3b52ff","Type":"ContainerDied","Data":"c0a61604098544497ff7d80649240bf23ed9ae4b34dabfe24a1174f77e04b797"} Feb 19 22:44:37 crc kubenswrapper[4795]: I0219 22:44:37.864548 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"24345708-df30-4486-bc7e-44eaa7722ffd","Type":"ContainerStarted","Data":"09d226b956b0ebbd3ab43c1ec12ae8658d98faece1941eba3a2d96a00df4303c"} Feb 19 22:44:37 crc kubenswrapper[4795]: I0219 22:44:37.926908 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=7.926886218 podStartE2EDuration="7.926886218s" podCreationTimestamp="2026-02-19 22:44:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:44:37.919735852 +0000 UTC m=+4589.112253806" watchObservedRunningTime="2026-02-19 22:44:37.926886218 +0000 UTC m=+4589.119404092" Feb 19 22:44:38 crc kubenswrapper[4795]: I0219 22:44:38.817910 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" Feb 19 22:44:38 crc kubenswrapper[4795]: I0219 22:44:38.880251 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c8f55130-d799-45ef-b174-450b6c3b52ff","Type":"ContainerStarted","Data":"b87c22825038b02f1820139804d67606a5e717ce2721120327186d2e9efd4ae2"} Feb 19 22:44:38 crc kubenswrapper[4795]: I0219 22:44:38.918852 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.918827735 podStartE2EDuration="7.918827735s" podCreationTimestamp="2026-02-19 22:44:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:44:38.91554633 +0000 UTC m=+4590.108064194" watchObservedRunningTime="2026-02-19 22:44:38.918827735 +0000 UTC m=+4590.111345629" Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.032455 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.097484 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-rn4dm"] Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.097729 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" podUID="b743a36e-23aa-4a29-b400-a91ed0788bd7" containerName="dnsmasq-dns" containerID="cri-o://72948b55c5297e74f47fec1c5b0abad9248f67622c9adf8cda348359b3335712" gracePeriod=10 Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.625863 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.731647 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b743a36e-23aa-4a29-b400-a91ed0788bd7-dns-svc\") pod \"b743a36e-23aa-4a29-b400-a91ed0788bd7\" (UID: \"b743a36e-23aa-4a29-b400-a91ed0788bd7\") " Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.731800 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b743a36e-23aa-4a29-b400-a91ed0788bd7-config\") pod \"b743a36e-23aa-4a29-b400-a91ed0788bd7\" (UID: \"b743a36e-23aa-4a29-b400-a91ed0788bd7\") " Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.731850 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhmhn\" (UniqueName: \"kubernetes.io/projected/b743a36e-23aa-4a29-b400-a91ed0788bd7-kube-api-access-lhmhn\") pod \"b743a36e-23aa-4a29-b400-a91ed0788bd7\" (UID: \"b743a36e-23aa-4a29-b400-a91ed0788bd7\") " Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.739413 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b743a36e-23aa-4a29-b400-a91ed0788bd7-kube-api-access-lhmhn" (OuterVolumeSpecName: "kube-api-access-lhmhn") pod "b743a36e-23aa-4a29-b400-a91ed0788bd7" (UID: "b743a36e-23aa-4a29-b400-a91ed0788bd7"). InnerVolumeSpecName "kube-api-access-lhmhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.763422 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b743a36e-23aa-4a29-b400-a91ed0788bd7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b743a36e-23aa-4a29-b400-a91ed0788bd7" (UID: "b743a36e-23aa-4a29-b400-a91ed0788bd7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.772520 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b743a36e-23aa-4a29-b400-a91ed0788bd7-config" (OuterVolumeSpecName: "config") pod "b743a36e-23aa-4a29-b400-a91ed0788bd7" (UID: "b743a36e-23aa-4a29-b400-a91ed0788bd7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.833716 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b743a36e-23aa-4a29-b400-a91ed0788bd7-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.833749 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b743a36e-23aa-4a29-b400-a91ed0788bd7-config\") on node \"crc\" DevicePath \"\"" Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.833759 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhmhn\" (UniqueName: \"kubernetes.io/projected/b743a36e-23aa-4a29-b400-a91ed0788bd7-kube-api-access-lhmhn\") on node \"crc\" DevicePath \"\"" Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.910922 4795 generic.go:334] "Generic (PLEG): container finished" podID="b743a36e-23aa-4a29-b400-a91ed0788bd7" containerID="72948b55c5297e74f47fec1c5b0abad9248f67622c9adf8cda348359b3335712" exitCode=0 Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.910985 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" event={"ID":"b743a36e-23aa-4a29-b400-a91ed0788bd7","Type":"ContainerDied","Data":"72948b55c5297e74f47fec1c5b0abad9248f67622c9adf8cda348359b3335712"} Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.911018 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" event={"ID":"b743a36e-23aa-4a29-b400-a91ed0788bd7","Type":"ContainerDied","Data":"01c621c9ebf25431f41781d9a945a324e3f1e0ba1f3afbd4aaf02e91fa196557"} Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.911036 4795 scope.go:117] "RemoveContainer" containerID="72948b55c5297e74f47fec1c5b0abad9248f67622c9adf8cda348359b3335712" Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.911288 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.934424 4795 scope.go:117] "RemoveContainer" containerID="e20ade87f2be3af18d652fc7a8d168566bbd9d8f1fa0af3e00e5b04add98e2f7" Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.962548 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-rn4dm"] Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.967454 4795 scope.go:117] "RemoveContainer" containerID="72948b55c5297e74f47fec1c5b0abad9248f67622c9adf8cda348359b3335712" Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.971243 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-rn4dm"] Feb 19 22:44:39 crc kubenswrapper[4795]: E0219 22:44:39.973334 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72948b55c5297e74f47fec1c5b0abad9248f67622c9adf8cda348359b3335712\": container with ID starting with 72948b55c5297e74f47fec1c5b0abad9248f67622c9adf8cda348359b3335712 not found: ID does not exist" containerID="72948b55c5297e74f47fec1c5b0abad9248f67622c9adf8cda348359b3335712" Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.973391 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72948b55c5297e74f47fec1c5b0abad9248f67622c9adf8cda348359b3335712"} err="failed to get container status \"72948b55c5297e74f47fec1c5b0abad9248f67622c9adf8cda348359b3335712\": rpc error: code = NotFound desc = could not find container \"72948b55c5297e74f47fec1c5b0abad9248f67622c9adf8cda348359b3335712\": container with ID starting with 72948b55c5297e74f47fec1c5b0abad9248f67622c9adf8cda348359b3335712 not found: ID does not exist" Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.973432 4795 scope.go:117] "RemoveContainer" containerID="e20ade87f2be3af18d652fc7a8d168566bbd9d8f1fa0af3e00e5b04add98e2f7" Feb 19 22:44:39 crc kubenswrapper[4795]: E0219 22:44:39.975766 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e20ade87f2be3af18d652fc7a8d168566bbd9d8f1fa0af3e00e5b04add98e2f7\": container with ID starting with e20ade87f2be3af18d652fc7a8d168566bbd9d8f1fa0af3e00e5b04add98e2f7 not found: ID does not exist" containerID="e20ade87f2be3af18d652fc7a8d168566bbd9d8f1fa0af3e00e5b04add98e2f7" Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.975812 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e20ade87f2be3af18d652fc7a8d168566bbd9d8f1fa0af3e00e5b04add98e2f7"} err="failed to get container status \"e20ade87f2be3af18d652fc7a8d168566bbd9d8f1fa0af3e00e5b04add98e2f7\": rpc error: code = NotFound desc = could not find container \"e20ade87f2be3af18d652fc7a8d168566bbd9d8f1fa0af3e00e5b04add98e2f7\": container with ID starting with e20ade87f2be3af18d652fc7a8d168566bbd9d8f1fa0af3e00e5b04add98e2f7 not found: ID does not exist" Feb 19 22:44:41 crc kubenswrapper[4795]: I0219 22:44:41.522508 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b743a36e-23aa-4a29-b400-a91ed0788bd7" path="/var/lib/kubelet/pods/b743a36e-23aa-4a29-b400-a91ed0788bd7/volumes" Feb 19 22:44:41 crc kubenswrapper[4795]: I0219 22:44:41.573010 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 19 22:44:41 crc kubenswrapper[4795]: I0219 22:44:41.573539 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 19 22:44:41 crc kubenswrapper[4795]: I0219 22:44:41.712326 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 19 22:44:41 crc kubenswrapper[4795]: I0219 22:44:41.995020 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 19 22:44:41 crc kubenswrapper[4795]: I0219 22:44:41.999707 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 19 22:44:43 crc kubenswrapper[4795]: I0219 22:44:43.352058 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:43 crc kubenswrapper[4795]: I0219 22:44:43.352516 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:43 crc kubenswrapper[4795]: I0219 22:44:43.741597 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:44 crc kubenswrapper[4795]: I0219 22:44:44.030566 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:50 crc kubenswrapper[4795]: I0219 22:44:50.245108 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-qr9lx"] Feb 19 22:44:50 crc kubenswrapper[4795]: E0219 22:44:50.246160 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b743a36e-23aa-4a29-b400-a91ed0788bd7" containerName="init" Feb 19 22:44:50 crc kubenswrapper[4795]: I0219 22:44:50.246212 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b743a36e-23aa-4a29-b400-a91ed0788bd7" containerName="init" Feb 19 22:44:50 crc kubenswrapper[4795]: E0219 22:44:50.246280 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b743a36e-23aa-4a29-b400-a91ed0788bd7" containerName="dnsmasq-dns" Feb 19 22:44:50 crc kubenswrapper[4795]: I0219 22:44:50.246298 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b743a36e-23aa-4a29-b400-a91ed0788bd7" containerName="dnsmasq-dns" Feb 19 22:44:50 crc kubenswrapper[4795]: I0219 22:44:50.246595 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b743a36e-23aa-4a29-b400-a91ed0788bd7" containerName="dnsmasq-dns" Feb 19 22:44:50 crc kubenswrapper[4795]: I0219 22:44:50.247445 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qr9lx" Feb 19 22:44:50 crc kubenswrapper[4795]: I0219 22:44:50.250375 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 19 22:44:50 crc kubenswrapper[4795]: I0219 22:44:50.262784 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-qr9lx"] Feb 19 22:44:50 crc kubenswrapper[4795]: I0219 22:44:50.425641 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/566c3329-8a98-426c-a847-7bdf7df37653-operator-scripts\") pod \"root-account-create-update-qr9lx\" (UID: \"566c3329-8a98-426c-a847-7bdf7df37653\") " pod="openstack/root-account-create-update-qr9lx" Feb 19 22:44:50 crc kubenswrapper[4795]: I0219 22:44:50.425718 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7tc2\" (UniqueName: \"kubernetes.io/projected/566c3329-8a98-426c-a847-7bdf7df37653-kube-api-access-x7tc2\") pod \"root-account-create-update-qr9lx\" (UID: \"566c3329-8a98-426c-a847-7bdf7df37653\") " pod="openstack/root-account-create-update-qr9lx" Feb 19 22:44:50 crc kubenswrapper[4795]: I0219 22:44:50.526608 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/566c3329-8a98-426c-a847-7bdf7df37653-operator-scripts\") pod \"root-account-create-update-qr9lx\" (UID: \"566c3329-8a98-426c-a847-7bdf7df37653\") " pod="openstack/root-account-create-update-qr9lx" Feb 19 22:44:50 crc kubenswrapper[4795]: I0219 22:44:50.527088 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7tc2\" (UniqueName: \"kubernetes.io/projected/566c3329-8a98-426c-a847-7bdf7df37653-kube-api-access-x7tc2\") pod \"root-account-create-update-qr9lx\" (UID: \"566c3329-8a98-426c-a847-7bdf7df37653\") " pod="openstack/root-account-create-update-qr9lx" Feb 19 22:44:50 crc kubenswrapper[4795]: I0219 22:44:50.528357 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/566c3329-8a98-426c-a847-7bdf7df37653-operator-scripts\") pod \"root-account-create-update-qr9lx\" (UID: \"566c3329-8a98-426c-a847-7bdf7df37653\") " pod="openstack/root-account-create-update-qr9lx" Feb 19 22:44:50 crc kubenswrapper[4795]: I0219 22:44:50.547471 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7tc2\" (UniqueName: \"kubernetes.io/projected/566c3329-8a98-426c-a847-7bdf7df37653-kube-api-access-x7tc2\") pod \"root-account-create-update-qr9lx\" (UID: \"566c3329-8a98-426c-a847-7bdf7df37653\") " pod="openstack/root-account-create-update-qr9lx" Feb 19 22:44:50 crc kubenswrapper[4795]: I0219 22:44:50.591015 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qr9lx" Feb 19 22:44:51 crc kubenswrapper[4795]: W0219 22:44:51.095407 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod566c3329_8a98_426c_a847_7bdf7df37653.slice/crio-4b7825ca392d02ae5a4b56bbd0154b28a035e4c142dd0204cbcf271db6886746 WatchSource:0}: Error finding container 4b7825ca392d02ae5a4b56bbd0154b28a035e4c142dd0204cbcf271db6886746: Status 404 returned error can't find the container with id 4b7825ca392d02ae5a4b56bbd0154b28a035e4c142dd0204cbcf271db6886746 Feb 19 22:44:51 crc kubenswrapper[4795]: I0219 22:44:51.096810 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-qr9lx"] Feb 19 22:44:52 crc kubenswrapper[4795]: I0219 22:44:52.007834 4795 generic.go:334] "Generic (PLEG): container finished" podID="566c3329-8a98-426c-a847-7bdf7df37653" containerID="9acf4103dbab2da287716a026f87427a75c0734ef6734fb45eb23664d9f962a3" exitCode=0 Feb 19 22:44:52 crc kubenswrapper[4795]: I0219 22:44:52.007876 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qr9lx" event={"ID":"566c3329-8a98-426c-a847-7bdf7df37653","Type":"ContainerDied","Data":"9acf4103dbab2da287716a026f87427a75c0734ef6734fb45eb23664d9f962a3"} Feb 19 22:44:52 crc kubenswrapper[4795]: I0219 22:44:52.007906 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qr9lx" event={"ID":"566c3329-8a98-426c-a847-7bdf7df37653","Type":"ContainerStarted","Data":"4b7825ca392d02ae5a4b56bbd0154b28a035e4c142dd0204cbcf271db6886746"} Feb 19 22:44:53 crc kubenswrapper[4795]: I0219 22:44:53.415314 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qr9lx" Feb 19 22:44:53 crc kubenswrapper[4795]: I0219 22:44:53.580874 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/566c3329-8a98-426c-a847-7bdf7df37653-operator-scripts\") pod \"566c3329-8a98-426c-a847-7bdf7df37653\" (UID: \"566c3329-8a98-426c-a847-7bdf7df37653\") " Feb 19 22:44:53 crc kubenswrapper[4795]: I0219 22:44:53.580942 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7tc2\" (UniqueName: \"kubernetes.io/projected/566c3329-8a98-426c-a847-7bdf7df37653-kube-api-access-x7tc2\") pod \"566c3329-8a98-426c-a847-7bdf7df37653\" (UID: \"566c3329-8a98-426c-a847-7bdf7df37653\") " Feb 19 22:44:53 crc kubenswrapper[4795]: I0219 22:44:53.582086 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/566c3329-8a98-426c-a847-7bdf7df37653-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "566c3329-8a98-426c-a847-7bdf7df37653" (UID: "566c3329-8a98-426c-a847-7bdf7df37653"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:44:53 crc kubenswrapper[4795]: I0219 22:44:53.586827 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/566c3329-8a98-426c-a847-7bdf7df37653-kube-api-access-x7tc2" (OuterVolumeSpecName: "kube-api-access-x7tc2") pod "566c3329-8a98-426c-a847-7bdf7df37653" (UID: "566c3329-8a98-426c-a847-7bdf7df37653"). InnerVolumeSpecName "kube-api-access-x7tc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:44:53 crc kubenswrapper[4795]: I0219 22:44:53.682699 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/566c3329-8a98-426c-a847-7bdf7df37653-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:44:53 crc kubenswrapper[4795]: I0219 22:44:53.682738 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7tc2\" (UniqueName: \"kubernetes.io/projected/566c3329-8a98-426c-a847-7bdf7df37653-kube-api-access-x7tc2\") on node \"crc\" DevicePath \"\"" Feb 19 22:44:54 crc kubenswrapper[4795]: I0219 22:44:54.032013 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qr9lx" event={"ID":"566c3329-8a98-426c-a847-7bdf7df37653","Type":"ContainerDied","Data":"4b7825ca392d02ae5a4b56bbd0154b28a035e4c142dd0204cbcf271db6886746"} Feb 19 22:44:54 crc kubenswrapper[4795]: I0219 22:44:54.032324 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b7825ca392d02ae5a4b56bbd0154b28a035e4c142dd0204cbcf271db6886746" Feb 19 22:44:54 crc kubenswrapper[4795]: I0219 22:44:54.032102 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qr9lx" Feb 19 22:44:56 crc kubenswrapper[4795]: I0219 22:44:56.678378 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-qr9lx"] Feb 19 22:44:56 crc kubenswrapper[4795]: I0219 22:44:56.690742 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-qr9lx"] Feb 19 22:44:57 crc kubenswrapper[4795]: I0219 22:44:57.523008 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="566c3329-8a98-426c-a847-7bdf7df37653" path="/var/lib/kubelet/pods/566c3329-8a98-426c-a847-7bdf7df37653/volumes" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.156676 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd"] Feb 19 22:45:00 crc kubenswrapper[4795]: E0219 22:45:00.157388 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="566c3329-8a98-426c-a847-7bdf7df37653" containerName="mariadb-account-create-update" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.157407 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="566c3329-8a98-426c-a847-7bdf7df37653" containerName="mariadb-account-create-update" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.157693 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="566c3329-8a98-426c-a847-7bdf7df37653" containerName="mariadb-account-create-update" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.158361 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.163918 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.164375 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.173800 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd"] Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.304682 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54bacd9c-6bce-433c-972c-3990566baa40-config-volume\") pod \"collect-profiles-29525685-qp6fd\" (UID: \"54bacd9c-6bce-433c-972c-3990566baa40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.304726 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54bacd9c-6bce-433c-972c-3990566baa40-secret-volume\") pod \"collect-profiles-29525685-qp6fd\" (UID: \"54bacd9c-6bce-433c-972c-3990566baa40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.304752 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj8mp\" (UniqueName: \"kubernetes.io/projected/54bacd9c-6bce-433c-972c-3990566baa40-kube-api-access-tj8mp\") pod \"collect-profiles-29525685-qp6fd\" (UID: \"54bacd9c-6bce-433c-972c-3990566baa40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.406644 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54bacd9c-6bce-433c-972c-3990566baa40-config-volume\") pod \"collect-profiles-29525685-qp6fd\" (UID: \"54bacd9c-6bce-433c-972c-3990566baa40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.406944 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54bacd9c-6bce-433c-972c-3990566baa40-secret-volume\") pod \"collect-profiles-29525685-qp6fd\" (UID: \"54bacd9c-6bce-433c-972c-3990566baa40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.407045 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tj8mp\" (UniqueName: \"kubernetes.io/projected/54bacd9c-6bce-433c-972c-3990566baa40-kube-api-access-tj8mp\") pod \"collect-profiles-29525685-qp6fd\" (UID: \"54bacd9c-6bce-433c-972c-3990566baa40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.407798 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54bacd9c-6bce-433c-972c-3990566baa40-config-volume\") pod \"collect-profiles-29525685-qp6fd\" (UID: \"54bacd9c-6bce-433c-972c-3990566baa40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.420416 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54bacd9c-6bce-433c-972c-3990566baa40-secret-volume\") pod \"collect-profiles-29525685-qp6fd\" (UID: \"54bacd9c-6bce-433c-972c-3990566baa40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.424230 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj8mp\" (UniqueName: \"kubernetes.io/projected/54bacd9c-6bce-433c-972c-3990566baa40-kube-api-access-tj8mp\") pod \"collect-profiles-29525685-qp6fd\" (UID: \"54bacd9c-6bce-433c-972c-3990566baa40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.521539 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.937844 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd"] Feb 19 22:45:01 crc kubenswrapper[4795]: I0219 22:45:01.098939 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" event={"ID":"54bacd9c-6bce-433c-972c-3990566baa40","Type":"ContainerStarted","Data":"eca52f37003a7a5168093ed1a2726d31a52843bd3e99b81128785c0ad70b60e9"} Feb 19 22:45:01 crc kubenswrapper[4795]: I0219 22:45:01.099228 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" event={"ID":"54bacd9c-6bce-433c-972c-3990566baa40","Type":"ContainerStarted","Data":"2c2f5859aa5c96d4c7919768d2d7d44779670de8216920b069687351f3e33599"} Feb 19 22:45:01 crc kubenswrapper[4795]: I0219 22:45:01.124526 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" podStartSLOduration=1.124506776 podStartE2EDuration="1.124506776s" podCreationTimestamp="2026-02-19 22:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:45:01.115763445 +0000 UTC m=+4612.308281349" watchObservedRunningTime="2026-02-19 22:45:01.124506776 +0000 UTC m=+4612.317024660" Feb 19 22:45:01 crc kubenswrapper[4795]: I0219 22:45:01.668622 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-vrr5x"] Feb 19 22:45:01 crc kubenswrapper[4795]: I0219 22:45:01.669615 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vrr5x" Feb 19 22:45:01 crc kubenswrapper[4795]: I0219 22:45:01.671565 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 19 22:45:01 crc kubenswrapper[4795]: I0219 22:45:01.683633 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-vrr5x"] Feb 19 22:45:01 crc kubenswrapper[4795]: I0219 22:45:01.729870 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfa8dda8-f620-4331-8909-b10784ceeab8-operator-scripts\") pod \"root-account-create-update-vrr5x\" (UID: \"cfa8dda8-f620-4331-8909-b10784ceeab8\") " pod="openstack/root-account-create-update-vrr5x" Feb 19 22:45:01 crc kubenswrapper[4795]: I0219 22:45:01.729935 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5fbc\" (UniqueName: \"kubernetes.io/projected/cfa8dda8-f620-4331-8909-b10784ceeab8-kube-api-access-s5fbc\") pod \"root-account-create-update-vrr5x\" (UID: \"cfa8dda8-f620-4331-8909-b10784ceeab8\") " pod="openstack/root-account-create-update-vrr5x" Feb 19 22:45:01 crc kubenswrapper[4795]: I0219 22:45:01.831062 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfa8dda8-f620-4331-8909-b10784ceeab8-operator-scripts\") pod \"root-account-create-update-vrr5x\" (UID: \"cfa8dda8-f620-4331-8909-b10784ceeab8\") " pod="openstack/root-account-create-update-vrr5x" Feb 19 22:45:01 crc kubenswrapper[4795]: I0219 22:45:01.831130 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5fbc\" (UniqueName: \"kubernetes.io/projected/cfa8dda8-f620-4331-8909-b10784ceeab8-kube-api-access-s5fbc\") pod \"root-account-create-update-vrr5x\" (UID: \"cfa8dda8-f620-4331-8909-b10784ceeab8\") " pod="openstack/root-account-create-update-vrr5x" Feb 19 22:45:01 crc kubenswrapper[4795]: I0219 22:45:01.832553 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfa8dda8-f620-4331-8909-b10784ceeab8-operator-scripts\") pod \"root-account-create-update-vrr5x\" (UID: \"cfa8dda8-f620-4331-8909-b10784ceeab8\") " pod="openstack/root-account-create-update-vrr5x" Feb 19 22:45:01 crc kubenswrapper[4795]: I0219 22:45:01.847943 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5fbc\" (UniqueName: \"kubernetes.io/projected/cfa8dda8-f620-4331-8909-b10784ceeab8-kube-api-access-s5fbc\") pod \"root-account-create-update-vrr5x\" (UID: \"cfa8dda8-f620-4331-8909-b10784ceeab8\") " pod="openstack/root-account-create-update-vrr5x" Feb 19 22:45:01 crc kubenswrapper[4795]: I0219 22:45:01.990587 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vrr5x" Feb 19 22:45:02 crc kubenswrapper[4795]: I0219 22:45:02.109609 4795 generic.go:334] "Generic (PLEG): container finished" podID="54bacd9c-6bce-433c-972c-3990566baa40" containerID="eca52f37003a7a5168093ed1a2726d31a52843bd3e99b81128785c0ad70b60e9" exitCode=0 Feb 19 22:45:02 crc kubenswrapper[4795]: I0219 22:45:02.109649 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" event={"ID":"54bacd9c-6bce-433c-972c-3990566baa40","Type":"ContainerDied","Data":"eca52f37003a7a5168093ed1a2726d31a52843bd3e99b81128785c0ad70b60e9"} Feb 19 22:45:02 crc kubenswrapper[4795]: I0219 22:45:02.460995 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-vrr5x"] Feb 19 22:45:02 crc kubenswrapper[4795]: W0219 22:45:02.465477 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfa8dda8_f620_4331_8909_b10784ceeab8.slice/crio-69b3b99567f4355711a9efa7f61e1c4d342b27993a8cb993e7e7f57b9a2da6c8 WatchSource:0}: Error finding container 69b3b99567f4355711a9efa7f61e1c4d342b27993a8cb993e7e7f57b9a2da6c8: Status 404 returned error can't find the container with id 69b3b99567f4355711a9efa7f61e1c4d342b27993a8cb993e7e7f57b9a2da6c8 Feb 19 22:45:03 crc kubenswrapper[4795]: I0219 22:45:03.121730 4795 generic.go:334] "Generic (PLEG): container finished" podID="cfa8dda8-f620-4331-8909-b10784ceeab8" containerID="595d4f05c4b7ec834570db4adf844a0fca41d1bed50151677fc612dd1b0457bc" exitCode=0 Feb 19 22:45:03 crc kubenswrapper[4795]: I0219 22:45:03.121845 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vrr5x" event={"ID":"cfa8dda8-f620-4331-8909-b10784ceeab8","Type":"ContainerDied","Data":"595d4f05c4b7ec834570db4adf844a0fca41d1bed50151677fc612dd1b0457bc"} Feb 19 22:45:03 crc kubenswrapper[4795]: I0219 22:45:03.121883 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vrr5x" event={"ID":"cfa8dda8-f620-4331-8909-b10784ceeab8","Type":"ContainerStarted","Data":"69b3b99567f4355711a9efa7f61e1c4d342b27993a8cb993e7e7f57b9a2da6c8"} Feb 19 22:45:03 crc kubenswrapper[4795]: I0219 22:45:03.555322 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" Feb 19 22:45:03 crc kubenswrapper[4795]: I0219 22:45:03.561891 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tj8mp\" (UniqueName: \"kubernetes.io/projected/54bacd9c-6bce-433c-972c-3990566baa40-kube-api-access-tj8mp\") pod \"54bacd9c-6bce-433c-972c-3990566baa40\" (UID: \"54bacd9c-6bce-433c-972c-3990566baa40\") " Feb 19 22:45:03 crc kubenswrapper[4795]: I0219 22:45:03.561945 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54bacd9c-6bce-433c-972c-3990566baa40-config-volume\") pod \"54bacd9c-6bce-433c-972c-3990566baa40\" (UID: \"54bacd9c-6bce-433c-972c-3990566baa40\") " Feb 19 22:45:03 crc kubenswrapper[4795]: I0219 22:45:03.561970 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54bacd9c-6bce-433c-972c-3990566baa40-secret-volume\") pod \"54bacd9c-6bce-433c-972c-3990566baa40\" (UID: \"54bacd9c-6bce-433c-972c-3990566baa40\") " Feb 19 22:45:03 crc kubenswrapper[4795]: I0219 22:45:03.563442 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54bacd9c-6bce-433c-972c-3990566baa40-config-volume" (OuterVolumeSpecName: "config-volume") pod "54bacd9c-6bce-433c-972c-3990566baa40" (UID: "54bacd9c-6bce-433c-972c-3990566baa40"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:45:03 crc kubenswrapper[4795]: I0219 22:45:03.568341 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54bacd9c-6bce-433c-972c-3990566baa40-kube-api-access-tj8mp" (OuterVolumeSpecName: "kube-api-access-tj8mp") pod "54bacd9c-6bce-433c-972c-3990566baa40" (UID: "54bacd9c-6bce-433c-972c-3990566baa40"). InnerVolumeSpecName "kube-api-access-tj8mp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:45:03 crc kubenswrapper[4795]: I0219 22:45:03.568343 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54bacd9c-6bce-433c-972c-3990566baa40-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "54bacd9c-6bce-433c-972c-3990566baa40" (UID: "54bacd9c-6bce-433c-972c-3990566baa40"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:45:03 crc kubenswrapper[4795]: I0219 22:45:03.664493 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tj8mp\" (UniqueName: \"kubernetes.io/projected/54bacd9c-6bce-433c-972c-3990566baa40-kube-api-access-tj8mp\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:03 crc kubenswrapper[4795]: I0219 22:45:03.664544 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54bacd9c-6bce-433c-972c-3990566baa40-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:03 crc kubenswrapper[4795]: I0219 22:45:03.664568 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54bacd9c-6bce-433c-972c-3990566baa40-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:04 crc kubenswrapper[4795]: I0219 22:45:04.133990 4795 generic.go:334] "Generic (PLEG): container finished" podID="bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" containerID="48d9ac5474c94a11a6da74ec61c602260d05f7961e1a831acf6b59d650ddb95f" exitCode=0 Feb 19 22:45:04 crc kubenswrapper[4795]: I0219 22:45:04.134104 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268","Type":"ContainerDied","Data":"48d9ac5474c94a11a6da74ec61c602260d05f7961e1a831acf6b59d650ddb95f"} Feb 19 22:45:04 crc kubenswrapper[4795]: I0219 22:45:04.136607 4795 generic.go:334] "Generic (PLEG): container finished" podID="9cd04173-2975-46bd-8602-f6561387d717" containerID="ef3bf36d2dad06468cc4add7eb9c2429b178c38643511387b593eff67aac784a" exitCode=0 Feb 19 22:45:04 crc kubenswrapper[4795]: I0219 22:45:04.136719 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9cd04173-2975-46bd-8602-f6561387d717","Type":"ContainerDied","Data":"ef3bf36d2dad06468cc4add7eb9c2429b178c38643511387b593eff67aac784a"} Feb 19 22:45:04 crc kubenswrapper[4795]: I0219 22:45:04.139122 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" Feb 19 22:45:04 crc kubenswrapper[4795]: I0219 22:45:04.139117 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" event={"ID":"54bacd9c-6bce-433c-972c-3990566baa40","Type":"ContainerDied","Data":"2c2f5859aa5c96d4c7919768d2d7d44779670de8216920b069687351f3e33599"} Feb 19 22:45:04 crc kubenswrapper[4795]: I0219 22:45:04.139351 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c2f5859aa5c96d4c7919768d2d7d44779670de8216920b069687351f3e33599" Feb 19 22:45:04 crc kubenswrapper[4795]: I0219 22:45:04.379841 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vrr5x" Feb 19 22:45:04 crc kubenswrapper[4795]: I0219 22:45:04.473497 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5fbc\" (UniqueName: \"kubernetes.io/projected/cfa8dda8-f620-4331-8909-b10784ceeab8-kube-api-access-s5fbc\") pod \"cfa8dda8-f620-4331-8909-b10784ceeab8\" (UID: \"cfa8dda8-f620-4331-8909-b10784ceeab8\") " Feb 19 22:45:04 crc kubenswrapper[4795]: I0219 22:45:04.473780 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfa8dda8-f620-4331-8909-b10784ceeab8-operator-scripts\") pod \"cfa8dda8-f620-4331-8909-b10784ceeab8\" (UID: \"cfa8dda8-f620-4331-8909-b10784ceeab8\") " Feb 19 22:45:04 crc kubenswrapper[4795]: I0219 22:45:04.474234 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfa8dda8-f620-4331-8909-b10784ceeab8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cfa8dda8-f620-4331-8909-b10784ceeab8" (UID: "cfa8dda8-f620-4331-8909-b10784ceeab8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:45:04 crc kubenswrapper[4795]: I0219 22:45:04.476528 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfa8dda8-f620-4331-8909-b10784ceeab8-kube-api-access-s5fbc" (OuterVolumeSpecName: "kube-api-access-s5fbc") pod "cfa8dda8-f620-4331-8909-b10784ceeab8" (UID: "cfa8dda8-f620-4331-8909-b10784ceeab8"). InnerVolumeSpecName "kube-api-access-s5fbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:45:04 crc kubenswrapper[4795]: I0219 22:45:04.574869 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5fbc\" (UniqueName: \"kubernetes.io/projected/cfa8dda8-f620-4331-8909-b10784ceeab8-kube-api-access-s5fbc\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:04 crc kubenswrapper[4795]: I0219 22:45:04.574903 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfa8dda8-f620-4331-8909-b10784ceeab8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:04 crc kubenswrapper[4795]: I0219 22:45:04.628655 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt"] Feb 19 22:45:04 crc kubenswrapper[4795]: I0219 22:45:04.635053 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt"] Feb 19 22:45:05 crc kubenswrapper[4795]: I0219 22:45:05.154840 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268","Type":"ContainerStarted","Data":"5acc4978acac0af934a6a762d2d3737171002b4cd362487321c37a56c03b3889"} Feb 19 22:45:05 crc kubenswrapper[4795]: I0219 22:45:05.155035 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:05 crc kubenswrapper[4795]: I0219 22:45:05.156331 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vrr5x" Feb 19 22:45:05 crc kubenswrapper[4795]: I0219 22:45:05.156371 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vrr5x" event={"ID":"cfa8dda8-f620-4331-8909-b10784ceeab8","Type":"ContainerDied","Data":"69b3b99567f4355711a9efa7f61e1c4d342b27993a8cb993e7e7f57b9a2da6c8"} Feb 19 22:45:05 crc kubenswrapper[4795]: I0219 22:45:05.156423 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69b3b99567f4355711a9efa7f61e1c4d342b27993a8cb993e7e7f57b9a2da6c8" Feb 19 22:45:05 crc kubenswrapper[4795]: I0219 22:45:05.158485 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9cd04173-2975-46bd-8602-f6561387d717","Type":"ContainerStarted","Data":"193aac651e1b8dba4dea79907fcffd2ee3b656e5be68313f2c158376eeedde01"} Feb 19 22:45:05 crc kubenswrapper[4795]: I0219 22:45:05.158690 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 19 22:45:05 crc kubenswrapper[4795]: I0219 22:45:05.183659 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.183634332 podStartE2EDuration="37.183634332s" podCreationTimestamp="2026-02-19 22:44:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:45:05.179266657 +0000 UTC m=+4616.371784541" watchObservedRunningTime="2026-02-19 22:45:05.183634332 +0000 UTC m=+4616.376152196" Feb 19 22:45:05 crc kubenswrapper[4795]: I0219 22:45:05.208457 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.208439937 podStartE2EDuration="37.208439937s" podCreationTimestamp="2026-02-19 22:44:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:45:05.206301765 +0000 UTC m=+4616.398819629" watchObservedRunningTime="2026-02-19 22:45:05.208439937 +0000 UTC m=+4616.400957801" Feb 19 22:45:05 crc kubenswrapper[4795]: I0219 22:45:05.523066 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6bba469-9e7c-4517-bc8d-2d5a5308edef" path="/var/lib/kubelet/pods/b6bba469-9e7c-4517-bc8d-2d5a5308edef/volumes" Feb 19 22:45:14 crc kubenswrapper[4795]: I0219 22:45:14.781248 4795 scope.go:117] "RemoveContainer" containerID="32bd4a96cb785a6511c7d0788e6684b8612ce255e3204d11b76077aa3fe75418" Feb 19 22:45:19 crc kubenswrapper[4795]: I0219 22:45:19.958327 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 19 22:45:20 crc kubenswrapper[4795]: I0219 22:45:20.246156 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.634906 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54dc9c94cc-wgn4d"] Feb 19 22:45:22 crc kubenswrapper[4795]: E0219 22:45:22.636300 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfa8dda8-f620-4331-8909-b10784ceeab8" containerName="mariadb-account-create-update" Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.636460 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfa8dda8-f620-4331-8909-b10784ceeab8" containerName="mariadb-account-create-update" Feb 19 22:45:22 crc kubenswrapper[4795]: E0219 22:45:22.636607 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54bacd9c-6bce-433c-972c-3990566baa40" containerName="collect-profiles" Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.636725 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="54bacd9c-6bce-433c-972c-3990566baa40" containerName="collect-profiles" Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.637105 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfa8dda8-f620-4331-8909-b10784ceeab8" containerName="mariadb-account-create-update" Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.637304 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="54bacd9c-6bce-433c-972c-3990566baa40" containerName="collect-profiles" Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.638809 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.647219 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54dc9c94cc-wgn4d"] Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.781988 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00c8c1c0-da57-4169-a42c-b52386ed3112-dns-svc\") pod \"dnsmasq-dns-54dc9c94cc-wgn4d\" (UID: \"00c8c1c0-da57-4169-a42c-b52386ed3112\") " pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.782063 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpmqb\" (UniqueName: \"kubernetes.io/projected/00c8c1c0-da57-4169-a42c-b52386ed3112-kube-api-access-vpmqb\") pod \"dnsmasq-dns-54dc9c94cc-wgn4d\" (UID: \"00c8c1c0-da57-4169-a42c-b52386ed3112\") " pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.782148 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00c8c1c0-da57-4169-a42c-b52386ed3112-config\") pod \"dnsmasq-dns-54dc9c94cc-wgn4d\" (UID: \"00c8c1c0-da57-4169-a42c-b52386ed3112\") " pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.883360 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00c8c1c0-da57-4169-a42c-b52386ed3112-config\") pod \"dnsmasq-dns-54dc9c94cc-wgn4d\" (UID: \"00c8c1c0-da57-4169-a42c-b52386ed3112\") " pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.883538 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00c8c1c0-da57-4169-a42c-b52386ed3112-dns-svc\") pod \"dnsmasq-dns-54dc9c94cc-wgn4d\" (UID: \"00c8c1c0-da57-4169-a42c-b52386ed3112\") " pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.883568 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpmqb\" (UniqueName: \"kubernetes.io/projected/00c8c1c0-da57-4169-a42c-b52386ed3112-kube-api-access-vpmqb\") pod \"dnsmasq-dns-54dc9c94cc-wgn4d\" (UID: \"00c8c1c0-da57-4169-a42c-b52386ed3112\") " pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.884212 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00c8c1c0-da57-4169-a42c-b52386ed3112-config\") pod \"dnsmasq-dns-54dc9c94cc-wgn4d\" (UID: \"00c8c1c0-da57-4169-a42c-b52386ed3112\") " pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.884460 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00c8c1c0-da57-4169-a42c-b52386ed3112-dns-svc\") pod \"dnsmasq-dns-54dc9c94cc-wgn4d\" (UID: \"00c8c1c0-da57-4169-a42c-b52386ed3112\") " pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.918225 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpmqb\" (UniqueName: \"kubernetes.io/projected/00c8c1c0-da57-4169-a42c-b52386ed3112-kube-api-access-vpmqb\") pod \"dnsmasq-dns-54dc9c94cc-wgn4d\" (UID: \"00c8c1c0-da57-4169-a42c-b52386ed3112\") " pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.958481 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" Feb 19 22:45:23 crc kubenswrapper[4795]: I0219 22:45:23.314773 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 22:45:23 crc kubenswrapper[4795]: I0219 22:45:23.380888 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54dc9c94cc-wgn4d"] Feb 19 22:45:23 crc kubenswrapper[4795]: I0219 22:45:23.971357 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 22:45:24 crc kubenswrapper[4795]: I0219 22:45:24.309200 4795 generic.go:334] "Generic (PLEG): container finished" podID="00c8c1c0-da57-4169-a42c-b52386ed3112" containerID="99d134900f88fca6c416350229ec6ebd6aa32403658dd7fa964885fc2b39579a" exitCode=0 Feb 19 22:45:24 crc kubenswrapper[4795]: I0219 22:45:24.309248 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" event={"ID":"00c8c1c0-da57-4169-a42c-b52386ed3112","Type":"ContainerDied","Data":"99d134900f88fca6c416350229ec6ebd6aa32403658dd7fa964885fc2b39579a"} Feb 19 22:45:24 crc kubenswrapper[4795]: I0219 22:45:24.309279 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" event={"ID":"00c8c1c0-da57-4169-a42c-b52386ed3112","Type":"ContainerStarted","Data":"188ca574ffaf1ffa388763be3f13a7eb4afedf6a896f0e7de263093402b89351"} Feb 19 22:45:25 crc kubenswrapper[4795]: I0219 22:45:25.258055 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="9cd04173-2975-46bd-8602-f6561387d717" containerName="rabbitmq" containerID="cri-o://193aac651e1b8dba4dea79907fcffd2ee3b656e5be68313f2c158376eeedde01" gracePeriod=604799 Feb 19 22:45:25 crc kubenswrapper[4795]: I0219 22:45:25.317861 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" event={"ID":"00c8c1c0-da57-4169-a42c-b52386ed3112","Type":"ContainerStarted","Data":"00f889ab0dea3cf5542a0ac31c64828fab8a4969385f74afc960d2da8d71af69"} Feb 19 22:45:25 crc kubenswrapper[4795]: I0219 22:45:25.319098 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" Feb 19 22:45:25 crc kubenswrapper[4795]: I0219 22:45:25.340216 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" podStartSLOduration=3.340197004 podStartE2EDuration="3.340197004s" podCreationTimestamp="2026-02-19 22:45:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:45:25.333985115 +0000 UTC m=+4636.526502979" watchObservedRunningTime="2026-02-19 22:45:25.340197004 +0000 UTC m=+4636.532714868" Feb 19 22:45:25 crc kubenswrapper[4795]: I0219 22:45:25.960812 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" containerName="rabbitmq" containerID="cri-o://5acc4978acac0af934a6a762d2d3737171002b4cd362487321c37a56c03b3889" gracePeriod=604799 Feb 19 22:45:29 crc kubenswrapper[4795]: I0219 22:45:29.956842 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="9cd04173-2975-46bd-8602-f6561387d717" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.243:5672: connect: connection refused" Feb 19 22:45:30 crc kubenswrapper[4795]: I0219 22:45:30.244358 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.244:5672: connect: connection refused" Feb 19 22:45:31 crc kubenswrapper[4795]: I0219 22:45:31.913842 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.029248 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9cd04173-2975-46bd-8602-f6561387d717-erlang-cookie-secret\") pod \"9cd04173-2975-46bd-8602-f6561387d717\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.029450 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\") pod \"9cd04173-2975-46bd-8602-f6561387d717\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.029495 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-plugins\") pod \"9cd04173-2975-46bd-8602-f6561387d717\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.029534 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-confd\") pod \"9cd04173-2975-46bd-8602-f6561387d717\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.029584 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9cd04173-2975-46bd-8602-f6561387d717-plugins-conf\") pod \"9cd04173-2975-46bd-8602-f6561387d717\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.029600 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9cd04173-2975-46bd-8602-f6561387d717-server-conf\") pod \"9cd04173-2975-46bd-8602-f6561387d717\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.029621 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9cd04173-2975-46bd-8602-f6561387d717-pod-info\") pod \"9cd04173-2975-46bd-8602-f6561387d717\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.029693 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-erlang-cookie\") pod \"9cd04173-2975-46bd-8602-f6561387d717\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.029742 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnbmk\" (UniqueName: \"kubernetes.io/projected/9cd04173-2975-46bd-8602-f6561387d717-kube-api-access-wnbmk\") pod \"9cd04173-2975-46bd-8602-f6561387d717\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.030617 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "9cd04173-2975-46bd-8602-f6561387d717" (UID: "9cd04173-2975-46bd-8602-f6561387d717"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.031022 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "9cd04173-2975-46bd-8602-f6561387d717" (UID: "9cd04173-2975-46bd-8602-f6561387d717"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.031060 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cd04173-2975-46bd-8602-f6561387d717-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "9cd04173-2975-46bd-8602-f6561387d717" (UID: "9cd04173-2975-46bd-8602-f6561387d717"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.034988 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cd04173-2975-46bd-8602-f6561387d717-kube-api-access-wnbmk" (OuterVolumeSpecName: "kube-api-access-wnbmk") pod "9cd04173-2975-46bd-8602-f6561387d717" (UID: "9cd04173-2975-46bd-8602-f6561387d717"). InnerVolumeSpecName "kube-api-access-wnbmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.035415 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cd04173-2975-46bd-8602-f6561387d717-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "9cd04173-2975-46bd-8602-f6561387d717" (UID: "9cd04173-2975-46bd-8602-f6561387d717"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.036231 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/9cd04173-2975-46bd-8602-f6561387d717-pod-info" (OuterVolumeSpecName: "pod-info") pod "9cd04173-2975-46bd-8602-f6561387d717" (UID: "9cd04173-2975-46bd-8602-f6561387d717"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.053860 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cd04173-2975-46bd-8602-f6561387d717-server-conf" (OuterVolumeSpecName: "server-conf") pod "9cd04173-2975-46bd-8602-f6561387d717" (UID: "9cd04173-2975-46bd-8602-f6561387d717"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.059366 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f" (OuterVolumeSpecName: "persistence") pod "9cd04173-2975-46bd-8602-f6561387d717" (UID: "9cd04173-2975-46bd-8602-f6561387d717"). InnerVolumeSpecName "pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.111645 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "9cd04173-2975-46bd-8602-f6561387d717" (UID: "9cd04173-2975-46bd-8602-f6561387d717"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.131694 4795 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9cd04173-2975-46bd-8602-f6561387d717-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.131806 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\") on node \"crc\" " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.131826 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.131840 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.131854 4795 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9cd04173-2975-46bd-8602-f6561387d717-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.131864 4795 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9cd04173-2975-46bd-8602-f6561387d717-server-conf\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.131874 4795 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9cd04173-2975-46bd-8602-f6561387d717-pod-info\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.131887 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.131899 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnbmk\" (UniqueName: \"kubernetes.io/projected/9cd04173-2975-46bd-8602-f6561387d717-kube-api-access-wnbmk\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.146905 4795 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.147442 4795 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f") on node "crc" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.233253 4795 reconciler_common.go:293] "Volume detached for volume \"pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.377179 4795 generic.go:334] "Generic (PLEG): container finished" podID="bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" containerID="5acc4978acac0af934a6a762d2d3737171002b4cd362487321c37a56c03b3889" exitCode=0 Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.377242 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268","Type":"ContainerDied","Data":"5acc4978acac0af934a6a762d2d3737171002b4cd362487321c37a56c03b3889"} Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.379119 4795 generic.go:334] "Generic (PLEG): container finished" podID="9cd04173-2975-46bd-8602-f6561387d717" containerID="193aac651e1b8dba4dea79907fcffd2ee3b656e5be68313f2c158376eeedde01" exitCode=0 Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.379280 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9cd04173-2975-46bd-8602-f6561387d717","Type":"ContainerDied","Data":"193aac651e1b8dba4dea79907fcffd2ee3b656e5be68313f2c158376eeedde01"} Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.379350 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9cd04173-2975-46bd-8602-f6561387d717","Type":"ContainerDied","Data":"8ac632b756081c55272b5e493bca0401eb5a5d61edfe7cea78f2545acf8b31bd"} Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.379371 4795 scope.go:117] "RemoveContainer" containerID="193aac651e1b8dba4dea79907fcffd2ee3b656e5be68313f2c158376eeedde01" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.379570 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.404838 4795 scope.go:117] "RemoveContainer" containerID="ef3bf36d2dad06468cc4add7eb9c2429b178c38643511387b593eff67aac784a" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.427659 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.449455 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.468845 4795 scope.go:117] "RemoveContainer" containerID="193aac651e1b8dba4dea79907fcffd2ee3b656e5be68313f2c158376eeedde01" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.468988 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 22:45:32 crc kubenswrapper[4795]: E0219 22:45:32.469432 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd04173-2975-46bd-8602-f6561387d717" containerName="rabbitmq" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.469457 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd04173-2975-46bd-8602-f6561387d717" containerName="rabbitmq" Feb 19 22:45:32 crc kubenswrapper[4795]: E0219 22:45:32.469472 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd04173-2975-46bd-8602-f6561387d717" containerName="setup-container" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.469491 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd04173-2975-46bd-8602-f6561387d717" containerName="setup-container" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.469695 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cd04173-2975-46bd-8602-f6561387d717" containerName="rabbitmq" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.470812 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.471541 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.474440 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.476259 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.476400 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.476620 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-rbx8c" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.476733 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 19 22:45:32 crc kubenswrapper[4795]: E0219 22:45:32.477443 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"193aac651e1b8dba4dea79907fcffd2ee3b656e5be68313f2c158376eeedde01\": container with ID starting with 193aac651e1b8dba4dea79907fcffd2ee3b656e5be68313f2c158376eeedde01 not found: ID does not exist" containerID="193aac651e1b8dba4dea79907fcffd2ee3b656e5be68313f2c158376eeedde01" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.477504 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"193aac651e1b8dba4dea79907fcffd2ee3b656e5be68313f2c158376eeedde01"} err="failed to get container status \"193aac651e1b8dba4dea79907fcffd2ee3b656e5be68313f2c158376eeedde01\": rpc error: code = NotFound desc = could not find container \"193aac651e1b8dba4dea79907fcffd2ee3b656e5be68313f2c158376eeedde01\": container with ID starting with 193aac651e1b8dba4dea79907fcffd2ee3b656e5be68313f2c158376eeedde01 not found: ID does not exist" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.477540 4795 scope.go:117] "RemoveContainer" containerID="ef3bf36d2dad06468cc4add7eb9c2429b178c38643511387b593eff67aac784a" Feb 19 22:45:32 crc kubenswrapper[4795]: E0219 22:45:32.477922 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef3bf36d2dad06468cc4add7eb9c2429b178c38643511387b593eff67aac784a\": container with ID starting with ef3bf36d2dad06468cc4add7eb9c2429b178c38643511387b593eff67aac784a not found: ID does not exist" containerID="ef3bf36d2dad06468cc4add7eb9c2429b178c38643511387b593eff67aac784a" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.477953 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef3bf36d2dad06468cc4add7eb9c2429b178c38643511387b593eff67aac784a"} err="failed to get container status \"ef3bf36d2dad06468cc4add7eb9c2429b178c38643511387b593eff67aac784a\": rpc error: code = NotFound desc = could not find container \"ef3bf36d2dad06468cc4add7eb9c2429b178c38643511387b593eff67aac784a\": container with ID starting with ef3bf36d2dad06468cc4add7eb9c2429b178c38643511387b593eff67aac784a not found: ID does not exist" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.546996 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.547048 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.547106 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.547132 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cfzv\" (UniqueName: \"kubernetes.io/projected/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-kube-api-access-8cfzv\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.547182 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.547208 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.547254 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.547276 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.547291 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.606476 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.648458 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-confd\") pod \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.648572 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-plugins-conf\") pod \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.648609 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-pod-info\") pod \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.648681 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24vlt\" (UniqueName: \"kubernetes.io/projected/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-kube-api-access-24vlt\") pod \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.649414 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-server-conf\") pod \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.649468 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-erlang-cookie\") pod \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.649503 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-erlang-cookie-secret\") pod \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.649533 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-plugins\") pod \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.649674 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\") pod \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.649878 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.649907 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.650002 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.650097 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.650128 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.650218 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.650242 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cfzv\" (UniqueName: \"kubernetes.io/projected/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-kube-api-access-8cfzv\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.650330 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.649416 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" (UID: "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.650337 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" (UID: "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.650364 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.650560 4795 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.650586 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.651232 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.651308 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" (UID: "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.651719 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.652341 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.652570 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.653483 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-pod-info" (OuterVolumeSpecName: "pod-info") pod "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" (UID: "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.655349 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.655628 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.655718 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/68b77c70021788ffaf78dfe86ddece9c7d3d5c9cffb40f46dc15f8b79fc094aa/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.661382 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" (UID: "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.662997 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-kube-api-access-24vlt" (OuterVolumeSpecName: "kube-api-access-24vlt") pod "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" (UID: "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268"). InnerVolumeSpecName "kube-api-access-24vlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.670078 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.671863 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb" (OuterVolumeSpecName: "persistence") pod "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" (UID: "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268"). InnerVolumeSpecName "pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.672046 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.674275 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cfzv\" (UniqueName: \"kubernetes.io/projected/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-kube-api-access-8cfzv\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.699619 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.709066 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-server-conf" (OuterVolumeSpecName: "server-conf") pod "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" (UID: "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.752268 4795 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-pod-info\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.752307 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24vlt\" (UniqueName: \"kubernetes.io/projected/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-kube-api-access-24vlt\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.752322 4795 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-server-conf\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.752333 4795 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.752345 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.752381 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\") on node \"crc\" " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.765368 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" (UID: "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.770136 4795 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.770311 4795 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb") on node "crc" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.791231 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.853558 4795 reconciler_common.go:293] "Volume detached for volume \"pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.853814 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:32.962240 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.015731 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589cf688cc-vf5s4"] Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.016039 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" podUID="cc119db7-03db-4838-b663-f244b7f93433" containerName="dnsmasq-dns" containerID="cri-o://64136e69fe434b4b0bfaa8f633f2393606df65c80063c9e28ce2ef61c86da556" gracePeriod=10 Feb 19 22:45:33 crc kubenswrapper[4795]: E0219 22:45:33.189503 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc119db7_03db_4838_b663_f244b7f93433.slice/crio-conmon-64136e69fe434b4b0bfaa8f633f2393606df65c80063c9e28ce2ef61c86da556.scope\": RecentStats: unable to find data in memory cache]" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.193428 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.394037 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b","Type":"ContainerStarted","Data":"25a4dd5a66aafe90ee24944e63798abc6d2b3f2338aa1a74cc4b99fcedce95a9"} Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.397379 4795 generic.go:334] "Generic (PLEG): container finished" podID="cc119db7-03db-4838-b663-f244b7f93433" containerID="64136e69fe434b4b0bfaa8f633f2393606df65c80063c9e28ce2ef61c86da556" exitCode=0 Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.397466 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" event={"ID":"cc119db7-03db-4838-b663-f244b7f93433","Type":"ContainerDied","Data":"64136e69fe434b4b0bfaa8f633f2393606df65c80063c9e28ce2ef61c86da556"} Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.400273 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.400311 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268","Type":"ContainerDied","Data":"e602b9b95a59c91b0341f4b50dac51b129fce0661e01cc211b5bddd7e49efb66"} Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.400392 4795 scope.go:117] "RemoveContainer" containerID="5acc4978acac0af934a6a762d2d3737171002b4cd362487321c37a56c03b3889" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.438858 4795 scope.go:117] "RemoveContainer" containerID="48d9ac5474c94a11a6da74ec61c602260d05f7961e1a831acf6b59d650ddb95f" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.440950 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.446738 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.467576 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 22:45:33 crc kubenswrapper[4795]: E0219 22:45:33.467856 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" containerName="rabbitmq" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.467870 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" containerName="rabbitmq" Feb 19 22:45:33 crc kubenswrapper[4795]: E0219 22:45:33.467887 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" containerName="setup-container" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.467895 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" containerName="setup-container" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.468025 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" containerName="rabbitmq" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.468786 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.471783 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.471876 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.471931 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-rh88g" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.472034 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.472108 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.537130 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cd04173-2975-46bd-8602-f6561387d717" path="/var/lib/kubelet/pods/9cd04173-2975-46bd-8602-f6561387d717/volumes" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.538535 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" path="/var/lib/kubelet/pods/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268/volumes" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.539042 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.565533 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ce76f8f5-4383-4be1-ab7b-cf862ae77025-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.565577 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ce76f8f5-4383-4be1-ab7b-cf862ae77025-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.565602 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.565620 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ce76f8f5-4383-4be1-ab7b-cf862ae77025-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.565856 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ce76f8f5-4383-4be1-ab7b-cf862ae77025-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.565878 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ce76f8f5-4383-4be1-ab7b-cf862ae77025-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.565977 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ce76f8f5-4383-4be1-ab7b-cf862ae77025-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.566008 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-png8h\" (UniqueName: \"kubernetes.io/projected/ce76f8f5-4383-4be1-ab7b-cf862ae77025-kube-api-access-png8h\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.566030 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ce76f8f5-4383-4be1-ab7b-cf862ae77025-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.668222 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ce76f8f5-4383-4be1-ab7b-cf862ae77025-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.668292 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-png8h\" (UniqueName: \"kubernetes.io/projected/ce76f8f5-4383-4be1-ab7b-cf862ae77025-kube-api-access-png8h\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.668348 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ce76f8f5-4383-4be1-ab7b-cf862ae77025-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.668372 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ce76f8f5-4383-4be1-ab7b-cf862ae77025-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.668447 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ce76f8f5-4383-4be1-ab7b-cf862ae77025-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.668472 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.668490 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ce76f8f5-4383-4be1-ab7b-cf862ae77025-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.668553 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ce76f8f5-4383-4be1-ab7b-cf862ae77025-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.668575 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ce76f8f5-4383-4be1-ab7b-cf862ae77025-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.669790 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ce76f8f5-4383-4be1-ab7b-cf862ae77025-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.670297 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ce76f8f5-4383-4be1-ab7b-cf862ae77025-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.670482 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ce76f8f5-4383-4be1-ab7b-cf862ae77025-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.671402 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ce76f8f5-4383-4be1-ab7b-cf862ae77025-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.673132 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.673152 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ce7e4e38eafbc5e4f450d6daa37e1c166de00b3db9bc92a3ff92b374f626b390/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.676885 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ce76f8f5-4383-4be1-ab7b-cf862ae77025-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.679937 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ce76f8f5-4383-4be1-ab7b-cf862ae77025-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.683680 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ce76f8f5-4383-4be1-ab7b-cf862ae77025-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.694586 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-png8h\" (UniqueName: \"kubernetes.io/projected/ce76f8f5-4383-4be1-ab7b-cf862ae77025-kube-api-access-png8h\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.762222 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.840796 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.950209 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.972975 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc119db7-03db-4838-b663-f244b7f93433-dns-svc\") pod \"cc119db7-03db-4838-b663-f244b7f93433\" (UID: \"cc119db7-03db-4838-b663-f244b7f93433\") " Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.973615 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc119db7-03db-4838-b663-f244b7f93433-config\") pod \"cc119db7-03db-4838-b663-f244b7f93433\" (UID: \"cc119db7-03db-4838-b663-f244b7f93433\") " Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.973763 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-784t4\" (UniqueName: \"kubernetes.io/projected/cc119db7-03db-4838-b663-f244b7f93433-kube-api-access-784t4\") pod \"cc119db7-03db-4838-b663-f244b7f93433\" (UID: \"cc119db7-03db-4838-b663-f244b7f93433\") " Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.978586 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc119db7-03db-4838-b663-f244b7f93433-kube-api-access-784t4" (OuterVolumeSpecName: "kube-api-access-784t4") pod "cc119db7-03db-4838-b663-f244b7f93433" (UID: "cc119db7-03db-4838-b663-f244b7f93433"). InnerVolumeSpecName "kube-api-access-784t4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:45:34 crc kubenswrapper[4795]: I0219 22:45:34.007013 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc119db7-03db-4838-b663-f244b7f93433-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cc119db7-03db-4838-b663-f244b7f93433" (UID: "cc119db7-03db-4838-b663-f244b7f93433"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:45:34 crc kubenswrapper[4795]: I0219 22:45:34.009983 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc119db7-03db-4838-b663-f244b7f93433-config" (OuterVolumeSpecName: "config") pod "cc119db7-03db-4838-b663-f244b7f93433" (UID: "cc119db7-03db-4838-b663-f244b7f93433"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:45:34 crc kubenswrapper[4795]: I0219 22:45:34.076989 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc119db7-03db-4838-b663-f244b7f93433-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:34 crc kubenswrapper[4795]: I0219 22:45:34.077032 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc119db7-03db-4838-b663-f244b7f93433-config\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:34 crc kubenswrapper[4795]: I0219 22:45:34.077046 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-784t4\" (UniqueName: \"kubernetes.io/projected/cc119db7-03db-4838-b663-f244b7f93433-kube-api-access-784t4\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:34 crc kubenswrapper[4795]: I0219 22:45:34.282581 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 22:45:34 crc kubenswrapper[4795]: W0219 22:45:34.286205 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce76f8f5_4383_4be1_ab7b_cf862ae77025.slice/crio-da4bb88a1ed67001533a76eba8ca9889a21fe46aab0f553e17159712c917c609 WatchSource:0}: Error finding container da4bb88a1ed67001533a76eba8ca9889a21fe46aab0f553e17159712c917c609: Status 404 returned error can't find the container with id da4bb88a1ed67001533a76eba8ca9889a21fe46aab0f553e17159712c917c609 Feb 19 22:45:34 crc kubenswrapper[4795]: I0219 22:45:34.421524 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b","Type":"ContainerStarted","Data":"95a2af03cbe9d22b98dd6b3c065e4321753cc8882241a21052ddddc4c34902ad"} Feb 19 22:45:34 crc kubenswrapper[4795]: I0219 22:45:34.425205 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" Feb 19 22:45:34 crc kubenswrapper[4795]: I0219 22:45:34.425197 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" event={"ID":"cc119db7-03db-4838-b663-f244b7f93433","Type":"ContainerDied","Data":"8e504bc8e8afcb8d881cb268684b0b1a7bedc4275f260a1ee4b205b460ca8f3e"} Feb 19 22:45:34 crc kubenswrapper[4795]: I0219 22:45:34.425467 4795 scope.go:117] "RemoveContainer" containerID="64136e69fe434b4b0bfaa8f633f2393606df65c80063c9e28ce2ef61c86da556" Feb 19 22:45:34 crc kubenswrapper[4795]: I0219 22:45:34.428779 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ce76f8f5-4383-4be1-ab7b-cf862ae77025","Type":"ContainerStarted","Data":"da4bb88a1ed67001533a76eba8ca9889a21fe46aab0f553e17159712c917c609"} Feb 19 22:45:34 crc kubenswrapper[4795]: I0219 22:45:34.450824 4795 scope.go:117] "RemoveContainer" containerID="6c19182bc802a37ab3a253eaa33c548a0f3f15e1ee3c00a8c016db959a5b8557" Feb 19 22:45:34 crc kubenswrapper[4795]: I0219 22:45:34.486291 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589cf688cc-vf5s4"] Feb 19 22:45:34 crc kubenswrapper[4795]: I0219 22:45:34.494038 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-589cf688cc-vf5s4"] Feb 19 22:45:35 crc kubenswrapper[4795]: I0219 22:45:35.436532 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ce76f8f5-4383-4be1-ab7b-cf862ae77025","Type":"ContainerStarted","Data":"28a8d402d4f21cba268822aeb50ee538ba8e4a32a3d16536eaf6ded74f5bbc84"} Feb 19 22:45:35 crc kubenswrapper[4795]: I0219 22:45:35.521066 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc119db7-03db-4838-b663-f244b7f93433" path="/var/lib/kubelet/pods/cc119db7-03db-4838-b663-f244b7f93433/volumes" Feb 19 22:46:07 crc kubenswrapper[4795]: I0219 22:46:07.703919 4795 generic.go:334] "Generic (PLEG): container finished" podID="8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b" containerID="95a2af03cbe9d22b98dd6b3c065e4321753cc8882241a21052ddddc4c34902ad" exitCode=0 Feb 19 22:46:07 crc kubenswrapper[4795]: I0219 22:46:07.704018 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b","Type":"ContainerDied","Data":"95a2af03cbe9d22b98dd6b3c065e4321753cc8882241a21052ddddc4c34902ad"} Feb 19 22:46:08 crc kubenswrapper[4795]: I0219 22:46:08.712320 4795 generic.go:334] "Generic (PLEG): container finished" podID="ce76f8f5-4383-4be1-ab7b-cf862ae77025" containerID="28a8d402d4f21cba268822aeb50ee538ba8e4a32a3d16536eaf6ded74f5bbc84" exitCode=0 Feb 19 22:46:08 crc kubenswrapper[4795]: I0219 22:46:08.712397 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ce76f8f5-4383-4be1-ab7b-cf862ae77025","Type":"ContainerDied","Data":"28a8d402d4f21cba268822aeb50ee538ba8e4a32a3d16536eaf6ded74f5bbc84"} Feb 19 22:46:08 crc kubenswrapper[4795]: I0219 22:46:08.714606 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b","Type":"ContainerStarted","Data":"97689d1d9792b12996c2fc3af997e9b6bd43b1de8113bf314329bc0b6ade5aff"} Feb 19 22:46:08 crc kubenswrapper[4795]: I0219 22:46:08.714826 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 19 22:46:08 crc kubenswrapper[4795]: I0219 22:46:08.766380 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.766361029 podStartE2EDuration="36.766361029s" podCreationTimestamp="2026-02-19 22:45:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:46:08.762138817 +0000 UTC m=+4679.954656721" watchObservedRunningTime="2026-02-19 22:46:08.766361029 +0000 UTC m=+4679.958878893" Feb 19 22:46:09 crc kubenswrapper[4795]: I0219 22:46:09.726046 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ce76f8f5-4383-4be1-ab7b-cf862ae77025","Type":"ContainerStarted","Data":"2249af915dfcd97ac2ae5ba18ac2a704f0a02d994866f1cbb860ad37ee70a32d"} Feb 19 22:46:09 crc kubenswrapper[4795]: I0219 22:46:09.727507 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:46:09 crc kubenswrapper[4795]: I0219 22:46:09.753827 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.753805986 podStartE2EDuration="36.753805986s" podCreationTimestamp="2026-02-19 22:45:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:46:09.753730614 +0000 UTC m=+4680.946248528" watchObservedRunningTime="2026-02-19 22:46:09.753805986 +0000 UTC m=+4680.946323870" Feb 19 22:46:22 crc kubenswrapper[4795]: I0219 22:46:22.794378 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 19 22:46:23 crc kubenswrapper[4795]: I0219 22:46:23.843642 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:46:32 crc kubenswrapper[4795]: I0219 22:46:32.286074 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Feb 19 22:46:32 crc kubenswrapper[4795]: E0219 22:46:32.286816 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc119db7-03db-4838-b663-f244b7f93433" containerName="init" Feb 19 22:46:32 crc kubenswrapper[4795]: I0219 22:46:32.286830 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc119db7-03db-4838-b663-f244b7f93433" containerName="init" Feb 19 22:46:32 crc kubenswrapper[4795]: E0219 22:46:32.286842 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc119db7-03db-4838-b663-f244b7f93433" containerName="dnsmasq-dns" Feb 19 22:46:32 crc kubenswrapper[4795]: I0219 22:46:32.286848 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc119db7-03db-4838-b663-f244b7f93433" containerName="dnsmasq-dns" Feb 19 22:46:32 crc kubenswrapper[4795]: I0219 22:46:32.287001 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc119db7-03db-4838-b663-f244b7f93433" containerName="dnsmasq-dns" Feb 19 22:46:32 crc kubenswrapper[4795]: I0219 22:46:32.287530 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:46:32 crc kubenswrapper[4795]: I0219 22:46:32.290786 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-p5fxh" Feb 19 22:46:32 crc kubenswrapper[4795]: I0219 22:46:32.297141 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:46:32 crc kubenswrapper[4795]: I0219 22:46:32.428676 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2pf5\" (UniqueName: \"kubernetes.io/projected/56d10cf2-06d6-4709-a9b9-1b88eb3d6304-kube-api-access-k2pf5\") pod \"mariadb-client\" (UID: \"56d10cf2-06d6-4709-a9b9-1b88eb3d6304\") " pod="openstack/mariadb-client" Feb 19 22:46:32 crc kubenswrapper[4795]: I0219 22:46:32.529520 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2pf5\" (UniqueName: \"kubernetes.io/projected/56d10cf2-06d6-4709-a9b9-1b88eb3d6304-kube-api-access-k2pf5\") pod \"mariadb-client\" (UID: \"56d10cf2-06d6-4709-a9b9-1b88eb3d6304\") " pod="openstack/mariadb-client" Feb 19 22:46:32 crc kubenswrapper[4795]: I0219 22:46:32.627848 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2pf5\" (UniqueName: \"kubernetes.io/projected/56d10cf2-06d6-4709-a9b9-1b88eb3d6304-kube-api-access-k2pf5\") pod \"mariadb-client\" (UID: \"56d10cf2-06d6-4709-a9b9-1b88eb3d6304\") " pod="openstack/mariadb-client" Feb 19 22:46:32 crc kubenswrapper[4795]: I0219 22:46:32.906262 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:46:33 crc kubenswrapper[4795]: I0219 22:46:33.479018 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:46:33 crc kubenswrapper[4795]: W0219 22:46:33.485383 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56d10cf2_06d6_4709_a9b9_1b88eb3d6304.slice/crio-372e10d7a38b279457b4e38e92f499f0063bfb07322a103b1d04a880c05a4be9 WatchSource:0}: Error finding container 372e10d7a38b279457b4e38e92f499f0063bfb07322a103b1d04a880c05a4be9: Status 404 returned error can't find the container with id 372e10d7a38b279457b4e38e92f499f0063bfb07322a103b1d04a880c05a4be9 Feb 19 22:46:33 crc kubenswrapper[4795]: I0219 22:46:33.910771 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"56d10cf2-06d6-4709-a9b9-1b88eb3d6304","Type":"ContainerStarted","Data":"372e10d7a38b279457b4e38e92f499f0063bfb07322a103b1d04a880c05a4be9"} Feb 19 22:46:34 crc kubenswrapper[4795]: I0219 22:46:34.921446 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"56d10cf2-06d6-4709-a9b9-1b88eb3d6304","Type":"ContainerStarted","Data":"6e6b9b06030c29641fb41afc58b73a20301d29aabf850b0e31fe1b9909cd9686"} Feb 19 22:46:34 crc kubenswrapper[4795]: I0219 22:46:34.941148 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client" podStartSLOduration=2.495445663 podStartE2EDuration="2.941119957s" podCreationTimestamp="2026-02-19 22:46:32 +0000 UTC" firstStartedPulling="2026-02-19 22:46:33.486789846 +0000 UTC m=+4704.679307710" lastFinishedPulling="2026-02-19 22:46:33.93246415 +0000 UTC m=+4705.124982004" observedRunningTime="2026-02-19 22:46:34.938158142 +0000 UTC m=+4706.130676066" watchObservedRunningTime="2026-02-19 22:46:34.941119957 +0000 UTC m=+4706.133637861" Feb 19 22:46:50 crc kubenswrapper[4795]: I0219 22:46:50.580224 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:46:50 crc kubenswrapper[4795]: I0219 22:46:50.581017 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-client" podUID="56d10cf2-06d6-4709-a9b9-1b88eb3d6304" containerName="mariadb-client" containerID="cri-o://6e6b9b06030c29641fb41afc58b73a20301d29aabf850b0e31fe1b9909cd9686" gracePeriod=30 Feb 19 22:46:51 crc kubenswrapper[4795]: I0219 22:46:51.026392 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:46:51 crc kubenswrapper[4795]: I0219 22:46:51.031686 4795 generic.go:334] "Generic (PLEG): container finished" podID="56d10cf2-06d6-4709-a9b9-1b88eb3d6304" containerID="6e6b9b06030c29641fb41afc58b73a20301d29aabf850b0e31fe1b9909cd9686" exitCode=143 Feb 19 22:46:51 crc kubenswrapper[4795]: I0219 22:46:51.031727 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"56d10cf2-06d6-4709-a9b9-1b88eb3d6304","Type":"ContainerDied","Data":"6e6b9b06030c29641fb41afc58b73a20301d29aabf850b0e31fe1b9909cd9686"} Feb 19 22:46:51 crc kubenswrapper[4795]: I0219 22:46:51.031757 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"56d10cf2-06d6-4709-a9b9-1b88eb3d6304","Type":"ContainerDied","Data":"372e10d7a38b279457b4e38e92f499f0063bfb07322a103b1d04a880c05a4be9"} Feb 19 22:46:51 crc kubenswrapper[4795]: I0219 22:46:51.031758 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:46:51 crc kubenswrapper[4795]: I0219 22:46:51.031778 4795 scope.go:117] "RemoveContainer" containerID="6e6b9b06030c29641fb41afc58b73a20301d29aabf850b0e31fe1b9909cd9686" Feb 19 22:46:51 crc kubenswrapper[4795]: I0219 22:46:51.059394 4795 scope.go:117] "RemoveContainer" containerID="6e6b9b06030c29641fb41afc58b73a20301d29aabf850b0e31fe1b9909cd9686" Feb 19 22:46:51 crc kubenswrapper[4795]: E0219 22:46:51.059964 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e6b9b06030c29641fb41afc58b73a20301d29aabf850b0e31fe1b9909cd9686\": container with ID starting with 6e6b9b06030c29641fb41afc58b73a20301d29aabf850b0e31fe1b9909cd9686 not found: ID does not exist" containerID="6e6b9b06030c29641fb41afc58b73a20301d29aabf850b0e31fe1b9909cd9686" Feb 19 22:46:51 crc kubenswrapper[4795]: I0219 22:46:51.059993 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e6b9b06030c29641fb41afc58b73a20301d29aabf850b0e31fe1b9909cd9686"} err="failed to get container status \"6e6b9b06030c29641fb41afc58b73a20301d29aabf850b0e31fe1b9909cd9686\": rpc error: code = NotFound desc = could not find container \"6e6b9b06030c29641fb41afc58b73a20301d29aabf850b0e31fe1b9909cd9686\": container with ID starting with 6e6b9b06030c29641fb41afc58b73a20301d29aabf850b0e31fe1b9909cd9686 not found: ID does not exist" Feb 19 22:46:51 crc kubenswrapper[4795]: I0219 22:46:51.130990 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2pf5\" (UniqueName: \"kubernetes.io/projected/56d10cf2-06d6-4709-a9b9-1b88eb3d6304-kube-api-access-k2pf5\") pod \"56d10cf2-06d6-4709-a9b9-1b88eb3d6304\" (UID: \"56d10cf2-06d6-4709-a9b9-1b88eb3d6304\") " Feb 19 22:46:51 crc kubenswrapper[4795]: I0219 22:46:51.136946 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56d10cf2-06d6-4709-a9b9-1b88eb3d6304-kube-api-access-k2pf5" (OuterVolumeSpecName: "kube-api-access-k2pf5") pod "56d10cf2-06d6-4709-a9b9-1b88eb3d6304" (UID: "56d10cf2-06d6-4709-a9b9-1b88eb3d6304"). InnerVolumeSpecName "kube-api-access-k2pf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:46:51 crc kubenswrapper[4795]: I0219 22:46:51.232533 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2pf5\" (UniqueName: \"kubernetes.io/projected/56d10cf2-06d6-4709-a9b9-1b88eb3d6304-kube-api-access-k2pf5\") on node \"crc\" DevicePath \"\"" Feb 19 22:46:51 crc kubenswrapper[4795]: I0219 22:46:51.366950 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:46:51 crc kubenswrapper[4795]: I0219 22:46:51.373390 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:46:51 crc kubenswrapper[4795]: I0219 22:46:51.521542 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56d10cf2-06d6-4709-a9b9-1b88eb3d6304" path="/var/lib/kubelet/pods/56d10cf2-06d6-4709-a9b9-1b88eb3d6304/volumes" Feb 19 22:46:58 crc kubenswrapper[4795]: I0219 22:46:58.427610 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:46:58 crc kubenswrapper[4795]: I0219 22:46:58.428573 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:47:14 crc kubenswrapper[4795]: I0219 22:47:14.941301 4795 scope.go:117] "RemoveContainer" containerID="fa2e3c7da6ac02ed85489de382e40a9b438bc98edd6481546fefa8394b7e5fce" Feb 19 22:47:28 crc kubenswrapper[4795]: I0219 22:47:28.427848 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:47:28 crc kubenswrapper[4795]: I0219 22:47:28.428666 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:47:58 crc kubenswrapper[4795]: I0219 22:47:58.428113 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:47:58 crc kubenswrapper[4795]: I0219 22:47:58.428965 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:47:58 crc kubenswrapper[4795]: I0219 22:47:58.429038 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 22:47:58 crc kubenswrapper[4795]: I0219 22:47:58.430061 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"85a7b1a9bb1dc26b66551125ca13bcd3b5f6f4015b61adf891b31e1f3b13c640"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 22:47:58 crc kubenswrapper[4795]: I0219 22:47:58.430208 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://85a7b1a9bb1dc26b66551125ca13bcd3b5f6f4015b61adf891b31e1f3b13c640" gracePeriod=600 Feb 19 22:47:58 crc kubenswrapper[4795]: I0219 22:47:58.611616 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="85a7b1a9bb1dc26b66551125ca13bcd3b5f6f4015b61adf891b31e1f3b13c640" exitCode=0 Feb 19 22:47:58 crc kubenswrapper[4795]: I0219 22:47:58.611699 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"85a7b1a9bb1dc26b66551125ca13bcd3b5f6f4015b61adf891b31e1f3b13c640"} Feb 19 22:47:58 crc kubenswrapper[4795]: I0219 22:47:58.612127 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:47:58 crc kubenswrapper[4795]: I0219 22:47:58.955855 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rc44b"] Feb 19 22:47:58 crc kubenswrapper[4795]: E0219 22:47:58.956541 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56d10cf2-06d6-4709-a9b9-1b88eb3d6304" containerName="mariadb-client" Feb 19 22:47:58 crc kubenswrapper[4795]: I0219 22:47:58.956571 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="56d10cf2-06d6-4709-a9b9-1b88eb3d6304" containerName="mariadb-client" Feb 19 22:47:58 crc kubenswrapper[4795]: I0219 22:47:58.956848 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="56d10cf2-06d6-4709-a9b9-1b88eb3d6304" containerName="mariadb-client" Feb 19 22:47:58 crc kubenswrapper[4795]: I0219 22:47:58.959549 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:47:58 crc kubenswrapper[4795]: I0219 22:47:58.971203 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rc44b"] Feb 19 22:47:59 crc kubenswrapper[4795]: I0219 22:47:59.075455 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-utilities\") pod \"community-operators-rc44b\" (UID: \"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53\") " pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:47:59 crc kubenswrapper[4795]: I0219 22:47:59.075513 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnb5b\" (UniqueName: \"kubernetes.io/projected/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-kube-api-access-mnb5b\") pod \"community-operators-rc44b\" (UID: \"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53\") " pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:47:59 crc kubenswrapper[4795]: I0219 22:47:59.075576 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-catalog-content\") pod \"community-operators-rc44b\" (UID: \"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53\") " pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:47:59 crc kubenswrapper[4795]: I0219 22:47:59.176483 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-utilities\") pod \"community-operators-rc44b\" (UID: \"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53\") " pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:47:59 crc kubenswrapper[4795]: I0219 22:47:59.176773 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnb5b\" (UniqueName: \"kubernetes.io/projected/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-kube-api-access-mnb5b\") pod \"community-operators-rc44b\" (UID: \"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53\") " pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:47:59 crc kubenswrapper[4795]: I0219 22:47:59.176842 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-catalog-content\") pod \"community-operators-rc44b\" (UID: \"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53\") " pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:47:59 crc kubenswrapper[4795]: I0219 22:47:59.177116 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-utilities\") pod \"community-operators-rc44b\" (UID: \"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53\") " pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:47:59 crc kubenswrapper[4795]: I0219 22:47:59.177261 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-catalog-content\") pod \"community-operators-rc44b\" (UID: \"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53\") " pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:47:59 crc kubenswrapper[4795]: I0219 22:47:59.194421 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnb5b\" (UniqueName: \"kubernetes.io/projected/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-kube-api-access-mnb5b\") pod \"community-operators-rc44b\" (UID: \"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53\") " pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:47:59 crc kubenswrapper[4795]: I0219 22:47:59.277533 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:47:59 crc kubenswrapper[4795]: I0219 22:47:59.621056 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a"} Feb 19 22:47:59 crc kubenswrapper[4795]: I0219 22:47:59.831949 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rc44b"] Feb 19 22:48:00 crc kubenswrapper[4795]: I0219 22:48:00.636611 4795 generic.go:334] "Generic (PLEG): container finished" podID="b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53" containerID="d930f6ef1c64c1b075c04f57895f270fa2bc9361bd9e8ee4b21259c337c594ac" exitCode=0 Feb 19 22:48:00 crc kubenswrapper[4795]: I0219 22:48:00.636710 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rc44b" event={"ID":"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53","Type":"ContainerDied","Data":"d930f6ef1c64c1b075c04f57895f270fa2bc9361bd9e8ee4b21259c337c594ac"} Feb 19 22:48:00 crc kubenswrapper[4795]: I0219 22:48:00.637156 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rc44b" event={"ID":"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53","Type":"ContainerStarted","Data":"786407bda10fe867dc026d29a3861c792317f5cb6f61ef1d564701d4907a048b"} Feb 19 22:48:00 crc kubenswrapper[4795]: I0219 22:48:00.639840 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 22:48:01 crc kubenswrapper[4795]: I0219 22:48:01.652351 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rc44b" event={"ID":"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53","Type":"ContainerStarted","Data":"4161ed10a91dc4e5047a51ea5b384cc4c1c8d814f52f47b80d5bb8e0e8702b18"} Feb 19 22:48:02 crc kubenswrapper[4795]: I0219 22:48:02.662401 4795 generic.go:334] "Generic (PLEG): container finished" podID="b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53" containerID="4161ed10a91dc4e5047a51ea5b384cc4c1c8d814f52f47b80d5bb8e0e8702b18" exitCode=0 Feb 19 22:48:02 crc kubenswrapper[4795]: I0219 22:48:02.662512 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rc44b" event={"ID":"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53","Type":"ContainerDied","Data":"4161ed10a91dc4e5047a51ea5b384cc4c1c8d814f52f47b80d5bb8e0e8702b18"} Feb 19 22:48:03 crc kubenswrapper[4795]: I0219 22:48:03.672291 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rc44b" event={"ID":"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53","Type":"ContainerStarted","Data":"d44419965cc6895b59b17ab6350af37d179c131b8265e3abea7073aa58aa25cb"} Feb 19 22:48:03 crc kubenswrapper[4795]: I0219 22:48:03.689729 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rc44b" podStartSLOduration=3.247860128 podStartE2EDuration="5.689704831s" podCreationTimestamp="2026-02-19 22:47:58 +0000 UTC" firstStartedPulling="2026-02-19 22:48:00.639461789 +0000 UTC m=+4791.831979693" lastFinishedPulling="2026-02-19 22:48:03.081306522 +0000 UTC m=+4794.273824396" observedRunningTime="2026-02-19 22:48:03.689426083 +0000 UTC m=+4794.881943957" watchObservedRunningTime="2026-02-19 22:48:03.689704831 +0000 UTC m=+4794.882222705" Feb 19 22:48:09 crc kubenswrapper[4795]: I0219 22:48:09.278324 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:48:09 crc kubenswrapper[4795]: I0219 22:48:09.278759 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:48:09 crc kubenswrapper[4795]: I0219 22:48:09.320867 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:48:09 crc kubenswrapper[4795]: I0219 22:48:09.785420 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:48:09 crc kubenswrapper[4795]: I0219 22:48:09.829579 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rc44b"] Feb 19 22:48:11 crc kubenswrapper[4795]: I0219 22:48:11.760344 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rc44b" podUID="b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53" containerName="registry-server" containerID="cri-o://d44419965cc6895b59b17ab6350af37d179c131b8265e3abea7073aa58aa25cb" gracePeriod=2 Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.141782 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.285593 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-catalog-content\") pod \"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53\" (UID: \"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53\") " Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.285735 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-utilities\") pod \"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53\" (UID: \"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53\") " Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.285827 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnb5b\" (UniqueName: \"kubernetes.io/projected/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-kube-api-access-mnb5b\") pod \"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53\" (UID: \"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53\") " Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.286507 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-utilities" (OuterVolumeSpecName: "utilities") pod "b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53" (UID: "b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.290522 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-kube-api-access-mnb5b" (OuterVolumeSpecName: "kube-api-access-mnb5b") pod "b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53" (UID: "b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53"). InnerVolumeSpecName "kube-api-access-mnb5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.357820 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53" (UID: "b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.387699 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.387749 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnb5b\" (UniqueName: \"kubernetes.io/projected/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-kube-api-access-mnb5b\") on node \"crc\" DevicePath \"\"" Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.387761 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.774161 4795 generic.go:334] "Generic (PLEG): container finished" podID="b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53" containerID="d44419965cc6895b59b17ab6350af37d179c131b8265e3abea7073aa58aa25cb" exitCode=0 Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.774269 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rc44b" event={"ID":"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53","Type":"ContainerDied","Data":"d44419965cc6895b59b17ab6350af37d179c131b8265e3abea7073aa58aa25cb"} Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.774293 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.774636 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rc44b" event={"ID":"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53","Type":"ContainerDied","Data":"786407bda10fe867dc026d29a3861c792317f5cb6f61ef1d564701d4907a048b"} Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.774681 4795 scope.go:117] "RemoveContainer" containerID="d44419965cc6895b59b17ab6350af37d179c131b8265e3abea7073aa58aa25cb" Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.819913 4795 scope.go:117] "RemoveContainer" containerID="4161ed10a91dc4e5047a51ea5b384cc4c1c8d814f52f47b80d5bb8e0e8702b18" Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.822149 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rc44b"] Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.835103 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rc44b"] Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.839966 4795 scope.go:117] "RemoveContainer" containerID="d930f6ef1c64c1b075c04f57895f270fa2bc9361bd9e8ee4b21259c337c594ac" Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.882081 4795 scope.go:117] "RemoveContainer" containerID="d44419965cc6895b59b17ab6350af37d179c131b8265e3abea7073aa58aa25cb" Feb 19 22:48:12 crc kubenswrapper[4795]: E0219 22:48:12.882964 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d44419965cc6895b59b17ab6350af37d179c131b8265e3abea7073aa58aa25cb\": container with ID starting with d44419965cc6895b59b17ab6350af37d179c131b8265e3abea7073aa58aa25cb not found: ID does not exist" containerID="d44419965cc6895b59b17ab6350af37d179c131b8265e3abea7073aa58aa25cb" Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.883020 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d44419965cc6895b59b17ab6350af37d179c131b8265e3abea7073aa58aa25cb"} err="failed to get container status \"d44419965cc6895b59b17ab6350af37d179c131b8265e3abea7073aa58aa25cb\": rpc error: code = NotFound desc = could not find container \"d44419965cc6895b59b17ab6350af37d179c131b8265e3abea7073aa58aa25cb\": container with ID starting with d44419965cc6895b59b17ab6350af37d179c131b8265e3abea7073aa58aa25cb not found: ID does not exist" Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.883051 4795 scope.go:117] "RemoveContainer" containerID="4161ed10a91dc4e5047a51ea5b384cc4c1c8d814f52f47b80d5bb8e0e8702b18" Feb 19 22:48:12 crc kubenswrapper[4795]: E0219 22:48:12.883584 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4161ed10a91dc4e5047a51ea5b384cc4c1c8d814f52f47b80d5bb8e0e8702b18\": container with ID starting with 4161ed10a91dc4e5047a51ea5b384cc4c1c8d814f52f47b80d5bb8e0e8702b18 not found: ID does not exist" containerID="4161ed10a91dc4e5047a51ea5b384cc4c1c8d814f52f47b80d5bb8e0e8702b18" Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.883675 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4161ed10a91dc4e5047a51ea5b384cc4c1c8d814f52f47b80d5bb8e0e8702b18"} err="failed to get container status \"4161ed10a91dc4e5047a51ea5b384cc4c1c8d814f52f47b80d5bb8e0e8702b18\": rpc error: code = NotFound desc = could not find container \"4161ed10a91dc4e5047a51ea5b384cc4c1c8d814f52f47b80d5bb8e0e8702b18\": container with ID starting with 4161ed10a91dc4e5047a51ea5b384cc4c1c8d814f52f47b80d5bb8e0e8702b18 not found: ID does not exist" Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.883699 4795 scope.go:117] "RemoveContainer" containerID="d930f6ef1c64c1b075c04f57895f270fa2bc9361bd9e8ee4b21259c337c594ac" Feb 19 22:48:12 crc kubenswrapper[4795]: E0219 22:48:12.884056 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d930f6ef1c64c1b075c04f57895f270fa2bc9361bd9e8ee4b21259c337c594ac\": container with ID starting with d930f6ef1c64c1b075c04f57895f270fa2bc9361bd9e8ee4b21259c337c594ac not found: ID does not exist" containerID="d930f6ef1c64c1b075c04f57895f270fa2bc9361bd9e8ee4b21259c337c594ac" Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.884087 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d930f6ef1c64c1b075c04f57895f270fa2bc9361bd9e8ee4b21259c337c594ac"} err="failed to get container status \"d930f6ef1c64c1b075c04f57895f270fa2bc9361bd9e8ee4b21259c337c594ac\": rpc error: code = NotFound desc = could not find container \"d930f6ef1c64c1b075c04f57895f270fa2bc9361bd9e8ee4b21259c337c594ac\": container with ID starting with d930f6ef1c64c1b075c04f57895f270fa2bc9361bd9e8ee4b21259c337c594ac not found: ID does not exist" Feb 19 22:48:13 crc kubenswrapper[4795]: I0219 22:48:13.531564 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53" path="/var/lib/kubelet/pods/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53/volumes" Feb 19 22:49:58 crc kubenswrapper[4795]: I0219 22:49:58.427817 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:49:58 crc kubenswrapper[4795]: I0219 22:49:58.428267 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.285321 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2l99m"] Feb 19 22:50:14 crc kubenswrapper[4795]: E0219 22:50:14.286365 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53" containerName="extract-utilities" Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.286485 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53" containerName="extract-utilities" Feb 19 22:50:14 crc kubenswrapper[4795]: E0219 22:50:14.286525 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53" containerName="registry-server" Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.286533 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53" containerName="registry-server" Feb 19 22:50:14 crc kubenswrapper[4795]: E0219 22:50:14.286553 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53" containerName="extract-content" Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.286561 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53" containerName="extract-content" Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.286750 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53" containerName="registry-server" Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.287848 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.312704 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2l99m"] Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.397428 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5964e2d1-6384-4043-9857-a20ea29bd451-catalog-content\") pod \"certified-operators-2l99m\" (UID: \"5964e2d1-6384-4043-9857-a20ea29bd451\") " pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.397508 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5964e2d1-6384-4043-9857-a20ea29bd451-utilities\") pod \"certified-operators-2l99m\" (UID: \"5964e2d1-6384-4043-9857-a20ea29bd451\") " pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.397719 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8nnm\" (UniqueName: \"kubernetes.io/projected/5964e2d1-6384-4043-9857-a20ea29bd451-kube-api-access-v8nnm\") pod \"certified-operators-2l99m\" (UID: \"5964e2d1-6384-4043-9857-a20ea29bd451\") " pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.499417 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5964e2d1-6384-4043-9857-a20ea29bd451-utilities\") pod \"certified-operators-2l99m\" (UID: \"5964e2d1-6384-4043-9857-a20ea29bd451\") " pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.499474 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8nnm\" (UniqueName: \"kubernetes.io/projected/5964e2d1-6384-4043-9857-a20ea29bd451-kube-api-access-v8nnm\") pod \"certified-operators-2l99m\" (UID: \"5964e2d1-6384-4043-9857-a20ea29bd451\") " pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.499569 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5964e2d1-6384-4043-9857-a20ea29bd451-catalog-content\") pod \"certified-operators-2l99m\" (UID: \"5964e2d1-6384-4043-9857-a20ea29bd451\") " pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.500082 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5964e2d1-6384-4043-9857-a20ea29bd451-catalog-content\") pod \"certified-operators-2l99m\" (UID: \"5964e2d1-6384-4043-9857-a20ea29bd451\") " pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.500086 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5964e2d1-6384-4043-9857-a20ea29bd451-utilities\") pod \"certified-operators-2l99m\" (UID: \"5964e2d1-6384-4043-9857-a20ea29bd451\") " pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.520018 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8nnm\" (UniqueName: \"kubernetes.io/projected/5964e2d1-6384-4043-9857-a20ea29bd451-kube-api-access-v8nnm\") pod \"certified-operators-2l99m\" (UID: \"5964e2d1-6384-4043-9857-a20ea29bd451\") " pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.608493 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:15 crc kubenswrapper[4795]: I0219 22:50:15.044318 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2l99m"] Feb 19 22:50:15 crc kubenswrapper[4795]: I0219 22:50:15.801884 4795 generic.go:334] "Generic (PLEG): container finished" podID="5964e2d1-6384-4043-9857-a20ea29bd451" containerID="683a053d5697b6996d3f17d1c67e5c15538cfa4baae0a2d5d0c9dc28a965d9ea" exitCode=0 Feb 19 22:50:15 crc kubenswrapper[4795]: I0219 22:50:15.801954 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2l99m" event={"ID":"5964e2d1-6384-4043-9857-a20ea29bd451","Type":"ContainerDied","Data":"683a053d5697b6996d3f17d1c67e5c15538cfa4baae0a2d5d0c9dc28a965d9ea"} Feb 19 22:50:15 crc kubenswrapper[4795]: I0219 22:50:15.802541 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2l99m" event={"ID":"5964e2d1-6384-4043-9857-a20ea29bd451","Type":"ContainerStarted","Data":"0442e2a13005e0db40a7fe7c99138b63b899c77f70a2bdc495ae396fdb4dc13b"} Feb 19 22:50:16 crc kubenswrapper[4795]: I0219 22:50:16.682126 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fl94b"] Feb 19 22:50:16 crc kubenswrapper[4795]: I0219 22:50:16.687923 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:16 crc kubenswrapper[4795]: I0219 22:50:16.692221 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fl94b"] Feb 19 22:50:16 crc kubenswrapper[4795]: I0219 22:50:16.751458 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrzlm\" (UniqueName: \"kubernetes.io/projected/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-kube-api-access-jrzlm\") pod \"redhat-operators-fl94b\" (UID: \"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5\") " pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:16 crc kubenswrapper[4795]: I0219 22:50:16.751531 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-catalog-content\") pod \"redhat-operators-fl94b\" (UID: \"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5\") " pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:16 crc kubenswrapper[4795]: I0219 22:50:16.751754 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-utilities\") pod \"redhat-operators-fl94b\" (UID: \"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5\") " pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:16 crc kubenswrapper[4795]: I0219 22:50:16.853642 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-utilities\") pod \"redhat-operators-fl94b\" (UID: \"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5\") " pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:16 crc kubenswrapper[4795]: I0219 22:50:16.853742 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrzlm\" (UniqueName: \"kubernetes.io/projected/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-kube-api-access-jrzlm\") pod \"redhat-operators-fl94b\" (UID: \"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5\") " pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:16 crc kubenswrapper[4795]: I0219 22:50:16.853774 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-catalog-content\") pod \"redhat-operators-fl94b\" (UID: \"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5\") " pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:16 crc kubenswrapper[4795]: I0219 22:50:16.854144 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-utilities\") pod \"redhat-operators-fl94b\" (UID: \"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5\") " pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:16 crc kubenswrapper[4795]: I0219 22:50:16.854290 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-catalog-content\") pod \"redhat-operators-fl94b\" (UID: \"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5\") " pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:16 crc kubenswrapper[4795]: I0219 22:50:16.878508 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrzlm\" (UniqueName: \"kubernetes.io/projected/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-kube-api-access-jrzlm\") pod \"redhat-operators-fl94b\" (UID: \"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5\") " pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:17 crc kubenswrapper[4795]: I0219 22:50:17.008863 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:17 crc kubenswrapper[4795]: I0219 22:50:17.253539 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fl94b"] Feb 19 22:50:17 crc kubenswrapper[4795]: W0219 22:50:17.298346 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e78bcb8_816d_4f80_9ec1_ef03e589b2b5.slice/crio-94ea73ffdc3628207bbbc3c197d8cb971f78df0836d4ee833099186fe17f50ea WatchSource:0}: Error finding container 94ea73ffdc3628207bbbc3c197d8cb971f78df0836d4ee833099186fe17f50ea: Status 404 returned error can't find the container with id 94ea73ffdc3628207bbbc3c197d8cb971f78df0836d4ee833099186fe17f50ea Feb 19 22:50:17 crc kubenswrapper[4795]: I0219 22:50:17.815238 4795 generic.go:334] "Generic (PLEG): container finished" podID="5964e2d1-6384-4043-9857-a20ea29bd451" containerID="9740b74f1e123788c75cebc3f7fc043bd4280c71ed6e7b445c5adb12933ad135" exitCode=0 Feb 19 22:50:17 crc kubenswrapper[4795]: I0219 22:50:17.815318 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2l99m" event={"ID":"5964e2d1-6384-4043-9857-a20ea29bd451","Type":"ContainerDied","Data":"9740b74f1e123788c75cebc3f7fc043bd4280c71ed6e7b445c5adb12933ad135"} Feb 19 22:50:17 crc kubenswrapper[4795]: I0219 22:50:17.817462 4795 generic.go:334] "Generic (PLEG): container finished" podID="4e78bcb8-816d-4f80-9ec1-ef03e589b2b5" containerID="1cfbf37fcf4c652bf5d28da74cd92b559cc0d200ba2637ed2ce3586d55033a9c" exitCode=0 Feb 19 22:50:17 crc kubenswrapper[4795]: I0219 22:50:17.817511 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fl94b" event={"ID":"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5","Type":"ContainerDied","Data":"1cfbf37fcf4c652bf5d28da74cd92b559cc0d200ba2637ed2ce3586d55033a9c"} Feb 19 22:50:17 crc kubenswrapper[4795]: I0219 22:50:17.817538 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fl94b" event={"ID":"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5","Type":"ContainerStarted","Data":"94ea73ffdc3628207bbbc3c197d8cb971f78df0836d4ee833099186fe17f50ea"} Feb 19 22:50:18 crc kubenswrapper[4795]: I0219 22:50:18.825838 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fl94b" event={"ID":"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5","Type":"ContainerStarted","Data":"ddfafa3137520be6dd504ceadc8d930e24d9bf3c72eb5bf12cdbe765c8bc7bda"} Feb 19 22:50:18 crc kubenswrapper[4795]: I0219 22:50:18.828282 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2l99m" event={"ID":"5964e2d1-6384-4043-9857-a20ea29bd451","Type":"ContainerStarted","Data":"1d88201bb74178f76bc909982c078a586396de651fcf6f16cf09294097148bff"} Feb 19 22:50:18 crc kubenswrapper[4795]: I0219 22:50:18.860541 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2l99m" podStartSLOduration=2.353502198 podStartE2EDuration="4.860521979s" podCreationTimestamp="2026-02-19 22:50:14 +0000 UTC" firstStartedPulling="2026-02-19 22:50:15.803983976 +0000 UTC m=+4926.996501840" lastFinishedPulling="2026-02-19 22:50:18.311003757 +0000 UTC m=+4929.503521621" observedRunningTime="2026-02-19 22:50:18.8584798 +0000 UTC m=+4930.050997684" watchObservedRunningTime="2026-02-19 22:50:18.860521979 +0000 UTC m=+4930.053039833" Feb 19 22:50:19 crc kubenswrapper[4795]: I0219 22:50:19.876261 4795 generic.go:334] "Generic (PLEG): container finished" podID="4e78bcb8-816d-4f80-9ec1-ef03e589b2b5" containerID="ddfafa3137520be6dd504ceadc8d930e24d9bf3c72eb5bf12cdbe765c8bc7bda" exitCode=0 Feb 19 22:50:19 crc kubenswrapper[4795]: I0219 22:50:19.876335 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fl94b" event={"ID":"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5","Type":"ContainerDied","Data":"ddfafa3137520be6dd504ceadc8d930e24d9bf3c72eb5bf12cdbe765c8bc7bda"} Feb 19 22:50:20 crc kubenswrapper[4795]: I0219 22:50:20.886783 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fl94b" event={"ID":"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5","Type":"ContainerStarted","Data":"a9cf1eb01a601e9cc23b219d1249ced53774674ce564fc4223b6b0816ceac793"} Feb 19 22:50:20 crc kubenswrapper[4795]: I0219 22:50:20.903758 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fl94b" podStartSLOduration=2.474535875 podStartE2EDuration="4.903737815s" podCreationTimestamp="2026-02-19 22:50:16 +0000 UTC" firstStartedPulling="2026-02-19 22:50:17.819082267 +0000 UTC m=+4929.011600141" lastFinishedPulling="2026-02-19 22:50:20.248284217 +0000 UTC m=+4931.440802081" observedRunningTime="2026-02-19 22:50:20.90112873 +0000 UTC m=+4932.093646624" watchObservedRunningTime="2026-02-19 22:50:20.903737815 +0000 UTC m=+4932.096255689" Feb 19 22:50:24 crc kubenswrapper[4795]: I0219 22:50:24.609548 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:24 crc kubenswrapper[4795]: I0219 22:50:24.610274 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:24 crc kubenswrapper[4795]: I0219 22:50:24.648303 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:24 crc kubenswrapper[4795]: I0219 22:50:24.959869 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:26 crc kubenswrapper[4795]: I0219 22:50:26.075138 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2l99m"] Feb 19 22:50:26 crc kubenswrapper[4795]: I0219 22:50:26.924589 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2l99m" podUID="5964e2d1-6384-4043-9857-a20ea29bd451" containerName="registry-server" containerID="cri-o://1d88201bb74178f76bc909982c078a586396de651fcf6f16cf09294097148bff" gracePeriod=2 Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.009327 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.009384 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.052066 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.363080 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.412027 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5964e2d1-6384-4043-9857-a20ea29bd451-catalog-content\") pod \"5964e2d1-6384-4043-9857-a20ea29bd451\" (UID: \"5964e2d1-6384-4043-9857-a20ea29bd451\") " Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.412198 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8nnm\" (UniqueName: \"kubernetes.io/projected/5964e2d1-6384-4043-9857-a20ea29bd451-kube-api-access-v8nnm\") pod \"5964e2d1-6384-4043-9857-a20ea29bd451\" (UID: \"5964e2d1-6384-4043-9857-a20ea29bd451\") " Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.412306 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5964e2d1-6384-4043-9857-a20ea29bd451-utilities\") pod \"5964e2d1-6384-4043-9857-a20ea29bd451\" (UID: \"5964e2d1-6384-4043-9857-a20ea29bd451\") " Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.413408 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5964e2d1-6384-4043-9857-a20ea29bd451-utilities" (OuterVolumeSpecName: "utilities") pod "5964e2d1-6384-4043-9857-a20ea29bd451" (UID: "5964e2d1-6384-4043-9857-a20ea29bd451"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.419689 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5964e2d1-6384-4043-9857-a20ea29bd451-kube-api-access-v8nnm" (OuterVolumeSpecName: "kube-api-access-v8nnm") pod "5964e2d1-6384-4043-9857-a20ea29bd451" (UID: "5964e2d1-6384-4043-9857-a20ea29bd451"). InnerVolumeSpecName "kube-api-access-v8nnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.459098 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5964e2d1-6384-4043-9857-a20ea29bd451-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5964e2d1-6384-4043-9857-a20ea29bd451" (UID: "5964e2d1-6384-4043-9857-a20ea29bd451"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.513562 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5964e2d1-6384-4043-9857-a20ea29bd451-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.513587 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8nnm\" (UniqueName: \"kubernetes.io/projected/5964e2d1-6384-4043-9857-a20ea29bd451-kube-api-access-v8nnm\") on node \"crc\" DevicePath \"\"" Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.513600 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5964e2d1-6384-4043-9857-a20ea29bd451-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.936703 4795 generic.go:334] "Generic (PLEG): container finished" podID="5964e2d1-6384-4043-9857-a20ea29bd451" containerID="1d88201bb74178f76bc909982c078a586396de651fcf6f16cf09294097148bff" exitCode=0 Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.936813 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2l99m" event={"ID":"5964e2d1-6384-4043-9857-a20ea29bd451","Type":"ContainerDied","Data":"1d88201bb74178f76bc909982c078a586396de651fcf6f16cf09294097148bff"} Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.936885 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2l99m" event={"ID":"5964e2d1-6384-4043-9857-a20ea29bd451","Type":"ContainerDied","Data":"0442e2a13005e0db40a7fe7c99138b63b899c77f70a2bdc495ae396fdb4dc13b"} Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.936921 4795 scope.go:117] "RemoveContainer" containerID="1d88201bb74178f76bc909982c078a586396de651fcf6f16cf09294097148bff" Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.937685 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.966939 4795 scope.go:117] "RemoveContainer" containerID="9740b74f1e123788c75cebc3f7fc043bd4280c71ed6e7b445c5adb12933ad135" Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.971052 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2l99m"] Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.980331 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2l99m"] Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.987433 4795 scope.go:117] "RemoveContainer" containerID="683a053d5697b6996d3f17d1c67e5c15538cfa4baae0a2d5d0c9dc28a965d9ea" Feb 19 22:50:28 crc kubenswrapper[4795]: I0219 22:50:28.015045 4795 scope.go:117] "RemoveContainer" containerID="1d88201bb74178f76bc909982c078a586396de651fcf6f16cf09294097148bff" Feb 19 22:50:28 crc kubenswrapper[4795]: E0219 22:50:28.015539 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d88201bb74178f76bc909982c078a586396de651fcf6f16cf09294097148bff\": container with ID starting with 1d88201bb74178f76bc909982c078a586396de651fcf6f16cf09294097148bff not found: ID does not exist" containerID="1d88201bb74178f76bc909982c078a586396de651fcf6f16cf09294097148bff" Feb 19 22:50:28 crc kubenswrapper[4795]: I0219 22:50:28.015581 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d88201bb74178f76bc909982c078a586396de651fcf6f16cf09294097148bff"} err="failed to get container status \"1d88201bb74178f76bc909982c078a586396de651fcf6f16cf09294097148bff\": rpc error: code = NotFound desc = could not find container \"1d88201bb74178f76bc909982c078a586396de651fcf6f16cf09294097148bff\": container with ID starting with 1d88201bb74178f76bc909982c078a586396de651fcf6f16cf09294097148bff not found: ID does not exist" Feb 19 22:50:28 crc kubenswrapper[4795]: I0219 22:50:28.015607 4795 scope.go:117] "RemoveContainer" containerID="9740b74f1e123788c75cebc3f7fc043bd4280c71ed6e7b445c5adb12933ad135" Feb 19 22:50:28 crc kubenswrapper[4795]: E0219 22:50:28.016056 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9740b74f1e123788c75cebc3f7fc043bd4280c71ed6e7b445c5adb12933ad135\": container with ID starting with 9740b74f1e123788c75cebc3f7fc043bd4280c71ed6e7b445c5adb12933ad135 not found: ID does not exist" containerID="9740b74f1e123788c75cebc3f7fc043bd4280c71ed6e7b445c5adb12933ad135" Feb 19 22:50:28 crc kubenswrapper[4795]: I0219 22:50:28.016084 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9740b74f1e123788c75cebc3f7fc043bd4280c71ed6e7b445c5adb12933ad135"} err="failed to get container status \"9740b74f1e123788c75cebc3f7fc043bd4280c71ed6e7b445c5adb12933ad135\": rpc error: code = NotFound desc = could not find container \"9740b74f1e123788c75cebc3f7fc043bd4280c71ed6e7b445c5adb12933ad135\": container with ID starting with 9740b74f1e123788c75cebc3f7fc043bd4280c71ed6e7b445c5adb12933ad135 not found: ID does not exist" Feb 19 22:50:28 crc kubenswrapper[4795]: I0219 22:50:28.016102 4795 scope.go:117] "RemoveContainer" containerID="683a053d5697b6996d3f17d1c67e5c15538cfa4baae0a2d5d0c9dc28a965d9ea" Feb 19 22:50:28 crc kubenswrapper[4795]: E0219 22:50:28.016582 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"683a053d5697b6996d3f17d1c67e5c15538cfa4baae0a2d5d0c9dc28a965d9ea\": container with ID starting with 683a053d5697b6996d3f17d1c67e5c15538cfa4baae0a2d5d0c9dc28a965d9ea not found: ID does not exist" containerID="683a053d5697b6996d3f17d1c67e5c15538cfa4baae0a2d5d0c9dc28a965d9ea" Feb 19 22:50:28 crc kubenswrapper[4795]: I0219 22:50:28.016616 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"683a053d5697b6996d3f17d1c67e5c15538cfa4baae0a2d5d0c9dc28a965d9ea"} err="failed to get container status \"683a053d5697b6996d3f17d1c67e5c15538cfa4baae0a2d5d0c9dc28a965d9ea\": rpc error: code = NotFound desc = could not find container \"683a053d5697b6996d3f17d1c67e5c15538cfa4baae0a2d5d0c9dc28a965d9ea\": container with ID starting with 683a053d5697b6996d3f17d1c67e5c15538cfa4baae0a2d5d0c9dc28a965d9ea not found: ID does not exist" Feb 19 22:50:28 crc kubenswrapper[4795]: I0219 22:50:28.025927 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:28 crc kubenswrapper[4795]: I0219 22:50:28.427646 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:50:28 crc kubenswrapper[4795]: I0219 22:50:28.428513 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:50:29 crc kubenswrapper[4795]: I0219 22:50:29.473459 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fl94b"] Feb 19 22:50:29 crc kubenswrapper[4795]: I0219 22:50:29.522768 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5964e2d1-6384-4043-9857-a20ea29bd451" path="/var/lib/kubelet/pods/5964e2d1-6384-4043-9857-a20ea29bd451/volumes" Feb 19 22:50:29 crc kubenswrapper[4795]: I0219 22:50:29.951718 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fl94b" podUID="4e78bcb8-816d-4f80-9ec1-ef03e589b2b5" containerName="registry-server" containerID="cri-o://a9cf1eb01a601e9cc23b219d1249ced53774674ce564fc4223b6b0816ceac793" gracePeriod=2 Feb 19 22:50:30 crc kubenswrapper[4795]: I0219 22:50:30.341731 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:30 crc kubenswrapper[4795]: I0219 22:50:30.355607 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrzlm\" (UniqueName: \"kubernetes.io/projected/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-kube-api-access-jrzlm\") pod \"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5\" (UID: \"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5\") " Feb 19 22:50:30 crc kubenswrapper[4795]: I0219 22:50:30.355685 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-utilities\") pod \"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5\" (UID: \"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5\") " Feb 19 22:50:30 crc kubenswrapper[4795]: I0219 22:50:30.355732 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-catalog-content\") pod \"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5\" (UID: \"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5\") " Feb 19 22:50:30 crc kubenswrapper[4795]: I0219 22:50:30.357959 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-utilities" (OuterVolumeSpecName: "utilities") pod "4e78bcb8-816d-4f80-9ec1-ef03e589b2b5" (UID: "4e78bcb8-816d-4f80-9ec1-ef03e589b2b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:50:30 crc kubenswrapper[4795]: I0219 22:50:30.365496 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-kube-api-access-jrzlm" (OuterVolumeSpecName: "kube-api-access-jrzlm") pod "4e78bcb8-816d-4f80-9ec1-ef03e589b2b5" (UID: "4e78bcb8-816d-4f80-9ec1-ef03e589b2b5"). InnerVolumeSpecName "kube-api-access-jrzlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:50:30 crc kubenswrapper[4795]: I0219 22:50:30.466916 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrzlm\" (UniqueName: \"kubernetes.io/projected/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-kube-api-access-jrzlm\") on node \"crc\" DevicePath \"\"" Feb 19 22:50:30 crc kubenswrapper[4795]: I0219 22:50:30.466959 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:50:30 crc kubenswrapper[4795]: I0219 22:50:30.492208 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e78bcb8-816d-4f80-9ec1-ef03e589b2b5" (UID: "4e78bcb8-816d-4f80-9ec1-ef03e589b2b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:50:30 crc kubenswrapper[4795]: I0219 22:50:30.569712 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:50:30 crc kubenswrapper[4795]: I0219 22:50:30.970985 4795 generic.go:334] "Generic (PLEG): container finished" podID="4e78bcb8-816d-4f80-9ec1-ef03e589b2b5" containerID="a9cf1eb01a601e9cc23b219d1249ced53774674ce564fc4223b6b0816ceac793" exitCode=0 Feb 19 22:50:30 crc kubenswrapper[4795]: I0219 22:50:30.971027 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fl94b" event={"ID":"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5","Type":"ContainerDied","Data":"a9cf1eb01a601e9cc23b219d1249ced53774674ce564fc4223b6b0816ceac793"} Feb 19 22:50:30 crc kubenswrapper[4795]: I0219 22:50:30.971050 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fl94b" event={"ID":"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5","Type":"ContainerDied","Data":"94ea73ffdc3628207bbbc3c197d8cb971f78df0836d4ee833099186fe17f50ea"} Feb 19 22:50:30 crc kubenswrapper[4795]: I0219 22:50:30.971056 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:30 crc kubenswrapper[4795]: I0219 22:50:30.971065 4795 scope.go:117] "RemoveContainer" containerID="a9cf1eb01a601e9cc23b219d1249ced53774674ce564fc4223b6b0816ceac793" Feb 19 22:50:30 crc kubenswrapper[4795]: I0219 22:50:30.992600 4795 scope.go:117] "RemoveContainer" containerID="ddfafa3137520be6dd504ceadc8d930e24d9bf3c72eb5bf12cdbe765c8bc7bda" Feb 19 22:50:31 crc kubenswrapper[4795]: I0219 22:50:31.023256 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fl94b"] Feb 19 22:50:31 crc kubenswrapper[4795]: I0219 22:50:31.030593 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fl94b"] Feb 19 22:50:31 crc kubenswrapper[4795]: I0219 22:50:31.031906 4795 scope.go:117] "RemoveContainer" containerID="1cfbf37fcf4c652bf5d28da74cd92b559cc0d200ba2637ed2ce3586d55033a9c" Feb 19 22:50:31 crc kubenswrapper[4795]: I0219 22:50:31.051835 4795 scope.go:117] "RemoveContainer" containerID="a9cf1eb01a601e9cc23b219d1249ced53774674ce564fc4223b6b0816ceac793" Feb 19 22:50:31 crc kubenswrapper[4795]: E0219 22:50:31.052405 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9cf1eb01a601e9cc23b219d1249ced53774674ce564fc4223b6b0816ceac793\": container with ID starting with a9cf1eb01a601e9cc23b219d1249ced53774674ce564fc4223b6b0816ceac793 not found: ID does not exist" containerID="a9cf1eb01a601e9cc23b219d1249ced53774674ce564fc4223b6b0816ceac793" Feb 19 22:50:31 crc kubenswrapper[4795]: I0219 22:50:31.052447 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9cf1eb01a601e9cc23b219d1249ced53774674ce564fc4223b6b0816ceac793"} err="failed to get container status \"a9cf1eb01a601e9cc23b219d1249ced53774674ce564fc4223b6b0816ceac793\": rpc error: code = NotFound desc = could not find container \"a9cf1eb01a601e9cc23b219d1249ced53774674ce564fc4223b6b0816ceac793\": container with ID starting with a9cf1eb01a601e9cc23b219d1249ced53774674ce564fc4223b6b0816ceac793 not found: ID does not exist" Feb 19 22:50:31 crc kubenswrapper[4795]: I0219 22:50:31.052472 4795 scope.go:117] "RemoveContainer" containerID="ddfafa3137520be6dd504ceadc8d930e24d9bf3c72eb5bf12cdbe765c8bc7bda" Feb 19 22:50:31 crc kubenswrapper[4795]: E0219 22:50:31.053138 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddfafa3137520be6dd504ceadc8d930e24d9bf3c72eb5bf12cdbe765c8bc7bda\": container with ID starting with ddfafa3137520be6dd504ceadc8d930e24d9bf3c72eb5bf12cdbe765c8bc7bda not found: ID does not exist" containerID="ddfafa3137520be6dd504ceadc8d930e24d9bf3c72eb5bf12cdbe765c8bc7bda" Feb 19 22:50:31 crc kubenswrapper[4795]: I0219 22:50:31.053277 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddfafa3137520be6dd504ceadc8d930e24d9bf3c72eb5bf12cdbe765c8bc7bda"} err="failed to get container status \"ddfafa3137520be6dd504ceadc8d930e24d9bf3c72eb5bf12cdbe765c8bc7bda\": rpc error: code = NotFound desc = could not find container \"ddfafa3137520be6dd504ceadc8d930e24d9bf3c72eb5bf12cdbe765c8bc7bda\": container with ID starting with ddfafa3137520be6dd504ceadc8d930e24d9bf3c72eb5bf12cdbe765c8bc7bda not found: ID does not exist" Feb 19 22:50:31 crc kubenswrapper[4795]: I0219 22:50:31.053336 4795 scope.go:117] "RemoveContainer" containerID="1cfbf37fcf4c652bf5d28da74cd92b559cc0d200ba2637ed2ce3586d55033a9c" Feb 19 22:50:31 crc kubenswrapper[4795]: E0219 22:50:31.053952 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cfbf37fcf4c652bf5d28da74cd92b559cc0d200ba2637ed2ce3586d55033a9c\": container with ID starting with 1cfbf37fcf4c652bf5d28da74cd92b559cc0d200ba2637ed2ce3586d55033a9c not found: ID does not exist" containerID="1cfbf37fcf4c652bf5d28da74cd92b559cc0d200ba2637ed2ce3586d55033a9c" Feb 19 22:50:31 crc kubenswrapper[4795]: I0219 22:50:31.053985 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cfbf37fcf4c652bf5d28da74cd92b559cc0d200ba2637ed2ce3586d55033a9c"} err="failed to get container status \"1cfbf37fcf4c652bf5d28da74cd92b559cc0d200ba2637ed2ce3586d55033a9c\": rpc error: code = NotFound desc = could not find container \"1cfbf37fcf4c652bf5d28da74cd92b559cc0d200ba2637ed2ce3586d55033a9c\": container with ID starting with 1cfbf37fcf4c652bf5d28da74cd92b559cc0d200ba2637ed2ce3586d55033a9c not found: ID does not exist" Feb 19 22:50:31 crc kubenswrapper[4795]: I0219 22:50:31.523082 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e78bcb8-816d-4f80-9ec1-ef03e589b2b5" path="/var/lib/kubelet/pods/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5/volumes" Feb 19 22:50:58 crc kubenswrapper[4795]: I0219 22:50:58.427476 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:50:58 crc kubenswrapper[4795]: I0219 22:50:58.427999 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:50:58 crc kubenswrapper[4795]: I0219 22:50:58.428067 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 22:50:58 crc kubenswrapper[4795]: I0219 22:50:58.428775 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 22:50:58 crc kubenswrapper[4795]: I0219 22:50:58.428841 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" gracePeriod=600 Feb 19 22:50:58 crc kubenswrapper[4795]: E0219 22:50:58.555654 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:50:59 crc kubenswrapper[4795]: I0219 22:50:59.180859 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" exitCode=0 Feb 19 22:50:59 crc kubenswrapper[4795]: I0219 22:50:59.180901 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a"} Feb 19 22:50:59 crc kubenswrapper[4795]: I0219 22:50:59.181001 4795 scope.go:117] "RemoveContainer" containerID="85a7b1a9bb1dc26b66551125ca13bcd3b5f6f4015b61adf891b31e1f3b13c640" Feb 19 22:50:59 crc kubenswrapper[4795]: I0219 22:50:59.181488 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:50:59 crc kubenswrapper[4795]: E0219 22:50:59.181719 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:51:10 crc kubenswrapper[4795]: I0219 22:51:10.511802 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:51:10 crc kubenswrapper[4795]: E0219 22:51:10.512623 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.784507 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Feb 19 22:51:14 crc kubenswrapper[4795]: E0219 22:51:14.785229 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5964e2d1-6384-4043-9857-a20ea29bd451" containerName="registry-server" Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.785246 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5964e2d1-6384-4043-9857-a20ea29bd451" containerName="registry-server" Feb 19 22:51:14 crc kubenswrapper[4795]: E0219 22:51:14.785260 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e78bcb8-816d-4f80-9ec1-ef03e589b2b5" containerName="registry-server" Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.785268 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e78bcb8-816d-4f80-9ec1-ef03e589b2b5" containerName="registry-server" Feb 19 22:51:14 crc kubenswrapper[4795]: E0219 22:51:14.785296 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5964e2d1-6384-4043-9857-a20ea29bd451" containerName="extract-content" Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.785305 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5964e2d1-6384-4043-9857-a20ea29bd451" containerName="extract-content" Feb 19 22:51:14 crc kubenswrapper[4795]: E0219 22:51:14.785318 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e78bcb8-816d-4f80-9ec1-ef03e589b2b5" containerName="extract-utilities" Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.785326 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e78bcb8-816d-4f80-9ec1-ef03e589b2b5" containerName="extract-utilities" Feb 19 22:51:14 crc kubenswrapper[4795]: E0219 22:51:14.785342 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5964e2d1-6384-4043-9857-a20ea29bd451" containerName="extract-utilities" Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.785353 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5964e2d1-6384-4043-9857-a20ea29bd451" containerName="extract-utilities" Feb 19 22:51:14 crc kubenswrapper[4795]: E0219 22:51:14.785374 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e78bcb8-816d-4f80-9ec1-ef03e589b2b5" containerName="extract-content" Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.785384 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e78bcb8-816d-4f80-9ec1-ef03e589b2b5" containerName="extract-content" Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.785566 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5964e2d1-6384-4043-9857-a20ea29bd451" containerName="registry-server" Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.785592 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e78bcb8-816d-4f80-9ec1-ef03e589b2b5" containerName="registry-server" Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.786303 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.788485 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-p5fxh" Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.803483 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.886612 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzsrb\" (UniqueName: \"kubernetes.io/projected/4f232979-ab9c-4b59-8ad8-7756367fe0bf-kube-api-access-pzsrb\") pod \"mariadb-copy-data\" (UID: \"4f232979-ab9c-4b59-8ad8-7756367fe0bf\") " pod="openstack/mariadb-copy-data" Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.886724 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8\") pod \"mariadb-copy-data\" (UID: \"4f232979-ab9c-4b59-8ad8-7756367fe0bf\") " pod="openstack/mariadb-copy-data" Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.987904 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8\") pod \"mariadb-copy-data\" (UID: \"4f232979-ab9c-4b59-8ad8-7756367fe0bf\") " pod="openstack/mariadb-copy-data" Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.988078 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzsrb\" (UniqueName: \"kubernetes.io/projected/4f232979-ab9c-4b59-8ad8-7756367fe0bf-kube-api-access-pzsrb\") pod \"mariadb-copy-data\" (UID: \"4f232979-ab9c-4b59-8ad8-7756367fe0bf\") " pod="openstack/mariadb-copy-data" Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.990635 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.990681 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8\") pod \"mariadb-copy-data\" (UID: \"4f232979-ab9c-4b59-8ad8-7756367fe0bf\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d0db1225f95d454647216c5717445acc04f6111435881f684855bea7543e0b64/globalmount\"" pod="openstack/mariadb-copy-data" Feb 19 22:51:15 crc kubenswrapper[4795]: I0219 22:51:15.011592 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzsrb\" (UniqueName: \"kubernetes.io/projected/4f232979-ab9c-4b59-8ad8-7756367fe0bf-kube-api-access-pzsrb\") pod \"mariadb-copy-data\" (UID: \"4f232979-ab9c-4b59-8ad8-7756367fe0bf\") " pod="openstack/mariadb-copy-data" Feb 19 22:51:15 crc kubenswrapper[4795]: I0219 22:51:15.021257 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8\") pod \"mariadb-copy-data\" (UID: \"4f232979-ab9c-4b59-8ad8-7756367fe0bf\") " pod="openstack/mariadb-copy-data" Feb 19 22:51:15 crc kubenswrapper[4795]: I0219 22:51:15.071043 4795 scope.go:117] "RemoveContainer" containerID="9acf4103dbab2da287716a026f87427a75c0734ef6734fb45eb23664d9f962a3" Feb 19 22:51:15 crc kubenswrapper[4795]: I0219 22:51:15.114419 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 19 22:51:15 crc kubenswrapper[4795]: I0219 22:51:15.601137 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Feb 19 22:51:16 crc kubenswrapper[4795]: I0219 22:51:16.307297 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"4f232979-ab9c-4b59-8ad8-7756367fe0bf","Type":"ContainerStarted","Data":"d99c65ed2cf2832977c62a47557eeea8eec734877d891b4ac4fe2f4a681f7224"} Feb 19 22:51:16 crc kubenswrapper[4795]: I0219 22:51:16.307695 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"4f232979-ab9c-4b59-8ad8-7756367fe0bf","Type":"ContainerStarted","Data":"6654d86759c1aab8319afbb64f442570ab81fe83e940c47c9e22d533fa8c1665"} Feb 19 22:51:16 crc kubenswrapper[4795]: I0219 22:51:16.331337 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.331315563 podStartE2EDuration="3.331315563s" podCreationTimestamp="2026-02-19 22:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:51:16.328095761 +0000 UTC m=+4987.520613625" watchObservedRunningTime="2026-02-19 22:51:16.331315563 +0000 UTC m=+4987.523833427" Feb 19 22:51:19 crc kubenswrapper[4795]: I0219 22:51:19.441522 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Feb 19 22:51:19 crc kubenswrapper[4795]: I0219 22:51:19.443146 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:51:19 crc kubenswrapper[4795]: I0219 22:51:19.451779 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:51:19 crc kubenswrapper[4795]: I0219 22:51:19.574992 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz7jf\" (UniqueName: \"kubernetes.io/projected/cc605ab7-0f74-4d42-881d-c486eee6bd72-kube-api-access-rz7jf\") pod \"mariadb-client\" (UID: \"cc605ab7-0f74-4d42-881d-c486eee6bd72\") " pod="openstack/mariadb-client" Feb 19 22:51:19 crc kubenswrapper[4795]: I0219 22:51:19.676267 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz7jf\" (UniqueName: \"kubernetes.io/projected/cc605ab7-0f74-4d42-881d-c486eee6bd72-kube-api-access-rz7jf\") pod \"mariadb-client\" (UID: \"cc605ab7-0f74-4d42-881d-c486eee6bd72\") " pod="openstack/mariadb-client" Feb 19 22:51:19 crc kubenswrapper[4795]: I0219 22:51:19.697853 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz7jf\" (UniqueName: \"kubernetes.io/projected/cc605ab7-0f74-4d42-881d-c486eee6bd72-kube-api-access-rz7jf\") pod \"mariadb-client\" (UID: \"cc605ab7-0f74-4d42-881d-c486eee6bd72\") " pod="openstack/mariadb-client" Feb 19 22:51:19 crc kubenswrapper[4795]: I0219 22:51:19.767429 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:51:20 crc kubenswrapper[4795]: I0219 22:51:19.999922 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:51:20 crc kubenswrapper[4795]: I0219 22:51:20.410574 4795 generic.go:334] "Generic (PLEG): container finished" podID="cc605ab7-0f74-4d42-881d-c486eee6bd72" containerID="f513386e45a86e8c71f0247f1dc15d3e271452b3797cb4850f3ce460e73e544e" exitCode=0 Feb 19 22:51:20 crc kubenswrapper[4795]: I0219 22:51:20.410653 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"cc605ab7-0f74-4d42-881d-c486eee6bd72","Type":"ContainerDied","Data":"f513386e45a86e8c71f0247f1dc15d3e271452b3797cb4850f3ce460e73e544e"} Feb 19 22:51:20 crc kubenswrapper[4795]: I0219 22:51:20.410677 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"cc605ab7-0f74-4d42-881d-c486eee6bd72","Type":"ContainerStarted","Data":"a90182db1e02dd2460da3caae1b4690b2f8e06fbecf0eed6750807d1237bf802"} Feb 19 22:51:21 crc kubenswrapper[4795]: I0219 22:51:21.708111 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:51:21 crc kubenswrapper[4795]: I0219 22:51:21.745826 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_cc605ab7-0f74-4d42-881d-c486eee6bd72/mariadb-client/0.log" Feb 19 22:51:21 crc kubenswrapper[4795]: I0219 22:51:21.771634 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:51:21 crc kubenswrapper[4795]: I0219 22:51:21.776581 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:51:21 crc kubenswrapper[4795]: I0219 22:51:21.809800 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rz7jf\" (UniqueName: \"kubernetes.io/projected/cc605ab7-0f74-4d42-881d-c486eee6bd72-kube-api-access-rz7jf\") pod \"cc605ab7-0f74-4d42-881d-c486eee6bd72\" (UID: \"cc605ab7-0f74-4d42-881d-c486eee6bd72\") " Feb 19 22:51:21 crc kubenswrapper[4795]: I0219 22:51:21.815504 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc605ab7-0f74-4d42-881d-c486eee6bd72-kube-api-access-rz7jf" (OuterVolumeSpecName: "kube-api-access-rz7jf") pod "cc605ab7-0f74-4d42-881d-c486eee6bd72" (UID: "cc605ab7-0f74-4d42-881d-c486eee6bd72"). InnerVolumeSpecName "kube-api-access-rz7jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:51:21 crc kubenswrapper[4795]: I0219 22:51:21.911907 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rz7jf\" (UniqueName: \"kubernetes.io/projected/cc605ab7-0f74-4d42-881d-c486eee6bd72-kube-api-access-rz7jf\") on node \"crc\" DevicePath \"\"" Feb 19 22:51:21 crc kubenswrapper[4795]: I0219 22:51:21.930656 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Feb 19 22:51:21 crc kubenswrapper[4795]: E0219 22:51:21.930948 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc605ab7-0f74-4d42-881d-c486eee6bd72" containerName="mariadb-client" Feb 19 22:51:21 crc kubenswrapper[4795]: I0219 22:51:21.930965 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc605ab7-0f74-4d42-881d-c486eee6bd72" containerName="mariadb-client" Feb 19 22:51:21 crc kubenswrapper[4795]: I0219 22:51:21.931126 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc605ab7-0f74-4d42-881d-c486eee6bd72" containerName="mariadb-client" Feb 19 22:51:21 crc kubenswrapper[4795]: I0219 22:51:21.932798 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:51:21 crc kubenswrapper[4795]: I0219 22:51:21.942679 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:51:22 crc kubenswrapper[4795]: I0219 22:51:22.013565 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8zf8\" (UniqueName: \"kubernetes.io/projected/8d6f031b-713a-4c22-8017-5a615e34004f-kube-api-access-l8zf8\") pod \"mariadb-client\" (UID: \"8d6f031b-713a-4c22-8017-5a615e34004f\") " pod="openstack/mariadb-client" Feb 19 22:51:22 crc kubenswrapper[4795]: I0219 22:51:22.116120 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8zf8\" (UniqueName: \"kubernetes.io/projected/8d6f031b-713a-4c22-8017-5a615e34004f-kube-api-access-l8zf8\") pod \"mariadb-client\" (UID: \"8d6f031b-713a-4c22-8017-5a615e34004f\") " pod="openstack/mariadb-client" Feb 19 22:51:22 crc kubenswrapper[4795]: I0219 22:51:22.134307 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8zf8\" (UniqueName: \"kubernetes.io/projected/8d6f031b-713a-4c22-8017-5a615e34004f-kube-api-access-l8zf8\") pod \"mariadb-client\" (UID: \"8d6f031b-713a-4c22-8017-5a615e34004f\") " pod="openstack/mariadb-client" Feb 19 22:51:22 crc kubenswrapper[4795]: I0219 22:51:22.253905 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:51:22 crc kubenswrapper[4795]: I0219 22:51:22.432707 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a90182db1e02dd2460da3caae1b4690b2f8e06fbecf0eed6750807d1237bf802" Feb 19 22:51:22 crc kubenswrapper[4795]: I0219 22:51:22.432909 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:51:22 crc kubenswrapper[4795]: I0219 22:51:22.452095 4795 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="cc605ab7-0f74-4d42-881d-c486eee6bd72" podUID="8d6f031b-713a-4c22-8017-5a615e34004f" Feb 19 22:51:22 crc kubenswrapper[4795]: I0219 22:51:22.497944 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:51:22 crc kubenswrapper[4795]: W0219 22:51:22.498604 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d6f031b_713a_4c22_8017_5a615e34004f.slice/crio-7e704049a2bbd5adf282108cd27b5f71465e6a42c51ed4657129b08be0639fa4 WatchSource:0}: Error finding container 7e704049a2bbd5adf282108cd27b5f71465e6a42c51ed4657129b08be0639fa4: Status 404 returned error can't find the container with id 7e704049a2bbd5adf282108cd27b5f71465e6a42c51ed4657129b08be0639fa4 Feb 19 22:51:22 crc kubenswrapper[4795]: I0219 22:51:22.511425 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:51:22 crc kubenswrapper[4795]: E0219 22:51:22.511683 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:51:23 crc kubenswrapper[4795]: I0219 22:51:23.440092 4795 generic.go:334] "Generic (PLEG): container finished" podID="8d6f031b-713a-4c22-8017-5a615e34004f" containerID="ddf33d9929aa340d91607be45827bb09c61b346f1c01e148af46a4a7d07c2f45" exitCode=0 Feb 19 22:51:23 crc kubenswrapper[4795]: I0219 22:51:23.440186 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"8d6f031b-713a-4c22-8017-5a615e34004f","Type":"ContainerDied","Data":"ddf33d9929aa340d91607be45827bb09c61b346f1c01e148af46a4a7d07c2f45"} Feb 19 22:51:23 crc kubenswrapper[4795]: I0219 22:51:23.440393 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"8d6f031b-713a-4c22-8017-5a615e34004f","Type":"ContainerStarted","Data":"7e704049a2bbd5adf282108cd27b5f71465e6a42c51ed4657129b08be0639fa4"} Feb 19 22:51:23 crc kubenswrapper[4795]: I0219 22:51:23.525687 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc605ab7-0f74-4d42-881d-c486eee6bd72" path="/var/lib/kubelet/pods/cc605ab7-0f74-4d42-881d-c486eee6bd72/volumes" Feb 19 22:51:24 crc kubenswrapper[4795]: I0219 22:51:24.737029 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:51:24 crc kubenswrapper[4795]: I0219 22:51:24.758556 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_8d6f031b-713a-4c22-8017-5a615e34004f/mariadb-client/0.log" Feb 19 22:51:24 crc kubenswrapper[4795]: I0219 22:51:24.786821 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:51:24 crc kubenswrapper[4795]: I0219 22:51:24.791106 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:51:24 crc kubenswrapper[4795]: I0219 22:51:24.855872 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8zf8\" (UniqueName: \"kubernetes.io/projected/8d6f031b-713a-4c22-8017-5a615e34004f-kube-api-access-l8zf8\") pod \"8d6f031b-713a-4c22-8017-5a615e34004f\" (UID: \"8d6f031b-713a-4c22-8017-5a615e34004f\") " Feb 19 22:51:24 crc kubenswrapper[4795]: I0219 22:51:24.862347 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d6f031b-713a-4c22-8017-5a615e34004f-kube-api-access-l8zf8" (OuterVolumeSpecName: "kube-api-access-l8zf8") pod "8d6f031b-713a-4c22-8017-5a615e34004f" (UID: "8d6f031b-713a-4c22-8017-5a615e34004f"). InnerVolumeSpecName "kube-api-access-l8zf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:51:24 crc kubenswrapper[4795]: I0219 22:51:24.957668 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8zf8\" (UniqueName: \"kubernetes.io/projected/8d6f031b-713a-4c22-8017-5a615e34004f-kube-api-access-l8zf8\") on node \"crc\" DevicePath \"\"" Feb 19 22:51:25 crc kubenswrapper[4795]: I0219 22:51:25.455259 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e704049a2bbd5adf282108cd27b5f71465e6a42c51ed4657129b08be0639fa4" Feb 19 22:51:25 crc kubenswrapper[4795]: I0219 22:51:25.455708 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:51:25 crc kubenswrapper[4795]: I0219 22:51:25.520186 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d6f031b-713a-4c22-8017-5a615e34004f" path="/var/lib/kubelet/pods/8d6f031b-713a-4c22-8017-5a615e34004f/volumes" Feb 19 22:51:34 crc kubenswrapper[4795]: I0219 22:51:34.511430 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:51:34 crc kubenswrapper[4795]: E0219 22:51:34.512226 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:51:47 crc kubenswrapper[4795]: I0219 22:51:47.512054 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:51:47 crc kubenswrapper[4795]: E0219 22:51:47.512872 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.553588 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 22:51:54 crc kubenswrapper[4795]: E0219 22:51:54.554468 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d6f031b-713a-4c22-8017-5a615e34004f" containerName="mariadb-client" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.554485 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d6f031b-713a-4c22-8017-5a615e34004f" containerName="mariadb-client" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.554674 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d6f031b-713a-4c22-8017-5a615e34004f" containerName="mariadb-client" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.555543 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.558497 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.559232 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.559475 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-2mgdw" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.586975 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.607841 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.609595 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.612649 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.614990 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.620731 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/924d2a8a-2ae7-417a-9770-054662474286-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.621094 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/924d2a8a-2ae7-417a-9770-054662474286-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.621136 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/924d2a8a-2ae7-417a-9770-054662474286-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.621200 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/924d2a8a-2ae7-417a-9770-054662474286-config\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.621413 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlm54\" (UniqueName: \"kubernetes.io/projected/924d2a8a-2ae7-417a-9770-054662474286-kube-api-access-qlm54\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.621486 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-552c33e5-c2b0-4416-b2bd-0144a712b3cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-552c33e5-c2b0-4416-b2bd-0144a712b3cc\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.630055 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.637966 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.723369 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5bddd7-705d-41b3-ad43-1889c6c34ab0-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.723412 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9c8cf9f5-7499-4c52-9710-91b96d49b0fc-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.723430 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snclf\" (UniqueName: \"kubernetes.io/projected/9c5bddd7-705d-41b3-ad43-1889c6c34ab0-kube-api-access-snclf\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.723450 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9c5bddd7-705d-41b3-ad43-1889c6c34ab0-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.723653 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsfq9\" (UniqueName: \"kubernetes.io/projected/9c8cf9f5-7499-4c52-9710-91b96d49b0fc-kube-api-access-tsfq9\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.723766 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c5bddd7-705d-41b3-ad43-1889c6c34ab0-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.723876 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/924d2a8a-2ae7-417a-9770-054662474286-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.723917 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/924d2a8a-2ae7-417a-9770-054662474286-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.723948 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/924d2a8a-2ae7-417a-9770-054662474286-config\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.723975 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c8cf9f5-7499-4c52-9710-91b96d49b0fc-config\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.723996 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c8cf9f5-7499-4c52-9710-91b96d49b0fc-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.724086 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlm54\" (UniqueName: \"kubernetes.io/projected/924d2a8a-2ae7-417a-9770-054662474286-kube-api-access-qlm54\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.724123 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-552c33e5-c2b0-4416-b2bd-0144a712b3cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-552c33e5-c2b0-4416-b2bd-0144a712b3cc\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.724158 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-841e812f-b743-45d7-a4f7-61e300ae1598\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-841e812f-b743-45d7-a4f7-61e300ae1598\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.724254 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c8cf9f5-7499-4c52-9710-91b96d49b0fc-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.724346 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-46120ff0-7821-4a9b-85e7-062ad17a54a1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46120ff0-7821-4a9b-85e7-062ad17a54a1\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.724387 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/924d2a8a-2ae7-417a-9770-054662474286-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.724418 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c5bddd7-705d-41b3-ad43-1889c6c34ab0-config\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.724457 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/924d2a8a-2ae7-417a-9770-054662474286-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.725128 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/924d2a8a-2ae7-417a-9770-054662474286-config\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.725588 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/924d2a8a-2ae7-417a-9770-054662474286-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.731006 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.731270 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-552c33e5-c2b0-4416-b2bd-0144a712b3cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-552c33e5-c2b0-4416-b2bd-0144a712b3cc\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3b528f5553046ea28336dd5dedcb616848a6b59d36a3821c411aaad7c89c5a53/globalmount\"" pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.731565 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/924d2a8a-2ae7-417a-9770-054662474286-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.741592 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.742979 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.749992 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-znw7z" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.750229 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.750349 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.757931 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.765126 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlm54\" (UniqueName: \"kubernetes.io/projected/924d2a8a-2ae7-417a-9770-054662474286-kube-api-access-qlm54\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.765974 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.767260 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.772295 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.774616 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.781422 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.793116 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-552c33e5-c2b0-4416-b2bd-0144a712b3cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-552c33e5-c2b0-4416-b2bd-0144a712b3cc\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.810556 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826204 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b32c19b-2b8b-4587-9327-1ddf5b074ad6-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826299 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/188a11e4-50de-4672-baaf-89a3a512cd0c-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826350 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7lh6\" (UniqueName: \"kubernetes.io/projected/9b32c19b-2b8b-4587-9327-1ddf5b074ad6-kube-api-access-z7lh6\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826372 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tpf5\" (UniqueName: \"kubernetes.io/projected/f814768e-2961-4d2a-ba3b-615dea717cf8-kube-api-access-9tpf5\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826398 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-841e812f-b743-45d7-a4f7-61e300ae1598\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-841e812f-b743-45d7-a4f7-61e300ae1598\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826414 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1bd16f90-4266-4719-9291-209b6cf6174c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1bd16f90-4266-4719-9291-209b6cf6174c\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826434 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c8cf9f5-7499-4c52-9710-91b96d49b0fc-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826465 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b32c19b-2b8b-4587-9327-1ddf5b074ad6-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826494 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-46120ff0-7821-4a9b-85e7-062ad17a54a1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46120ff0-7821-4a9b-85e7-062ad17a54a1\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826515 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f814768e-2961-4d2a-ba3b-615dea717cf8-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826539 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c5bddd7-705d-41b3-ad43-1889c6c34ab0-config\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826561 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/188a11e4-50de-4672-baaf-89a3a512cd0c-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826580 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-08539c3c-3934-44e9-9aad-66b43491c570\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08539c3c-3934-44e9-9aad-66b43491c570\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826716 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5bddd7-705d-41b3-ad43-1889c6c34ab0-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826813 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9c8cf9f5-7499-4c52-9710-91b96d49b0fc-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826833 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snclf\" (UniqueName: \"kubernetes.io/projected/9c5bddd7-705d-41b3-ad43-1889c6c34ab0-kube-api-access-snclf\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826852 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9c5bddd7-705d-41b3-ad43-1889c6c34ab0-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826877 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b6d12d84-b782-4a08-b2af-133e42b5db7b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6d12d84-b782-4a08-b2af-133e42b5db7b\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826892 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b32c19b-2b8b-4587-9327-1ddf5b074ad6-config\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826910 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/188a11e4-50de-4672-baaf-89a3a512cd0c-config\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826930 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsfq9\" (UniqueName: \"kubernetes.io/projected/9c8cf9f5-7499-4c52-9710-91b96d49b0fc-kube-api-access-tsfq9\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826947 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/188a11e4-50de-4672-baaf-89a3a512cd0c-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826970 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vpjf\" (UniqueName: \"kubernetes.io/projected/188a11e4-50de-4672-baaf-89a3a512cd0c-kube-api-access-7vpjf\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.827029 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c5bddd7-705d-41b3-ad43-1889c6c34ab0-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.827069 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9b32c19b-2b8b-4587-9327-1ddf5b074ad6-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.827088 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f814768e-2961-4d2a-ba3b-615dea717cf8-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.827107 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f814768e-2961-4d2a-ba3b-615dea717cf8-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.827129 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f814768e-2961-4d2a-ba3b-615dea717cf8-config\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.827151 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c8cf9f5-7499-4c52-9710-91b96d49b0fc-config\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.827182 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c8cf9f5-7499-4c52-9710-91b96d49b0fc-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.827396 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c5bddd7-705d-41b3-ad43-1889c6c34ab0-config\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.827601 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c8cf9f5-7499-4c52-9710-91b96d49b0fc-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.827655 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9c5bddd7-705d-41b3-ad43-1889c6c34ab0-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.828024 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9c8cf9f5-7499-4c52-9710-91b96d49b0fc-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.828188 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c5bddd7-705d-41b3-ad43-1889c6c34ab0-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.828544 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c8cf9f5-7499-4c52-9710-91b96d49b0fc-config\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.829883 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.829910 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-841e812f-b743-45d7-a4f7-61e300ae1598\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-841e812f-b743-45d7-a4f7-61e300ae1598\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/aca27e3cff8edd54249a833e181f411a80bf436270cd04eb0a8f9162f59ecfd5/globalmount\"" pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.830343 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.830368 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-46120ff0-7821-4a9b-85e7-062ad17a54a1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46120ff0-7821-4a9b-85e7-062ad17a54a1\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0a2d8661200c1d07fe2ccaa0ae4e4d6b97a3f8f36bbb172d4a279baae4787109/globalmount\"" pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.831219 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5bddd7-705d-41b3-ad43-1889c6c34ab0-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.833954 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c8cf9f5-7499-4c52-9710-91b96d49b0fc-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.848397 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsfq9\" (UniqueName: \"kubernetes.io/projected/9c8cf9f5-7499-4c52-9710-91b96d49b0fc-kube-api-access-tsfq9\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.848613 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snclf\" (UniqueName: \"kubernetes.io/projected/9c5bddd7-705d-41b3-ad43-1889c6c34ab0-kube-api-access-snclf\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.876551 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-841e812f-b743-45d7-a4f7-61e300ae1598\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-841e812f-b743-45d7-a4f7-61e300ae1598\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.878110 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.880398 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-46120ff0-7821-4a9b-85e7-062ad17a54a1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46120ff0-7821-4a9b-85e7-062ad17a54a1\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929242 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b32c19b-2b8b-4587-9327-1ddf5b074ad6-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929350 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f814768e-2961-4d2a-ba3b-615dea717cf8-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929436 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/188a11e4-50de-4672-baaf-89a3a512cd0c-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929488 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-08539c3c-3934-44e9-9aad-66b43491c570\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08539c3c-3934-44e9-9aad-66b43491c570\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929537 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b6d12d84-b782-4a08-b2af-133e42b5db7b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6d12d84-b782-4a08-b2af-133e42b5db7b\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929558 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b32c19b-2b8b-4587-9327-1ddf5b074ad6-config\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929580 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/188a11e4-50de-4672-baaf-89a3a512cd0c-config\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929610 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/188a11e4-50de-4672-baaf-89a3a512cd0c-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929638 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vpjf\" (UniqueName: \"kubernetes.io/projected/188a11e4-50de-4672-baaf-89a3a512cd0c-kube-api-access-7vpjf\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929672 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9b32c19b-2b8b-4587-9327-1ddf5b074ad6-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929691 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f814768e-2961-4d2a-ba3b-615dea717cf8-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929716 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f814768e-2961-4d2a-ba3b-615dea717cf8-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929738 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f814768e-2961-4d2a-ba3b-615dea717cf8-config\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929783 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b32c19b-2b8b-4587-9327-1ddf5b074ad6-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929806 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/188a11e4-50de-4672-baaf-89a3a512cd0c-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929837 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7lh6\" (UniqueName: \"kubernetes.io/projected/9b32c19b-2b8b-4587-9327-1ddf5b074ad6-kube-api-access-z7lh6\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929860 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tpf5\" (UniqueName: \"kubernetes.io/projected/f814768e-2961-4d2a-ba3b-615dea717cf8-kube-api-access-9tpf5\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929883 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1bd16f90-4266-4719-9291-209b6cf6174c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1bd16f90-4266-4719-9291-209b6cf6174c\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.931430 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/188a11e4-50de-4672-baaf-89a3a512cd0c-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.931830 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b32c19b-2b8b-4587-9327-1ddf5b074ad6-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.932140 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f814768e-2961-4d2a-ba3b-615dea717cf8-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.934272 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f814768e-2961-4d2a-ba3b-615dea717cf8-config\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.934351 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.934427 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-08539c3c-3934-44e9-9aad-66b43491c570\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08539c3c-3934-44e9-9aad-66b43491c570\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c5557edd1aaf2a0c264069831ad9260b3ced248ed61a9854588c21aa3523c18c/globalmount\"" pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.935461 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/188a11e4-50de-4672-baaf-89a3a512cd0c-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.934748 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/188a11e4-50de-4672-baaf-89a3a512cd0c-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.934530 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.935247 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/188a11e4-50de-4672-baaf-89a3a512cd0c-config\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.935544 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1bd16f90-4266-4719-9291-209b6cf6174c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1bd16f90-4266-4719-9291-209b6cf6174c\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/27a567fbfad556db7b211a2e7847fe547895c6e57e901962a8919ac461884ee0/globalmount\"" pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.935348 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b32c19b-2b8b-4587-9327-1ddf5b074ad6-config\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.935152 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9b32c19b-2b8b-4587-9327-1ddf5b074ad6-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.935448 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.935418 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f814768e-2961-4d2a-ba3b-615dea717cf8-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.935620 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b6d12d84-b782-4a08-b2af-133e42b5db7b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6d12d84-b782-4a08-b2af-133e42b5db7b\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/db1dd82f0e1b9f62d7344caa6e7654af05c11eaf0ccaf04f246ecd80c7b35186/globalmount\"" pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.939750 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.940641 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f814768e-2961-4d2a-ba3b-615dea717cf8-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.948434 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tpf5\" (UniqueName: \"kubernetes.io/projected/f814768e-2961-4d2a-ba3b-615dea717cf8-kube-api-access-9tpf5\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.950759 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.954264 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b32c19b-2b8b-4587-9327-1ddf5b074ad6-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.955531 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7lh6\" (UniqueName: \"kubernetes.io/projected/9b32c19b-2b8b-4587-9327-1ddf5b074ad6-kube-api-access-z7lh6\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.964517 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b6d12d84-b782-4a08-b2af-133e42b5db7b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6d12d84-b782-4a08-b2af-133e42b5db7b\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.967580 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vpjf\" (UniqueName: \"kubernetes.io/projected/188a11e4-50de-4672-baaf-89a3a512cd0c-kube-api-access-7vpjf\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.976254 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-08539c3c-3934-44e9-9aad-66b43491c570\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08539c3c-3934-44e9-9aad-66b43491c570\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.985770 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1bd16f90-4266-4719-9291-209b6cf6174c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1bd16f90-4266-4719-9291-209b6cf6174c\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:55 crc kubenswrapper[4795]: I0219 22:51:55.200675 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:55 crc kubenswrapper[4795]: I0219 22:51:55.240085 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:55 crc kubenswrapper[4795]: I0219 22:51:55.253907 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:55 crc kubenswrapper[4795]: I0219 22:51:55.415725 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 22:51:55 crc kubenswrapper[4795]: W0219 22:51:55.519633 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c8cf9f5_7499_4c52_9710_91b96d49b0fc.slice/crio-1ac92e99539ac69cecf201912fafb10d05998d6510947a4be6459d01671fc203 WatchSource:0}: Error finding container 1ac92e99539ac69cecf201912fafb10d05998d6510947a4be6459d01671fc203: Status 404 returned error can't find the container with id 1ac92e99539ac69cecf201912fafb10d05998d6510947a4be6459d01671fc203 Feb 19 22:51:55 crc kubenswrapper[4795]: I0219 22:51:55.525510 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 19 22:51:55 crc kubenswrapper[4795]: I0219 22:51:55.674652 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"924d2a8a-2ae7-417a-9770-054662474286","Type":"ContainerStarted","Data":"49c0a55fa410598d38997cfac435ed117fc4d698fba1d797eddfd1fe9b55a871"} Feb 19 22:51:55 crc kubenswrapper[4795]: I0219 22:51:55.674690 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"924d2a8a-2ae7-417a-9770-054662474286","Type":"ContainerStarted","Data":"32337d3621741d173ad77da972c37a7e044d3759fd535ec343e10bfb072597aa"} Feb 19 22:51:55 crc kubenswrapper[4795]: I0219 22:51:55.676893 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"9c8cf9f5-7499-4c52-9710-91b96d49b0fc","Type":"ContainerStarted","Data":"652d5844931377f852635eef72cbbec67148a46211ccc470935614b10dad6194"} Feb 19 22:51:55 crc kubenswrapper[4795]: I0219 22:51:55.676916 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"9c8cf9f5-7499-4c52-9710-91b96d49b0fc","Type":"ContainerStarted","Data":"1ac92e99539ac69cecf201912fafb10d05998d6510947a4be6459d01671fc203"} Feb 19 22:51:55 crc kubenswrapper[4795]: I0219 22:51:55.703626 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 22:51:55 crc kubenswrapper[4795]: W0219 22:51:55.718069 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b32c19b_2b8b_4587_9327_1ddf5b074ad6.slice/crio-e7606ae6f09fbf2b028db75b20c79529c331868bb30de53bddd69aca62425a7a WatchSource:0}: Error finding container e7606ae6f09fbf2b028db75b20c79529c331868bb30de53bddd69aca62425a7a: Status 404 returned error can't find the container with id e7606ae6f09fbf2b028db75b20c79529c331868bb30de53bddd69aca62425a7a Feb 19 22:51:55 crc kubenswrapper[4795]: I0219 22:51:55.817152 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 19 22:51:55 crc kubenswrapper[4795]: W0219 22:51:55.821655 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf814768e_2961_4d2a_ba3b_615dea717cf8.slice/crio-6c53eb8c45bad827e2360bebeef78494ad130dc2748fdc0944c22d4d4a99ae9a WatchSource:0}: Error finding container 6c53eb8c45bad827e2360bebeef78494ad130dc2748fdc0944c22d4d4a99ae9a: Status 404 returned error can't find the container with id 6c53eb8c45bad827e2360bebeef78494ad130dc2748fdc0944c22d4d4a99ae9a Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.114624 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.683989 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"f814768e-2961-4d2a-ba3b-615dea717cf8","Type":"ContainerStarted","Data":"7155fc037f2c62975ea85f65ed5cdd4da97d391e6238a7588c35c446cd943e68"} Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.685239 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"f814768e-2961-4d2a-ba3b-615dea717cf8","Type":"ContainerStarted","Data":"c3294df24bbdd5efeecb8ed96f4b3ec477f55238605ce6a7ba26f396bcf9316b"} Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.685266 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"f814768e-2961-4d2a-ba3b-615dea717cf8","Type":"ContainerStarted","Data":"6c53eb8c45bad827e2360bebeef78494ad130dc2748fdc0944c22d4d4a99ae9a"} Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.688671 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9b32c19b-2b8b-4587-9327-1ddf5b074ad6","Type":"ContainerStarted","Data":"faae9ef344ebf349c5656c03b60dc6ec8b3197801d99b54016f0ab282a2c1d2b"} Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.688697 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9b32c19b-2b8b-4587-9327-1ddf5b074ad6","Type":"ContainerStarted","Data":"0bcc6a4699eaa7b1e559d8fb7153887a76810c14f58965864516267036824580"} Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.688706 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9b32c19b-2b8b-4587-9327-1ddf5b074ad6","Type":"ContainerStarted","Data":"e7606ae6f09fbf2b028db75b20c79529c331868bb30de53bddd69aca62425a7a"} Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.690083 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"9c8cf9f5-7499-4c52-9710-91b96d49b0fc","Type":"ContainerStarted","Data":"4130d4c8f1430c2f2c5659f4afff52f3b06db37d844f72cd7d8ec089d6bd8c10"} Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.692009 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"9c5bddd7-705d-41b3-ad43-1889c6c34ab0","Type":"ContainerStarted","Data":"b3658c2e1080f8ddd0daa19ba508bd8f8781a70157518d68b05c21f0e2ecafc5"} Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.692032 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"9c5bddd7-705d-41b3-ad43-1889c6c34ab0","Type":"ContainerStarted","Data":"74bba6c69905c7ad27b9f920647247cdab0473ebb5cb41441b882379756a3caf"} Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.692042 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"9c5bddd7-705d-41b3-ad43-1889c6c34ab0","Type":"ContainerStarted","Data":"776f674a5b7e650c62f1c4ec9ad34086b4a66fb8c7326722c072767e6b74bcf1"} Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.693250 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"924d2a8a-2ae7-417a-9770-054662474286","Type":"ContainerStarted","Data":"26bd76b1fc96d5f7750289149cb37d6cec20052065f5959d04c801e28f6bb382"} Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.732669 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.732654952 podStartE2EDuration="3.732654952s" podCreationTimestamp="2026-02-19 22:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:51:56.731804818 +0000 UTC m=+5027.924322682" watchObservedRunningTime="2026-02-19 22:51:56.732654952 +0000 UTC m=+5027.925172816" Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.735867 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=3.735859104 podStartE2EDuration="3.735859104s" podCreationTimestamp="2026-02-19 22:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:51:56.708932102 +0000 UTC m=+5027.901449976" watchObservedRunningTime="2026-02-19 22:51:56.735859104 +0000 UTC m=+5027.928376968" Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.761030 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=3.7610094849999998 podStartE2EDuration="3.761009485s" podCreationTimestamp="2026-02-19 22:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:51:56.756807194 +0000 UTC m=+5027.949325068" watchObservedRunningTime="2026-02-19 22:51:56.761009485 +0000 UTC m=+5027.953527349" Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.789510 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.789486051 podStartE2EDuration="3.789486051s" podCreationTimestamp="2026-02-19 22:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:51:56.787723191 +0000 UTC m=+5027.980241055" watchObservedRunningTime="2026-02-19 22:51:56.789486051 +0000 UTC m=+5027.982003945" Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.815220 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 19 22:51:56 crc kubenswrapper[4795]: W0219 22:51:56.821836 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod188a11e4_50de_4672_baaf_89a3a512cd0c.slice/crio-6174c539b4290276b5a7764ab336641ae36237e6572ffb7135237fb585aa9422 WatchSource:0}: Error finding container 6174c539b4290276b5a7764ab336641ae36237e6572ffb7135237fb585aa9422: Status 404 returned error can't find the container with id 6174c539b4290276b5a7764ab336641ae36237e6572ffb7135237fb585aa9422 Feb 19 22:51:57 crc kubenswrapper[4795]: I0219 22:51:57.706427 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"188a11e4-50de-4672-baaf-89a3a512cd0c","Type":"ContainerStarted","Data":"16143a7141f0493d70240cd1893314445bd55ba212e2b5433b67e9bfe0a968f8"} Feb 19 22:51:57 crc kubenswrapper[4795]: I0219 22:51:57.706747 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"188a11e4-50de-4672-baaf-89a3a512cd0c","Type":"ContainerStarted","Data":"5ed80e8f09df17ce3b9626981e388bc78869cdc68ac78e8ab98739f8160677c0"} Feb 19 22:51:57 crc kubenswrapper[4795]: I0219 22:51:57.706758 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"188a11e4-50de-4672-baaf-89a3a512cd0c","Type":"ContainerStarted","Data":"6174c539b4290276b5a7764ab336641ae36237e6572ffb7135237fb585aa9422"} Feb 19 22:51:57 crc kubenswrapper[4795]: I0219 22:51:57.729327 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=4.72930162 podStartE2EDuration="4.72930162s" podCreationTimestamp="2026-02-19 22:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:51:56.820985484 +0000 UTC m=+5028.013503358" watchObservedRunningTime="2026-02-19 22:51:57.72930162 +0000 UTC m=+5028.921819524" Feb 19 22:51:57 crc kubenswrapper[4795]: I0219 22:51:57.736067 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=4.736050314 podStartE2EDuration="4.736050314s" podCreationTimestamp="2026-02-19 22:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:51:57.7272138 +0000 UTC m=+5028.919731704" watchObservedRunningTime="2026-02-19 22:51:57.736050314 +0000 UTC m=+5028.928568218" Feb 19 22:51:57 crc kubenswrapper[4795]: I0219 22:51:57.879147 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:57 crc kubenswrapper[4795]: I0219 22:51:57.941057 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:57 crc kubenswrapper[4795]: I0219 22:51:57.951371 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:58 crc kubenswrapper[4795]: I0219 22:51:58.201972 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:58 crc kubenswrapper[4795]: I0219 22:51:58.240427 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:58 crc kubenswrapper[4795]: I0219 22:51:58.254417 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:58 crc kubenswrapper[4795]: I0219 22:51:58.270885 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:58 crc kubenswrapper[4795]: I0219 22:51:58.315603 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:58 crc kubenswrapper[4795]: I0219 22:51:58.716383 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:58 crc kubenswrapper[4795]: I0219 22:51:58.716922 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:59 crc kubenswrapper[4795]: I0219 22:51:59.879262 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:59 crc kubenswrapper[4795]: I0219 22:51:59.940778 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:59 crc kubenswrapper[4795]: I0219 22:51:59.951472 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.254020 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.259985 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.280735 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.457682 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56f9dfcfb9-bslgg"] Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.459027 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.462377 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.483858 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56f9dfcfb9-bslgg"] Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.511937 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:52:00 crc kubenswrapper[4795]: E0219 22:52:00.512183 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.532347 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-config\") pod \"dnsmasq-dns-56f9dfcfb9-bslgg\" (UID: \"71730cb0-0d62-496c-b20a-590bc258489b\") " pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.532398 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-ovsdbserver-nb\") pod \"dnsmasq-dns-56f9dfcfb9-bslgg\" (UID: \"71730cb0-0d62-496c-b20a-590bc258489b\") " pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.532525 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqqtg\" (UniqueName: \"kubernetes.io/projected/71730cb0-0d62-496c-b20a-590bc258489b-kube-api-access-pqqtg\") pod \"dnsmasq-dns-56f9dfcfb9-bslgg\" (UID: \"71730cb0-0d62-496c-b20a-590bc258489b\") " pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.532677 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-dns-svc\") pod \"dnsmasq-dns-56f9dfcfb9-bslgg\" (UID: \"71730cb0-0d62-496c-b20a-590bc258489b\") " pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.633778 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-config\") pod \"dnsmasq-dns-56f9dfcfb9-bslgg\" (UID: \"71730cb0-0d62-496c-b20a-590bc258489b\") " pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.633827 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-ovsdbserver-nb\") pod \"dnsmasq-dns-56f9dfcfb9-bslgg\" (UID: \"71730cb0-0d62-496c-b20a-590bc258489b\") " pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.633860 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqqtg\" (UniqueName: \"kubernetes.io/projected/71730cb0-0d62-496c-b20a-590bc258489b-kube-api-access-pqqtg\") pod \"dnsmasq-dns-56f9dfcfb9-bslgg\" (UID: \"71730cb0-0d62-496c-b20a-590bc258489b\") " pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.633946 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-dns-svc\") pod \"dnsmasq-dns-56f9dfcfb9-bslgg\" (UID: \"71730cb0-0d62-496c-b20a-590bc258489b\") " pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.634773 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-ovsdbserver-nb\") pod \"dnsmasq-dns-56f9dfcfb9-bslgg\" (UID: \"71730cb0-0d62-496c-b20a-590bc258489b\") " pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.634794 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-config\") pod \"dnsmasq-dns-56f9dfcfb9-bslgg\" (UID: \"71730cb0-0d62-496c-b20a-590bc258489b\") " pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.635092 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-dns-svc\") pod \"dnsmasq-dns-56f9dfcfb9-bslgg\" (UID: \"71730cb0-0d62-496c-b20a-590bc258489b\") " pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.653596 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqqtg\" (UniqueName: \"kubernetes.io/projected/71730cb0-0d62-496c-b20a-590bc258489b-kube-api-access-pqqtg\") pod \"dnsmasq-dns-56f9dfcfb9-bslgg\" (UID: \"71730cb0-0d62-496c-b20a-590bc258489b\") " pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.790881 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.924226 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.969357 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.009412 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.009759 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.063690 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.082468 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.136551 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56f9dfcfb9-bslgg"] Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.168653 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6dc9b786df-rmprl"] Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.170006 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.175998 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.191089 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dc9b786df-rmprl"] Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.230885 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56f9dfcfb9-bslgg"] Feb 19 22:52:01 crc kubenswrapper[4795]: W0219 22:52:01.236505 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71730cb0_0d62_496c_b20a_590bc258489b.slice/crio-514fd9cf80faa56b01c5fba3e6443cc446339d2131fc32a8758f5440873bb6c7 WatchSource:0}: Error finding container 514fd9cf80faa56b01c5fba3e6443cc446339d2131fc32a8758f5440873bb6c7: Status 404 returned error can't find the container with id 514fd9cf80faa56b01c5fba3e6443cc446339d2131fc32a8758f5440873bb6c7 Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.250219 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-ovsdbserver-nb\") pod \"dnsmasq-dns-6dc9b786df-rmprl\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.250409 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-dns-svc\") pod \"dnsmasq-dns-6dc9b786df-rmprl\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.250482 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-config\") pod \"dnsmasq-dns-6dc9b786df-rmprl\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.250561 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-ovsdbserver-sb\") pod \"dnsmasq-dns-6dc9b786df-rmprl\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.250678 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrnhl\" (UniqueName: \"kubernetes.io/projected/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-kube-api-access-rrnhl\") pod \"dnsmasq-dns-6dc9b786df-rmprl\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.296628 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.352306 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrnhl\" (UniqueName: \"kubernetes.io/projected/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-kube-api-access-rrnhl\") pod \"dnsmasq-dns-6dc9b786df-rmprl\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.352408 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-ovsdbserver-nb\") pod \"dnsmasq-dns-6dc9b786df-rmprl\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.352439 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-dns-svc\") pod \"dnsmasq-dns-6dc9b786df-rmprl\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.352460 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-config\") pod \"dnsmasq-dns-6dc9b786df-rmprl\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.352481 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-ovsdbserver-sb\") pod \"dnsmasq-dns-6dc9b786df-rmprl\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.353211 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-ovsdbserver-sb\") pod \"dnsmasq-dns-6dc9b786df-rmprl\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.353821 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-ovsdbserver-nb\") pod \"dnsmasq-dns-6dc9b786df-rmprl\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.353833 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-dns-svc\") pod \"dnsmasq-dns-6dc9b786df-rmprl\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.353995 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-config\") pod \"dnsmasq-dns-6dc9b786df-rmprl\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.370950 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrnhl\" (UniqueName: \"kubernetes.io/projected/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-kube-api-access-rrnhl\") pod \"dnsmasq-dns-6dc9b786df-rmprl\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.495079 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.733662 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dc9b786df-rmprl"] Feb 19 22:52:01 crc kubenswrapper[4795]: W0219 22:52:01.741077 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee8a52c2_f6ad_4b2e_a092_9393dac0f15a.slice/crio-759043739bccdb4e9d8810a99089303a4dcf977d51cfcc31ca61096dfaef3dbf WatchSource:0}: Error finding container 759043739bccdb4e9d8810a99089303a4dcf977d51cfcc31ca61096dfaef3dbf: Status 404 returned error can't find the container with id 759043739bccdb4e9d8810a99089303a4dcf977d51cfcc31ca61096dfaef3dbf Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.741530 4795 generic.go:334] "Generic (PLEG): container finished" podID="71730cb0-0d62-496c-b20a-590bc258489b" containerID="95a83942347a601497cb76f3b33f1eec961796f6d98935a98ad389d2a4ed0165" exitCode=0 Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.741629 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" event={"ID":"71730cb0-0d62-496c-b20a-590bc258489b","Type":"ContainerDied","Data":"95a83942347a601497cb76f3b33f1eec961796f6d98935a98ad389d2a4ed0165"} Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.741675 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" event={"ID":"71730cb0-0d62-496c-b20a-590bc258489b","Type":"ContainerStarted","Data":"514fd9cf80faa56b01c5fba3e6443cc446339d2131fc32a8758f5440873bb6c7"} Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.817470 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.103597 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.177743 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqqtg\" (UniqueName: \"kubernetes.io/projected/71730cb0-0d62-496c-b20a-590bc258489b-kube-api-access-pqqtg\") pod \"71730cb0-0d62-496c-b20a-590bc258489b\" (UID: \"71730cb0-0d62-496c-b20a-590bc258489b\") " Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.177810 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-dns-svc\") pod \"71730cb0-0d62-496c-b20a-590bc258489b\" (UID: \"71730cb0-0d62-496c-b20a-590bc258489b\") " Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.177934 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-config\") pod \"71730cb0-0d62-496c-b20a-590bc258489b\" (UID: \"71730cb0-0d62-496c-b20a-590bc258489b\") " Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.177974 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-ovsdbserver-nb\") pod \"71730cb0-0d62-496c-b20a-590bc258489b\" (UID: \"71730cb0-0d62-496c-b20a-590bc258489b\") " Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.182268 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71730cb0-0d62-496c-b20a-590bc258489b-kube-api-access-pqqtg" (OuterVolumeSpecName: "kube-api-access-pqqtg") pod "71730cb0-0d62-496c-b20a-590bc258489b" (UID: "71730cb0-0d62-496c-b20a-590bc258489b"). InnerVolumeSpecName "kube-api-access-pqqtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.195623 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "71730cb0-0d62-496c-b20a-590bc258489b" (UID: "71730cb0-0d62-496c-b20a-590bc258489b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.197351 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "71730cb0-0d62-496c-b20a-590bc258489b" (UID: "71730cb0-0d62-496c-b20a-590bc258489b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.200893 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-config" (OuterVolumeSpecName: "config") pod "71730cb0-0d62-496c-b20a-590bc258489b" (UID: "71730cb0-0d62-496c-b20a-590bc258489b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.280078 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqqtg\" (UniqueName: \"kubernetes.io/projected/71730cb0-0d62-496c-b20a-590bc258489b-kube-api-access-pqqtg\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.283915 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.283975 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-config\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.283990 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.751582 4795 generic.go:334] "Generic (PLEG): container finished" podID="ee8a52c2-f6ad-4b2e-a092-9393dac0f15a" containerID="02b91837f0ddc7d9ab267bf7d71d729c37ae5422cf7fed3035a9ff30fcca558e" exitCode=0 Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.751668 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" event={"ID":"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a","Type":"ContainerDied","Data":"02b91837f0ddc7d9ab267bf7d71d729c37ae5422cf7fed3035a9ff30fcca558e"} Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.752016 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" event={"ID":"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a","Type":"ContainerStarted","Data":"759043739bccdb4e9d8810a99089303a4dcf977d51cfcc31ca61096dfaef3dbf"} Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.753818 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.753950 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" event={"ID":"71730cb0-0d62-496c-b20a-590bc258489b","Type":"ContainerDied","Data":"514fd9cf80faa56b01c5fba3e6443cc446339d2131fc32a8758f5440873bb6c7"} Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.754731 4795 scope.go:117] "RemoveContainer" containerID="95a83942347a601497cb76f3b33f1eec961796f6d98935a98ad389d2a4ed0165" Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.992422 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56f9dfcfb9-bslgg"] Feb 19 22:52:03 crc kubenswrapper[4795]: I0219 22:52:03.003577 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56f9dfcfb9-bslgg"] Feb 19 22:52:03 crc kubenswrapper[4795]: I0219 22:52:03.525453 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71730cb0-0d62-496c-b20a-590bc258489b" path="/var/lib/kubelet/pods/71730cb0-0d62-496c-b20a-590bc258489b/volumes" Feb 19 22:52:03 crc kubenswrapper[4795]: I0219 22:52:03.769948 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" event={"ID":"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a","Type":"ContainerStarted","Data":"b0e5926c1df9cf0e08b6183cea78e6fd21fb1a7f2265743460b8010658ff82c0"} Feb 19 22:52:03 crc kubenswrapper[4795]: I0219 22:52:03.770638 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:03 crc kubenswrapper[4795]: I0219 22:52:03.801772 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" podStartSLOduration=2.801745679 podStartE2EDuration="2.801745679s" podCreationTimestamp="2026-02-19 22:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:52:03.796790037 +0000 UTC m=+5034.989307921" watchObservedRunningTime="2026-02-19 22:52:03.801745679 +0000 UTC m=+5034.994263553" Feb 19 22:52:04 crc kubenswrapper[4795]: I0219 22:52:04.849906 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Feb 19 22:52:04 crc kubenswrapper[4795]: E0219 22:52:04.850284 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71730cb0-0d62-496c-b20a-590bc258489b" containerName="init" Feb 19 22:52:04 crc kubenswrapper[4795]: I0219 22:52:04.850302 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="71730cb0-0d62-496c-b20a-590bc258489b" containerName="init" Feb 19 22:52:04 crc kubenswrapper[4795]: I0219 22:52:04.850597 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="71730cb0-0d62-496c-b20a-590bc258489b" containerName="init" Feb 19 22:52:04 crc kubenswrapper[4795]: I0219 22:52:04.851763 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 19 22:52:04 crc kubenswrapper[4795]: I0219 22:52:04.854010 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Feb 19 22:52:04 crc kubenswrapper[4795]: I0219 22:52:04.863927 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Feb 19 22:52:05 crc kubenswrapper[4795]: I0219 22:52:05.031849 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/59d77cc9-140e-4468-9023-0a973155d290-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"59d77cc9-140e-4468-9023-0a973155d290\") " pod="openstack/ovn-copy-data" Feb 19 22:52:05 crc kubenswrapper[4795]: I0219 22:52:05.032131 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcjlg\" (UniqueName: \"kubernetes.io/projected/59d77cc9-140e-4468-9023-0a973155d290-kube-api-access-jcjlg\") pod \"ovn-copy-data\" (UID: \"59d77cc9-140e-4468-9023-0a973155d290\") " pod="openstack/ovn-copy-data" Feb 19 22:52:05 crc kubenswrapper[4795]: I0219 22:52:05.032245 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fe46f653-3b46-49f1-9da1-d67576b17f42\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe46f653-3b46-49f1-9da1-d67576b17f42\") pod \"ovn-copy-data\" (UID: \"59d77cc9-140e-4468-9023-0a973155d290\") " pod="openstack/ovn-copy-data" Feb 19 22:52:05 crc kubenswrapper[4795]: I0219 22:52:05.133192 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcjlg\" (UniqueName: \"kubernetes.io/projected/59d77cc9-140e-4468-9023-0a973155d290-kube-api-access-jcjlg\") pod \"ovn-copy-data\" (UID: \"59d77cc9-140e-4468-9023-0a973155d290\") " pod="openstack/ovn-copy-data" Feb 19 22:52:05 crc kubenswrapper[4795]: I0219 22:52:05.133446 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fe46f653-3b46-49f1-9da1-d67576b17f42\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe46f653-3b46-49f1-9da1-d67576b17f42\") pod \"ovn-copy-data\" (UID: \"59d77cc9-140e-4468-9023-0a973155d290\") " pod="openstack/ovn-copy-data" Feb 19 22:52:05 crc kubenswrapper[4795]: I0219 22:52:05.133733 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/59d77cc9-140e-4468-9023-0a973155d290-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"59d77cc9-140e-4468-9023-0a973155d290\") " pod="openstack/ovn-copy-data" Feb 19 22:52:05 crc kubenswrapper[4795]: I0219 22:52:05.137480 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:52:05 crc kubenswrapper[4795]: I0219 22:52:05.137546 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fe46f653-3b46-49f1-9da1-d67576b17f42\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe46f653-3b46-49f1-9da1-d67576b17f42\") pod \"ovn-copy-data\" (UID: \"59d77cc9-140e-4468-9023-0a973155d290\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/272a78958a7306caa675bdcba31e4abf2af9d4231e8b5f084d7a94734d563c34/globalmount\"" pod="openstack/ovn-copy-data" Feb 19 22:52:05 crc kubenswrapper[4795]: I0219 22:52:05.148628 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/59d77cc9-140e-4468-9023-0a973155d290-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"59d77cc9-140e-4468-9023-0a973155d290\") " pod="openstack/ovn-copy-data" Feb 19 22:52:05 crc kubenswrapper[4795]: I0219 22:52:05.153527 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcjlg\" (UniqueName: \"kubernetes.io/projected/59d77cc9-140e-4468-9023-0a973155d290-kube-api-access-jcjlg\") pod \"ovn-copy-data\" (UID: \"59d77cc9-140e-4468-9023-0a973155d290\") " pod="openstack/ovn-copy-data" Feb 19 22:52:05 crc kubenswrapper[4795]: I0219 22:52:05.178334 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fe46f653-3b46-49f1-9da1-d67576b17f42\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe46f653-3b46-49f1-9da1-d67576b17f42\") pod \"ovn-copy-data\" (UID: \"59d77cc9-140e-4468-9023-0a973155d290\") " pod="openstack/ovn-copy-data" Feb 19 22:52:05 crc kubenswrapper[4795]: I0219 22:52:05.476943 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 19 22:52:05 crc kubenswrapper[4795]: I0219 22:52:05.982056 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Feb 19 22:52:05 crc kubenswrapper[4795]: W0219 22:52:05.983653 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59d77cc9_140e_4468_9023_0a973155d290.slice/crio-d1d8d1a7f24d4b32f5d50dbc56c8c75a39927408e5337eb8542c8ba059a2e9d8 WatchSource:0}: Error finding container d1d8d1a7f24d4b32f5d50dbc56c8c75a39927408e5337eb8542c8ba059a2e9d8: Status 404 returned error can't find the container with id d1d8d1a7f24d4b32f5d50dbc56c8c75a39927408e5337eb8542c8ba059a2e9d8 Feb 19 22:52:06 crc kubenswrapper[4795]: I0219 22:52:06.796788 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"59d77cc9-140e-4468-9023-0a973155d290","Type":"ContainerStarted","Data":"a94a997e126652369a7f539bdbf820b6c97b6808304fc5ab088e5b94e32df40f"} Feb 19 22:52:06 crc kubenswrapper[4795]: I0219 22:52:06.797097 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"59d77cc9-140e-4468-9023-0a973155d290","Type":"ContainerStarted","Data":"d1d8d1a7f24d4b32f5d50dbc56c8c75a39927408e5337eb8542c8ba059a2e9d8"} Feb 19 22:52:06 crc kubenswrapper[4795]: I0219 22:52:06.832306 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.379894209 podStartE2EDuration="3.832279446s" podCreationTimestamp="2026-02-19 22:52:03 +0000 UTC" firstStartedPulling="2026-02-19 22:52:05.986335149 +0000 UTC m=+5037.178853013" lastFinishedPulling="2026-02-19 22:52:06.438720386 +0000 UTC m=+5037.631238250" observedRunningTime="2026-02-19 22:52:06.820516579 +0000 UTC m=+5038.013034483" watchObservedRunningTime="2026-02-19 22:52:06.832279446 +0000 UTC m=+5038.024797350" Feb 19 22:52:09 crc kubenswrapper[4795]: E0219 22:52:09.076297 4795 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.69:60290->38.102.83.69:37561: read tcp 38.102.83.69:60290->38.102.83.69:37561: read: connection reset by peer Feb 19 22:52:11 crc kubenswrapper[4795]: I0219 22:52:11.496547 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:11 crc kubenswrapper[4795]: I0219 22:52:11.609331 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54dc9c94cc-wgn4d"] Feb 19 22:52:11 crc kubenswrapper[4795]: I0219 22:52:11.609597 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" podUID="00c8c1c0-da57-4169-a42c-b52386ed3112" containerName="dnsmasq-dns" containerID="cri-o://00f889ab0dea3cf5542a0ac31c64828fab8a4969385f74afc960d2da8d71af69" gracePeriod=10 Feb 19 22:52:11 crc kubenswrapper[4795]: I0219 22:52:11.842589 4795 generic.go:334] "Generic (PLEG): container finished" podID="00c8c1c0-da57-4169-a42c-b52386ed3112" containerID="00f889ab0dea3cf5542a0ac31c64828fab8a4969385f74afc960d2da8d71af69" exitCode=0 Feb 19 22:52:11 crc kubenswrapper[4795]: I0219 22:52:11.842632 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" event={"ID":"00c8c1c0-da57-4169-a42c-b52386ed3112","Type":"ContainerDied","Data":"00f889ab0dea3cf5542a0ac31c64828fab8a4969385f74afc960d2da8d71af69"} Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.046814 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.145036 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00c8c1c0-da57-4169-a42c-b52386ed3112-config\") pod \"00c8c1c0-da57-4169-a42c-b52386ed3112\" (UID: \"00c8c1c0-da57-4169-a42c-b52386ed3112\") " Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.145126 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpmqb\" (UniqueName: \"kubernetes.io/projected/00c8c1c0-da57-4169-a42c-b52386ed3112-kube-api-access-vpmqb\") pod \"00c8c1c0-da57-4169-a42c-b52386ed3112\" (UID: \"00c8c1c0-da57-4169-a42c-b52386ed3112\") " Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.145152 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00c8c1c0-da57-4169-a42c-b52386ed3112-dns-svc\") pod \"00c8c1c0-da57-4169-a42c-b52386ed3112\" (UID: \"00c8c1c0-da57-4169-a42c-b52386ed3112\") " Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.149892 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00c8c1c0-da57-4169-a42c-b52386ed3112-kube-api-access-vpmqb" (OuterVolumeSpecName: "kube-api-access-vpmqb") pod "00c8c1c0-da57-4169-a42c-b52386ed3112" (UID: "00c8c1c0-da57-4169-a42c-b52386ed3112"). InnerVolumeSpecName "kube-api-access-vpmqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.183022 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00c8c1c0-da57-4169-a42c-b52386ed3112-config" (OuterVolumeSpecName: "config") pod "00c8c1c0-da57-4169-a42c-b52386ed3112" (UID: "00c8c1c0-da57-4169-a42c-b52386ed3112"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.183086 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00c8c1c0-da57-4169-a42c-b52386ed3112-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "00c8c1c0-da57-4169-a42c-b52386ed3112" (UID: "00c8c1c0-da57-4169-a42c-b52386ed3112"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.246954 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00c8c1c0-da57-4169-a42c-b52386ed3112-config\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.247009 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpmqb\" (UniqueName: \"kubernetes.io/projected/00c8c1c0-da57-4169-a42c-b52386ed3112-kube-api-access-vpmqb\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.247019 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00c8c1c0-da57-4169-a42c-b52386ed3112-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.373213 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 19 22:52:12 crc kubenswrapper[4795]: E0219 22:52:12.373504 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00c8c1c0-da57-4169-a42c-b52386ed3112" containerName="init" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.373516 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="00c8c1c0-da57-4169-a42c-b52386ed3112" containerName="init" Feb 19 22:52:12 crc kubenswrapper[4795]: E0219 22:52:12.373533 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00c8c1c0-da57-4169-a42c-b52386ed3112" containerName="dnsmasq-dns" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.373540 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="00c8c1c0-da57-4169-a42c-b52386ed3112" containerName="dnsmasq-dns" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.373682 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="00c8c1c0-da57-4169-a42c-b52386ed3112" containerName="dnsmasq-dns" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.374429 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.376294 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-54jvp" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.376612 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.381805 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.395759 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.551598 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b989c1be-7a74-42ee-a27b-dc34ce8d727a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b989c1be-7a74-42ee-a27b-dc34ce8d727a\") " pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.551642 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b989c1be-7a74-42ee-a27b-dc34ce8d727a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b989c1be-7a74-42ee-a27b-dc34ce8d727a\") " pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.551682 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b989c1be-7a74-42ee-a27b-dc34ce8d727a-scripts\") pod \"ovn-northd-0\" (UID: \"b989c1be-7a74-42ee-a27b-dc34ce8d727a\") " pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.551818 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b989c1be-7a74-42ee-a27b-dc34ce8d727a-config\") pod \"ovn-northd-0\" (UID: \"b989c1be-7a74-42ee-a27b-dc34ce8d727a\") " pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.551863 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwv7w\" (UniqueName: \"kubernetes.io/projected/b989c1be-7a74-42ee-a27b-dc34ce8d727a-kube-api-access-kwv7w\") pod \"ovn-northd-0\" (UID: \"b989c1be-7a74-42ee-a27b-dc34ce8d727a\") " pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.653704 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwv7w\" (UniqueName: \"kubernetes.io/projected/b989c1be-7a74-42ee-a27b-dc34ce8d727a-kube-api-access-kwv7w\") pod \"ovn-northd-0\" (UID: \"b989c1be-7a74-42ee-a27b-dc34ce8d727a\") " pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.653866 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b989c1be-7a74-42ee-a27b-dc34ce8d727a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b989c1be-7a74-42ee-a27b-dc34ce8d727a\") " pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.653900 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b989c1be-7a74-42ee-a27b-dc34ce8d727a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b989c1be-7a74-42ee-a27b-dc34ce8d727a\") " pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.653950 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b989c1be-7a74-42ee-a27b-dc34ce8d727a-scripts\") pod \"ovn-northd-0\" (UID: \"b989c1be-7a74-42ee-a27b-dc34ce8d727a\") " pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.654420 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b989c1be-7a74-42ee-a27b-dc34ce8d727a-config\") pod \"ovn-northd-0\" (UID: \"b989c1be-7a74-42ee-a27b-dc34ce8d727a\") " pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.654816 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b989c1be-7a74-42ee-a27b-dc34ce8d727a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b989c1be-7a74-42ee-a27b-dc34ce8d727a\") " pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.654889 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b989c1be-7a74-42ee-a27b-dc34ce8d727a-scripts\") pod \"ovn-northd-0\" (UID: \"b989c1be-7a74-42ee-a27b-dc34ce8d727a\") " pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.655800 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b989c1be-7a74-42ee-a27b-dc34ce8d727a-config\") pod \"ovn-northd-0\" (UID: \"b989c1be-7a74-42ee-a27b-dc34ce8d727a\") " pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.657243 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b989c1be-7a74-42ee-a27b-dc34ce8d727a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b989c1be-7a74-42ee-a27b-dc34ce8d727a\") " pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.681583 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwv7w\" (UniqueName: \"kubernetes.io/projected/b989c1be-7a74-42ee-a27b-dc34ce8d727a-kube-api-access-kwv7w\") pod \"ovn-northd-0\" (UID: \"b989c1be-7a74-42ee-a27b-dc34ce8d727a\") " pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.692565 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.851684 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" event={"ID":"00c8c1c0-da57-4169-a42c-b52386ed3112","Type":"ContainerDied","Data":"188ca574ffaf1ffa388763be3f13a7eb4afedf6a896f0e7de263093402b89351"} Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.851944 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.851974 4795 scope.go:117] "RemoveContainer" containerID="00f889ab0dea3cf5542a0ac31c64828fab8a4969385f74afc960d2da8d71af69" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.876210 4795 scope.go:117] "RemoveContainer" containerID="99d134900f88fca6c416350229ec6ebd6aa32403658dd7fa964885fc2b39579a" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.897182 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54dc9c94cc-wgn4d"] Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.902888 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54dc9c94cc-wgn4d"] Feb 19 22:52:13 crc kubenswrapper[4795]: I0219 22:52:13.118088 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 19 22:52:13 crc kubenswrapper[4795]: I0219 22:52:13.521449 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00c8c1c0-da57-4169-a42c-b52386ed3112" path="/var/lib/kubelet/pods/00c8c1c0-da57-4169-a42c-b52386ed3112/volumes" Feb 19 22:52:13 crc kubenswrapper[4795]: I0219 22:52:13.863053 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b989c1be-7a74-42ee-a27b-dc34ce8d727a","Type":"ContainerStarted","Data":"0c600f96eff9c987bbf1c42f97977ddf8c53ab241d1f8c672da01b40b928914e"} Feb 19 22:52:13 crc kubenswrapper[4795]: I0219 22:52:13.863473 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 19 22:52:13 crc kubenswrapper[4795]: I0219 22:52:13.863496 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b989c1be-7a74-42ee-a27b-dc34ce8d727a","Type":"ContainerStarted","Data":"c57eac7d730fe0b608bf5efe642341194773ab7bf8b980686440d17b6be9377d"} Feb 19 22:52:13 crc kubenswrapper[4795]: I0219 22:52:13.863506 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b989c1be-7a74-42ee-a27b-dc34ce8d727a","Type":"ContainerStarted","Data":"1975f0327f0564a3a40276b367239b249c816055a1dcd73225c90dc3f0fa2d3b"} Feb 19 22:52:13 crc kubenswrapper[4795]: I0219 22:52:13.891814 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.8917904079999999 podStartE2EDuration="1.891790408s" podCreationTimestamp="2026-02-19 22:52:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:52:13.881795021 +0000 UTC m=+5045.074312905" watchObservedRunningTime="2026-02-19 22:52:13.891790408 +0000 UTC m=+5045.084308282" Feb 19 22:52:14 crc kubenswrapper[4795]: I0219 22:52:14.512179 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:52:14 crc kubenswrapper[4795]: E0219 22:52:14.512619 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.271331 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-pgc5v"] Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.272615 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pgc5v" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.293604 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-pgc5v"] Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.308524 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-af99-account-create-update-5rdfd"] Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.309940 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-af99-account-create-update-5rdfd" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.320072 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.340361 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-af99-account-create-update-5rdfd"] Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.471106 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/036fd6f7-0c88-4c92-9a98-0a774124c8fd-operator-scripts\") pod \"keystone-db-create-pgc5v\" (UID: \"036fd6f7-0c88-4c92-9a98-0a774124c8fd\") " pod="openstack/keystone-db-create-pgc5v" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.471314 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zczw7\" (UniqueName: \"kubernetes.io/projected/036fd6f7-0c88-4c92-9a98-0a774124c8fd-kube-api-access-zczw7\") pod \"keystone-db-create-pgc5v\" (UID: \"036fd6f7-0c88-4c92-9a98-0a774124c8fd\") " pod="openstack/keystone-db-create-pgc5v" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.471423 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5rc4\" (UniqueName: \"kubernetes.io/projected/d1e0382a-40d3-42e1-93d3-e5098af1e54f-kube-api-access-v5rc4\") pod \"keystone-af99-account-create-update-5rdfd\" (UID: \"d1e0382a-40d3-42e1-93d3-e5098af1e54f\") " pod="openstack/keystone-af99-account-create-update-5rdfd" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.471470 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1e0382a-40d3-42e1-93d3-e5098af1e54f-operator-scripts\") pod \"keystone-af99-account-create-update-5rdfd\" (UID: \"d1e0382a-40d3-42e1-93d3-e5098af1e54f\") " pod="openstack/keystone-af99-account-create-update-5rdfd" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.573512 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/036fd6f7-0c88-4c92-9a98-0a774124c8fd-operator-scripts\") pod \"keystone-db-create-pgc5v\" (UID: \"036fd6f7-0c88-4c92-9a98-0a774124c8fd\") " pod="openstack/keystone-db-create-pgc5v" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.573736 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zczw7\" (UniqueName: \"kubernetes.io/projected/036fd6f7-0c88-4c92-9a98-0a774124c8fd-kube-api-access-zczw7\") pod \"keystone-db-create-pgc5v\" (UID: \"036fd6f7-0c88-4c92-9a98-0a774124c8fd\") " pod="openstack/keystone-db-create-pgc5v" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.573890 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5rc4\" (UniqueName: \"kubernetes.io/projected/d1e0382a-40d3-42e1-93d3-e5098af1e54f-kube-api-access-v5rc4\") pod \"keystone-af99-account-create-update-5rdfd\" (UID: \"d1e0382a-40d3-42e1-93d3-e5098af1e54f\") " pod="openstack/keystone-af99-account-create-update-5rdfd" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.573963 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1e0382a-40d3-42e1-93d3-e5098af1e54f-operator-scripts\") pod \"keystone-af99-account-create-update-5rdfd\" (UID: \"d1e0382a-40d3-42e1-93d3-e5098af1e54f\") " pod="openstack/keystone-af99-account-create-update-5rdfd" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.574290 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/036fd6f7-0c88-4c92-9a98-0a774124c8fd-operator-scripts\") pod \"keystone-db-create-pgc5v\" (UID: \"036fd6f7-0c88-4c92-9a98-0a774124c8fd\") " pod="openstack/keystone-db-create-pgc5v" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.576030 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1e0382a-40d3-42e1-93d3-e5098af1e54f-operator-scripts\") pod \"keystone-af99-account-create-update-5rdfd\" (UID: \"d1e0382a-40d3-42e1-93d3-e5098af1e54f\") " pod="openstack/keystone-af99-account-create-update-5rdfd" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.600246 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5rc4\" (UniqueName: \"kubernetes.io/projected/d1e0382a-40d3-42e1-93d3-e5098af1e54f-kube-api-access-v5rc4\") pod \"keystone-af99-account-create-update-5rdfd\" (UID: \"d1e0382a-40d3-42e1-93d3-e5098af1e54f\") " pod="openstack/keystone-af99-account-create-update-5rdfd" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.600878 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zczw7\" (UniqueName: \"kubernetes.io/projected/036fd6f7-0c88-4c92-9a98-0a774124c8fd-kube-api-access-zczw7\") pod \"keystone-db-create-pgc5v\" (UID: \"036fd6f7-0c88-4c92-9a98-0a774124c8fd\") " pod="openstack/keystone-db-create-pgc5v" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.643991 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-af99-account-create-update-5rdfd" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.891820 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pgc5v" Feb 19 22:52:19 crc kubenswrapper[4795]: I0219 22:52:19.120635 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-af99-account-create-update-5rdfd"] Feb 19 22:52:19 crc kubenswrapper[4795]: W0219 22:52:19.128777 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1e0382a_40d3_42e1_93d3_e5098af1e54f.slice/crio-78ef92caa6d040d97226723b78da3eb0edd275fd2ef294ec591c2bcce83c400c WatchSource:0}: Error finding container 78ef92caa6d040d97226723b78da3eb0edd275fd2ef294ec591c2bcce83c400c: Status 404 returned error can't find the container with id 78ef92caa6d040d97226723b78da3eb0edd275fd2ef294ec591c2bcce83c400c Feb 19 22:52:19 crc kubenswrapper[4795]: I0219 22:52:19.335743 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-pgc5v"] Feb 19 22:52:19 crc kubenswrapper[4795]: I0219 22:52:19.913542 4795 generic.go:334] "Generic (PLEG): container finished" podID="036fd6f7-0c88-4c92-9a98-0a774124c8fd" containerID="e2855b43837d8ca73ce1e28756facf9b648ea7b22ec3a7d6bc132133d332b880" exitCode=0 Feb 19 22:52:19 crc kubenswrapper[4795]: I0219 22:52:19.914984 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-pgc5v" event={"ID":"036fd6f7-0c88-4c92-9a98-0a774124c8fd","Type":"ContainerDied","Data":"e2855b43837d8ca73ce1e28756facf9b648ea7b22ec3a7d6bc132133d332b880"} Feb 19 22:52:19 crc kubenswrapper[4795]: I0219 22:52:19.915050 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-pgc5v" event={"ID":"036fd6f7-0c88-4c92-9a98-0a774124c8fd","Type":"ContainerStarted","Data":"195a684b2f3b5afb8957fbe27e13ee24b478b3afdc0c86ac9506472105a02b91"} Feb 19 22:52:19 crc kubenswrapper[4795]: I0219 22:52:19.915636 4795 generic.go:334] "Generic (PLEG): container finished" podID="d1e0382a-40d3-42e1-93d3-e5098af1e54f" containerID="f711a9de7781799454b8f0272a2636f2533360c8ba6ce3a134936f4cf9908d61" exitCode=0 Feb 19 22:52:19 crc kubenswrapper[4795]: I0219 22:52:19.915675 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-af99-account-create-update-5rdfd" event={"ID":"d1e0382a-40d3-42e1-93d3-e5098af1e54f","Type":"ContainerDied","Data":"f711a9de7781799454b8f0272a2636f2533360c8ba6ce3a134936f4cf9908d61"} Feb 19 22:52:19 crc kubenswrapper[4795]: I0219 22:52:19.915694 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-af99-account-create-update-5rdfd" event={"ID":"d1e0382a-40d3-42e1-93d3-e5098af1e54f","Type":"ContainerStarted","Data":"78ef92caa6d040d97226723b78da3eb0edd275fd2ef294ec591c2bcce83c400c"} Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.225238 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pgc5v" Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.322274 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zczw7\" (UniqueName: \"kubernetes.io/projected/036fd6f7-0c88-4c92-9a98-0a774124c8fd-kube-api-access-zczw7\") pod \"036fd6f7-0c88-4c92-9a98-0a774124c8fd\" (UID: \"036fd6f7-0c88-4c92-9a98-0a774124c8fd\") " Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.322440 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/036fd6f7-0c88-4c92-9a98-0a774124c8fd-operator-scripts\") pod \"036fd6f7-0c88-4c92-9a98-0a774124c8fd\" (UID: \"036fd6f7-0c88-4c92-9a98-0a774124c8fd\") " Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.323099 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/036fd6f7-0c88-4c92-9a98-0a774124c8fd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "036fd6f7-0c88-4c92-9a98-0a774124c8fd" (UID: "036fd6f7-0c88-4c92-9a98-0a774124c8fd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.327955 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/036fd6f7-0c88-4c92-9a98-0a774124c8fd-kube-api-access-zczw7" (OuterVolumeSpecName: "kube-api-access-zczw7") pod "036fd6f7-0c88-4c92-9a98-0a774124c8fd" (UID: "036fd6f7-0c88-4c92-9a98-0a774124c8fd"). InnerVolumeSpecName "kube-api-access-zczw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.385074 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-af99-account-create-update-5rdfd" Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.423705 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1e0382a-40d3-42e1-93d3-e5098af1e54f-operator-scripts\") pod \"d1e0382a-40d3-42e1-93d3-e5098af1e54f\" (UID: \"d1e0382a-40d3-42e1-93d3-e5098af1e54f\") " Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.424033 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5rc4\" (UniqueName: \"kubernetes.io/projected/d1e0382a-40d3-42e1-93d3-e5098af1e54f-kube-api-access-v5rc4\") pod \"d1e0382a-40d3-42e1-93d3-e5098af1e54f\" (UID: \"d1e0382a-40d3-42e1-93d3-e5098af1e54f\") " Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.424406 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zczw7\" (UniqueName: \"kubernetes.io/projected/036fd6f7-0c88-4c92-9a98-0a774124c8fd-kube-api-access-zczw7\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.424444 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/036fd6f7-0c88-4c92-9a98-0a774124c8fd-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.424682 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1e0382a-40d3-42e1-93d3-e5098af1e54f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d1e0382a-40d3-42e1-93d3-e5098af1e54f" (UID: "d1e0382a-40d3-42e1-93d3-e5098af1e54f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.427000 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1e0382a-40d3-42e1-93d3-e5098af1e54f-kube-api-access-v5rc4" (OuterVolumeSpecName: "kube-api-access-v5rc4") pod "d1e0382a-40d3-42e1-93d3-e5098af1e54f" (UID: "d1e0382a-40d3-42e1-93d3-e5098af1e54f"). InnerVolumeSpecName "kube-api-access-v5rc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.525549 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1e0382a-40d3-42e1-93d3-e5098af1e54f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.525587 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5rc4\" (UniqueName: \"kubernetes.io/projected/d1e0382a-40d3-42e1-93d3-e5098af1e54f-kube-api-access-v5rc4\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.932454 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-pgc5v" event={"ID":"036fd6f7-0c88-4c92-9a98-0a774124c8fd","Type":"ContainerDied","Data":"195a684b2f3b5afb8957fbe27e13ee24b478b3afdc0c86ac9506472105a02b91"} Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.932501 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="195a684b2f3b5afb8957fbe27e13ee24b478b3afdc0c86ac9506472105a02b91" Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.932783 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pgc5v" Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.934323 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-af99-account-create-update-5rdfd" event={"ID":"d1e0382a-40d3-42e1-93d3-e5098af1e54f","Type":"ContainerDied","Data":"78ef92caa6d040d97226723b78da3eb0edd275fd2ef294ec591c2bcce83c400c"} Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.934357 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78ef92caa6d040d97226723b78da3eb0edd275fd2ef294ec591c2bcce83c400c" Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.934361 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-af99-account-create-update-5rdfd" Feb 19 22:52:23 crc kubenswrapper[4795]: I0219 22:52:23.769987 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-bm6ln"] Feb 19 22:52:23 crc kubenswrapper[4795]: E0219 22:52:23.770711 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="036fd6f7-0c88-4c92-9a98-0a774124c8fd" containerName="mariadb-database-create" Feb 19 22:52:23 crc kubenswrapper[4795]: I0219 22:52:23.770727 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="036fd6f7-0c88-4c92-9a98-0a774124c8fd" containerName="mariadb-database-create" Feb 19 22:52:23 crc kubenswrapper[4795]: E0219 22:52:23.770761 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1e0382a-40d3-42e1-93d3-e5098af1e54f" containerName="mariadb-account-create-update" Feb 19 22:52:23 crc kubenswrapper[4795]: I0219 22:52:23.770769 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1e0382a-40d3-42e1-93d3-e5098af1e54f" containerName="mariadb-account-create-update" Feb 19 22:52:23 crc kubenswrapper[4795]: I0219 22:52:23.770960 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1e0382a-40d3-42e1-93d3-e5098af1e54f" containerName="mariadb-account-create-update" Feb 19 22:52:23 crc kubenswrapper[4795]: I0219 22:52:23.770990 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="036fd6f7-0c88-4c92-9a98-0a774124c8fd" containerName="mariadb-database-create" Feb 19 22:52:23 crc kubenswrapper[4795]: I0219 22:52:23.771637 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bm6ln" Feb 19 22:52:23 crc kubenswrapper[4795]: I0219 22:52:23.774413 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 22:52:23 crc kubenswrapper[4795]: I0219 22:52:23.774557 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bpdln" Feb 19 22:52:23 crc kubenswrapper[4795]: I0219 22:52:23.775050 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 22:52:23 crc kubenswrapper[4795]: I0219 22:52:23.776465 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 22:52:23 crc kubenswrapper[4795]: I0219 22:52:23.786097 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-bm6ln"] Feb 19 22:52:23 crc kubenswrapper[4795]: I0219 22:52:23.962491 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrw9v\" (UniqueName: \"kubernetes.io/projected/96625ae6-8eb0-43d0-a180-20c79dfd6717-kube-api-access-wrw9v\") pod \"keystone-db-sync-bm6ln\" (UID: \"96625ae6-8eb0-43d0-a180-20c79dfd6717\") " pod="openstack/keystone-db-sync-bm6ln" Feb 19 22:52:23 crc kubenswrapper[4795]: I0219 22:52:23.962736 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96625ae6-8eb0-43d0-a180-20c79dfd6717-config-data\") pod \"keystone-db-sync-bm6ln\" (UID: \"96625ae6-8eb0-43d0-a180-20c79dfd6717\") " pod="openstack/keystone-db-sync-bm6ln" Feb 19 22:52:23 crc kubenswrapper[4795]: I0219 22:52:23.962779 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96625ae6-8eb0-43d0-a180-20c79dfd6717-combined-ca-bundle\") pod \"keystone-db-sync-bm6ln\" (UID: \"96625ae6-8eb0-43d0-a180-20c79dfd6717\") " pod="openstack/keystone-db-sync-bm6ln" Feb 19 22:52:24 crc kubenswrapper[4795]: I0219 22:52:24.064532 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrw9v\" (UniqueName: \"kubernetes.io/projected/96625ae6-8eb0-43d0-a180-20c79dfd6717-kube-api-access-wrw9v\") pod \"keystone-db-sync-bm6ln\" (UID: \"96625ae6-8eb0-43d0-a180-20c79dfd6717\") " pod="openstack/keystone-db-sync-bm6ln" Feb 19 22:52:24 crc kubenswrapper[4795]: I0219 22:52:24.064697 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96625ae6-8eb0-43d0-a180-20c79dfd6717-config-data\") pod \"keystone-db-sync-bm6ln\" (UID: \"96625ae6-8eb0-43d0-a180-20c79dfd6717\") " pod="openstack/keystone-db-sync-bm6ln" Feb 19 22:52:24 crc kubenswrapper[4795]: I0219 22:52:24.064721 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96625ae6-8eb0-43d0-a180-20c79dfd6717-combined-ca-bundle\") pod \"keystone-db-sync-bm6ln\" (UID: \"96625ae6-8eb0-43d0-a180-20c79dfd6717\") " pod="openstack/keystone-db-sync-bm6ln" Feb 19 22:52:24 crc kubenswrapper[4795]: I0219 22:52:24.072452 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96625ae6-8eb0-43d0-a180-20c79dfd6717-combined-ca-bundle\") pod \"keystone-db-sync-bm6ln\" (UID: \"96625ae6-8eb0-43d0-a180-20c79dfd6717\") " pod="openstack/keystone-db-sync-bm6ln" Feb 19 22:52:24 crc kubenswrapper[4795]: I0219 22:52:24.074355 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96625ae6-8eb0-43d0-a180-20c79dfd6717-config-data\") pod \"keystone-db-sync-bm6ln\" (UID: \"96625ae6-8eb0-43d0-a180-20c79dfd6717\") " pod="openstack/keystone-db-sync-bm6ln" Feb 19 22:52:24 crc kubenswrapper[4795]: I0219 22:52:24.086730 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrw9v\" (UniqueName: \"kubernetes.io/projected/96625ae6-8eb0-43d0-a180-20c79dfd6717-kube-api-access-wrw9v\") pod \"keystone-db-sync-bm6ln\" (UID: \"96625ae6-8eb0-43d0-a180-20c79dfd6717\") " pod="openstack/keystone-db-sync-bm6ln" Feb 19 22:52:24 crc kubenswrapper[4795]: I0219 22:52:24.092876 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bm6ln" Feb 19 22:52:24 crc kubenswrapper[4795]: I0219 22:52:24.543364 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-bm6ln"] Feb 19 22:52:24 crc kubenswrapper[4795]: W0219 22:52:24.553295 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96625ae6_8eb0_43d0_a180_20c79dfd6717.slice/crio-b94d855fad5a15c8ec84580ccd5857e68c5f03ee7e5a4a3d61a2c5f61bdf3807 WatchSource:0}: Error finding container b94d855fad5a15c8ec84580ccd5857e68c5f03ee7e5a4a3d61a2c5f61bdf3807: Status 404 returned error can't find the container with id b94d855fad5a15c8ec84580ccd5857e68c5f03ee7e5a4a3d61a2c5f61bdf3807 Feb 19 22:52:24 crc kubenswrapper[4795]: I0219 22:52:24.967672 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bm6ln" event={"ID":"96625ae6-8eb0-43d0-a180-20c79dfd6717","Type":"ContainerStarted","Data":"000fa8f747858d495d2c6d2c850ff5f2717d6c22150f68c950eeae0bca0bd7f7"} Feb 19 22:52:24 crc kubenswrapper[4795]: I0219 22:52:24.968012 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bm6ln" event={"ID":"96625ae6-8eb0-43d0-a180-20c79dfd6717","Type":"ContainerStarted","Data":"b94d855fad5a15c8ec84580ccd5857e68c5f03ee7e5a4a3d61a2c5f61bdf3807"} Feb 19 22:52:24 crc kubenswrapper[4795]: I0219 22:52:24.991810 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-bm6ln" podStartSLOduration=1.9917846670000001 podStartE2EDuration="1.991784667s" podCreationTimestamp="2026-02-19 22:52:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:52:24.984412195 +0000 UTC m=+5056.176930099" watchObservedRunningTime="2026-02-19 22:52:24.991784667 +0000 UTC m=+5056.184302551" Feb 19 22:52:26 crc kubenswrapper[4795]: I0219 22:52:26.987475 4795 generic.go:334] "Generic (PLEG): container finished" podID="96625ae6-8eb0-43d0-a180-20c79dfd6717" containerID="000fa8f747858d495d2c6d2c850ff5f2717d6c22150f68c950eeae0bca0bd7f7" exitCode=0 Feb 19 22:52:26 crc kubenswrapper[4795]: I0219 22:52:26.987554 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bm6ln" event={"ID":"96625ae6-8eb0-43d0-a180-20c79dfd6717","Type":"ContainerDied","Data":"000fa8f747858d495d2c6d2c850ff5f2717d6c22150f68c950eeae0bca0bd7f7"} Feb 19 22:52:27 crc kubenswrapper[4795]: I0219 22:52:27.514137 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:52:27 crc kubenswrapper[4795]: E0219 22:52:27.515647 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:52:28 crc kubenswrapper[4795]: I0219 22:52:28.402701 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bm6ln" Feb 19 22:52:28 crc kubenswrapper[4795]: I0219 22:52:28.571103 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrw9v\" (UniqueName: \"kubernetes.io/projected/96625ae6-8eb0-43d0-a180-20c79dfd6717-kube-api-access-wrw9v\") pod \"96625ae6-8eb0-43d0-a180-20c79dfd6717\" (UID: \"96625ae6-8eb0-43d0-a180-20c79dfd6717\") " Feb 19 22:52:28 crc kubenswrapper[4795]: I0219 22:52:28.571386 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96625ae6-8eb0-43d0-a180-20c79dfd6717-config-data\") pod \"96625ae6-8eb0-43d0-a180-20c79dfd6717\" (UID: \"96625ae6-8eb0-43d0-a180-20c79dfd6717\") " Feb 19 22:52:28 crc kubenswrapper[4795]: I0219 22:52:28.571551 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96625ae6-8eb0-43d0-a180-20c79dfd6717-combined-ca-bundle\") pod \"96625ae6-8eb0-43d0-a180-20c79dfd6717\" (UID: \"96625ae6-8eb0-43d0-a180-20c79dfd6717\") " Feb 19 22:52:28 crc kubenswrapper[4795]: I0219 22:52:28.579450 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96625ae6-8eb0-43d0-a180-20c79dfd6717-kube-api-access-wrw9v" (OuterVolumeSpecName: "kube-api-access-wrw9v") pod "96625ae6-8eb0-43d0-a180-20c79dfd6717" (UID: "96625ae6-8eb0-43d0-a180-20c79dfd6717"). InnerVolumeSpecName "kube-api-access-wrw9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:52:28 crc kubenswrapper[4795]: I0219 22:52:28.599711 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96625ae6-8eb0-43d0-a180-20c79dfd6717-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96625ae6-8eb0-43d0-a180-20c79dfd6717" (UID: "96625ae6-8eb0-43d0-a180-20c79dfd6717"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:52:28 crc kubenswrapper[4795]: I0219 22:52:28.640393 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96625ae6-8eb0-43d0-a180-20c79dfd6717-config-data" (OuterVolumeSpecName: "config-data") pod "96625ae6-8eb0-43d0-a180-20c79dfd6717" (UID: "96625ae6-8eb0-43d0-a180-20c79dfd6717"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:52:28 crc kubenswrapper[4795]: I0219 22:52:28.673580 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96625ae6-8eb0-43d0-a180-20c79dfd6717-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:28 crc kubenswrapper[4795]: I0219 22:52:28.673630 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrw9v\" (UniqueName: \"kubernetes.io/projected/96625ae6-8eb0-43d0-a180-20c79dfd6717-kube-api-access-wrw9v\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:28 crc kubenswrapper[4795]: I0219 22:52:28.673652 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96625ae6-8eb0-43d0-a180-20c79dfd6717-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.013768 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bm6ln" event={"ID":"96625ae6-8eb0-43d0-a180-20c79dfd6717","Type":"ContainerDied","Data":"b94d855fad5a15c8ec84580ccd5857e68c5f03ee7e5a4a3d61a2c5f61bdf3807"} Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.013804 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b94d855fad5a15c8ec84580ccd5857e68c5f03ee7e5a4a3d61a2c5f61bdf3807" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.013905 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bm6ln" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.252620 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cbd7f9ccc-v47nm"] Feb 19 22:52:29 crc kubenswrapper[4795]: E0219 22:52:29.252986 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96625ae6-8eb0-43d0-a180-20c79dfd6717" containerName="keystone-db-sync" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.253012 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="96625ae6-8eb0-43d0-a180-20c79dfd6717" containerName="keystone-db-sync" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.253350 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="96625ae6-8eb0-43d0-a180-20c79dfd6717" containerName="keystone-db-sync" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.254636 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.274038 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cbd7f9ccc-v47nm"] Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.307058 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-26wvp"] Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.308221 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.312374 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.312697 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.312529 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bpdln" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.312529 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.312543 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.326105 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-26wvp"] Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.386867 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-ovsdbserver-sb\") pod \"dnsmasq-dns-5cbd7f9ccc-v47nm\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.386927 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-config\") pod \"dnsmasq-dns-5cbd7f9ccc-v47nm\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.386959 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g49lp\" (UniqueName: \"kubernetes.io/projected/74bde2c2-542d-4473-8a2d-4276ef12f1a1-kube-api-access-g49lp\") pod \"dnsmasq-dns-5cbd7f9ccc-v47nm\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.387031 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hjfr\" (UniqueName: \"kubernetes.io/projected/415c9781-58d2-447a-8e0c-2fed3a02ef09-kube-api-access-6hjfr\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.387054 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-config-data\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.387086 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-credential-keys\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.387139 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-ovsdbserver-nb\") pod \"dnsmasq-dns-5cbd7f9ccc-v47nm\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.387191 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-scripts\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.387219 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-fernet-keys\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.387260 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-combined-ca-bundle\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.387292 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-dns-svc\") pod \"dnsmasq-dns-5cbd7f9ccc-v47nm\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.488068 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-ovsdbserver-nb\") pod \"dnsmasq-dns-5cbd7f9ccc-v47nm\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.489010 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-scripts\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.489104 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-fernet-keys\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.489201 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-ovsdbserver-nb\") pod \"dnsmasq-dns-5cbd7f9ccc-v47nm\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.489309 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-combined-ca-bundle\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.489407 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-dns-svc\") pod \"dnsmasq-dns-5cbd7f9ccc-v47nm\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.489512 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-ovsdbserver-sb\") pod \"dnsmasq-dns-5cbd7f9ccc-v47nm\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.489581 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-config\") pod \"dnsmasq-dns-5cbd7f9ccc-v47nm\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.489661 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g49lp\" (UniqueName: \"kubernetes.io/projected/74bde2c2-542d-4473-8a2d-4276ef12f1a1-kube-api-access-g49lp\") pod \"dnsmasq-dns-5cbd7f9ccc-v47nm\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.489731 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hjfr\" (UniqueName: \"kubernetes.io/projected/415c9781-58d2-447a-8e0c-2fed3a02ef09-kube-api-access-6hjfr\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.489808 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-config-data\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.489886 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-credential-keys\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.490260 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-dns-svc\") pod \"dnsmasq-dns-5cbd7f9ccc-v47nm\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.490366 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-config\") pod \"dnsmasq-dns-5cbd7f9ccc-v47nm\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.490574 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-ovsdbserver-sb\") pod \"dnsmasq-dns-5cbd7f9ccc-v47nm\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.492885 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-scripts\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.493181 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-fernet-keys\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.493313 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-config-data\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.493419 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-credential-keys\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.494125 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-combined-ca-bundle\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.505321 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g49lp\" (UniqueName: \"kubernetes.io/projected/74bde2c2-542d-4473-8a2d-4276ef12f1a1-kube-api-access-g49lp\") pod \"dnsmasq-dns-5cbd7f9ccc-v47nm\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.505745 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hjfr\" (UniqueName: \"kubernetes.io/projected/415c9781-58d2-447a-8e0c-2fed3a02ef09-kube-api-access-6hjfr\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.580005 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.657611 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:30 crc kubenswrapper[4795]: W0219 22:52:30.034946 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74bde2c2_542d_4473_8a2d_4276ef12f1a1.slice/crio-bf13f0e9bcad39a75b168c83b41452ae37c85cfd3c851a7d568e51da749b6e4f WatchSource:0}: Error finding container bf13f0e9bcad39a75b168c83b41452ae37c85cfd3c851a7d568e51da749b6e4f: Status 404 returned error can't find the container with id bf13f0e9bcad39a75b168c83b41452ae37c85cfd3c851a7d568e51da749b6e4f Feb 19 22:52:30 crc kubenswrapper[4795]: I0219 22:52:30.035868 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cbd7f9ccc-v47nm"] Feb 19 22:52:30 crc kubenswrapper[4795]: I0219 22:52:30.108692 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-26wvp"] Feb 19 22:52:31 crc kubenswrapper[4795]: I0219 22:52:31.033513 4795 generic.go:334] "Generic (PLEG): container finished" podID="74bde2c2-542d-4473-8a2d-4276ef12f1a1" containerID="16b74941371be4fa54bab7807a1264eec35a2ff59fcbcca048e96bbfbf300be4" exitCode=0 Feb 19 22:52:31 crc kubenswrapper[4795]: I0219 22:52:31.033593 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" event={"ID":"74bde2c2-542d-4473-8a2d-4276ef12f1a1","Type":"ContainerDied","Data":"16b74941371be4fa54bab7807a1264eec35a2ff59fcbcca048e96bbfbf300be4"} Feb 19 22:52:31 crc kubenswrapper[4795]: I0219 22:52:31.033957 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" event={"ID":"74bde2c2-542d-4473-8a2d-4276ef12f1a1","Type":"ContainerStarted","Data":"bf13f0e9bcad39a75b168c83b41452ae37c85cfd3c851a7d568e51da749b6e4f"} Feb 19 22:52:31 crc kubenswrapper[4795]: I0219 22:52:31.035798 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-26wvp" event={"ID":"415c9781-58d2-447a-8e0c-2fed3a02ef09","Type":"ContainerStarted","Data":"ccef51520029c7e6d5f912ec1258a3f7218e74073cc8ab1ae05322f167a9a678"} Feb 19 22:52:31 crc kubenswrapper[4795]: I0219 22:52:31.035857 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-26wvp" event={"ID":"415c9781-58d2-447a-8e0c-2fed3a02ef09","Type":"ContainerStarted","Data":"3c586b18913896c81e735ec4b2b818346aead905ba07882b9ec1c256d7a50791"} Feb 19 22:52:31 crc kubenswrapper[4795]: I0219 22:52:31.097610 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-26wvp" podStartSLOduration=2.097583323 podStartE2EDuration="2.097583323s" podCreationTimestamp="2026-02-19 22:52:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:52:31.086757012 +0000 UTC m=+5062.279274886" watchObservedRunningTime="2026-02-19 22:52:31.097583323 +0000 UTC m=+5062.290101227" Feb 19 22:52:32 crc kubenswrapper[4795]: I0219 22:52:32.045489 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" event={"ID":"74bde2c2-542d-4473-8a2d-4276ef12f1a1","Type":"ContainerStarted","Data":"1e453cedb2505c25dd86ba43beea6d20a98c7545840d561df3aaa00ec8232701"} Feb 19 22:52:32 crc kubenswrapper[4795]: I0219 22:52:32.071202 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" podStartSLOduration=3.071183499 podStartE2EDuration="3.071183499s" podCreationTimestamp="2026-02-19 22:52:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:52:32.062556622 +0000 UTC m=+5063.255074486" watchObservedRunningTime="2026-02-19 22:52:32.071183499 +0000 UTC m=+5063.263701363" Feb 19 22:52:32 crc kubenswrapper[4795]: I0219 22:52:32.769644 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 19 22:52:33 crc kubenswrapper[4795]: I0219 22:52:33.056620 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:34 crc kubenswrapper[4795]: I0219 22:52:34.067985 4795 generic.go:334] "Generic (PLEG): container finished" podID="415c9781-58d2-447a-8e0c-2fed3a02ef09" containerID="ccef51520029c7e6d5f912ec1258a3f7218e74073cc8ab1ae05322f167a9a678" exitCode=0 Feb 19 22:52:34 crc kubenswrapper[4795]: I0219 22:52:34.068099 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-26wvp" event={"ID":"415c9781-58d2-447a-8e0c-2fed3a02ef09","Type":"ContainerDied","Data":"ccef51520029c7e6d5f912ec1258a3f7218e74073cc8ab1ae05322f167a9a678"} Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.444541 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.517468 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-credential-keys\") pod \"415c9781-58d2-447a-8e0c-2fed3a02ef09\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.517560 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-combined-ca-bundle\") pod \"415c9781-58d2-447a-8e0c-2fed3a02ef09\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.517662 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-scripts\") pod \"415c9781-58d2-447a-8e0c-2fed3a02ef09\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.517806 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hjfr\" (UniqueName: \"kubernetes.io/projected/415c9781-58d2-447a-8e0c-2fed3a02ef09-kube-api-access-6hjfr\") pod \"415c9781-58d2-447a-8e0c-2fed3a02ef09\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.517868 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-config-data\") pod \"415c9781-58d2-447a-8e0c-2fed3a02ef09\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.517925 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-fernet-keys\") pod \"415c9781-58d2-447a-8e0c-2fed3a02ef09\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.523859 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/415c9781-58d2-447a-8e0c-2fed3a02ef09-kube-api-access-6hjfr" (OuterVolumeSpecName: "kube-api-access-6hjfr") pod "415c9781-58d2-447a-8e0c-2fed3a02ef09" (UID: "415c9781-58d2-447a-8e0c-2fed3a02ef09"). InnerVolumeSpecName "kube-api-access-6hjfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.532309 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "415c9781-58d2-447a-8e0c-2fed3a02ef09" (UID: "415c9781-58d2-447a-8e0c-2fed3a02ef09"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.534434 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-scripts" (OuterVolumeSpecName: "scripts") pod "415c9781-58d2-447a-8e0c-2fed3a02ef09" (UID: "415c9781-58d2-447a-8e0c-2fed3a02ef09"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.537216 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "415c9781-58d2-447a-8e0c-2fed3a02ef09" (UID: "415c9781-58d2-447a-8e0c-2fed3a02ef09"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.550341 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "415c9781-58d2-447a-8e0c-2fed3a02ef09" (UID: "415c9781-58d2-447a-8e0c-2fed3a02ef09"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.559004 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-config-data" (OuterVolumeSpecName: "config-data") pod "415c9781-58d2-447a-8e0c-2fed3a02ef09" (UID: "415c9781-58d2-447a-8e0c-2fed3a02ef09"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.620288 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hjfr\" (UniqueName: \"kubernetes.io/projected/415c9781-58d2-447a-8e0c-2fed3a02ef09-kube-api-access-6hjfr\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.620329 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.620341 4795 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.620351 4795 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.620362 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.620373 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.086549 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-26wvp" event={"ID":"415c9781-58d2-447a-8e0c-2fed3a02ef09","Type":"ContainerDied","Data":"3c586b18913896c81e735ec4b2b818346aead905ba07882b9ec1c256d7a50791"} Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.086598 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c586b18913896c81e735ec4b2b818346aead905ba07882b9ec1c256d7a50791" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.086677 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.190149 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-26wvp"] Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.202939 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-26wvp"] Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.270847 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-tghht"] Feb 19 22:52:36 crc kubenswrapper[4795]: E0219 22:52:36.271393 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="415c9781-58d2-447a-8e0c-2fed3a02ef09" containerName="keystone-bootstrap" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.271427 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="415c9781-58d2-447a-8e0c-2fed3a02ef09" containerName="keystone-bootstrap" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.271692 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="415c9781-58d2-447a-8e0c-2fed3a02ef09" containerName="keystone-bootstrap" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.272505 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.274725 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.275783 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bpdln" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.276173 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.276430 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.277786 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.284077 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-tghht"] Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.330321 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-combined-ca-bundle\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.330376 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-config-data\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.330480 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-credential-keys\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.330526 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lqz7\" (UniqueName: \"kubernetes.io/projected/fe89b6c7-308b-42a8-92a9-da093d6bbae4-kube-api-access-6lqz7\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.330552 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-scripts\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.330599 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-fernet-keys\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.432504 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-credential-keys\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.432564 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lqz7\" (UniqueName: \"kubernetes.io/projected/fe89b6c7-308b-42a8-92a9-da093d6bbae4-kube-api-access-6lqz7\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.432586 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-scripts\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.432622 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-fernet-keys\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.432699 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-combined-ca-bundle\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.432884 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-config-data\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.437676 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-scripts\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.437718 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-credential-keys\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.437952 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-fernet-keys\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.438061 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-combined-ca-bundle\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.441901 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-config-data\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.455582 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lqz7\" (UniqueName: \"kubernetes.io/projected/fe89b6c7-308b-42a8-92a9-da093d6bbae4-kube-api-access-6lqz7\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.586791 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:37 crc kubenswrapper[4795]: I0219 22:52:37.023588 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-tghht"] Feb 19 22:52:37 crc kubenswrapper[4795]: I0219 22:52:37.094646 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tghht" event={"ID":"fe89b6c7-308b-42a8-92a9-da093d6bbae4","Type":"ContainerStarted","Data":"a7cebc2666dd4130e92c6c4d3c68c5b05d24b04e3bc0724f42b292ec6cb3c474"} Feb 19 22:52:37 crc kubenswrapper[4795]: I0219 22:52:37.525645 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="415c9781-58d2-447a-8e0c-2fed3a02ef09" path="/var/lib/kubelet/pods/415c9781-58d2-447a-8e0c-2fed3a02ef09/volumes" Feb 19 22:52:38 crc kubenswrapper[4795]: I0219 22:52:38.131562 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tghht" event={"ID":"fe89b6c7-308b-42a8-92a9-da093d6bbae4","Type":"ContainerStarted","Data":"2119979e9a0784ca3835a55937d4bf0a118f5016d04e68e3c74e02efc1b7df06"} Feb 19 22:52:38 crc kubenswrapper[4795]: I0219 22:52:38.156912 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-tghht" podStartSLOduration=2.15689112 podStartE2EDuration="2.15689112s" podCreationTimestamp="2026-02-19 22:52:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:52:38.149791606 +0000 UTC m=+5069.342309500" watchObservedRunningTime="2026-02-19 22:52:38.15689112 +0000 UTC m=+5069.349409004" Feb 19 22:52:38 crc kubenswrapper[4795]: I0219 22:52:38.512001 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:52:38 crc kubenswrapper[4795]: E0219 22:52:38.512603 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:52:39 crc kubenswrapper[4795]: I0219 22:52:39.582235 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:39 crc kubenswrapper[4795]: I0219 22:52:39.654340 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dc9b786df-rmprl"] Feb 19 22:52:39 crc kubenswrapper[4795]: I0219 22:52:39.654655 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" podUID="ee8a52c2-f6ad-4b2e-a092-9393dac0f15a" containerName="dnsmasq-dns" containerID="cri-o://b0e5926c1df9cf0e08b6183cea78e6fd21fb1a7f2265743460b8010658ff82c0" gracePeriod=10 Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.160745 4795 generic.go:334] "Generic (PLEG): container finished" podID="fe89b6c7-308b-42a8-92a9-da093d6bbae4" containerID="2119979e9a0784ca3835a55937d4bf0a118f5016d04e68e3c74e02efc1b7df06" exitCode=0 Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.160974 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tghht" event={"ID":"fe89b6c7-308b-42a8-92a9-da093d6bbae4","Type":"ContainerDied","Data":"2119979e9a0784ca3835a55937d4bf0a118f5016d04e68e3c74e02efc1b7df06"} Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.163339 4795 generic.go:334] "Generic (PLEG): container finished" podID="ee8a52c2-f6ad-4b2e-a092-9393dac0f15a" containerID="b0e5926c1df9cf0e08b6183cea78e6fd21fb1a7f2265743460b8010658ff82c0" exitCode=0 Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.163361 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" event={"ID":"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a","Type":"ContainerDied","Data":"b0e5926c1df9cf0e08b6183cea78e6fd21fb1a7f2265743460b8010658ff82c0"} Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.246002 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.397135 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-ovsdbserver-nb\") pod \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.397408 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-ovsdbserver-sb\") pod \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.397445 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrnhl\" (UniqueName: \"kubernetes.io/projected/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-kube-api-access-rrnhl\") pod \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.397491 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-dns-svc\") pod \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.397664 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-config\") pod \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.404290 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-kube-api-access-rrnhl" (OuterVolumeSpecName: "kube-api-access-rrnhl") pod "ee8a52c2-f6ad-4b2e-a092-9393dac0f15a" (UID: "ee8a52c2-f6ad-4b2e-a092-9393dac0f15a"). InnerVolumeSpecName "kube-api-access-rrnhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.439307 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-config" (OuterVolumeSpecName: "config") pod "ee8a52c2-f6ad-4b2e-a092-9393dac0f15a" (UID: "ee8a52c2-f6ad-4b2e-a092-9393dac0f15a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.446844 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ee8a52c2-f6ad-4b2e-a092-9393dac0f15a" (UID: "ee8a52c2-f6ad-4b2e-a092-9393dac0f15a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.452071 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ee8a52c2-f6ad-4b2e-a092-9393dac0f15a" (UID: "ee8a52c2-f6ad-4b2e-a092-9393dac0f15a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.460544 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ee8a52c2-f6ad-4b2e-a092-9393dac0f15a" (UID: "ee8a52c2-f6ad-4b2e-a092-9393dac0f15a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.499753 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrnhl\" (UniqueName: \"kubernetes.io/projected/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-kube-api-access-rrnhl\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.499785 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.499794 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-config\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.499803 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.499813 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.181080 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" event={"ID":"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a","Type":"ContainerDied","Data":"759043739bccdb4e9d8810a99089303a4dcf977d51cfcc31ca61096dfaef3dbf"} Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.184494 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.184567 4795 scope.go:117] "RemoveContainer" containerID="b0e5926c1df9cf0e08b6183cea78e6fd21fb1a7f2265743460b8010658ff82c0" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.232559 4795 scope.go:117] "RemoveContainer" containerID="02b91837f0ddc7d9ab267bf7d71d729c37ae5422cf7fed3035a9ff30fcca558e" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.246206 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dc9b786df-rmprl"] Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.266642 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6dc9b786df-rmprl"] Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.523827 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee8a52c2-f6ad-4b2e-a092-9393dac0f15a" path="/var/lib/kubelet/pods/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a/volumes" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.573511 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.719231 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-fernet-keys\") pod \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.719653 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-credential-keys\") pod \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.719689 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-config-data\") pod \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.719732 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lqz7\" (UniqueName: \"kubernetes.io/projected/fe89b6c7-308b-42a8-92a9-da093d6bbae4-kube-api-access-6lqz7\") pod \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.720327 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-scripts\") pod \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.720401 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-combined-ca-bundle\") pod \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.731246 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "fe89b6c7-308b-42a8-92a9-da093d6bbae4" (UID: "fe89b6c7-308b-42a8-92a9-da093d6bbae4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.731935 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-scripts" (OuterVolumeSpecName: "scripts") pod "fe89b6c7-308b-42a8-92a9-da093d6bbae4" (UID: "fe89b6c7-308b-42a8-92a9-da093d6bbae4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.731972 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe89b6c7-308b-42a8-92a9-da093d6bbae4-kube-api-access-6lqz7" (OuterVolumeSpecName: "kube-api-access-6lqz7") pod "fe89b6c7-308b-42a8-92a9-da093d6bbae4" (UID: "fe89b6c7-308b-42a8-92a9-da093d6bbae4"). InnerVolumeSpecName "kube-api-access-6lqz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.732389 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "fe89b6c7-308b-42a8-92a9-da093d6bbae4" (UID: "fe89b6c7-308b-42a8-92a9-da093d6bbae4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.739045 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-config-data" (OuterVolumeSpecName: "config-data") pod "fe89b6c7-308b-42a8-92a9-da093d6bbae4" (UID: "fe89b6c7-308b-42a8-92a9-da093d6bbae4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.754860 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe89b6c7-308b-42a8-92a9-da093d6bbae4" (UID: "fe89b6c7-308b-42a8-92a9-da093d6bbae4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.822401 4795 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.822441 4795 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.822450 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.822461 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lqz7\" (UniqueName: \"kubernetes.io/projected/fe89b6c7-308b-42a8-92a9-da093d6bbae4-kube-api-access-6lqz7\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.822470 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.822478 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.191459 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tghht" event={"ID":"fe89b6c7-308b-42a8-92a9-da093d6bbae4","Type":"ContainerDied","Data":"a7cebc2666dd4130e92c6c4d3c68c5b05d24b04e3bc0724f42b292ec6cb3c474"} Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.192092 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7cebc2666dd4130e92c6c4d3c68c5b05d24b04e3bc0724f42b292ec6cb3c474" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.191525 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.287021 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-689ff8fbd7-j2v4l"] Feb 19 22:52:42 crc kubenswrapper[4795]: E0219 22:52:42.287502 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee8a52c2-f6ad-4b2e-a092-9393dac0f15a" containerName="dnsmasq-dns" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.287525 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee8a52c2-f6ad-4b2e-a092-9393dac0f15a" containerName="dnsmasq-dns" Feb 19 22:52:42 crc kubenswrapper[4795]: E0219 22:52:42.287574 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee8a52c2-f6ad-4b2e-a092-9393dac0f15a" containerName="init" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.287583 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee8a52c2-f6ad-4b2e-a092-9393dac0f15a" containerName="init" Feb 19 22:52:42 crc kubenswrapper[4795]: E0219 22:52:42.287593 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe89b6c7-308b-42a8-92a9-da093d6bbae4" containerName="keystone-bootstrap" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.287603 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe89b6c7-308b-42a8-92a9-da093d6bbae4" containerName="keystone-bootstrap" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.287830 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee8a52c2-f6ad-4b2e-a092-9393dac0f15a" containerName="dnsmasq-dns" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.287875 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe89b6c7-308b-42a8-92a9-da093d6bbae4" containerName="keystone-bootstrap" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.288699 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.290981 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.291224 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.291270 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.291451 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bpdln" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.305715 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-689ff8fbd7-j2v4l"] Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.432933 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh7tz\" (UniqueName: \"kubernetes.io/projected/57c39d61-cab0-49e7-8938-06952896387e-kube-api-access-xh7tz\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.433005 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57c39d61-cab0-49e7-8938-06952896387e-config-data\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.433028 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57c39d61-cab0-49e7-8938-06952896387e-combined-ca-bundle\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.433047 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/57c39d61-cab0-49e7-8938-06952896387e-fernet-keys\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.433077 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57c39d61-cab0-49e7-8938-06952896387e-scripts\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.433134 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/57c39d61-cab0-49e7-8938-06952896387e-credential-keys\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.534458 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh7tz\" (UniqueName: \"kubernetes.io/projected/57c39d61-cab0-49e7-8938-06952896387e-kube-api-access-xh7tz\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.534515 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57c39d61-cab0-49e7-8938-06952896387e-config-data\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.534539 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57c39d61-cab0-49e7-8938-06952896387e-combined-ca-bundle\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.534556 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/57c39d61-cab0-49e7-8938-06952896387e-fernet-keys\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.534587 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57c39d61-cab0-49e7-8938-06952896387e-scripts\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.534611 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/57c39d61-cab0-49e7-8938-06952896387e-credential-keys\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.538881 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/57c39d61-cab0-49e7-8938-06952896387e-credential-keys\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.539045 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/57c39d61-cab0-49e7-8938-06952896387e-fernet-keys\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.539605 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57c39d61-cab0-49e7-8938-06952896387e-config-data\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.539799 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57c39d61-cab0-49e7-8938-06952896387e-combined-ca-bundle\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.540154 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57c39d61-cab0-49e7-8938-06952896387e-scripts\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.551338 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh7tz\" (UniqueName: \"kubernetes.io/projected/57c39d61-cab0-49e7-8938-06952896387e-kube-api-access-xh7tz\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.609067 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:43 crc kubenswrapper[4795]: I0219 22:52:43.022075 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-689ff8fbd7-j2v4l"] Feb 19 22:52:43 crc kubenswrapper[4795]: I0219 22:52:43.202942 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-689ff8fbd7-j2v4l" event={"ID":"57c39d61-cab0-49e7-8938-06952896387e","Type":"ContainerStarted","Data":"a5b34a7617681ccb01188f3f8df30993d2c24e886985165554122e6148e7e686"} Feb 19 22:52:44 crc kubenswrapper[4795]: I0219 22:52:44.217689 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-689ff8fbd7-j2v4l" event={"ID":"57c39d61-cab0-49e7-8938-06952896387e","Type":"ContainerStarted","Data":"498e012de73e1306e1a1f950ab4d480f4813c283f654129da15f41fffb0fd674"} Feb 19 22:52:44 crc kubenswrapper[4795]: I0219 22:52:44.218150 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:44 crc kubenswrapper[4795]: I0219 22:52:44.246123 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-689ff8fbd7-j2v4l" podStartSLOduration=2.246101869 podStartE2EDuration="2.246101869s" podCreationTimestamp="2026-02-19 22:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:52:44.238245934 +0000 UTC m=+5075.430763798" watchObservedRunningTime="2026-02-19 22:52:44.246101869 +0000 UTC m=+5075.438619733" Feb 19 22:52:53 crc kubenswrapper[4795]: I0219 22:52:53.512297 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:52:53 crc kubenswrapper[4795]: E0219 22:52:53.512933 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:53:06 crc kubenswrapper[4795]: I0219 22:53:06.511458 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:53:06 crc kubenswrapper[4795]: E0219 22:53:06.512127 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:53:14 crc kubenswrapper[4795]: I0219 22:53:14.006826 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.150696 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.153578 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.155713 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.156271 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-8rdbp" Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.158100 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.176556 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.206584 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sfjd\" (UniqueName: \"kubernetes.io/projected/54e90f84-703c-41b3-85c2-dd4ce9e3a968-kube-api-access-8sfjd\") pod \"openstackclient\" (UID: \"54e90f84-703c-41b3-85c2-dd4ce9e3a968\") " pod="openstack/openstackclient" Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.206800 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/54e90f84-703c-41b3-85c2-dd4ce9e3a968-openstack-config-secret\") pod \"openstackclient\" (UID: \"54e90f84-703c-41b3-85c2-dd4ce9e3a968\") " pod="openstack/openstackclient" Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.207034 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/54e90f84-703c-41b3-85c2-dd4ce9e3a968-openstack-config\") pod \"openstackclient\" (UID: \"54e90f84-703c-41b3-85c2-dd4ce9e3a968\") " pod="openstack/openstackclient" Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.308019 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/54e90f84-703c-41b3-85c2-dd4ce9e3a968-openstack-config\") pod \"openstackclient\" (UID: \"54e90f84-703c-41b3-85c2-dd4ce9e3a968\") " pod="openstack/openstackclient" Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.308075 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sfjd\" (UniqueName: \"kubernetes.io/projected/54e90f84-703c-41b3-85c2-dd4ce9e3a968-kube-api-access-8sfjd\") pod \"openstackclient\" (UID: \"54e90f84-703c-41b3-85c2-dd4ce9e3a968\") " pod="openstack/openstackclient" Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.308127 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/54e90f84-703c-41b3-85c2-dd4ce9e3a968-openstack-config-secret\") pod \"openstackclient\" (UID: \"54e90f84-703c-41b3-85c2-dd4ce9e3a968\") " pod="openstack/openstackclient" Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.309070 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/54e90f84-703c-41b3-85c2-dd4ce9e3a968-openstack-config\") pod \"openstackclient\" (UID: \"54e90f84-703c-41b3-85c2-dd4ce9e3a968\") " pod="openstack/openstackclient" Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.315772 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/54e90f84-703c-41b3-85c2-dd4ce9e3a968-openstack-config-secret\") pod \"openstackclient\" (UID: \"54e90f84-703c-41b3-85c2-dd4ce9e3a968\") " pod="openstack/openstackclient" Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.324457 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sfjd\" (UniqueName: \"kubernetes.io/projected/54e90f84-703c-41b3-85c2-dd4ce9e3a968-kube-api-access-8sfjd\") pod \"openstackclient\" (UID: \"54e90f84-703c-41b3-85c2-dd4ce9e3a968\") " pod="openstack/openstackclient" Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.510882 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:53:18 crc kubenswrapper[4795]: E0219 22:53:18.511667 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.543038 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.973360 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 22:53:19 crc kubenswrapper[4795]: I0219 22:53:19.507783 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"54e90f84-703c-41b3-85c2-dd4ce9e3a968","Type":"ContainerStarted","Data":"3e0c8bd8a5e5ff3ecf9344765c7baeaf282ae456818a7075cdf1b64ce1d085bd"} Feb 19 22:53:19 crc kubenswrapper[4795]: I0219 22:53:19.508415 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"54e90f84-703c-41b3-85c2-dd4ce9e3a968","Type":"ContainerStarted","Data":"c80b0ea28a23506da3a2169be37d63f31f0ce0b86f7443217beacd84fe00e7de"} Feb 19 22:53:19 crc kubenswrapper[4795]: I0219 22:53:19.539904 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.5398829059999999 podStartE2EDuration="1.539882906s" podCreationTimestamp="2026-02-19 22:53:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:53:19.533880764 +0000 UTC m=+5110.726398668" watchObservedRunningTime="2026-02-19 22:53:19.539882906 +0000 UTC m=+5110.732400780" Feb 19 22:53:31 crc kubenswrapper[4795]: I0219 22:53:31.309447 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f8r7g"] Feb 19 22:53:31 crc kubenswrapper[4795]: I0219 22:53:31.315062 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:31 crc kubenswrapper[4795]: I0219 22:53:31.340614 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8r7g"] Feb 19 22:53:31 crc kubenswrapper[4795]: I0219 22:53:31.448957 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/557ec556-8442-4d6a-a634-4fa240dc96dd-utilities\") pod \"redhat-marketplace-f8r7g\" (UID: \"557ec556-8442-4d6a-a634-4fa240dc96dd\") " pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:31 crc kubenswrapper[4795]: I0219 22:53:31.449139 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/557ec556-8442-4d6a-a634-4fa240dc96dd-catalog-content\") pod \"redhat-marketplace-f8r7g\" (UID: \"557ec556-8442-4d6a-a634-4fa240dc96dd\") " pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:31 crc kubenswrapper[4795]: I0219 22:53:31.449214 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frc84\" (UniqueName: \"kubernetes.io/projected/557ec556-8442-4d6a-a634-4fa240dc96dd-kube-api-access-frc84\") pod \"redhat-marketplace-f8r7g\" (UID: \"557ec556-8442-4d6a-a634-4fa240dc96dd\") " pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:31 crc kubenswrapper[4795]: I0219 22:53:31.512313 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:53:31 crc kubenswrapper[4795]: E0219 22:53:31.512905 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:53:31 crc kubenswrapper[4795]: I0219 22:53:31.550942 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/557ec556-8442-4d6a-a634-4fa240dc96dd-utilities\") pod \"redhat-marketplace-f8r7g\" (UID: \"557ec556-8442-4d6a-a634-4fa240dc96dd\") " pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:31 crc kubenswrapper[4795]: I0219 22:53:31.551055 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/557ec556-8442-4d6a-a634-4fa240dc96dd-catalog-content\") pod \"redhat-marketplace-f8r7g\" (UID: \"557ec556-8442-4d6a-a634-4fa240dc96dd\") " pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:31 crc kubenswrapper[4795]: I0219 22:53:31.551081 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frc84\" (UniqueName: \"kubernetes.io/projected/557ec556-8442-4d6a-a634-4fa240dc96dd-kube-api-access-frc84\") pod \"redhat-marketplace-f8r7g\" (UID: \"557ec556-8442-4d6a-a634-4fa240dc96dd\") " pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:31 crc kubenswrapper[4795]: I0219 22:53:31.551451 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/557ec556-8442-4d6a-a634-4fa240dc96dd-catalog-content\") pod \"redhat-marketplace-f8r7g\" (UID: \"557ec556-8442-4d6a-a634-4fa240dc96dd\") " pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:31 crc kubenswrapper[4795]: I0219 22:53:31.551504 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/557ec556-8442-4d6a-a634-4fa240dc96dd-utilities\") pod \"redhat-marketplace-f8r7g\" (UID: \"557ec556-8442-4d6a-a634-4fa240dc96dd\") " pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:31 crc kubenswrapper[4795]: I0219 22:53:31.570690 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frc84\" (UniqueName: \"kubernetes.io/projected/557ec556-8442-4d6a-a634-4fa240dc96dd-kube-api-access-frc84\") pod \"redhat-marketplace-f8r7g\" (UID: \"557ec556-8442-4d6a-a634-4fa240dc96dd\") " pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:31 crc kubenswrapper[4795]: I0219 22:53:31.650583 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:32 crc kubenswrapper[4795]: I0219 22:53:32.102877 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8r7g"] Feb 19 22:53:32 crc kubenswrapper[4795]: W0219 22:53:32.107286 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod557ec556_8442_4d6a_a634_4fa240dc96dd.slice/crio-a6d2f64224f31951b6522ea786610229890b81739479a3ce81f1108e35bc147e WatchSource:0}: Error finding container a6d2f64224f31951b6522ea786610229890b81739479a3ce81f1108e35bc147e: Status 404 returned error can't find the container with id a6d2f64224f31951b6522ea786610229890b81739479a3ce81f1108e35bc147e Feb 19 22:53:32 crc kubenswrapper[4795]: I0219 22:53:32.633599 4795 generic.go:334] "Generic (PLEG): container finished" podID="557ec556-8442-4d6a-a634-4fa240dc96dd" containerID="c231894f928ff6006d3795f32c07693544e63e24b0bda123c750e9a9591a85cb" exitCode=0 Feb 19 22:53:32 crc kubenswrapper[4795]: I0219 22:53:32.633649 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8r7g" event={"ID":"557ec556-8442-4d6a-a634-4fa240dc96dd","Type":"ContainerDied","Data":"c231894f928ff6006d3795f32c07693544e63e24b0bda123c750e9a9591a85cb"} Feb 19 22:53:32 crc kubenswrapper[4795]: I0219 22:53:32.633680 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8r7g" event={"ID":"557ec556-8442-4d6a-a634-4fa240dc96dd","Type":"ContainerStarted","Data":"a6d2f64224f31951b6522ea786610229890b81739479a3ce81f1108e35bc147e"} Feb 19 22:53:32 crc kubenswrapper[4795]: I0219 22:53:32.635700 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 22:53:33 crc kubenswrapper[4795]: I0219 22:53:33.643287 4795 generic.go:334] "Generic (PLEG): container finished" podID="557ec556-8442-4d6a-a634-4fa240dc96dd" containerID="a8ba9c147da3d514e3af879474033da9f8b52540193f19e56451a10a7bd8dda7" exitCode=0 Feb 19 22:53:33 crc kubenswrapper[4795]: I0219 22:53:33.643334 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8r7g" event={"ID":"557ec556-8442-4d6a-a634-4fa240dc96dd","Type":"ContainerDied","Data":"a8ba9c147da3d514e3af879474033da9f8b52540193f19e56451a10a7bd8dda7"} Feb 19 22:53:34 crc kubenswrapper[4795]: I0219 22:53:34.653315 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8r7g" event={"ID":"557ec556-8442-4d6a-a634-4fa240dc96dd","Type":"ContainerStarted","Data":"4623fadc476945889aff24b06d18aa005efb4422a840cf8233dbc8382806d4e8"} Feb 19 22:53:34 crc kubenswrapper[4795]: I0219 22:53:34.672825 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f8r7g" podStartSLOduration=2.28759593 podStartE2EDuration="3.672806886s" podCreationTimestamp="2026-02-19 22:53:31 +0000 UTC" firstStartedPulling="2026-02-19 22:53:32.635338904 +0000 UTC m=+5123.827856778" lastFinishedPulling="2026-02-19 22:53:34.02054987 +0000 UTC m=+5125.213067734" observedRunningTime="2026-02-19 22:53:34.671290612 +0000 UTC m=+5125.863808476" watchObservedRunningTime="2026-02-19 22:53:34.672806886 +0000 UTC m=+5125.865324760" Feb 19 22:53:41 crc kubenswrapper[4795]: I0219 22:53:41.651654 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:41 crc kubenswrapper[4795]: I0219 22:53:41.652756 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:41 crc kubenswrapper[4795]: I0219 22:53:41.698063 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:41 crc kubenswrapper[4795]: I0219 22:53:41.754478 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:41 crc kubenswrapper[4795]: I0219 22:53:41.942391 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8r7g"] Feb 19 22:53:43 crc kubenswrapper[4795]: I0219 22:53:43.723236 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f8r7g" podUID="557ec556-8442-4d6a-a634-4fa240dc96dd" containerName="registry-server" containerID="cri-o://4623fadc476945889aff24b06d18aa005efb4422a840cf8233dbc8382806d4e8" gracePeriod=2 Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.293461 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.420523 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/557ec556-8442-4d6a-a634-4fa240dc96dd-catalog-content\") pod \"557ec556-8442-4d6a-a634-4fa240dc96dd\" (UID: \"557ec556-8442-4d6a-a634-4fa240dc96dd\") " Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.420613 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frc84\" (UniqueName: \"kubernetes.io/projected/557ec556-8442-4d6a-a634-4fa240dc96dd-kube-api-access-frc84\") pod \"557ec556-8442-4d6a-a634-4fa240dc96dd\" (UID: \"557ec556-8442-4d6a-a634-4fa240dc96dd\") " Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.420713 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/557ec556-8442-4d6a-a634-4fa240dc96dd-utilities\") pod \"557ec556-8442-4d6a-a634-4fa240dc96dd\" (UID: \"557ec556-8442-4d6a-a634-4fa240dc96dd\") " Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.421888 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/557ec556-8442-4d6a-a634-4fa240dc96dd-utilities" (OuterVolumeSpecName: "utilities") pod "557ec556-8442-4d6a-a634-4fa240dc96dd" (UID: "557ec556-8442-4d6a-a634-4fa240dc96dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.426446 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/557ec556-8442-4d6a-a634-4fa240dc96dd-kube-api-access-frc84" (OuterVolumeSpecName: "kube-api-access-frc84") pod "557ec556-8442-4d6a-a634-4fa240dc96dd" (UID: "557ec556-8442-4d6a-a634-4fa240dc96dd"). InnerVolumeSpecName "kube-api-access-frc84". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.444148 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/557ec556-8442-4d6a-a634-4fa240dc96dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "557ec556-8442-4d6a-a634-4fa240dc96dd" (UID: "557ec556-8442-4d6a-a634-4fa240dc96dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.521994 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/557ec556-8442-4d6a-a634-4fa240dc96dd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.522032 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frc84\" (UniqueName: \"kubernetes.io/projected/557ec556-8442-4d6a-a634-4fa240dc96dd-kube-api-access-frc84\") on node \"crc\" DevicePath \"\"" Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.522046 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/557ec556-8442-4d6a-a634-4fa240dc96dd-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.735553 4795 generic.go:334] "Generic (PLEG): container finished" podID="557ec556-8442-4d6a-a634-4fa240dc96dd" containerID="4623fadc476945889aff24b06d18aa005efb4422a840cf8233dbc8382806d4e8" exitCode=0 Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.735597 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8r7g" event={"ID":"557ec556-8442-4d6a-a634-4fa240dc96dd","Type":"ContainerDied","Data":"4623fadc476945889aff24b06d18aa005efb4422a840cf8233dbc8382806d4e8"} Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.735648 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.735660 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8r7g" event={"ID":"557ec556-8442-4d6a-a634-4fa240dc96dd","Type":"ContainerDied","Data":"a6d2f64224f31951b6522ea786610229890b81739479a3ce81f1108e35bc147e"} Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.735690 4795 scope.go:117] "RemoveContainer" containerID="4623fadc476945889aff24b06d18aa005efb4422a840cf8233dbc8382806d4e8" Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.764056 4795 scope.go:117] "RemoveContainer" containerID="a8ba9c147da3d514e3af879474033da9f8b52540193f19e56451a10a7bd8dda7" Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.781844 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8r7g"] Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.793067 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8r7g"] Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.805835 4795 scope.go:117] "RemoveContainer" containerID="c231894f928ff6006d3795f32c07693544e63e24b0bda123c750e9a9591a85cb" Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.832718 4795 scope.go:117] "RemoveContainer" containerID="4623fadc476945889aff24b06d18aa005efb4422a840cf8233dbc8382806d4e8" Feb 19 22:53:44 crc kubenswrapper[4795]: E0219 22:53:44.833198 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4623fadc476945889aff24b06d18aa005efb4422a840cf8233dbc8382806d4e8\": container with ID starting with 4623fadc476945889aff24b06d18aa005efb4422a840cf8233dbc8382806d4e8 not found: ID does not exist" containerID="4623fadc476945889aff24b06d18aa005efb4422a840cf8233dbc8382806d4e8" Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.833245 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4623fadc476945889aff24b06d18aa005efb4422a840cf8233dbc8382806d4e8"} err="failed to get container status \"4623fadc476945889aff24b06d18aa005efb4422a840cf8233dbc8382806d4e8\": rpc error: code = NotFound desc = could not find container \"4623fadc476945889aff24b06d18aa005efb4422a840cf8233dbc8382806d4e8\": container with ID starting with 4623fadc476945889aff24b06d18aa005efb4422a840cf8233dbc8382806d4e8 not found: ID does not exist" Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.833276 4795 scope.go:117] "RemoveContainer" containerID="a8ba9c147da3d514e3af879474033da9f8b52540193f19e56451a10a7bd8dda7" Feb 19 22:53:44 crc kubenswrapper[4795]: E0219 22:53:44.833720 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8ba9c147da3d514e3af879474033da9f8b52540193f19e56451a10a7bd8dda7\": container with ID starting with a8ba9c147da3d514e3af879474033da9f8b52540193f19e56451a10a7bd8dda7 not found: ID does not exist" containerID="a8ba9c147da3d514e3af879474033da9f8b52540193f19e56451a10a7bd8dda7" Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.833763 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8ba9c147da3d514e3af879474033da9f8b52540193f19e56451a10a7bd8dda7"} err="failed to get container status \"a8ba9c147da3d514e3af879474033da9f8b52540193f19e56451a10a7bd8dda7\": rpc error: code = NotFound desc = could not find container \"a8ba9c147da3d514e3af879474033da9f8b52540193f19e56451a10a7bd8dda7\": container with ID starting with a8ba9c147da3d514e3af879474033da9f8b52540193f19e56451a10a7bd8dda7 not found: ID does not exist" Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.833792 4795 scope.go:117] "RemoveContainer" containerID="c231894f928ff6006d3795f32c07693544e63e24b0bda123c750e9a9591a85cb" Feb 19 22:53:44 crc kubenswrapper[4795]: E0219 22:53:44.834095 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c231894f928ff6006d3795f32c07693544e63e24b0bda123c750e9a9591a85cb\": container with ID starting with c231894f928ff6006d3795f32c07693544e63e24b0bda123c750e9a9591a85cb not found: ID does not exist" containerID="c231894f928ff6006d3795f32c07693544e63e24b0bda123c750e9a9591a85cb" Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.834147 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c231894f928ff6006d3795f32c07693544e63e24b0bda123c750e9a9591a85cb"} err="failed to get container status \"c231894f928ff6006d3795f32c07693544e63e24b0bda123c750e9a9591a85cb\": rpc error: code = NotFound desc = could not find container \"c231894f928ff6006d3795f32c07693544e63e24b0bda123c750e9a9591a85cb\": container with ID starting with c231894f928ff6006d3795f32c07693544e63e24b0bda123c750e9a9591a85cb not found: ID does not exist" Feb 19 22:53:45 crc kubenswrapper[4795]: I0219 22:53:45.512398 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:53:45 crc kubenswrapper[4795]: E0219 22:53:45.513382 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:53:45 crc kubenswrapper[4795]: I0219 22:53:45.528567 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="557ec556-8442-4d6a-a634-4fa240dc96dd" path="/var/lib/kubelet/pods/557ec556-8442-4d6a-a634-4fa240dc96dd/volumes" Feb 19 22:53:58 crc kubenswrapper[4795]: I0219 22:53:58.513471 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:53:58 crc kubenswrapper[4795]: E0219 22:53:58.514361 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:54:13 crc kubenswrapper[4795]: I0219 22:54:13.514343 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:54:13 crc kubenswrapper[4795]: E0219 22:54:13.515043 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:54:24 crc kubenswrapper[4795]: I0219 22:54:24.513388 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:54:24 crc kubenswrapper[4795]: E0219 22:54:24.514373 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:54:36 crc kubenswrapper[4795]: I0219 22:54:36.511365 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:54:36 crc kubenswrapper[4795]: E0219 22:54:36.512250 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:54:49 crc kubenswrapper[4795]: I0219 22:54:49.517338 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:54:49 crc kubenswrapper[4795]: E0219 22:54:49.518030 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.615447 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-v9rbk"] Feb 19 22:55:00 crc kubenswrapper[4795]: E0219 22:55:00.616341 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="557ec556-8442-4d6a-a634-4fa240dc96dd" containerName="registry-server" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.616355 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="557ec556-8442-4d6a-a634-4fa240dc96dd" containerName="registry-server" Feb 19 22:55:00 crc kubenswrapper[4795]: E0219 22:55:00.616377 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="557ec556-8442-4d6a-a634-4fa240dc96dd" containerName="extract-utilities" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.616383 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="557ec556-8442-4d6a-a634-4fa240dc96dd" containerName="extract-utilities" Feb 19 22:55:00 crc kubenswrapper[4795]: E0219 22:55:00.616396 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="557ec556-8442-4d6a-a634-4fa240dc96dd" containerName="extract-content" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.616402 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="557ec556-8442-4d6a-a634-4fa240dc96dd" containerName="extract-content" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.616554 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="557ec556-8442-4d6a-a634-4fa240dc96dd" containerName="registry-server" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.617072 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-v9rbk" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.625779 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-v9rbk"] Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.701058 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbr6n\" (UniqueName: \"kubernetes.io/projected/c13f05e4-27de-4750-bb9d-008e3a0be0c7-kube-api-access-kbr6n\") pod \"barbican-db-create-v9rbk\" (UID: \"c13f05e4-27de-4750-bb9d-008e3a0be0c7\") " pod="openstack/barbican-db-create-v9rbk" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.701430 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c13f05e4-27de-4750-bb9d-008e3a0be0c7-operator-scripts\") pod \"barbican-db-create-v9rbk\" (UID: \"c13f05e4-27de-4750-bb9d-008e3a0be0c7\") " pod="openstack/barbican-db-create-v9rbk" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.710977 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-b1d9-account-create-update-bfmjz"] Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.712079 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b1d9-account-create-update-bfmjz" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.715125 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.721272 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b1d9-account-create-update-bfmjz"] Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.803191 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4ca6125-46fa-4dd9-8d20-3816b6c09066-operator-scripts\") pod \"barbican-b1d9-account-create-update-bfmjz\" (UID: \"a4ca6125-46fa-4dd9-8d20-3816b6c09066\") " pod="openstack/barbican-b1d9-account-create-update-bfmjz" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.803267 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbr6n\" (UniqueName: \"kubernetes.io/projected/c13f05e4-27de-4750-bb9d-008e3a0be0c7-kube-api-access-kbr6n\") pod \"barbican-db-create-v9rbk\" (UID: \"c13f05e4-27de-4750-bb9d-008e3a0be0c7\") " pod="openstack/barbican-db-create-v9rbk" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.803303 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkpcn\" (UniqueName: \"kubernetes.io/projected/a4ca6125-46fa-4dd9-8d20-3816b6c09066-kube-api-access-nkpcn\") pod \"barbican-b1d9-account-create-update-bfmjz\" (UID: \"a4ca6125-46fa-4dd9-8d20-3816b6c09066\") " pod="openstack/barbican-b1d9-account-create-update-bfmjz" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.803381 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c13f05e4-27de-4750-bb9d-008e3a0be0c7-operator-scripts\") pod \"barbican-db-create-v9rbk\" (UID: \"c13f05e4-27de-4750-bb9d-008e3a0be0c7\") " pod="openstack/barbican-db-create-v9rbk" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.804201 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c13f05e4-27de-4750-bb9d-008e3a0be0c7-operator-scripts\") pod \"barbican-db-create-v9rbk\" (UID: \"c13f05e4-27de-4750-bb9d-008e3a0be0c7\") " pod="openstack/barbican-db-create-v9rbk" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.821191 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbr6n\" (UniqueName: \"kubernetes.io/projected/c13f05e4-27de-4750-bb9d-008e3a0be0c7-kube-api-access-kbr6n\") pod \"barbican-db-create-v9rbk\" (UID: \"c13f05e4-27de-4750-bb9d-008e3a0be0c7\") " pod="openstack/barbican-db-create-v9rbk" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.904628 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4ca6125-46fa-4dd9-8d20-3816b6c09066-operator-scripts\") pod \"barbican-b1d9-account-create-update-bfmjz\" (UID: \"a4ca6125-46fa-4dd9-8d20-3816b6c09066\") " pod="openstack/barbican-b1d9-account-create-update-bfmjz" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.904694 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkpcn\" (UniqueName: \"kubernetes.io/projected/a4ca6125-46fa-4dd9-8d20-3816b6c09066-kube-api-access-nkpcn\") pod \"barbican-b1d9-account-create-update-bfmjz\" (UID: \"a4ca6125-46fa-4dd9-8d20-3816b6c09066\") " pod="openstack/barbican-b1d9-account-create-update-bfmjz" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.905424 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4ca6125-46fa-4dd9-8d20-3816b6c09066-operator-scripts\") pod \"barbican-b1d9-account-create-update-bfmjz\" (UID: \"a4ca6125-46fa-4dd9-8d20-3816b6c09066\") " pod="openstack/barbican-b1d9-account-create-update-bfmjz" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.926463 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkpcn\" (UniqueName: \"kubernetes.io/projected/a4ca6125-46fa-4dd9-8d20-3816b6c09066-kube-api-access-nkpcn\") pod \"barbican-b1d9-account-create-update-bfmjz\" (UID: \"a4ca6125-46fa-4dd9-8d20-3816b6c09066\") " pod="openstack/barbican-b1d9-account-create-update-bfmjz" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.932602 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-v9rbk" Feb 19 22:55:01 crc kubenswrapper[4795]: I0219 22:55:01.027618 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b1d9-account-create-update-bfmjz" Feb 19 22:55:01 crc kubenswrapper[4795]: I0219 22:55:01.372503 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-v9rbk"] Feb 19 22:55:01 crc kubenswrapper[4795]: I0219 22:55:01.409475 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-v9rbk" event={"ID":"c13f05e4-27de-4750-bb9d-008e3a0be0c7","Type":"ContainerStarted","Data":"7c97af9ccf6f31d23cbe3a3815205b5749e3e039902947753042820a7aade8f7"} Feb 19 22:55:01 crc kubenswrapper[4795]: I0219 22:55:01.463604 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b1d9-account-create-update-bfmjz"] Feb 19 22:55:01 crc kubenswrapper[4795]: W0219 22:55:01.465392 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4ca6125_46fa_4dd9_8d20_3816b6c09066.slice/crio-41ebf5dff4b7d3713e2e96a9fc11933327b20a4c76864580a6e60d66ebb7a6b3 WatchSource:0}: Error finding container 41ebf5dff4b7d3713e2e96a9fc11933327b20a4c76864580a6e60d66ebb7a6b3: Status 404 returned error can't find the container with id 41ebf5dff4b7d3713e2e96a9fc11933327b20a4c76864580a6e60d66ebb7a6b3 Feb 19 22:55:02 crc kubenswrapper[4795]: I0219 22:55:02.419906 4795 generic.go:334] "Generic (PLEG): container finished" podID="a4ca6125-46fa-4dd9-8d20-3816b6c09066" containerID="88b5a89b19e9675d8f8f4f6be28cd30648da8baae5b54e90d50ee586416168ce" exitCode=0 Feb 19 22:55:02 crc kubenswrapper[4795]: I0219 22:55:02.419969 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b1d9-account-create-update-bfmjz" event={"ID":"a4ca6125-46fa-4dd9-8d20-3816b6c09066","Type":"ContainerDied","Data":"88b5a89b19e9675d8f8f4f6be28cd30648da8baae5b54e90d50ee586416168ce"} Feb 19 22:55:02 crc kubenswrapper[4795]: I0219 22:55:02.419996 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b1d9-account-create-update-bfmjz" event={"ID":"a4ca6125-46fa-4dd9-8d20-3816b6c09066","Type":"ContainerStarted","Data":"41ebf5dff4b7d3713e2e96a9fc11933327b20a4c76864580a6e60d66ebb7a6b3"} Feb 19 22:55:02 crc kubenswrapper[4795]: I0219 22:55:02.421061 4795 generic.go:334] "Generic (PLEG): container finished" podID="c13f05e4-27de-4750-bb9d-008e3a0be0c7" containerID="c70a8662daa881e0007310348920cbd44bed7e794700942de323e6f34a2f57fc" exitCode=0 Feb 19 22:55:02 crc kubenswrapper[4795]: I0219 22:55:02.421079 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-v9rbk" event={"ID":"c13f05e4-27de-4750-bb9d-008e3a0be0c7","Type":"ContainerDied","Data":"c70a8662daa881e0007310348920cbd44bed7e794700942de323e6f34a2f57fc"} Feb 19 22:55:02 crc kubenswrapper[4795]: I0219 22:55:02.511822 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:55:02 crc kubenswrapper[4795]: E0219 22:55:02.512152 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:55:03 crc kubenswrapper[4795]: I0219 22:55:03.824558 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-v9rbk" Feb 19 22:55:03 crc kubenswrapper[4795]: I0219 22:55:03.910618 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b1d9-account-create-update-bfmjz" Feb 19 22:55:03 crc kubenswrapper[4795]: I0219 22:55:03.955801 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbr6n\" (UniqueName: \"kubernetes.io/projected/c13f05e4-27de-4750-bb9d-008e3a0be0c7-kube-api-access-kbr6n\") pod \"c13f05e4-27de-4750-bb9d-008e3a0be0c7\" (UID: \"c13f05e4-27de-4750-bb9d-008e3a0be0c7\") " Feb 19 22:55:03 crc kubenswrapper[4795]: I0219 22:55:03.955845 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c13f05e4-27de-4750-bb9d-008e3a0be0c7-operator-scripts\") pod \"c13f05e4-27de-4750-bb9d-008e3a0be0c7\" (UID: \"c13f05e4-27de-4750-bb9d-008e3a0be0c7\") " Feb 19 22:55:03 crc kubenswrapper[4795]: I0219 22:55:03.956771 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c13f05e4-27de-4750-bb9d-008e3a0be0c7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c13f05e4-27de-4750-bb9d-008e3a0be0c7" (UID: "c13f05e4-27de-4750-bb9d-008e3a0be0c7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:55:03 crc kubenswrapper[4795]: I0219 22:55:03.966569 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c13f05e4-27de-4750-bb9d-008e3a0be0c7-kube-api-access-kbr6n" (OuterVolumeSpecName: "kube-api-access-kbr6n") pod "c13f05e4-27de-4750-bb9d-008e3a0be0c7" (UID: "c13f05e4-27de-4750-bb9d-008e3a0be0c7"). InnerVolumeSpecName "kube-api-access-kbr6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:55:04 crc kubenswrapper[4795]: I0219 22:55:04.057251 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkpcn\" (UniqueName: \"kubernetes.io/projected/a4ca6125-46fa-4dd9-8d20-3816b6c09066-kube-api-access-nkpcn\") pod \"a4ca6125-46fa-4dd9-8d20-3816b6c09066\" (UID: \"a4ca6125-46fa-4dd9-8d20-3816b6c09066\") " Feb 19 22:55:04 crc kubenswrapper[4795]: I0219 22:55:04.057415 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4ca6125-46fa-4dd9-8d20-3816b6c09066-operator-scripts\") pod \"a4ca6125-46fa-4dd9-8d20-3816b6c09066\" (UID: \"a4ca6125-46fa-4dd9-8d20-3816b6c09066\") " Feb 19 22:55:04 crc kubenswrapper[4795]: I0219 22:55:04.057741 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbr6n\" (UniqueName: \"kubernetes.io/projected/c13f05e4-27de-4750-bb9d-008e3a0be0c7-kube-api-access-kbr6n\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:04 crc kubenswrapper[4795]: I0219 22:55:04.057761 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c13f05e4-27de-4750-bb9d-008e3a0be0c7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:04 crc kubenswrapper[4795]: I0219 22:55:04.058218 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4ca6125-46fa-4dd9-8d20-3816b6c09066-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a4ca6125-46fa-4dd9-8d20-3816b6c09066" (UID: "a4ca6125-46fa-4dd9-8d20-3816b6c09066"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:55:04 crc kubenswrapper[4795]: I0219 22:55:04.060449 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4ca6125-46fa-4dd9-8d20-3816b6c09066-kube-api-access-nkpcn" (OuterVolumeSpecName: "kube-api-access-nkpcn") pod "a4ca6125-46fa-4dd9-8d20-3816b6c09066" (UID: "a4ca6125-46fa-4dd9-8d20-3816b6c09066"). InnerVolumeSpecName "kube-api-access-nkpcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:55:04 crc kubenswrapper[4795]: I0219 22:55:04.159005 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4ca6125-46fa-4dd9-8d20-3816b6c09066-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:04 crc kubenswrapper[4795]: I0219 22:55:04.159067 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkpcn\" (UniqueName: \"kubernetes.io/projected/a4ca6125-46fa-4dd9-8d20-3816b6c09066-kube-api-access-nkpcn\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:04 crc kubenswrapper[4795]: I0219 22:55:04.439712 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-v9rbk" event={"ID":"c13f05e4-27de-4750-bb9d-008e3a0be0c7","Type":"ContainerDied","Data":"7c97af9ccf6f31d23cbe3a3815205b5749e3e039902947753042820a7aade8f7"} Feb 19 22:55:04 crc kubenswrapper[4795]: I0219 22:55:04.439749 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-v9rbk" Feb 19 22:55:04 crc kubenswrapper[4795]: I0219 22:55:04.439757 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c97af9ccf6f31d23cbe3a3815205b5749e3e039902947753042820a7aade8f7" Feb 19 22:55:04 crc kubenswrapper[4795]: I0219 22:55:04.441217 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b1d9-account-create-update-bfmjz" event={"ID":"a4ca6125-46fa-4dd9-8d20-3816b6c09066","Type":"ContainerDied","Data":"41ebf5dff4b7d3713e2e96a9fc11933327b20a4c76864580a6e60d66ebb7a6b3"} Feb 19 22:55:04 crc kubenswrapper[4795]: I0219 22:55:04.441299 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b1d9-account-create-update-bfmjz" Feb 19 22:55:04 crc kubenswrapper[4795]: I0219 22:55:04.441314 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41ebf5dff4b7d3713e2e96a9fc11933327b20a4c76864580a6e60d66ebb7a6b3" Feb 19 22:55:04 crc kubenswrapper[4795]: E0219 22:55:04.494542 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc13f05e4_27de_4750_bb9d_008e3a0be0c7.slice/crio-7c97af9ccf6f31d23cbe3a3815205b5749e3e039902947753042820a7aade8f7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc13f05e4_27de_4750_bb9d_008e3a0be0c7.slice\": RecentStats: unable to find data in memory cache]" Feb 19 22:55:05 crc kubenswrapper[4795]: I0219 22:55:05.060682 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-vrr5x"] Feb 19 22:55:05 crc kubenswrapper[4795]: I0219 22:55:05.067360 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-vrr5x"] Feb 19 22:55:05 crc kubenswrapper[4795]: I0219 22:55:05.524382 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfa8dda8-f620-4331-8909-b10784ceeab8" path="/var/lib/kubelet/pods/cfa8dda8-f620-4331-8909-b10784ceeab8/volumes" Feb 19 22:55:05 crc kubenswrapper[4795]: I0219 22:55:05.971606 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-f5h94"] Feb 19 22:55:05 crc kubenswrapper[4795]: E0219 22:55:05.972013 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c13f05e4-27de-4750-bb9d-008e3a0be0c7" containerName="mariadb-database-create" Feb 19 22:55:05 crc kubenswrapper[4795]: I0219 22:55:05.972030 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c13f05e4-27de-4750-bb9d-008e3a0be0c7" containerName="mariadb-database-create" Feb 19 22:55:05 crc kubenswrapper[4795]: E0219 22:55:05.972049 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4ca6125-46fa-4dd9-8d20-3816b6c09066" containerName="mariadb-account-create-update" Feb 19 22:55:05 crc kubenswrapper[4795]: I0219 22:55:05.972058 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4ca6125-46fa-4dd9-8d20-3816b6c09066" containerName="mariadb-account-create-update" Feb 19 22:55:05 crc kubenswrapper[4795]: I0219 22:55:05.972329 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4ca6125-46fa-4dd9-8d20-3816b6c09066" containerName="mariadb-account-create-update" Feb 19 22:55:05 crc kubenswrapper[4795]: I0219 22:55:05.972361 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c13f05e4-27de-4750-bb9d-008e3a0be0c7" containerName="mariadb-database-create" Feb 19 22:55:05 crc kubenswrapper[4795]: I0219 22:55:05.972964 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-f5h94" Feb 19 22:55:05 crc kubenswrapper[4795]: I0219 22:55:05.976530 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2lvws" Feb 19 22:55:05 crc kubenswrapper[4795]: I0219 22:55:05.977877 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 19 22:55:05 crc kubenswrapper[4795]: I0219 22:55:05.984339 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-f5h94"] Feb 19 22:55:06 crc kubenswrapper[4795]: I0219 22:55:06.087895 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace73a97-1b52-4187-a035-df7a08266bab-combined-ca-bundle\") pod \"barbican-db-sync-f5h94\" (UID: \"ace73a97-1b52-4187-a035-df7a08266bab\") " pod="openstack/barbican-db-sync-f5h94" Feb 19 22:55:06 crc kubenswrapper[4795]: I0219 22:55:06.088749 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ace73a97-1b52-4187-a035-df7a08266bab-db-sync-config-data\") pod \"barbican-db-sync-f5h94\" (UID: \"ace73a97-1b52-4187-a035-df7a08266bab\") " pod="openstack/barbican-db-sync-f5h94" Feb 19 22:55:06 crc kubenswrapper[4795]: I0219 22:55:06.088919 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-579zw\" (UniqueName: \"kubernetes.io/projected/ace73a97-1b52-4187-a035-df7a08266bab-kube-api-access-579zw\") pod \"barbican-db-sync-f5h94\" (UID: \"ace73a97-1b52-4187-a035-df7a08266bab\") " pod="openstack/barbican-db-sync-f5h94" Feb 19 22:55:06 crc kubenswrapper[4795]: I0219 22:55:06.190667 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ace73a97-1b52-4187-a035-df7a08266bab-db-sync-config-data\") pod \"barbican-db-sync-f5h94\" (UID: \"ace73a97-1b52-4187-a035-df7a08266bab\") " pod="openstack/barbican-db-sync-f5h94" Feb 19 22:55:06 crc kubenswrapper[4795]: I0219 22:55:06.191872 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-579zw\" (UniqueName: \"kubernetes.io/projected/ace73a97-1b52-4187-a035-df7a08266bab-kube-api-access-579zw\") pod \"barbican-db-sync-f5h94\" (UID: \"ace73a97-1b52-4187-a035-df7a08266bab\") " pod="openstack/barbican-db-sync-f5h94" Feb 19 22:55:06 crc kubenswrapper[4795]: I0219 22:55:06.192072 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace73a97-1b52-4187-a035-df7a08266bab-combined-ca-bundle\") pod \"barbican-db-sync-f5h94\" (UID: \"ace73a97-1b52-4187-a035-df7a08266bab\") " pod="openstack/barbican-db-sync-f5h94" Feb 19 22:55:06 crc kubenswrapper[4795]: I0219 22:55:06.197265 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ace73a97-1b52-4187-a035-df7a08266bab-db-sync-config-data\") pod \"barbican-db-sync-f5h94\" (UID: \"ace73a97-1b52-4187-a035-df7a08266bab\") " pod="openstack/barbican-db-sync-f5h94" Feb 19 22:55:06 crc kubenswrapper[4795]: I0219 22:55:06.197351 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace73a97-1b52-4187-a035-df7a08266bab-combined-ca-bundle\") pod \"barbican-db-sync-f5h94\" (UID: \"ace73a97-1b52-4187-a035-df7a08266bab\") " pod="openstack/barbican-db-sync-f5h94" Feb 19 22:55:06 crc kubenswrapper[4795]: I0219 22:55:06.208668 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-579zw\" (UniqueName: \"kubernetes.io/projected/ace73a97-1b52-4187-a035-df7a08266bab-kube-api-access-579zw\") pod \"barbican-db-sync-f5h94\" (UID: \"ace73a97-1b52-4187-a035-df7a08266bab\") " pod="openstack/barbican-db-sync-f5h94" Feb 19 22:55:06 crc kubenswrapper[4795]: I0219 22:55:06.337316 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-f5h94" Feb 19 22:55:06 crc kubenswrapper[4795]: I0219 22:55:06.813659 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-f5h94"] Feb 19 22:55:07 crc kubenswrapper[4795]: I0219 22:55:07.461958 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-f5h94" event={"ID":"ace73a97-1b52-4187-a035-df7a08266bab","Type":"ContainerStarted","Data":"1ad516f68056dfaa3dffe45049adde7607d436761c153d38593aeeda7b4036af"} Feb 19 22:55:07 crc kubenswrapper[4795]: I0219 22:55:07.462317 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-f5h94" event={"ID":"ace73a97-1b52-4187-a035-df7a08266bab","Type":"ContainerStarted","Data":"5ac57d6447f66088480258c408e45417d8d396373de56641f2bcaa0b12a4827f"} Feb 19 22:55:07 crc kubenswrapper[4795]: I0219 22:55:07.492382 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-f5h94" podStartSLOduration=2.492358384 podStartE2EDuration="2.492358384s" podCreationTimestamp="2026-02-19 22:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:55:07.484603252 +0000 UTC m=+5218.677121126" watchObservedRunningTime="2026-02-19 22:55:07.492358384 +0000 UTC m=+5218.684876288" Feb 19 22:55:08 crc kubenswrapper[4795]: I0219 22:55:08.474481 4795 generic.go:334] "Generic (PLEG): container finished" podID="ace73a97-1b52-4187-a035-df7a08266bab" containerID="1ad516f68056dfaa3dffe45049adde7607d436761c153d38593aeeda7b4036af" exitCode=0 Feb 19 22:55:08 crc kubenswrapper[4795]: I0219 22:55:08.474571 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-f5h94" event={"ID":"ace73a97-1b52-4187-a035-df7a08266bab","Type":"ContainerDied","Data":"1ad516f68056dfaa3dffe45049adde7607d436761c153d38593aeeda7b4036af"} Feb 19 22:55:09 crc kubenswrapper[4795]: I0219 22:55:09.809610 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-f5h94" Feb 19 22:55:09 crc kubenswrapper[4795]: I0219 22:55:09.860677 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ace73a97-1b52-4187-a035-df7a08266bab-db-sync-config-data\") pod \"ace73a97-1b52-4187-a035-df7a08266bab\" (UID: \"ace73a97-1b52-4187-a035-df7a08266bab\") " Feb 19 22:55:09 crc kubenswrapper[4795]: I0219 22:55:09.860750 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace73a97-1b52-4187-a035-df7a08266bab-combined-ca-bundle\") pod \"ace73a97-1b52-4187-a035-df7a08266bab\" (UID: \"ace73a97-1b52-4187-a035-df7a08266bab\") " Feb 19 22:55:09 crc kubenswrapper[4795]: I0219 22:55:09.860964 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-579zw\" (UniqueName: \"kubernetes.io/projected/ace73a97-1b52-4187-a035-df7a08266bab-kube-api-access-579zw\") pod \"ace73a97-1b52-4187-a035-df7a08266bab\" (UID: \"ace73a97-1b52-4187-a035-df7a08266bab\") " Feb 19 22:55:09 crc kubenswrapper[4795]: I0219 22:55:09.866915 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ace73a97-1b52-4187-a035-df7a08266bab-kube-api-access-579zw" (OuterVolumeSpecName: "kube-api-access-579zw") pod "ace73a97-1b52-4187-a035-df7a08266bab" (UID: "ace73a97-1b52-4187-a035-df7a08266bab"). InnerVolumeSpecName "kube-api-access-579zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:55:09 crc kubenswrapper[4795]: I0219 22:55:09.867810 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ace73a97-1b52-4187-a035-df7a08266bab-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ace73a97-1b52-4187-a035-df7a08266bab" (UID: "ace73a97-1b52-4187-a035-df7a08266bab"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:55:09 crc kubenswrapper[4795]: I0219 22:55:09.884915 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ace73a97-1b52-4187-a035-df7a08266bab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ace73a97-1b52-4187-a035-df7a08266bab" (UID: "ace73a97-1b52-4187-a035-df7a08266bab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:55:09 crc kubenswrapper[4795]: I0219 22:55:09.962712 4795 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ace73a97-1b52-4187-a035-df7a08266bab-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:09 crc kubenswrapper[4795]: I0219 22:55:09.962752 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace73a97-1b52-4187-a035-df7a08266bab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:09 crc kubenswrapper[4795]: I0219 22:55:09.962767 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-579zw\" (UniqueName: \"kubernetes.io/projected/ace73a97-1b52-4187-a035-df7a08266bab-kube-api-access-579zw\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.491002 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-f5h94" event={"ID":"ace73a97-1b52-4187-a035-df7a08266bab","Type":"ContainerDied","Data":"5ac57d6447f66088480258c408e45417d8d396373de56641f2bcaa0b12a4827f"} Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.491047 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ac57d6447f66088480258c408e45417d8d396373de56641f2bcaa0b12a4827f" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.491052 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-f5h94" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.716652 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7c6d9dfdbf-zg9wc"] Feb 19 22:55:10 crc kubenswrapper[4795]: E0219 22:55:10.717044 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace73a97-1b52-4187-a035-df7a08266bab" containerName="barbican-db-sync" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.717067 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace73a97-1b52-4187-a035-df7a08266bab" containerName="barbican-db-sync" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.717319 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ace73a97-1b52-4187-a035-df7a08266bab" containerName="barbican-db-sync" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.718323 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.720853 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.721151 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.722296 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2lvws" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.764002 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr"] Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.768472 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.772904 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.775966 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e6a3af4-fd31-411b-833c-5a39501f5d63-logs\") pod \"barbican-worker-7c6d9dfdbf-zg9wc\" (UID: \"3e6a3af4-fd31-411b-833c-5a39501f5d63\") " pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.776039 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6a3af4-fd31-411b-833c-5a39501f5d63-config-data\") pod \"barbican-worker-7c6d9dfdbf-zg9wc\" (UID: \"3e6a3af4-fd31-411b-833c-5a39501f5d63\") " pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.776099 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlcxk\" (UniqueName: \"kubernetes.io/projected/3e6a3af4-fd31-411b-833c-5a39501f5d63-kube-api-access-zlcxk\") pod \"barbican-worker-7c6d9dfdbf-zg9wc\" (UID: \"3e6a3af4-fd31-411b-833c-5a39501f5d63\") " pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.776133 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e6a3af4-fd31-411b-833c-5a39501f5d63-config-data-custom\") pod \"barbican-worker-7c6d9dfdbf-zg9wc\" (UID: \"3e6a3af4-fd31-411b-833c-5a39501f5d63\") " pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.776187 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6a3af4-fd31-411b-833c-5a39501f5d63-combined-ca-bundle\") pod \"barbican-worker-7c6d9dfdbf-zg9wc\" (UID: \"3e6a3af4-fd31-411b-833c-5a39501f5d63\") " pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.814881 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr"] Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.822140 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7c6d9dfdbf-zg9wc"] Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.853384 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5dc65d4d5f-tt9nm"] Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.858348 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.878753 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlcxk\" (UniqueName: \"kubernetes.io/projected/3e6a3af4-fd31-411b-833c-5a39501f5d63-kube-api-access-zlcxk\") pod \"barbican-worker-7c6d9dfdbf-zg9wc\" (UID: \"3e6a3af4-fd31-411b-833c-5a39501f5d63\") " pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.878891 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e6a3af4-fd31-411b-833c-5a39501f5d63-config-data-custom\") pod \"barbican-worker-7c6d9dfdbf-zg9wc\" (UID: \"3e6a3af4-fd31-411b-833c-5a39501f5d63\") " pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.878984 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5bb2f008-145f-4fc9-9d51-065874ab1b1e-config-data-custom\") pod \"barbican-keystone-listener-6b9b47c4f6-2kzbr\" (UID: \"5bb2f008-145f-4fc9-9d51-065874ab1b1e\") " pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.879049 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6a3af4-fd31-411b-833c-5a39501f5d63-combined-ca-bundle\") pod \"barbican-worker-7c6d9dfdbf-zg9wc\" (UID: \"3e6a3af4-fd31-411b-833c-5a39501f5d63\") " pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.879124 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh8xg\" (UniqueName: \"kubernetes.io/projected/5bb2f008-145f-4fc9-9d51-065874ab1b1e-kube-api-access-dh8xg\") pod \"barbican-keystone-listener-6b9b47c4f6-2kzbr\" (UID: \"5bb2f008-145f-4fc9-9d51-065874ab1b1e\") " pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.879154 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bb2f008-145f-4fc9-9d51-065874ab1b1e-combined-ca-bundle\") pod \"barbican-keystone-listener-6b9b47c4f6-2kzbr\" (UID: \"5bb2f008-145f-4fc9-9d51-065874ab1b1e\") " pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.879229 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bb2f008-145f-4fc9-9d51-065874ab1b1e-config-data\") pod \"barbican-keystone-listener-6b9b47c4f6-2kzbr\" (UID: \"5bb2f008-145f-4fc9-9d51-065874ab1b1e\") " pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.879298 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e6a3af4-fd31-411b-833c-5a39501f5d63-logs\") pod \"barbican-worker-7c6d9dfdbf-zg9wc\" (UID: \"3e6a3af4-fd31-411b-833c-5a39501f5d63\") " pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.879386 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6a3af4-fd31-411b-833c-5a39501f5d63-config-data\") pod \"barbican-worker-7c6d9dfdbf-zg9wc\" (UID: \"3e6a3af4-fd31-411b-833c-5a39501f5d63\") " pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.879441 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bb2f008-145f-4fc9-9d51-065874ab1b1e-logs\") pod \"barbican-keystone-listener-6b9b47c4f6-2kzbr\" (UID: \"5bb2f008-145f-4fc9-9d51-065874ab1b1e\") " pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.879918 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dc65d4d5f-tt9nm"] Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.880478 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e6a3af4-fd31-411b-833c-5a39501f5d63-logs\") pod \"barbican-worker-7c6d9dfdbf-zg9wc\" (UID: \"3e6a3af4-fd31-411b-833c-5a39501f5d63\") " pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.884075 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e6a3af4-fd31-411b-833c-5a39501f5d63-config-data-custom\") pod \"barbican-worker-7c6d9dfdbf-zg9wc\" (UID: \"3e6a3af4-fd31-411b-833c-5a39501f5d63\") " pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.887676 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6a3af4-fd31-411b-833c-5a39501f5d63-config-data\") pod \"barbican-worker-7c6d9dfdbf-zg9wc\" (UID: \"3e6a3af4-fd31-411b-833c-5a39501f5d63\") " pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.914116 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6a3af4-fd31-411b-833c-5a39501f5d63-combined-ca-bundle\") pod \"barbican-worker-7c6d9dfdbf-zg9wc\" (UID: \"3e6a3af4-fd31-411b-833c-5a39501f5d63\") " pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.918445 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlcxk\" (UniqueName: \"kubernetes.io/projected/3e6a3af4-fd31-411b-833c-5a39501f5d63-kube-api-access-zlcxk\") pod \"barbican-worker-7c6d9dfdbf-zg9wc\" (UID: \"3e6a3af4-fd31-411b-833c-5a39501f5d63\") " pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.934274 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-57b58f479d-8dz8t"] Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.935946 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.939077 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.956562 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-57b58f479d-8dz8t"] Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.980913 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh8xg\" (UniqueName: \"kubernetes.io/projected/5bb2f008-145f-4fc9-9d51-065874ab1b1e-kube-api-access-dh8xg\") pod \"barbican-keystone-listener-6b9b47c4f6-2kzbr\" (UID: \"5bb2f008-145f-4fc9-9d51-065874ab1b1e\") " pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.981204 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bb2f008-145f-4fc9-9d51-065874ab1b1e-combined-ca-bundle\") pod \"barbican-keystone-listener-6b9b47c4f6-2kzbr\" (UID: \"5bb2f008-145f-4fc9-9d51-065874ab1b1e\") " pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.981345 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bb2f008-145f-4fc9-9d51-065874ab1b1e-config-data\") pod \"barbican-keystone-listener-6b9b47c4f6-2kzbr\" (UID: \"5bb2f008-145f-4fc9-9d51-065874ab1b1e\") " pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.981461 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwwxp\" (UniqueName: \"kubernetes.io/projected/0f5b30f1-1278-4376-b1bc-6e72def4d494-kube-api-access-bwwxp\") pod \"dnsmasq-dns-5dc65d4d5f-tt9nm\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.981606 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-ovsdbserver-sb\") pod \"dnsmasq-dns-5dc65d4d5f-tt9nm\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.981727 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f7c03f-5289-48c5-987e-b808897adc6d-config-data\") pod \"barbican-api-57b58f479d-8dz8t\" (UID: \"30f7c03f-5289-48c5-987e-b808897adc6d\") " pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.981833 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-dns-svc\") pod \"dnsmasq-dns-5dc65d4d5f-tt9nm\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.981942 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bb2f008-145f-4fc9-9d51-065874ab1b1e-logs\") pod \"barbican-keystone-listener-6b9b47c4f6-2kzbr\" (UID: \"5bb2f008-145f-4fc9-9d51-065874ab1b1e\") " pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.982079 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f7c03f-5289-48c5-987e-b808897adc6d-combined-ca-bundle\") pod \"barbican-api-57b58f479d-8dz8t\" (UID: \"30f7c03f-5289-48c5-987e-b808897adc6d\") " pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.982238 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-config\") pod \"dnsmasq-dns-5dc65d4d5f-tt9nm\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.982375 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5bb2f008-145f-4fc9-9d51-065874ab1b1e-config-data-custom\") pod \"barbican-keystone-listener-6b9b47c4f6-2kzbr\" (UID: \"5bb2f008-145f-4fc9-9d51-065874ab1b1e\") " pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.982504 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30f7c03f-5289-48c5-987e-b808897adc6d-logs\") pod \"barbican-api-57b58f479d-8dz8t\" (UID: \"30f7c03f-5289-48c5-987e-b808897adc6d\") " pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.982610 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgxb6\" (UniqueName: \"kubernetes.io/projected/30f7c03f-5289-48c5-987e-b808897adc6d-kube-api-access-lgxb6\") pod \"barbican-api-57b58f479d-8dz8t\" (UID: \"30f7c03f-5289-48c5-987e-b808897adc6d\") " pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.982752 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30f7c03f-5289-48c5-987e-b808897adc6d-config-data-custom\") pod \"barbican-api-57b58f479d-8dz8t\" (UID: \"30f7c03f-5289-48c5-987e-b808897adc6d\") " pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.983315 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-ovsdbserver-nb\") pod \"dnsmasq-dns-5dc65d4d5f-tt9nm\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.985807 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bb2f008-145f-4fc9-9d51-065874ab1b1e-logs\") pod \"barbican-keystone-listener-6b9b47c4f6-2kzbr\" (UID: \"5bb2f008-145f-4fc9-9d51-065874ab1b1e\") " pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.986526 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bb2f008-145f-4fc9-9d51-065874ab1b1e-combined-ca-bundle\") pod \"barbican-keystone-listener-6b9b47c4f6-2kzbr\" (UID: \"5bb2f008-145f-4fc9-9d51-065874ab1b1e\") " pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.987666 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bb2f008-145f-4fc9-9d51-065874ab1b1e-config-data\") pod \"barbican-keystone-listener-6b9b47c4f6-2kzbr\" (UID: \"5bb2f008-145f-4fc9-9d51-065874ab1b1e\") " pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.995428 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5bb2f008-145f-4fc9-9d51-065874ab1b1e-config-data-custom\") pod \"barbican-keystone-listener-6b9b47c4f6-2kzbr\" (UID: \"5bb2f008-145f-4fc9-9d51-065874ab1b1e\") " pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.000896 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh8xg\" (UniqueName: \"kubernetes.io/projected/5bb2f008-145f-4fc9-9d51-065874ab1b1e-kube-api-access-dh8xg\") pod \"barbican-keystone-listener-6b9b47c4f6-2kzbr\" (UID: \"5bb2f008-145f-4fc9-9d51-065874ab1b1e\") " pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.035296 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.085509 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f7c03f-5289-48c5-987e-b808897adc6d-config-data\") pod \"barbican-api-57b58f479d-8dz8t\" (UID: \"30f7c03f-5289-48c5-987e-b808897adc6d\") " pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.085557 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-dns-svc\") pod \"dnsmasq-dns-5dc65d4d5f-tt9nm\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.085638 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f7c03f-5289-48c5-987e-b808897adc6d-combined-ca-bundle\") pod \"barbican-api-57b58f479d-8dz8t\" (UID: \"30f7c03f-5289-48c5-987e-b808897adc6d\") " pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.085717 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-config\") pod \"dnsmasq-dns-5dc65d4d5f-tt9nm\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.085788 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30f7c03f-5289-48c5-987e-b808897adc6d-logs\") pod \"barbican-api-57b58f479d-8dz8t\" (UID: \"30f7c03f-5289-48c5-987e-b808897adc6d\") " pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.085837 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgxb6\" (UniqueName: \"kubernetes.io/projected/30f7c03f-5289-48c5-987e-b808897adc6d-kube-api-access-lgxb6\") pod \"barbican-api-57b58f479d-8dz8t\" (UID: \"30f7c03f-5289-48c5-987e-b808897adc6d\") " pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.085858 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30f7c03f-5289-48c5-987e-b808897adc6d-config-data-custom\") pod \"barbican-api-57b58f479d-8dz8t\" (UID: \"30f7c03f-5289-48c5-987e-b808897adc6d\") " pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.085912 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-ovsdbserver-nb\") pod \"dnsmasq-dns-5dc65d4d5f-tt9nm\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.085960 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwwxp\" (UniqueName: \"kubernetes.io/projected/0f5b30f1-1278-4376-b1bc-6e72def4d494-kube-api-access-bwwxp\") pod \"dnsmasq-dns-5dc65d4d5f-tt9nm\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.086038 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-ovsdbserver-sb\") pod \"dnsmasq-dns-5dc65d4d5f-tt9nm\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.089151 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-dns-svc\") pod \"dnsmasq-dns-5dc65d4d5f-tt9nm\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.089884 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-ovsdbserver-sb\") pod \"dnsmasq-dns-5dc65d4d5f-tt9nm\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.090582 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30f7c03f-5289-48c5-987e-b808897adc6d-logs\") pod \"barbican-api-57b58f479d-8dz8t\" (UID: \"30f7c03f-5289-48c5-987e-b808897adc6d\") " pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.090863 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-config\") pod \"dnsmasq-dns-5dc65d4d5f-tt9nm\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.093988 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30f7c03f-5289-48c5-987e-b808897adc6d-config-data-custom\") pod \"barbican-api-57b58f479d-8dz8t\" (UID: \"30f7c03f-5289-48c5-987e-b808897adc6d\") " pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.094115 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-ovsdbserver-nb\") pod \"dnsmasq-dns-5dc65d4d5f-tt9nm\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.100896 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f7c03f-5289-48c5-987e-b808897adc6d-config-data\") pod \"barbican-api-57b58f479d-8dz8t\" (UID: \"30f7c03f-5289-48c5-987e-b808897adc6d\") " pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.102742 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f7c03f-5289-48c5-987e-b808897adc6d-combined-ca-bundle\") pod \"barbican-api-57b58f479d-8dz8t\" (UID: \"30f7c03f-5289-48c5-987e-b808897adc6d\") " pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.114849 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwwxp\" (UniqueName: \"kubernetes.io/projected/0f5b30f1-1278-4376-b1bc-6e72def4d494-kube-api-access-bwwxp\") pod \"dnsmasq-dns-5dc65d4d5f-tt9nm\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.115423 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.116405 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgxb6\" (UniqueName: \"kubernetes.io/projected/30f7c03f-5289-48c5-987e-b808897adc6d-kube-api-access-lgxb6\") pod \"barbican-api-57b58f479d-8dz8t\" (UID: \"30f7c03f-5289-48c5-987e-b808897adc6d\") " pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.262784 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.350093 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.573038 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7c6d9dfdbf-zg9wc"] Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.635369 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dc65d4d5f-tt9nm"] Feb 19 22:55:11 crc kubenswrapper[4795]: W0219 22:55:11.661300 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f5b30f1_1278_4376_b1bc_6e72def4d494.slice/crio-0ababbb8a0c165fe874ce96e1fb589c25f4808501b4ec3a8ba5a072fa951fb3d WatchSource:0}: Error finding container 0ababbb8a0c165fe874ce96e1fb589c25f4808501b4ec3a8ba5a072fa951fb3d: Status 404 returned error can't find the container with id 0ababbb8a0c165fe874ce96e1fb589c25f4808501b4ec3a8ba5a072fa951fb3d Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.684714 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr"] Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.800101 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-57b58f479d-8dz8t"] Feb 19 22:55:11 crc kubenswrapper[4795]: W0219 22:55:11.812789 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30f7c03f_5289_48c5_987e_b808897adc6d.slice/crio-b91e63e8606c45be88dc366dae5101d108ea95439a06796210749fc2fd80172f WatchSource:0}: Error finding container b91e63e8606c45be88dc366dae5101d108ea95439a06796210749fc2fd80172f: Status 404 returned error can't find the container with id b91e63e8606c45be88dc366dae5101d108ea95439a06796210749fc2fd80172f Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.533428 4795 generic.go:334] "Generic (PLEG): container finished" podID="0f5b30f1-1278-4376-b1bc-6e72def4d494" containerID="da78b2ce5ad3acba003da6948c0acb827331e556f914cb3178cd5862028563a8" exitCode=0 Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.533739 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" event={"ID":"0f5b30f1-1278-4376-b1bc-6e72def4d494","Type":"ContainerDied","Data":"da78b2ce5ad3acba003da6948c0acb827331e556f914cb3178cd5862028563a8"} Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.533766 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" event={"ID":"0f5b30f1-1278-4376-b1bc-6e72def4d494","Type":"ContainerStarted","Data":"0ababbb8a0c165fe874ce96e1fb589c25f4808501b4ec3a8ba5a072fa951fb3d"} Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.576361 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" event={"ID":"3e6a3af4-fd31-411b-833c-5a39501f5d63","Type":"ContainerStarted","Data":"3f7b90f71568f9f1e2f76a73b0760f38eef669f889318b4bceecc96cfe93c6f4"} Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.576401 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" event={"ID":"3e6a3af4-fd31-411b-833c-5a39501f5d63","Type":"ContainerStarted","Data":"0e80fadd20836711ca4dabb497dae8bdccfbae769dcdfbca41342b3a99687b54"} Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.576410 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" event={"ID":"3e6a3af4-fd31-411b-833c-5a39501f5d63","Type":"ContainerStarted","Data":"0d7dbcf09bbce07f0bc9745d4203787115390bb3d7e0d026295e9a5beffcce09"} Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.599371 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57b58f479d-8dz8t" event={"ID":"30f7c03f-5289-48c5-987e-b808897adc6d","Type":"ContainerStarted","Data":"ce4262fe2a9a2ead03e6081080511b9d9fac63d4f8a914df063115de337c815e"} Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.599413 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57b58f479d-8dz8t" event={"ID":"30f7c03f-5289-48c5-987e-b808897adc6d","Type":"ContainerStarted","Data":"d23dea4efee1caf9050acb1dd78f0134a6b93ce46cf468a88179d34fc0242e76"} Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.599426 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57b58f479d-8dz8t" event={"ID":"30f7c03f-5289-48c5-987e-b808897adc6d","Type":"ContainerStarted","Data":"b91e63e8606c45be88dc366dae5101d108ea95439a06796210749fc2fd80172f"} Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.600118 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.600140 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.605187 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" event={"ID":"5bb2f008-145f-4fc9-9d51-065874ab1b1e","Type":"ContainerStarted","Data":"395dacc5c3c5826c0ae5c0f3234a36b11f81cc6adc9ec428952c743cc6f7deff"} Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.605229 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" event={"ID":"5bb2f008-145f-4fc9-9d51-065874ab1b1e","Type":"ContainerStarted","Data":"36e0fc8410e757037d278e96ae40ef00c2855d84be871f41f02f19cb03dc8817"} Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.605241 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" event={"ID":"5bb2f008-145f-4fc9-9d51-065874ab1b1e","Type":"ContainerStarted","Data":"14c6bafe81a7616551ef14573a02ba392e55faebb3797b3afbaf98c0be1d5e12"} Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.611517 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" podStartSLOduration=2.611499639 podStartE2EDuration="2.611499639s" podCreationTimestamp="2026-02-19 22:55:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:55:12.602437989 +0000 UTC m=+5223.794955853" watchObservedRunningTime="2026-02-19 22:55:12.611499639 +0000 UTC m=+5223.804017503" Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.635183 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-57b58f479d-8dz8t" podStartSLOduration=2.635139977 podStartE2EDuration="2.635139977s" podCreationTimestamp="2026-02-19 22:55:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:55:12.617753358 +0000 UTC m=+5223.810271222" watchObservedRunningTime="2026-02-19 22:55:12.635139977 +0000 UTC m=+5223.827657861" Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.647961 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" podStartSLOduration=2.647940284 podStartE2EDuration="2.647940284s" podCreationTimestamp="2026-02-19 22:55:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:55:12.639965165 +0000 UTC m=+5223.832483029" watchObservedRunningTime="2026-02-19 22:55:12.647940284 +0000 UTC m=+5223.840458148" Feb 19 22:55:13 crc kubenswrapper[4795]: I0219 22:55:13.614036 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" event={"ID":"0f5b30f1-1278-4376-b1bc-6e72def4d494","Type":"ContainerStarted","Data":"be2f6f4f8f1c84b00fee1744df7613e5b2cb7c534a7ea1884cb10ab8da51e6b9"} Feb 19 22:55:13 crc kubenswrapper[4795]: I0219 22:55:13.630971 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" podStartSLOduration=3.63095722 podStartE2EDuration="3.63095722s" podCreationTimestamp="2026-02-19 22:55:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:55:13.628285743 +0000 UTC m=+5224.820803607" watchObservedRunningTime="2026-02-19 22:55:13.63095722 +0000 UTC m=+5224.823475084" Feb 19 22:55:14 crc kubenswrapper[4795]: I0219 22:55:14.623356 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:15 crc kubenswrapper[4795]: I0219 22:55:15.265879 4795 scope.go:117] "RemoveContainer" containerID="595d4f05c4b7ec834570db4adf844a0fca41d1bed50151677fc612dd1b0457bc" Feb 19 22:55:15 crc kubenswrapper[4795]: I0219 22:55:15.511198 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:55:15 crc kubenswrapper[4795]: E0219 22:55:15.511576 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:55:21 crc kubenswrapper[4795]: I0219 22:55:21.265340 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:21 crc kubenswrapper[4795]: I0219 22:55:21.327644 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cbd7f9ccc-v47nm"] Feb 19 22:55:21 crc kubenswrapper[4795]: I0219 22:55:21.329255 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" podUID="74bde2c2-542d-4473-8a2d-4276ef12f1a1" containerName="dnsmasq-dns" containerID="cri-o://1e453cedb2505c25dd86ba43beea6d20a98c7545840d561df3aaa00ec8232701" gracePeriod=10 Feb 19 22:55:21 crc kubenswrapper[4795]: I0219 22:55:21.702095 4795 generic.go:334] "Generic (PLEG): container finished" podID="74bde2c2-542d-4473-8a2d-4276ef12f1a1" containerID="1e453cedb2505c25dd86ba43beea6d20a98c7545840d561df3aaa00ec8232701" exitCode=0 Feb 19 22:55:21 crc kubenswrapper[4795]: I0219 22:55:21.702138 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" event={"ID":"74bde2c2-542d-4473-8a2d-4276ef12f1a1","Type":"ContainerDied","Data":"1e453cedb2505c25dd86ba43beea6d20a98c7545840d561df3aaa00ec8232701"} Feb 19 22:55:21 crc kubenswrapper[4795]: I0219 22:55:21.830750 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:55:21 crc kubenswrapper[4795]: I0219 22:55:21.989762 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-config\") pod \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " Feb 19 22:55:21 crc kubenswrapper[4795]: I0219 22:55:21.990059 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g49lp\" (UniqueName: \"kubernetes.io/projected/74bde2c2-542d-4473-8a2d-4276ef12f1a1-kube-api-access-g49lp\") pod \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " Feb 19 22:55:21 crc kubenswrapper[4795]: I0219 22:55:21.990116 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-ovsdbserver-nb\") pod \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " Feb 19 22:55:21 crc kubenswrapper[4795]: I0219 22:55:21.990235 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-ovsdbserver-sb\") pod \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " Feb 19 22:55:21 crc kubenswrapper[4795]: I0219 22:55:21.990265 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-dns-svc\") pod \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " Feb 19 22:55:21 crc kubenswrapper[4795]: I0219 22:55:21.999294 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74bde2c2-542d-4473-8a2d-4276ef12f1a1-kube-api-access-g49lp" (OuterVolumeSpecName: "kube-api-access-g49lp") pod "74bde2c2-542d-4473-8a2d-4276ef12f1a1" (UID: "74bde2c2-542d-4473-8a2d-4276ef12f1a1"). InnerVolumeSpecName "kube-api-access-g49lp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.034402 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "74bde2c2-542d-4473-8a2d-4276ef12f1a1" (UID: "74bde2c2-542d-4473-8a2d-4276ef12f1a1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.037846 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-config" (OuterVolumeSpecName: "config") pod "74bde2c2-542d-4473-8a2d-4276ef12f1a1" (UID: "74bde2c2-542d-4473-8a2d-4276ef12f1a1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.052168 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "74bde2c2-542d-4473-8a2d-4276ef12f1a1" (UID: "74bde2c2-542d-4473-8a2d-4276ef12f1a1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.064751 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "74bde2c2-542d-4473-8a2d-4276ef12f1a1" (UID: "74bde2c2-542d-4473-8a2d-4276ef12f1a1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.093560 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-config\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.093604 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g49lp\" (UniqueName: \"kubernetes.io/projected/74bde2c2-542d-4473-8a2d-4276ef12f1a1-kube-api-access-g49lp\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.093619 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.093632 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.093644 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.713071 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" event={"ID":"74bde2c2-542d-4473-8a2d-4276ef12f1a1","Type":"ContainerDied","Data":"bf13f0e9bcad39a75b168c83b41452ae37c85cfd3c851a7d568e51da749b6e4f"} Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.713117 4795 scope.go:117] "RemoveContainer" containerID="1e453cedb2505c25dd86ba43beea6d20a98c7545840d561df3aaa00ec8232701" Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.713272 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.754849 4795 scope.go:117] "RemoveContainer" containerID="16b74941371be4fa54bab7807a1264eec35a2ff59fcbcca048e96bbfbf300be4" Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.755853 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cbd7f9ccc-v47nm"] Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.779685 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cbd7f9ccc-v47nm"] Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.831984 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.856677 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:23 crc kubenswrapper[4795]: I0219 22:55:23.528296 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74bde2c2-542d-4473-8a2d-4276ef12f1a1" path="/var/lib/kubelet/pods/74bde2c2-542d-4473-8a2d-4276ef12f1a1/volumes" Feb 19 22:55:28 crc kubenswrapper[4795]: I0219 22:55:28.512018 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:55:28 crc kubenswrapper[4795]: E0219 22:55:28.512860 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.559383 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-xbwb6"] Feb 19 22:55:36 crc kubenswrapper[4795]: E0219 22:55:36.560247 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74bde2c2-542d-4473-8a2d-4276ef12f1a1" containerName="dnsmasq-dns" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.560260 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="74bde2c2-542d-4473-8a2d-4276ef12f1a1" containerName="dnsmasq-dns" Feb 19 22:55:36 crc kubenswrapper[4795]: E0219 22:55:36.560270 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74bde2c2-542d-4473-8a2d-4276ef12f1a1" containerName="init" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.560276 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="74bde2c2-542d-4473-8a2d-4276ef12f1a1" containerName="init" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.560430 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="74bde2c2-542d-4473-8a2d-4276ef12f1a1" containerName="dnsmasq-dns" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.561007 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xbwb6" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.609008 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-xbwb6"] Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.664512 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f69d-account-create-update-gbq6r"] Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.666468 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f69d-account-create-update-gbq6r" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.668773 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.675498 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f69d-account-create-update-gbq6r"] Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.750188 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl2rd\" (UniqueName: \"kubernetes.io/projected/c0369c6f-517b-44b8-968a-a3408c6044d6-kube-api-access-kl2rd\") pod \"neutron-db-create-xbwb6\" (UID: \"c0369c6f-517b-44b8-968a-a3408c6044d6\") " pod="openstack/neutron-db-create-xbwb6" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.750245 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0369c6f-517b-44b8-968a-a3408c6044d6-operator-scripts\") pod \"neutron-db-create-xbwb6\" (UID: \"c0369c6f-517b-44b8-968a-a3408c6044d6\") " pod="openstack/neutron-db-create-xbwb6" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.852217 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bks4\" (UniqueName: \"kubernetes.io/projected/635044d2-10e8-457c-b03e-9507a500c7fe-kube-api-access-4bks4\") pod \"neutron-f69d-account-create-update-gbq6r\" (UID: \"635044d2-10e8-457c-b03e-9507a500c7fe\") " pod="openstack/neutron-f69d-account-create-update-gbq6r" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.852328 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/635044d2-10e8-457c-b03e-9507a500c7fe-operator-scripts\") pod \"neutron-f69d-account-create-update-gbq6r\" (UID: \"635044d2-10e8-457c-b03e-9507a500c7fe\") " pod="openstack/neutron-f69d-account-create-update-gbq6r" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.852367 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl2rd\" (UniqueName: \"kubernetes.io/projected/c0369c6f-517b-44b8-968a-a3408c6044d6-kube-api-access-kl2rd\") pod \"neutron-db-create-xbwb6\" (UID: \"c0369c6f-517b-44b8-968a-a3408c6044d6\") " pod="openstack/neutron-db-create-xbwb6" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.852389 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0369c6f-517b-44b8-968a-a3408c6044d6-operator-scripts\") pod \"neutron-db-create-xbwb6\" (UID: \"c0369c6f-517b-44b8-968a-a3408c6044d6\") " pod="openstack/neutron-db-create-xbwb6" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.853056 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0369c6f-517b-44b8-968a-a3408c6044d6-operator-scripts\") pod \"neutron-db-create-xbwb6\" (UID: \"c0369c6f-517b-44b8-968a-a3408c6044d6\") " pod="openstack/neutron-db-create-xbwb6" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.870786 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl2rd\" (UniqueName: \"kubernetes.io/projected/c0369c6f-517b-44b8-968a-a3408c6044d6-kube-api-access-kl2rd\") pod \"neutron-db-create-xbwb6\" (UID: \"c0369c6f-517b-44b8-968a-a3408c6044d6\") " pod="openstack/neutron-db-create-xbwb6" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.882045 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xbwb6" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.953460 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bks4\" (UniqueName: \"kubernetes.io/projected/635044d2-10e8-457c-b03e-9507a500c7fe-kube-api-access-4bks4\") pod \"neutron-f69d-account-create-update-gbq6r\" (UID: \"635044d2-10e8-457c-b03e-9507a500c7fe\") " pod="openstack/neutron-f69d-account-create-update-gbq6r" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.953569 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/635044d2-10e8-457c-b03e-9507a500c7fe-operator-scripts\") pod \"neutron-f69d-account-create-update-gbq6r\" (UID: \"635044d2-10e8-457c-b03e-9507a500c7fe\") " pod="openstack/neutron-f69d-account-create-update-gbq6r" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.954462 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/635044d2-10e8-457c-b03e-9507a500c7fe-operator-scripts\") pod \"neutron-f69d-account-create-update-gbq6r\" (UID: \"635044d2-10e8-457c-b03e-9507a500c7fe\") " pod="openstack/neutron-f69d-account-create-update-gbq6r" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.977508 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bks4\" (UniqueName: \"kubernetes.io/projected/635044d2-10e8-457c-b03e-9507a500c7fe-kube-api-access-4bks4\") pod \"neutron-f69d-account-create-update-gbq6r\" (UID: \"635044d2-10e8-457c-b03e-9507a500c7fe\") " pod="openstack/neutron-f69d-account-create-update-gbq6r" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.984598 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f69d-account-create-update-gbq6r" Feb 19 22:55:37 crc kubenswrapper[4795]: I0219 22:55:37.412937 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-xbwb6"] Feb 19 22:55:37 crc kubenswrapper[4795]: I0219 22:55:37.566343 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f69d-account-create-update-gbq6r"] Feb 19 22:55:37 crc kubenswrapper[4795]: W0219 22:55:37.567812 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod635044d2_10e8_457c_b03e_9507a500c7fe.slice/crio-2fa48bc04516a1246fcfd55e60c6c0e0408ad4f81c112f1c19ad75fdbaeda045 WatchSource:0}: Error finding container 2fa48bc04516a1246fcfd55e60c6c0e0408ad4f81c112f1c19ad75fdbaeda045: Status 404 returned error can't find the container with id 2fa48bc04516a1246fcfd55e60c6c0e0408ad4f81c112f1c19ad75fdbaeda045 Feb 19 22:55:37 crc kubenswrapper[4795]: I0219 22:55:37.821981 4795 generic.go:334] "Generic (PLEG): container finished" podID="c0369c6f-517b-44b8-968a-a3408c6044d6" containerID="82b2542be6e058170f19505c42c4114714e88b52257f1c0e99664dda5eb05f78" exitCode=0 Feb 19 22:55:37 crc kubenswrapper[4795]: I0219 22:55:37.822046 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xbwb6" event={"ID":"c0369c6f-517b-44b8-968a-a3408c6044d6","Type":"ContainerDied","Data":"82b2542be6e058170f19505c42c4114714e88b52257f1c0e99664dda5eb05f78"} Feb 19 22:55:37 crc kubenswrapper[4795]: I0219 22:55:37.822072 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xbwb6" event={"ID":"c0369c6f-517b-44b8-968a-a3408c6044d6","Type":"ContainerStarted","Data":"798c223eb760c97067b30610ba4e1b170b3df8ec61f6c7acc66d63b4565f14de"} Feb 19 22:55:37 crc kubenswrapper[4795]: I0219 22:55:37.823915 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f69d-account-create-update-gbq6r" event={"ID":"635044d2-10e8-457c-b03e-9507a500c7fe","Type":"ContainerStarted","Data":"247acf34bea1d6df95accdf51f9a624b45c32153a0b9fa97bc69d2358a9601e8"} Feb 19 22:55:37 crc kubenswrapper[4795]: I0219 22:55:37.823958 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f69d-account-create-update-gbq6r" event={"ID":"635044d2-10e8-457c-b03e-9507a500c7fe","Type":"ContainerStarted","Data":"2fa48bc04516a1246fcfd55e60c6c0e0408ad4f81c112f1c19ad75fdbaeda045"} Feb 19 22:55:37 crc kubenswrapper[4795]: I0219 22:55:37.856836 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-f69d-account-create-update-gbq6r" podStartSLOduration=1.856816596 podStartE2EDuration="1.856816596s" podCreationTimestamp="2026-02-19 22:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:55:37.848721924 +0000 UTC m=+5249.041239788" watchObservedRunningTime="2026-02-19 22:55:37.856816596 +0000 UTC m=+5249.049334460" Feb 19 22:55:38 crc kubenswrapper[4795]: I0219 22:55:38.836086 4795 generic.go:334] "Generic (PLEG): container finished" podID="635044d2-10e8-457c-b03e-9507a500c7fe" containerID="247acf34bea1d6df95accdf51f9a624b45c32153a0b9fa97bc69d2358a9601e8" exitCode=0 Feb 19 22:55:38 crc kubenswrapper[4795]: I0219 22:55:38.836133 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f69d-account-create-update-gbq6r" event={"ID":"635044d2-10e8-457c-b03e-9507a500c7fe","Type":"ContainerDied","Data":"247acf34bea1d6df95accdf51f9a624b45c32153a0b9fa97bc69d2358a9601e8"} Feb 19 22:55:39 crc kubenswrapper[4795]: I0219 22:55:39.195690 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xbwb6" Feb 19 22:55:39 crc kubenswrapper[4795]: I0219 22:55:39.294432 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0369c6f-517b-44b8-968a-a3408c6044d6-operator-scripts\") pod \"c0369c6f-517b-44b8-968a-a3408c6044d6\" (UID: \"c0369c6f-517b-44b8-968a-a3408c6044d6\") " Feb 19 22:55:39 crc kubenswrapper[4795]: I0219 22:55:39.294523 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kl2rd\" (UniqueName: \"kubernetes.io/projected/c0369c6f-517b-44b8-968a-a3408c6044d6-kube-api-access-kl2rd\") pod \"c0369c6f-517b-44b8-968a-a3408c6044d6\" (UID: \"c0369c6f-517b-44b8-968a-a3408c6044d6\") " Feb 19 22:55:39 crc kubenswrapper[4795]: I0219 22:55:39.294949 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0369c6f-517b-44b8-968a-a3408c6044d6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c0369c6f-517b-44b8-968a-a3408c6044d6" (UID: "c0369c6f-517b-44b8-968a-a3408c6044d6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:55:39 crc kubenswrapper[4795]: I0219 22:55:39.299847 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0369c6f-517b-44b8-968a-a3408c6044d6-kube-api-access-kl2rd" (OuterVolumeSpecName: "kube-api-access-kl2rd") pod "c0369c6f-517b-44b8-968a-a3408c6044d6" (UID: "c0369c6f-517b-44b8-968a-a3408c6044d6"). InnerVolumeSpecName "kube-api-access-kl2rd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:55:39 crc kubenswrapper[4795]: I0219 22:55:39.396056 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0369c6f-517b-44b8-968a-a3408c6044d6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:39 crc kubenswrapper[4795]: I0219 22:55:39.396087 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kl2rd\" (UniqueName: \"kubernetes.io/projected/c0369c6f-517b-44b8-968a-a3408c6044d6-kube-api-access-kl2rd\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:39 crc kubenswrapper[4795]: I0219 22:55:39.519733 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:55:39 crc kubenswrapper[4795]: E0219 22:55:39.520104 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:55:39 crc kubenswrapper[4795]: I0219 22:55:39.846300 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xbwb6" event={"ID":"c0369c6f-517b-44b8-968a-a3408c6044d6","Type":"ContainerDied","Data":"798c223eb760c97067b30610ba4e1b170b3df8ec61f6c7acc66d63b4565f14de"} Feb 19 22:55:39 crc kubenswrapper[4795]: I0219 22:55:39.846348 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="798c223eb760c97067b30610ba4e1b170b3df8ec61f6c7acc66d63b4565f14de" Feb 19 22:55:39 crc kubenswrapper[4795]: I0219 22:55:39.846627 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xbwb6" Feb 19 22:55:40 crc kubenswrapper[4795]: I0219 22:55:40.171413 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f69d-account-create-update-gbq6r" Feb 19 22:55:40 crc kubenswrapper[4795]: I0219 22:55:40.309658 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bks4\" (UniqueName: \"kubernetes.io/projected/635044d2-10e8-457c-b03e-9507a500c7fe-kube-api-access-4bks4\") pod \"635044d2-10e8-457c-b03e-9507a500c7fe\" (UID: \"635044d2-10e8-457c-b03e-9507a500c7fe\") " Feb 19 22:55:40 crc kubenswrapper[4795]: I0219 22:55:40.309747 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/635044d2-10e8-457c-b03e-9507a500c7fe-operator-scripts\") pod \"635044d2-10e8-457c-b03e-9507a500c7fe\" (UID: \"635044d2-10e8-457c-b03e-9507a500c7fe\") " Feb 19 22:55:40 crc kubenswrapper[4795]: I0219 22:55:40.310501 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/635044d2-10e8-457c-b03e-9507a500c7fe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "635044d2-10e8-457c-b03e-9507a500c7fe" (UID: "635044d2-10e8-457c-b03e-9507a500c7fe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:55:40 crc kubenswrapper[4795]: I0219 22:55:40.314137 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/635044d2-10e8-457c-b03e-9507a500c7fe-kube-api-access-4bks4" (OuterVolumeSpecName: "kube-api-access-4bks4") pod "635044d2-10e8-457c-b03e-9507a500c7fe" (UID: "635044d2-10e8-457c-b03e-9507a500c7fe"). InnerVolumeSpecName "kube-api-access-4bks4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:55:40 crc kubenswrapper[4795]: I0219 22:55:40.411665 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bks4\" (UniqueName: \"kubernetes.io/projected/635044d2-10e8-457c-b03e-9507a500c7fe-kube-api-access-4bks4\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:40 crc kubenswrapper[4795]: I0219 22:55:40.411719 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/635044d2-10e8-457c-b03e-9507a500c7fe-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:40 crc kubenswrapper[4795]: I0219 22:55:40.858343 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f69d-account-create-update-gbq6r" event={"ID":"635044d2-10e8-457c-b03e-9507a500c7fe","Type":"ContainerDied","Data":"2fa48bc04516a1246fcfd55e60c6c0e0408ad4f81c112f1c19ad75fdbaeda045"} Feb 19 22:55:40 crc kubenswrapper[4795]: I0219 22:55:40.858385 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fa48bc04516a1246fcfd55e60c6c0e0408ad4f81c112f1c19ad75fdbaeda045" Feb 19 22:55:40 crc kubenswrapper[4795]: I0219 22:55:40.858412 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f69d-account-create-update-gbq6r" Feb 19 22:55:41 crc kubenswrapper[4795]: I0219 22:55:41.926963 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-p5rjh"] Feb 19 22:55:41 crc kubenswrapper[4795]: E0219 22:55:41.927334 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="635044d2-10e8-457c-b03e-9507a500c7fe" containerName="mariadb-account-create-update" Feb 19 22:55:41 crc kubenswrapper[4795]: I0219 22:55:41.927346 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="635044d2-10e8-457c-b03e-9507a500c7fe" containerName="mariadb-account-create-update" Feb 19 22:55:41 crc kubenswrapper[4795]: E0219 22:55:41.927375 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0369c6f-517b-44b8-968a-a3408c6044d6" containerName="mariadb-database-create" Feb 19 22:55:41 crc kubenswrapper[4795]: I0219 22:55:41.927381 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0369c6f-517b-44b8-968a-a3408c6044d6" containerName="mariadb-database-create" Feb 19 22:55:41 crc kubenswrapper[4795]: I0219 22:55:41.927514 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="635044d2-10e8-457c-b03e-9507a500c7fe" containerName="mariadb-account-create-update" Feb 19 22:55:41 crc kubenswrapper[4795]: I0219 22:55:41.927532 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0369c6f-517b-44b8-968a-a3408c6044d6" containerName="mariadb-database-create" Feb 19 22:55:41 crc kubenswrapper[4795]: I0219 22:55:41.928137 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-p5rjh" Feb 19 22:55:41 crc kubenswrapper[4795]: I0219 22:55:41.929939 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-lsmm9" Feb 19 22:55:41 crc kubenswrapper[4795]: I0219 22:55:41.930174 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 19 22:55:41 crc kubenswrapper[4795]: I0219 22:55:41.930484 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 19 22:55:41 crc kubenswrapper[4795]: I0219 22:55:41.935102 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-p5rjh"] Feb 19 22:55:42 crc kubenswrapper[4795]: I0219 22:55:42.036664 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9ad0e107-d857-4118-9582-5039b45f1ec8-config\") pod \"neutron-db-sync-p5rjh\" (UID: \"9ad0e107-d857-4118-9582-5039b45f1ec8\") " pod="openstack/neutron-db-sync-p5rjh" Feb 19 22:55:42 crc kubenswrapper[4795]: I0219 22:55:42.036756 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zhn4\" (UniqueName: \"kubernetes.io/projected/9ad0e107-d857-4118-9582-5039b45f1ec8-kube-api-access-9zhn4\") pod \"neutron-db-sync-p5rjh\" (UID: \"9ad0e107-d857-4118-9582-5039b45f1ec8\") " pod="openstack/neutron-db-sync-p5rjh" Feb 19 22:55:42 crc kubenswrapper[4795]: I0219 22:55:42.036803 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ad0e107-d857-4118-9582-5039b45f1ec8-combined-ca-bundle\") pod \"neutron-db-sync-p5rjh\" (UID: \"9ad0e107-d857-4118-9582-5039b45f1ec8\") " pod="openstack/neutron-db-sync-p5rjh" Feb 19 22:55:42 crc kubenswrapper[4795]: I0219 22:55:42.138312 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zhn4\" (UniqueName: \"kubernetes.io/projected/9ad0e107-d857-4118-9582-5039b45f1ec8-kube-api-access-9zhn4\") pod \"neutron-db-sync-p5rjh\" (UID: \"9ad0e107-d857-4118-9582-5039b45f1ec8\") " pod="openstack/neutron-db-sync-p5rjh" Feb 19 22:55:42 crc kubenswrapper[4795]: I0219 22:55:42.138401 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ad0e107-d857-4118-9582-5039b45f1ec8-combined-ca-bundle\") pod \"neutron-db-sync-p5rjh\" (UID: \"9ad0e107-d857-4118-9582-5039b45f1ec8\") " pod="openstack/neutron-db-sync-p5rjh" Feb 19 22:55:42 crc kubenswrapper[4795]: I0219 22:55:42.138463 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9ad0e107-d857-4118-9582-5039b45f1ec8-config\") pod \"neutron-db-sync-p5rjh\" (UID: \"9ad0e107-d857-4118-9582-5039b45f1ec8\") " pod="openstack/neutron-db-sync-p5rjh" Feb 19 22:55:42 crc kubenswrapper[4795]: I0219 22:55:42.142442 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ad0e107-d857-4118-9582-5039b45f1ec8-combined-ca-bundle\") pod \"neutron-db-sync-p5rjh\" (UID: \"9ad0e107-d857-4118-9582-5039b45f1ec8\") " pod="openstack/neutron-db-sync-p5rjh" Feb 19 22:55:42 crc kubenswrapper[4795]: I0219 22:55:42.143391 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9ad0e107-d857-4118-9582-5039b45f1ec8-config\") pod \"neutron-db-sync-p5rjh\" (UID: \"9ad0e107-d857-4118-9582-5039b45f1ec8\") " pod="openstack/neutron-db-sync-p5rjh" Feb 19 22:55:42 crc kubenswrapper[4795]: I0219 22:55:42.157059 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zhn4\" (UniqueName: \"kubernetes.io/projected/9ad0e107-d857-4118-9582-5039b45f1ec8-kube-api-access-9zhn4\") pod \"neutron-db-sync-p5rjh\" (UID: \"9ad0e107-d857-4118-9582-5039b45f1ec8\") " pod="openstack/neutron-db-sync-p5rjh" Feb 19 22:55:42 crc kubenswrapper[4795]: I0219 22:55:42.244673 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-p5rjh" Feb 19 22:55:42 crc kubenswrapper[4795]: I0219 22:55:42.678291 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-p5rjh"] Feb 19 22:55:42 crc kubenswrapper[4795]: I0219 22:55:42.873249 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-p5rjh" event={"ID":"9ad0e107-d857-4118-9582-5039b45f1ec8","Type":"ContainerStarted","Data":"a01a032519bdc879e7f051388c3f7e0f8289504340bef813d7d1864b844d8771"} Feb 19 22:55:42 crc kubenswrapper[4795]: I0219 22:55:42.873539 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-p5rjh" event={"ID":"9ad0e107-d857-4118-9582-5039b45f1ec8","Type":"ContainerStarted","Data":"d3eea427f09096b213db3bad61bbfc80ea6959119d0cf31b074397e7c578d1fa"} Feb 19 22:55:42 crc kubenswrapper[4795]: I0219 22:55:42.893776 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-p5rjh" podStartSLOduration=1.893755565 podStartE2EDuration="1.893755565s" podCreationTimestamp="2026-02-19 22:55:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:55:42.890124261 +0000 UTC m=+5254.082642125" watchObservedRunningTime="2026-02-19 22:55:42.893755565 +0000 UTC m=+5254.086273429" Feb 19 22:55:46 crc kubenswrapper[4795]: I0219 22:55:46.911214 4795 generic.go:334] "Generic (PLEG): container finished" podID="9ad0e107-d857-4118-9582-5039b45f1ec8" containerID="a01a032519bdc879e7f051388c3f7e0f8289504340bef813d7d1864b844d8771" exitCode=0 Feb 19 22:55:46 crc kubenswrapper[4795]: I0219 22:55:46.911424 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-p5rjh" event={"ID":"9ad0e107-d857-4118-9582-5039b45f1ec8","Type":"ContainerDied","Data":"a01a032519bdc879e7f051388c3f7e0f8289504340bef813d7d1864b844d8771"} Feb 19 22:55:48 crc kubenswrapper[4795]: I0219 22:55:48.258877 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-p5rjh" Feb 19 22:55:48 crc kubenswrapper[4795]: I0219 22:55:48.341683 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9ad0e107-d857-4118-9582-5039b45f1ec8-config\") pod \"9ad0e107-d857-4118-9582-5039b45f1ec8\" (UID: \"9ad0e107-d857-4118-9582-5039b45f1ec8\") " Feb 19 22:55:48 crc kubenswrapper[4795]: I0219 22:55:48.341792 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zhn4\" (UniqueName: \"kubernetes.io/projected/9ad0e107-d857-4118-9582-5039b45f1ec8-kube-api-access-9zhn4\") pod \"9ad0e107-d857-4118-9582-5039b45f1ec8\" (UID: \"9ad0e107-d857-4118-9582-5039b45f1ec8\") " Feb 19 22:55:48 crc kubenswrapper[4795]: I0219 22:55:48.342710 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ad0e107-d857-4118-9582-5039b45f1ec8-combined-ca-bundle\") pod \"9ad0e107-d857-4118-9582-5039b45f1ec8\" (UID: \"9ad0e107-d857-4118-9582-5039b45f1ec8\") " Feb 19 22:55:48 crc kubenswrapper[4795]: I0219 22:55:48.348764 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ad0e107-d857-4118-9582-5039b45f1ec8-kube-api-access-9zhn4" (OuterVolumeSpecName: "kube-api-access-9zhn4") pod "9ad0e107-d857-4118-9582-5039b45f1ec8" (UID: "9ad0e107-d857-4118-9582-5039b45f1ec8"). InnerVolumeSpecName "kube-api-access-9zhn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:55:48 crc kubenswrapper[4795]: I0219 22:55:48.366981 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ad0e107-d857-4118-9582-5039b45f1ec8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ad0e107-d857-4118-9582-5039b45f1ec8" (UID: "9ad0e107-d857-4118-9582-5039b45f1ec8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:55:48 crc kubenswrapper[4795]: I0219 22:55:48.384740 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ad0e107-d857-4118-9582-5039b45f1ec8-config" (OuterVolumeSpecName: "config") pod "9ad0e107-d857-4118-9582-5039b45f1ec8" (UID: "9ad0e107-d857-4118-9582-5039b45f1ec8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:55:48 crc kubenswrapper[4795]: I0219 22:55:48.445024 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9ad0e107-d857-4118-9582-5039b45f1ec8-config\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:48 crc kubenswrapper[4795]: I0219 22:55:48.445058 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zhn4\" (UniqueName: \"kubernetes.io/projected/9ad0e107-d857-4118-9582-5039b45f1ec8-kube-api-access-9zhn4\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:48 crc kubenswrapper[4795]: I0219 22:55:48.445072 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ad0e107-d857-4118-9582-5039b45f1ec8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:48 crc kubenswrapper[4795]: I0219 22:55:48.929470 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-p5rjh" event={"ID":"9ad0e107-d857-4118-9582-5039b45f1ec8","Type":"ContainerDied","Data":"d3eea427f09096b213db3bad61bbfc80ea6959119d0cf31b074397e7c578d1fa"} Feb 19 22:55:48 crc kubenswrapper[4795]: I0219 22:55:48.929511 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3eea427f09096b213db3bad61bbfc80ea6959119d0cf31b074397e7c578d1fa" Feb 19 22:55:48 crc kubenswrapper[4795]: I0219 22:55:48.929887 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-p5rjh" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.180455 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d5d85df8f-clq65"] Feb 19 22:55:49 crc kubenswrapper[4795]: E0219 22:55:49.180795 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ad0e107-d857-4118-9582-5039b45f1ec8" containerName="neutron-db-sync" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.180816 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ad0e107-d857-4118-9582-5039b45f1ec8" containerName="neutron-db-sync" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.180995 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ad0e107-d857-4118-9582-5039b45f1ec8" containerName="neutron-db-sync" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.181899 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.204686 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d5d85df8f-clq65"] Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.272908 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7945766d5c-fjptf"] Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.274358 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7945766d5c-fjptf" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.278586 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.278906 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.279106 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-lsmm9" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.279769 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-ovsdbserver-nb\") pod \"dnsmasq-dns-7d5d85df8f-clq65\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.279819 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-config\") pod \"dnsmasq-dns-7d5d85df8f-clq65\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.279872 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-dns-svc\") pod \"dnsmasq-dns-7d5d85df8f-clq65\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.279964 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl457\" (UniqueName: \"kubernetes.io/projected/8e5a1fbd-4617-434e-8719-12b16bc88b98-kube-api-access-cl457\") pod \"dnsmasq-dns-7d5d85df8f-clq65\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.279996 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-ovsdbserver-sb\") pod \"dnsmasq-dns-7d5d85df8f-clq65\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.285755 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7945766d5c-fjptf"] Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.382015 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmx67\" (UniqueName: \"kubernetes.io/projected/dea0417f-0988-4d82-80cc-03298be367bd-kube-api-access-xmx67\") pod \"neutron-7945766d5c-fjptf\" (UID: \"dea0417f-0988-4d82-80cc-03298be367bd\") " pod="openstack/neutron-7945766d5c-fjptf" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.382107 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl457\" (UniqueName: \"kubernetes.io/projected/8e5a1fbd-4617-434e-8719-12b16bc88b98-kube-api-access-cl457\") pod \"dnsmasq-dns-7d5d85df8f-clq65\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.382155 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-ovsdbserver-sb\") pod \"dnsmasq-dns-7d5d85df8f-clq65\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.382237 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-ovsdbserver-nb\") pod \"dnsmasq-dns-7d5d85df8f-clq65\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.382268 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-config\") pod \"dnsmasq-dns-7d5d85df8f-clq65\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.382308 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dea0417f-0988-4d82-80cc-03298be367bd-combined-ca-bundle\") pod \"neutron-7945766d5c-fjptf\" (UID: \"dea0417f-0988-4d82-80cc-03298be367bd\") " pod="openstack/neutron-7945766d5c-fjptf" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.382349 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-dns-svc\") pod \"dnsmasq-dns-7d5d85df8f-clq65\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.382373 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dea0417f-0988-4d82-80cc-03298be367bd-httpd-config\") pod \"neutron-7945766d5c-fjptf\" (UID: \"dea0417f-0988-4d82-80cc-03298be367bd\") " pod="openstack/neutron-7945766d5c-fjptf" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.382430 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dea0417f-0988-4d82-80cc-03298be367bd-config\") pod \"neutron-7945766d5c-fjptf\" (UID: \"dea0417f-0988-4d82-80cc-03298be367bd\") " pod="openstack/neutron-7945766d5c-fjptf" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.383027 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-ovsdbserver-sb\") pod \"dnsmasq-dns-7d5d85df8f-clq65\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.383358 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-ovsdbserver-nb\") pod \"dnsmasq-dns-7d5d85df8f-clq65\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.383397 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-config\") pod \"dnsmasq-dns-7d5d85df8f-clq65\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.383543 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-dns-svc\") pod \"dnsmasq-dns-7d5d85df8f-clq65\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.400397 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl457\" (UniqueName: \"kubernetes.io/projected/8e5a1fbd-4617-434e-8719-12b16bc88b98-kube-api-access-cl457\") pod \"dnsmasq-dns-7d5d85df8f-clq65\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.483702 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dea0417f-0988-4d82-80cc-03298be367bd-config\") pod \"neutron-7945766d5c-fjptf\" (UID: \"dea0417f-0988-4d82-80cc-03298be367bd\") " pod="openstack/neutron-7945766d5c-fjptf" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.484086 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmx67\" (UniqueName: \"kubernetes.io/projected/dea0417f-0988-4d82-80cc-03298be367bd-kube-api-access-xmx67\") pod \"neutron-7945766d5c-fjptf\" (UID: \"dea0417f-0988-4d82-80cc-03298be367bd\") " pod="openstack/neutron-7945766d5c-fjptf" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.484186 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dea0417f-0988-4d82-80cc-03298be367bd-combined-ca-bundle\") pod \"neutron-7945766d5c-fjptf\" (UID: \"dea0417f-0988-4d82-80cc-03298be367bd\") " pod="openstack/neutron-7945766d5c-fjptf" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.484217 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dea0417f-0988-4d82-80cc-03298be367bd-httpd-config\") pod \"neutron-7945766d5c-fjptf\" (UID: \"dea0417f-0988-4d82-80cc-03298be367bd\") " pod="openstack/neutron-7945766d5c-fjptf" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.487488 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dea0417f-0988-4d82-80cc-03298be367bd-httpd-config\") pod \"neutron-7945766d5c-fjptf\" (UID: \"dea0417f-0988-4d82-80cc-03298be367bd\") " pod="openstack/neutron-7945766d5c-fjptf" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.487836 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/dea0417f-0988-4d82-80cc-03298be367bd-config\") pod \"neutron-7945766d5c-fjptf\" (UID: \"dea0417f-0988-4d82-80cc-03298be367bd\") " pod="openstack/neutron-7945766d5c-fjptf" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.488768 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dea0417f-0988-4d82-80cc-03298be367bd-combined-ca-bundle\") pod \"neutron-7945766d5c-fjptf\" (UID: \"dea0417f-0988-4d82-80cc-03298be367bd\") " pod="openstack/neutron-7945766d5c-fjptf" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.500212 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.506706 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmx67\" (UniqueName: \"kubernetes.io/projected/dea0417f-0988-4d82-80cc-03298be367bd-kube-api-access-xmx67\") pod \"neutron-7945766d5c-fjptf\" (UID: \"dea0417f-0988-4d82-80cc-03298be367bd\") " pod="openstack/neutron-7945766d5c-fjptf" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.603215 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7945766d5c-fjptf" Feb 19 22:55:50 crc kubenswrapper[4795]: I0219 22:55:50.054192 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d5d85df8f-clq65"] Feb 19 22:55:50 crc kubenswrapper[4795]: I0219 22:55:50.364976 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7945766d5c-fjptf"] Feb 19 22:55:50 crc kubenswrapper[4795]: W0219 22:55:50.379702 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddea0417f_0988_4d82_80cc_03298be367bd.slice/crio-ff1abb28fae1a83463f38eacf392a3526e7e57c91627771bd880aebeac689880 WatchSource:0}: Error finding container ff1abb28fae1a83463f38eacf392a3526e7e57c91627771bd880aebeac689880: Status 404 returned error can't find the container with id ff1abb28fae1a83463f38eacf392a3526e7e57c91627771bd880aebeac689880 Feb 19 22:55:50 crc kubenswrapper[4795]: I0219 22:55:50.943763 4795 generic.go:334] "Generic (PLEG): container finished" podID="8e5a1fbd-4617-434e-8719-12b16bc88b98" containerID="400d407c08a50e5c293b398a82df6a8f79475760e406fca1855e5558b33cb384" exitCode=0 Feb 19 22:55:50 crc kubenswrapper[4795]: I0219 22:55:50.943872 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" event={"ID":"8e5a1fbd-4617-434e-8719-12b16bc88b98","Type":"ContainerDied","Data":"400d407c08a50e5c293b398a82df6a8f79475760e406fca1855e5558b33cb384"} Feb 19 22:55:50 crc kubenswrapper[4795]: I0219 22:55:50.944199 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" event={"ID":"8e5a1fbd-4617-434e-8719-12b16bc88b98","Type":"ContainerStarted","Data":"7217af4cd4eb00396c1a57059f86739ac95943d868ddbc9e0af3ab209b2339ee"} Feb 19 22:55:50 crc kubenswrapper[4795]: I0219 22:55:50.946713 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7945766d5c-fjptf" event={"ID":"dea0417f-0988-4d82-80cc-03298be367bd","Type":"ContainerStarted","Data":"78fe4dbc7e56ab8394f76ce36d07a23856dbfc8149134c7fe2e37c98f537d806"} Feb 19 22:55:50 crc kubenswrapper[4795]: I0219 22:55:50.946750 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7945766d5c-fjptf" event={"ID":"dea0417f-0988-4d82-80cc-03298be367bd","Type":"ContainerStarted","Data":"7aab37ce10ab6572e88c1a662fda02629c82d89034781cfe1e5b1ae472dfa9cf"} Feb 19 22:55:50 crc kubenswrapper[4795]: I0219 22:55:50.946765 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7945766d5c-fjptf" event={"ID":"dea0417f-0988-4d82-80cc-03298be367bd","Type":"ContainerStarted","Data":"ff1abb28fae1a83463f38eacf392a3526e7e57c91627771bd880aebeac689880"} Feb 19 22:55:50 crc kubenswrapper[4795]: I0219 22:55:50.946867 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7945766d5c-fjptf" Feb 19 22:55:50 crc kubenswrapper[4795]: I0219 22:55:50.996640 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7945766d5c-fjptf" podStartSLOduration=1.996619224 podStartE2EDuration="1.996619224s" podCreationTimestamp="2026-02-19 22:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:55:50.992531737 +0000 UTC m=+5262.185049601" watchObservedRunningTime="2026-02-19 22:55:50.996619224 +0000 UTC m=+5262.189137088" Feb 19 22:55:51 crc kubenswrapper[4795]: I0219 22:55:51.511278 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:55:51 crc kubenswrapper[4795]: E0219 22:55:51.511571 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:55:51 crc kubenswrapper[4795]: I0219 22:55:51.956704 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" event={"ID":"8e5a1fbd-4617-434e-8719-12b16bc88b98","Type":"ContainerStarted","Data":"db3ac61fe2bbba8663fc1cafd3f78c8dde189ed879980b2ca339cc53d3ebb615"} Feb 19 22:55:51 crc kubenswrapper[4795]: I0219 22:55:51.957037 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:51 crc kubenswrapper[4795]: I0219 22:55:51.983534 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" podStartSLOduration=2.983514323 podStartE2EDuration="2.983514323s" podCreationTimestamp="2026-02-19 22:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:55:51.974433252 +0000 UTC m=+5263.166951136" watchObservedRunningTime="2026-02-19 22:55:51.983514323 +0000 UTC m=+5263.176032187" Feb 19 22:55:59 crc kubenswrapper[4795]: I0219 22:55:59.502350 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:59 crc kubenswrapper[4795]: I0219 22:55:59.558523 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dc65d4d5f-tt9nm"] Feb 19 22:55:59 crc kubenswrapper[4795]: I0219 22:55:59.558765 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" podUID="0f5b30f1-1278-4376-b1bc-6e72def4d494" containerName="dnsmasq-dns" containerID="cri-o://be2f6f4f8f1c84b00fee1744df7613e5b2cb7c534a7ea1884cb10ab8da51e6b9" gracePeriod=10 Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.023090 4795 generic.go:334] "Generic (PLEG): container finished" podID="0f5b30f1-1278-4376-b1bc-6e72def4d494" containerID="be2f6f4f8f1c84b00fee1744df7613e5b2cb7c534a7ea1884cb10ab8da51e6b9" exitCode=0 Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.023129 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" event={"ID":"0f5b30f1-1278-4376-b1bc-6e72def4d494","Type":"ContainerDied","Data":"be2f6f4f8f1c84b00fee1744df7613e5b2cb7c534a7ea1884cb10ab8da51e6b9"} Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.023457 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" event={"ID":"0f5b30f1-1278-4376-b1bc-6e72def4d494","Type":"ContainerDied","Data":"0ababbb8a0c165fe874ce96e1fb589c25f4808501b4ec3a8ba5a072fa951fb3d"} Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.023474 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ababbb8a0c165fe874ce96e1fb589c25f4808501b4ec3a8ba5a072fa951fb3d" Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.080096 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.265474 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-ovsdbserver-sb\") pod \"0f5b30f1-1278-4376-b1bc-6e72def4d494\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.265593 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-ovsdbserver-nb\") pod \"0f5b30f1-1278-4376-b1bc-6e72def4d494\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.265643 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-config\") pod \"0f5b30f1-1278-4376-b1bc-6e72def4d494\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.265693 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-dns-svc\") pod \"0f5b30f1-1278-4376-b1bc-6e72def4d494\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.265740 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwwxp\" (UniqueName: \"kubernetes.io/projected/0f5b30f1-1278-4376-b1bc-6e72def4d494-kube-api-access-bwwxp\") pod \"0f5b30f1-1278-4376-b1bc-6e72def4d494\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.273693 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f5b30f1-1278-4376-b1bc-6e72def4d494-kube-api-access-bwwxp" (OuterVolumeSpecName: "kube-api-access-bwwxp") pod "0f5b30f1-1278-4376-b1bc-6e72def4d494" (UID: "0f5b30f1-1278-4376-b1bc-6e72def4d494"). InnerVolumeSpecName "kube-api-access-bwwxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.317546 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-config" (OuterVolumeSpecName: "config") pod "0f5b30f1-1278-4376-b1bc-6e72def4d494" (UID: "0f5b30f1-1278-4376-b1bc-6e72def4d494"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.332132 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0f5b30f1-1278-4376-b1bc-6e72def4d494" (UID: "0f5b30f1-1278-4376-b1bc-6e72def4d494"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.341271 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0f5b30f1-1278-4376-b1bc-6e72def4d494" (UID: "0f5b30f1-1278-4376-b1bc-6e72def4d494"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.355872 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0f5b30f1-1278-4376-b1bc-6e72def4d494" (UID: "0f5b30f1-1278-4376-b1bc-6e72def4d494"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.367629 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.367665 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwwxp\" (UniqueName: \"kubernetes.io/projected/0f5b30f1-1278-4376-b1bc-6e72def4d494-kube-api-access-bwwxp\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.367676 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.367684 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.367692 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-config\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:01 crc kubenswrapper[4795]: I0219 22:56:01.030582 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:56:01 crc kubenswrapper[4795]: I0219 22:56:01.065496 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dc65d4d5f-tt9nm"] Feb 19 22:56:01 crc kubenswrapper[4795]: I0219 22:56:01.071192 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5dc65d4d5f-tt9nm"] Feb 19 22:56:01 crc kubenswrapper[4795]: I0219 22:56:01.521527 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f5b30f1-1278-4376-b1bc-6e72def4d494" path="/var/lib/kubelet/pods/0f5b30f1-1278-4376-b1bc-6e72def4d494/volumes" Feb 19 22:56:06 crc kubenswrapper[4795]: I0219 22:56:06.512549 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:56:07 crc kubenswrapper[4795]: I0219 22:56:07.081804 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"fd820b5d0adc1705d78ec76939670a71a79ca07206c8d0459e23712d0b015f16"} Feb 19 22:56:19 crc kubenswrapper[4795]: I0219 22:56:19.616016 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7945766d5c-fjptf" Feb 19 22:56:26 crc kubenswrapper[4795]: I0219 22:56:26.912231 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-57skz"] Feb 19 22:56:26 crc kubenswrapper[4795]: E0219 22:56:26.913069 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f5b30f1-1278-4376-b1bc-6e72def4d494" containerName="dnsmasq-dns" Feb 19 22:56:26 crc kubenswrapper[4795]: I0219 22:56:26.913086 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f5b30f1-1278-4376-b1bc-6e72def4d494" containerName="dnsmasq-dns" Feb 19 22:56:26 crc kubenswrapper[4795]: E0219 22:56:26.913110 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f5b30f1-1278-4376-b1bc-6e72def4d494" containerName="init" Feb 19 22:56:26 crc kubenswrapper[4795]: I0219 22:56:26.913119 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f5b30f1-1278-4376-b1bc-6e72def4d494" containerName="init" Feb 19 22:56:26 crc kubenswrapper[4795]: I0219 22:56:26.913359 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f5b30f1-1278-4376-b1bc-6e72def4d494" containerName="dnsmasq-dns" Feb 19 22:56:26 crc kubenswrapper[4795]: I0219 22:56:26.914040 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-57skz" Feb 19 22:56:26 crc kubenswrapper[4795]: I0219 22:56:26.922375 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-57skz"] Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.015555 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-c270-account-create-update-m9p4w"] Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.016526 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c270-account-create-update-m9p4w" Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.019031 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.026342 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c270-account-create-update-m9p4w"] Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.069411 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5efc0b6-7441-4f4b-827e-d920c711d076-operator-scripts\") pod \"glance-db-create-57skz\" (UID: \"b5efc0b6-7441-4f4b-827e-d920c711d076\") " pod="openstack/glance-db-create-57skz" Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.069481 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjr5c\" (UniqueName: \"kubernetes.io/projected/b5efc0b6-7441-4f4b-827e-d920c711d076-kube-api-access-qjr5c\") pod \"glance-db-create-57skz\" (UID: \"b5efc0b6-7441-4f4b-827e-d920c711d076\") " pod="openstack/glance-db-create-57skz" Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.171371 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njq24\" (UniqueName: \"kubernetes.io/projected/e6d0c29a-694d-4afc-ba36-c66fa8fd0328-kube-api-access-njq24\") pod \"glance-c270-account-create-update-m9p4w\" (UID: \"e6d0c29a-694d-4afc-ba36-c66fa8fd0328\") " pod="openstack/glance-c270-account-create-update-m9p4w" Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.171684 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5efc0b6-7441-4f4b-827e-d920c711d076-operator-scripts\") pod \"glance-db-create-57skz\" (UID: \"b5efc0b6-7441-4f4b-827e-d920c711d076\") " pod="openstack/glance-db-create-57skz" Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.171729 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjr5c\" (UniqueName: \"kubernetes.io/projected/b5efc0b6-7441-4f4b-827e-d920c711d076-kube-api-access-qjr5c\") pod \"glance-db-create-57skz\" (UID: \"b5efc0b6-7441-4f4b-827e-d920c711d076\") " pod="openstack/glance-db-create-57skz" Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.171778 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6d0c29a-694d-4afc-ba36-c66fa8fd0328-operator-scripts\") pod \"glance-c270-account-create-update-m9p4w\" (UID: \"e6d0c29a-694d-4afc-ba36-c66fa8fd0328\") " pod="openstack/glance-c270-account-create-update-m9p4w" Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.172445 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5efc0b6-7441-4f4b-827e-d920c711d076-operator-scripts\") pod \"glance-db-create-57skz\" (UID: \"b5efc0b6-7441-4f4b-827e-d920c711d076\") " pod="openstack/glance-db-create-57skz" Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.197062 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjr5c\" (UniqueName: \"kubernetes.io/projected/b5efc0b6-7441-4f4b-827e-d920c711d076-kube-api-access-qjr5c\") pod \"glance-db-create-57skz\" (UID: \"b5efc0b6-7441-4f4b-827e-d920c711d076\") " pod="openstack/glance-db-create-57skz" Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.273708 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6d0c29a-694d-4afc-ba36-c66fa8fd0328-operator-scripts\") pod \"glance-c270-account-create-update-m9p4w\" (UID: \"e6d0c29a-694d-4afc-ba36-c66fa8fd0328\") " pod="openstack/glance-c270-account-create-update-m9p4w" Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.273818 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njq24\" (UniqueName: \"kubernetes.io/projected/e6d0c29a-694d-4afc-ba36-c66fa8fd0328-kube-api-access-njq24\") pod \"glance-c270-account-create-update-m9p4w\" (UID: \"e6d0c29a-694d-4afc-ba36-c66fa8fd0328\") " pod="openstack/glance-c270-account-create-update-m9p4w" Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.274893 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6d0c29a-694d-4afc-ba36-c66fa8fd0328-operator-scripts\") pod \"glance-c270-account-create-update-m9p4w\" (UID: \"e6d0c29a-694d-4afc-ba36-c66fa8fd0328\") " pod="openstack/glance-c270-account-create-update-m9p4w" Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.280281 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-57skz" Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.293766 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njq24\" (UniqueName: \"kubernetes.io/projected/e6d0c29a-694d-4afc-ba36-c66fa8fd0328-kube-api-access-njq24\") pod \"glance-c270-account-create-update-m9p4w\" (UID: \"e6d0c29a-694d-4afc-ba36-c66fa8fd0328\") " pod="openstack/glance-c270-account-create-update-m9p4w" Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.331415 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c270-account-create-update-m9p4w" Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.730013 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-57skz"] Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.817455 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c270-account-create-update-m9p4w"] Feb 19 22:56:28 crc kubenswrapper[4795]: I0219 22:56:28.261645 4795 generic.go:334] "Generic (PLEG): container finished" podID="e6d0c29a-694d-4afc-ba36-c66fa8fd0328" containerID="ce12984ea586896da4a3a2ca9a12c46a23d3b89ee0886c7e9ee2b6ceb73add38" exitCode=0 Feb 19 22:56:28 crc kubenswrapper[4795]: I0219 22:56:28.261723 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c270-account-create-update-m9p4w" event={"ID":"e6d0c29a-694d-4afc-ba36-c66fa8fd0328","Type":"ContainerDied","Data":"ce12984ea586896da4a3a2ca9a12c46a23d3b89ee0886c7e9ee2b6ceb73add38"} Feb 19 22:56:28 crc kubenswrapper[4795]: I0219 22:56:28.261752 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c270-account-create-update-m9p4w" event={"ID":"e6d0c29a-694d-4afc-ba36-c66fa8fd0328","Type":"ContainerStarted","Data":"e113f538a1a32a9c0a8de19728d882d6568b7f3bce7fabcc99c168487e7235c9"} Feb 19 22:56:28 crc kubenswrapper[4795]: I0219 22:56:28.271658 4795 generic.go:334] "Generic (PLEG): container finished" podID="b5efc0b6-7441-4f4b-827e-d920c711d076" containerID="406b01e5656ffc8e932bd5ff0af64ce799789e55eeb816cb84ffd89f44da61ef" exitCode=0 Feb 19 22:56:28 crc kubenswrapper[4795]: I0219 22:56:28.271693 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-57skz" event={"ID":"b5efc0b6-7441-4f4b-827e-d920c711d076","Type":"ContainerDied","Data":"406b01e5656ffc8e932bd5ff0af64ce799789e55eeb816cb84ffd89f44da61ef"} Feb 19 22:56:28 crc kubenswrapper[4795]: I0219 22:56:28.271716 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-57skz" event={"ID":"b5efc0b6-7441-4f4b-827e-d920c711d076","Type":"ContainerStarted","Data":"6b2c41ca69ed4deb69d1b1332d8a3fd6cd1ed218f442a9619c93b3ed3970049a"} Feb 19 22:56:29 crc kubenswrapper[4795]: I0219 22:56:29.646621 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c270-account-create-update-m9p4w" Feb 19 22:56:29 crc kubenswrapper[4795]: I0219 22:56:29.651522 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-57skz" Feb 19 22:56:29 crc kubenswrapper[4795]: I0219 22:56:29.819327 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njq24\" (UniqueName: \"kubernetes.io/projected/e6d0c29a-694d-4afc-ba36-c66fa8fd0328-kube-api-access-njq24\") pod \"e6d0c29a-694d-4afc-ba36-c66fa8fd0328\" (UID: \"e6d0c29a-694d-4afc-ba36-c66fa8fd0328\") " Feb 19 22:56:29 crc kubenswrapper[4795]: I0219 22:56:29.819454 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6d0c29a-694d-4afc-ba36-c66fa8fd0328-operator-scripts\") pod \"e6d0c29a-694d-4afc-ba36-c66fa8fd0328\" (UID: \"e6d0c29a-694d-4afc-ba36-c66fa8fd0328\") " Feb 19 22:56:29 crc kubenswrapper[4795]: I0219 22:56:29.819474 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5efc0b6-7441-4f4b-827e-d920c711d076-operator-scripts\") pod \"b5efc0b6-7441-4f4b-827e-d920c711d076\" (UID: \"b5efc0b6-7441-4f4b-827e-d920c711d076\") " Feb 19 22:56:29 crc kubenswrapper[4795]: I0219 22:56:29.819543 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjr5c\" (UniqueName: \"kubernetes.io/projected/b5efc0b6-7441-4f4b-827e-d920c711d076-kube-api-access-qjr5c\") pod \"b5efc0b6-7441-4f4b-827e-d920c711d076\" (UID: \"b5efc0b6-7441-4f4b-827e-d920c711d076\") " Feb 19 22:56:29 crc kubenswrapper[4795]: I0219 22:56:29.820070 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5efc0b6-7441-4f4b-827e-d920c711d076-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b5efc0b6-7441-4f4b-827e-d920c711d076" (UID: "b5efc0b6-7441-4f4b-827e-d920c711d076"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:56:29 crc kubenswrapper[4795]: I0219 22:56:29.820092 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6d0c29a-694d-4afc-ba36-c66fa8fd0328-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e6d0c29a-694d-4afc-ba36-c66fa8fd0328" (UID: "e6d0c29a-694d-4afc-ba36-c66fa8fd0328"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:56:29 crc kubenswrapper[4795]: I0219 22:56:29.824837 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5efc0b6-7441-4f4b-827e-d920c711d076-kube-api-access-qjr5c" (OuterVolumeSpecName: "kube-api-access-qjr5c") pod "b5efc0b6-7441-4f4b-827e-d920c711d076" (UID: "b5efc0b6-7441-4f4b-827e-d920c711d076"). InnerVolumeSpecName "kube-api-access-qjr5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:56:29 crc kubenswrapper[4795]: I0219 22:56:29.824929 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6d0c29a-694d-4afc-ba36-c66fa8fd0328-kube-api-access-njq24" (OuterVolumeSpecName: "kube-api-access-njq24") pod "e6d0c29a-694d-4afc-ba36-c66fa8fd0328" (UID: "e6d0c29a-694d-4afc-ba36-c66fa8fd0328"). InnerVolumeSpecName "kube-api-access-njq24". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:56:29 crc kubenswrapper[4795]: I0219 22:56:29.922807 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjr5c\" (UniqueName: \"kubernetes.io/projected/b5efc0b6-7441-4f4b-827e-d920c711d076-kube-api-access-qjr5c\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:29 crc kubenswrapper[4795]: I0219 22:56:29.922845 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njq24\" (UniqueName: \"kubernetes.io/projected/e6d0c29a-694d-4afc-ba36-c66fa8fd0328-kube-api-access-njq24\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:29 crc kubenswrapper[4795]: I0219 22:56:29.922989 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6d0c29a-694d-4afc-ba36-c66fa8fd0328-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:29 crc kubenswrapper[4795]: I0219 22:56:29.923001 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5efc0b6-7441-4f4b-827e-d920c711d076-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:30 crc kubenswrapper[4795]: I0219 22:56:30.289897 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c270-account-create-update-m9p4w" event={"ID":"e6d0c29a-694d-4afc-ba36-c66fa8fd0328","Type":"ContainerDied","Data":"e113f538a1a32a9c0a8de19728d882d6568b7f3bce7fabcc99c168487e7235c9"} Feb 19 22:56:30 crc kubenswrapper[4795]: I0219 22:56:30.290251 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e113f538a1a32a9c0a8de19728d882d6568b7f3bce7fabcc99c168487e7235c9" Feb 19 22:56:30 crc kubenswrapper[4795]: I0219 22:56:30.289956 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c270-account-create-update-m9p4w" Feb 19 22:56:30 crc kubenswrapper[4795]: I0219 22:56:30.292119 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-57skz" event={"ID":"b5efc0b6-7441-4f4b-827e-d920c711d076","Type":"ContainerDied","Data":"6b2c41ca69ed4deb69d1b1332d8a3fd6cd1ed218f442a9619c93b3ed3970049a"} Feb 19 22:56:30 crc kubenswrapper[4795]: I0219 22:56:30.292211 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b2c41ca69ed4deb69d1b1332d8a3fd6cd1ed218f442a9619c93b3ed3970049a" Feb 19 22:56:30 crc kubenswrapper[4795]: I0219 22:56:30.292245 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-57skz" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.248586 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-wrz6p"] Feb 19 22:56:32 crc kubenswrapper[4795]: E0219 22:56:32.248992 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5efc0b6-7441-4f4b-827e-d920c711d076" containerName="mariadb-database-create" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.249007 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5efc0b6-7441-4f4b-827e-d920c711d076" containerName="mariadb-database-create" Feb 19 22:56:32 crc kubenswrapper[4795]: E0219 22:56:32.249018 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6d0c29a-694d-4afc-ba36-c66fa8fd0328" containerName="mariadb-account-create-update" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.249027 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6d0c29a-694d-4afc-ba36-c66fa8fd0328" containerName="mariadb-account-create-update" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.249278 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6d0c29a-694d-4afc-ba36-c66fa8fd0328" containerName="mariadb-account-create-update" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.249304 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5efc0b6-7441-4f4b-827e-d920c711d076" containerName="mariadb-database-create" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.249931 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wrz6p" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.251947 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.252103 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-8gwsm" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.260401 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-wrz6p"] Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.363892 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-combined-ca-bundle\") pod \"glance-db-sync-wrz6p\" (UID: \"3e95033f-725f-4784-995c-ec7a3b9c24c4\") " pod="openstack/glance-db-sync-wrz6p" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.363947 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-db-sync-config-data\") pod \"glance-db-sync-wrz6p\" (UID: \"3e95033f-725f-4784-995c-ec7a3b9c24c4\") " pod="openstack/glance-db-sync-wrz6p" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.363972 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-config-data\") pod \"glance-db-sync-wrz6p\" (UID: \"3e95033f-725f-4784-995c-ec7a3b9c24c4\") " pod="openstack/glance-db-sync-wrz6p" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.364285 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxkb7\" (UniqueName: \"kubernetes.io/projected/3e95033f-725f-4784-995c-ec7a3b9c24c4-kube-api-access-bxkb7\") pod \"glance-db-sync-wrz6p\" (UID: \"3e95033f-725f-4784-995c-ec7a3b9c24c4\") " pod="openstack/glance-db-sync-wrz6p" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.465885 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxkb7\" (UniqueName: \"kubernetes.io/projected/3e95033f-725f-4784-995c-ec7a3b9c24c4-kube-api-access-bxkb7\") pod \"glance-db-sync-wrz6p\" (UID: \"3e95033f-725f-4784-995c-ec7a3b9c24c4\") " pod="openstack/glance-db-sync-wrz6p" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.465960 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-combined-ca-bundle\") pod \"glance-db-sync-wrz6p\" (UID: \"3e95033f-725f-4784-995c-ec7a3b9c24c4\") " pod="openstack/glance-db-sync-wrz6p" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.465992 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-db-sync-config-data\") pod \"glance-db-sync-wrz6p\" (UID: \"3e95033f-725f-4784-995c-ec7a3b9c24c4\") " pod="openstack/glance-db-sync-wrz6p" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.466012 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-config-data\") pod \"glance-db-sync-wrz6p\" (UID: \"3e95033f-725f-4784-995c-ec7a3b9c24c4\") " pod="openstack/glance-db-sync-wrz6p" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.471454 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-db-sync-config-data\") pod \"glance-db-sync-wrz6p\" (UID: \"3e95033f-725f-4784-995c-ec7a3b9c24c4\") " pod="openstack/glance-db-sync-wrz6p" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.471630 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-config-data\") pod \"glance-db-sync-wrz6p\" (UID: \"3e95033f-725f-4784-995c-ec7a3b9c24c4\") " pod="openstack/glance-db-sync-wrz6p" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.480340 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-combined-ca-bundle\") pod \"glance-db-sync-wrz6p\" (UID: \"3e95033f-725f-4784-995c-ec7a3b9c24c4\") " pod="openstack/glance-db-sync-wrz6p" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.481388 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxkb7\" (UniqueName: \"kubernetes.io/projected/3e95033f-725f-4784-995c-ec7a3b9c24c4-kube-api-access-bxkb7\") pod \"glance-db-sync-wrz6p\" (UID: \"3e95033f-725f-4784-995c-ec7a3b9c24c4\") " pod="openstack/glance-db-sync-wrz6p" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.579902 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wrz6p" Feb 19 22:56:33 crc kubenswrapper[4795]: I0219 22:56:33.106842 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-wrz6p"] Feb 19 22:56:33 crc kubenswrapper[4795]: I0219 22:56:33.316147 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wrz6p" event={"ID":"3e95033f-725f-4784-995c-ec7a3b9c24c4","Type":"ContainerStarted","Data":"a0569c78b70090d7fd22b175f603c4cf568abb23fa79c1c5271376d6515bd29a"} Feb 19 22:56:34 crc kubenswrapper[4795]: I0219 22:56:34.334069 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wrz6p" event={"ID":"3e95033f-725f-4784-995c-ec7a3b9c24c4","Type":"ContainerStarted","Data":"b7a9bfd4cc9f1de56acd0e84791ea820651eb87fe9629808f92a9d7ea31fba08"} Feb 19 22:56:34 crc kubenswrapper[4795]: I0219 22:56:34.370614 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-wrz6p" podStartSLOduration=2.37059393 podStartE2EDuration="2.37059393s" podCreationTimestamp="2026-02-19 22:56:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:56:34.367917642 +0000 UTC m=+5305.560435536" watchObservedRunningTime="2026-02-19 22:56:34.37059393 +0000 UTC m=+5305.563111794" Feb 19 22:56:37 crc kubenswrapper[4795]: I0219 22:56:37.356534 4795 generic.go:334] "Generic (PLEG): container finished" podID="3e95033f-725f-4784-995c-ec7a3b9c24c4" containerID="b7a9bfd4cc9f1de56acd0e84791ea820651eb87fe9629808f92a9d7ea31fba08" exitCode=0 Feb 19 22:56:37 crc kubenswrapper[4795]: I0219 22:56:37.356636 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wrz6p" event={"ID":"3e95033f-725f-4784-995c-ec7a3b9c24c4","Type":"ContainerDied","Data":"b7a9bfd4cc9f1de56acd0e84791ea820651eb87fe9629808f92a9d7ea31fba08"} Feb 19 22:56:38 crc kubenswrapper[4795]: I0219 22:56:38.796287 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wrz6p" Feb 19 22:56:38 crc kubenswrapper[4795]: I0219 22:56:38.902652 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-combined-ca-bundle\") pod \"3e95033f-725f-4784-995c-ec7a3b9c24c4\" (UID: \"3e95033f-725f-4784-995c-ec7a3b9c24c4\") " Feb 19 22:56:38 crc kubenswrapper[4795]: I0219 22:56:38.902705 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-config-data\") pod \"3e95033f-725f-4784-995c-ec7a3b9c24c4\" (UID: \"3e95033f-725f-4784-995c-ec7a3b9c24c4\") " Feb 19 22:56:38 crc kubenswrapper[4795]: I0219 22:56:38.902794 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxkb7\" (UniqueName: \"kubernetes.io/projected/3e95033f-725f-4784-995c-ec7a3b9c24c4-kube-api-access-bxkb7\") pod \"3e95033f-725f-4784-995c-ec7a3b9c24c4\" (UID: \"3e95033f-725f-4784-995c-ec7a3b9c24c4\") " Feb 19 22:56:38 crc kubenswrapper[4795]: I0219 22:56:38.902928 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-db-sync-config-data\") pod \"3e95033f-725f-4784-995c-ec7a3b9c24c4\" (UID: \"3e95033f-725f-4784-995c-ec7a3b9c24c4\") " Feb 19 22:56:38 crc kubenswrapper[4795]: I0219 22:56:38.907276 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e95033f-725f-4784-995c-ec7a3b9c24c4-kube-api-access-bxkb7" (OuterVolumeSpecName: "kube-api-access-bxkb7") pod "3e95033f-725f-4784-995c-ec7a3b9c24c4" (UID: "3e95033f-725f-4784-995c-ec7a3b9c24c4"). InnerVolumeSpecName "kube-api-access-bxkb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:56:38 crc kubenswrapper[4795]: I0219 22:56:38.916262 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3e95033f-725f-4784-995c-ec7a3b9c24c4" (UID: "3e95033f-725f-4784-995c-ec7a3b9c24c4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:56:38 crc kubenswrapper[4795]: I0219 22:56:38.925416 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e95033f-725f-4784-995c-ec7a3b9c24c4" (UID: "3e95033f-725f-4784-995c-ec7a3b9c24c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:56:38 crc kubenswrapper[4795]: I0219 22:56:38.949223 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-config-data" (OuterVolumeSpecName: "config-data") pod "3e95033f-725f-4784-995c-ec7a3b9c24c4" (UID: "3e95033f-725f-4784-995c-ec7a3b9c24c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.004908 4795 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.004942 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.004956 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.005008 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxkb7\" (UniqueName: \"kubernetes.io/projected/3e95033f-725f-4784-995c-ec7a3b9c24c4-kube-api-access-bxkb7\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.380210 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wrz6p" event={"ID":"3e95033f-725f-4784-995c-ec7a3b9c24c4","Type":"ContainerDied","Data":"a0569c78b70090d7fd22b175f603c4cf568abb23fa79c1c5271376d6515bd29a"} Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.380312 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0569c78b70090d7fd22b175f603c4cf568abb23fa79c1c5271376d6515bd29a" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.380232 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wrz6p" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.675465 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 22:56:39 crc kubenswrapper[4795]: E0219 22:56:39.676255 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e95033f-725f-4784-995c-ec7a3b9c24c4" containerName="glance-db-sync" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.676279 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e95033f-725f-4784-995c-ec7a3b9c24c4" containerName="glance-db-sync" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.676504 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e95033f-725f-4784-995c-ec7a3b9c24c4" containerName="glance-db-sync" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.677511 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.688892 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.689304 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.689337 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-8gwsm" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.689543 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.710903 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.765903 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-679b7b556f-vb5wj"] Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.767210 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.778703 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-679b7b556f-vb5wj"] Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.819280 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13c7f8fe-962d-47d0-9607-f121e0c6a38d-logs\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.819361 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-config-data\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.819384 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13c7f8fe-962d-47d0-9607-f121e0c6a38d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.819406 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/13c7f8fe-962d-47d0-9607-f121e0c6a38d-ceph\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.819424 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pdqx\" (UniqueName: \"kubernetes.io/projected/13c7f8fe-962d-47d0-9607-f121e0c6a38d-kube-api-access-8pdqx\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.819509 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.819539 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-scripts\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.862852 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.864714 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.869366 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.874179 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.920525 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pdqx\" (UniqueName: \"kubernetes.io/projected/13c7f8fe-962d-47d0-9607-f121e0c6a38d-kube-api-access-8pdqx\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.920574 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/13c7f8fe-962d-47d0-9607-f121e0c6a38d-ceph\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.920635 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.920658 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-scripts\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.920727 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-config\") pod \"dnsmasq-dns-679b7b556f-vb5wj\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.920758 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-ovsdbserver-nb\") pod \"dnsmasq-dns-679b7b556f-vb5wj\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.920800 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13c7f8fe-962d-47d0-9607-f121e0c6a38d-logs\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.920841 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-dns-svc\") pod \"dnsmasq-dns-679b7b556f-vb5wj\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.920869 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-ovsdbserver-sb\") pod \"dnsmasq-dns-679b7b556f-vb5wj\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.920898 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmn7q\" (UniqueName: \"kubernetes.io/projected/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-kube-api-access-bmn7q\") pod \"dnsmasq-dns-679b7b556f-vb5wj\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.920940 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-config-data\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.920965 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13c7f8fe-962d-47d0-9607-f121e0c6a38d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.921588 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13c7f8fe-962d-47d0-9607-f121e0c6a38d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.924845 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13c7f8fe-962d-47d0-9607-f121e0c6a38d-logs\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.925019 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/13c7f8fe-962d-47d0-9607-f121e0c6a38d-ceph\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.927064 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.932894 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-scripts\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.935885 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-config-data\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.943522 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pdqx\" (UniqueName: \"kubernetes.io/projected/13c7f8fe-962d-47d0-9607-f121e0c6a38d-kube-api-access-8pdqx\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.997935 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.022128 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-config-data\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.022214 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdwdh\" (UniqueName: \"kubernetes.io/projected/388d272a-cffa-4321-ac91-648accbf6930-kube-api-access-kdwdh\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.022244 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.022269 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/388d272a-cffa-4321-ac91-648accbf6930-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.022353 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/388d272a-cffa-4321-ac91-648accbf6930-ceph\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.022385 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-config\") pod \"dnsmasq-dns-679b7b556f-vb5wj\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.022407 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/388d272a-cffa-4321-ac91-648accbf6930-logs\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.022428 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-ovsdbserver-nb\") pod \"dnsmasq-dns-679b7b556f-vb5wj\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.022462 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-scripts\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.022496 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-dns-svc\") pod \"dnsmasq-dns-679b7b556f-vb5wj\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.022526 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-ovsdbserver-sb\") pod \"dnsmasq-dns-679b7b556f-vb5wj\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.022551 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmn7q\" (UniqueName: \"kubernetes.io/projected/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-kube-api-access-bmn7q\") pod \"dnsmasq-dns-679b7b556f-vb5wj\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.023464 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-config\") pod \"dnsmasq-dns-679b7b556f-vb5wj\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.023647 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-ovsdbserver-nb\") pod \"dnsmasq-dns-679b7b556f-vb5wj\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.024216 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-dns-svc\") pod \"dnsmasq-dns-679b7b556f-vb5wj\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.025282 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-ovsdbserver-sb\") pod \"dnsmasq-dns-679b7b556f-vb5wj\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.040987 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmn7q\" (UniqueName: \"kubernetes.io/projected/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-kube-api-access-bmn7q\") pod \"dnsmasq-dns-679b7b556f-vb5wj\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.085494 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.124252 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-config-data\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.124542 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdwdh\" (UniqueName: \"kubernetes.io/projected/388d272a-cffa-4321-ac91-648accbf6930-kube-api-access-kdwdh\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.124979 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.128163 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/388d272a-cffa-4321-ac91-648accbf6930-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.128722 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/388d272a-cffa-4321-ac91-648accbf6930-ceph\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.128754 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/388d272a-cffa-4321-ac91-648accbf6930-logs\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.128865 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-scripts\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.129254 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/388d272a-cffa-4321-ac91-648accbf6930-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.129564 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/388d272a-cffa-4321-ac91-648accbf6930-logs\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.133794 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.137156 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/388d272a-cffa-4321-ac91-648accbf6930-ceph\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.142373 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-config-data\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.144320 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-scripts\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.146330 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdwdh\" (UniqueName: \"kubernetes.io/projected/388d272a-cffa-4321-ac91-648accbf6930-kube-api-access-kdwdh\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.306058 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.658692 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-679b7b556f-vb5wj"] Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.691109 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.926929 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.986642 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 22:56:40 crc kubenswrapper[4795]: W0219 22:56:40.998104 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod388d272a_cffa_4321_ac91_648accbf6930.slice/crio-1345bf8f0b6f81825f1c42cd0e670dbc13d5673dca7862d996402d988d2684ef WatchSource:0}: Error finding container 1345bf8f0b6f81825f1c42cd0e670dbc13d5673dca7862d996402d988d2684ef: Status 404 returned error can't find the container with id 1345bf8f0b6f81825f1c42cd0e670dbc13d5673dca7862d996402d988d2684ef Feb 19 22:56:41 crc kubenswrapper[4795]: I0219 22:56:41.405785 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"13c7f8fe-962d-47d0-9607-f121e0c6a38d","Type":"ContainerStarted","Data":"d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77"} Feb 19 22:56:41 crc kubenswrapper[4795]: I0219 22:56:41.406279 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"13c7f8fe-962d-47d0-9607-f121e0c6a38d","Type":"ContainerStarted","Data":"cd0b6e7125b0cc4a617c794275fd47b2d7fa67576e3ce216320de639f679c6fb"} Feb 19 22:56:41 crc kubenswrapper[4795]: I0219 22:56:41.415312 4795 generic.go:334] "Generic (PLEG): container finished" podID="1847cbdb-2b75-48d9-ab0c-db5da5a236a4" containerID="23858aaac28bb5c70674b0ae3bbc7222b23e412fd7f763759a9bb0ad432d3261" exitCode=0 Feb 19 22:56:41 crc kubenswrapper[4795]: I0219 22:56:41.415418 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" event={"ID":"1847cbdb-2b75-48d9-ab0c-db5da5a236a4","Type":"ContainerDied","Data":"23858aaac28bb5c70674b0ae3bbc7222b23e412fd7f763759a9bb0ad432d3261"} Feb 19 22:56:41 crc kubenswrapper[4795]: I0219 22:56:41.415467 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" event={"ID":"1847cbdb-2b75-48d9-ab0c-db5da5a236a4","Type":"ContainerStarted","Data":"7922c7a8658b3cbc38083041c7e26029e3e327c9354e7a22ada01242a334b1d4"} Feb 19 22:56:41 crc kubenswrapper[4795]: I0219 22:56:41.418518 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"388d272a-cffa-4321-ac91-648accbf6930","Type":"ContainerStarted","Data":"1345bf8f0b6f81825f1c42cd0e670dbc13d5673dca7862d996402d988d2684ef"} Feb 19 22:56:42 crc kubenswrapper[4795]: I0219 22:56:42.428352 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" event={"ID":"1847cbdb-2b75-48d9-ab0c-db5da5a236a4","Type":"ContainerStarted","Data":"dc882b0d9e65af862350aae0c80a9f7a4b7f8f9b438717336c96a4725a891917"} Feb 19 22:56:42 crc kubenswrapper[4795]: I0219 22:56:42.429010 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:42 crc kubenswrapper[4795]: I0219 22:56:42.430250 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"388d272a-cffa-4321-ac91-648accbf6930","Type":"ContainerStarted","Data":"20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35"} Feb 19 22:56:42 crc kubenswrapper[4795]: I0219 22:56:42.430282 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"388d272a-cffa-4321-ac91-648accbf6930","Type":"ContainerStarted","Data":"b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f"} Feb 19 22:56:42 crc kubenswrapper[4795]: I0219 22:56:42.431944 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"13c7f8fe-962d-47d0-9607-f121e0c6a38d","Type":"ContainerStarted","Data":"c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14"} Feb 19 22:56:42 crc kubenswrapper[4795]: I0219 22:56:42.432276 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="13c7f8fe-962d-47d0-9607-f121e0c6a38d" containerName="glance-httpd" containerID="cri-o://c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14" gracePeriod=30 Feb 19 22:56:42 crc kubenswrapper[4795]: I0219 22:56:42.432280 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="13c7f8fe-962d-47d0-9607-f121e0c6a38d" containerName="glance-log" containerID="cri-o://d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77" gracePeriod=30 Feb 19 22:56:42 crc kubenswrapper[4795]: I0219 22:56:42.454979 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" podStartSLOduration=3.454960967 podStartE2EDuration="3.454960967s" podCreationTimestamp="2026-02-19 22:56:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:56:42.448643386 +0000 UTC m=+5313.641161280" watchObservedRunningTime="2026-02-19 22:56:42.454960967 +0000 UTC m=+5313.647478831" Feb 19 22:56:42 crc kubenswrapper[4795]: I0219 22:56:42.477687 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.477666715 podStartE2EDuration="3.477666715s" podCreationTimestamp="2026-02-19 22:56:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:56:42.475180612 +0000 UTC m=+5313.667698476" watchObservedRunningTime="2026-02-19 22:56:42.477666715 +0000 UTC m=+5313.670184579" Feb 19 22:56:42 crc kubenswrapper[4795]: I0219 22:56:42.500059 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.500034894 podStartE2EDuration="3.500034894s" podCreationTimestamp="2026-02-19 22:56:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:56:42.493267852 +0000 UTC m=+5313.685785776" watchObservedRunningTime="2026-02-19 22:56:42.500034894 +0000 UTC m=+5313.692552758" Feb 19 22:56:42 crc kubenswrapper[4795]: I0219 22:56:42.713109 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.046790 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.196604 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pdqx\" (UniqueName: \"kubernetes.io/projected/13c7f8fe-962d-47d0-9607-f121e0c6a38d-kube-api-access-8pdqx\") pod \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.196680 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/13c7f8fe-962d-47d0-9607-f121e0c6a38d-ceph\") pod \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.196730 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13c7f8fe-962d-47d0-9607-f121e0c6a38d-httpd-run\") pod \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.196750 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-scripts\") pod \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.196827 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-combined-ca-bundle\") pod \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.196843 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13c7f8fe-962d-47d0-9607-f121e0c6a38d-logs\") pod \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.196891 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-config-data\") pod \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.197333 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13c7f8fe-962d-47d0-9607-f121e0c6a38d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "13c7f8fe-962d-47d0-9607-f121e0c6a38d" (UID: "13c7f8fe-962d-47d0-9607-f121e0c6a38d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.197510 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13c7f8fe-962d-47d0-9607-f121e0c6a38d-logs" (OuterVolumeSpecName: "logs") pod "13c7f8fe-962d-47d0-9607-f121e0c6a38d" (UID: "13c7f8fe-962d-47d0-9607-f121e0c6a38d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.202868 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13c7f8fe-962d-47d0-9607-f121e0c6a38d-kube-api-access-8pdqx" (OuterVolumeSpecName: "kube-api-access-8pdqx") pod "13c7f8fe-962d-47d0-9607-f121e0c6a38d" (UID: "13c7f8fe-962d-47d0-9607-f121e0c6a38d"). InnerVolumeSpecName "kube-api-access-8pdqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.216763 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13c7f8fe-962d-47d0-9607-f121e0c6a38d-ceph" (OuterVolumeSpecName: "ceph") pod "13c7f8fe-962d-47d0-9607-f121e0c6a38d" (UID: "13c7f8fe-962d-47d0-9607-f121e0c6a38d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.216888 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-scripts" (OuterVolumeSpecName: "scripts") pod "13c7f8fe-962d-47d0-9607-f121e0c6a38d" (UID: "13c7f8fe-962d-47d0-9607-f121e0c6a38d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.227297 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13c7f8fe-962d-47d0-9607-f121e0c6a38d" (UID: "13c7f8fe-962d-47d0-9607-f121e0c6a38d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.251359 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-config-data" (OuterVolumeSpecName: "config-data") pod "13c7f8fe-962d-47d0-9607-f121e0c6a38d" (UID: "13c7f8fe-962d-47d0-9607-f121e0c6a38d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.299187 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.299222 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pdqx\" (UniqueName: \"kubernetes.io/projected/13c7f8fe-962d-47d0-9607-f121e0c6a38d-kube-api-access-8pdqx\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.299233 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/13c7f8fe-962d-47d0-9607-f121e0c6a38d-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.299240 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13c7f8fe-962d-47d0-9607-f121e0c6a38d-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.299250 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.299260 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.299269 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13c7f8fe-962d-47d0-9607-f121e0c6a38d-logs\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.441449 4795 generic.go:334] "Generic (PLEG): container finished" podID="13c7f8fe-962d-47d0-9607-f121e0c6a38d" containerID="c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14" exitCode=0 Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.441491 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.441520 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"13c7f8fe-962d-47d0-9607-f121e0c6a38d","Type":"ContainerDied","Data":"c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14"} Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.441565 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"13c7f8fe-962d-47d0-9607-f121e0c6a38d","Type":"ContainerDied","Data":"d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77"} Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.441496 4795 generic.go:334] "Generic (PLEG): container finished" podID="13c7f8fe-962d-47d0-9607-f121e0c6a38d" containerID="d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77" exitCode=143 Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.441587 4795 scope.go:117] "RemoveContainer" containerID="c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.441684 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"13c7f8fe-962d-47d0-9607-f121e0c6a38d","Type":"ContainerDied","Data":"cd0b6e7125b0cc4a617c794275fd47b2d7fa67576e3ce216320de639f679c6fb"} Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.463781 4795 scope.go:117] "RemoveContainer" containerID="d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.477140 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.495291 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.526355 4795 scope.go:117] "RemoveContainer" containerID="c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14" Feb 19 22:56:43 crc kubenswrapper[4795]: E0219 22:56:43.527422 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14\": container with ID starting with c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14 not found: ID does not exist" containerID="c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.527477 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14"} err="failed to get container status \"c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14\": rpc error: code = NotFound desc = could not find container \"c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14\": container with ID starting with c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14 not found: ID does not exist" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.527506 4795 scope.go:117] "RemoveContainer" containerID="d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77" Feb 19 22:56:43 crc kubenswrapper[4795]: E0219 22:56:43.528134 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77\": container with ID starting with d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77 not found: ID does not exist" containerID="d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.528228 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77"} err="failed to get container status \"d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77\": rpc error: code = NotFound desc = could not find container \"d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77\": container with ID starting with d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77 not found: ID does not exist" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.528264 4795 scope.go:117] "RemoveContainer" containerID="c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.528626 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14"} err="failed to get container status \"c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14\": rpc error: code = NotFound desc = could not find container \"c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14\": container with ID starting with c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14 not found: ID does not exist" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.528648 4795 scope.go:117] "RemoveContainer" containerID="d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.529045 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77"} err="failed to get container status \"d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77\": rpc error: code = NotFound desc = could not find container \"d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77\": container with ID starting with d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77 not found: ID does not exist" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.531902 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13c7f8fe-962d-47d0-9607-f121e0c6a38d" path="/var/lib/kubelet/pods/13c7f8fe-962d-47d0-9607-f121e0c6a38d/volumes" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.532775 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 22:56:43 crc kubenswrapper[4795]: E0219 22:56:43.533079 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13c7f8fe-962d-47d0-9607-f121e0c6a38d" containerName="glance-httpd" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.533100 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="13c7f8fe-962d-47d0-9607-f121e0c6a38d" containerName="glance-httpd" Feb 19 22:56:43 crc kubenswrapper[4795]: E0219 22:56:43.533119 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13c7f8fe-962d-47d0-9607-f121e0c6a38d" containerName="glance-log" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.533127 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="13c7f8fe-962d-47d0-9607-f121e0c6a38d" containerName="glance-log" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.533378 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="13c7f8fe-962d-47d0-9607-f121e0c6a38d" containerName="glance-log" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.533404 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="13c7f8fe-962d-47d0-9607-f121e0c6a38d" containerName="glance-httpd" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.534884 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.534996 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.538755 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.706546 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-ceph\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.706661 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-scripts\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.706755 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.706789 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-logs\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.706864 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.707028 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srnpd\" (UniqueName: \"kubernetes.io/projected/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-kube-api-access-srnpd\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.707057 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-config-data\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.808342 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-ceph\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.808467 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-scripts\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.808540 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.808591 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-logs\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.808615 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.808719 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srnpd\" (UniqueName: \"kubernetes.io/projected/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-kube-api-access-srnpd\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.808765 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-config-data\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.809832 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-logs\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.809856 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.812871 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-ceph\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.813355 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-config-data\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.813752 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-scripts\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.816298 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.829030 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srnpd\" (UniqueName: \"kubernetes.io/projected/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-kube-api-access-srnpd\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.859953 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 22:56:44 crc kubenswrapper[4795]: I0219 22:56:44.348080 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 22:56:44 crc kubenswrapper[4795]: W0219 22:56:44.349328 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ba19509_98fd_4ae4_b9ab_673c27ab8e85.slice/crio-80a194040524fc06f644957b6fb02ec46c953aefd62778b8fadace961f88d1ac WatchSource:0}: Error finding container 80a194040524fc06f644957b6fb02ec46c953aefd62778b8fadace961f88d1ac: Status 404 returned error can't find the container with id 80a194040524fc06f644957b6fb02ec46c953aefd62778b8fadace961f88d1ac Feb 19 22:56:44 crc kubenswrapper[4795]: I0219 22:56:44.450895 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5ba19509-98fd-4ae4-b9ab-673c27ab8e85","Type":"ContainerStarted","Data":"80a194040524fc06f644957b6fb02ec46c953aefd62778b8fadace961f88d1ac"} Feb 19 22:56:44 crc kubenswrapper[4795]: I0219 22:56:44.451067 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="388d272a-cffa-4321-ac91-648accbf6930" containerName="glance-log" containerID="cri-o://b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f" gracePeriod=30 Feb 19 22:56:44 crc kubenswrapper[4795]: I0219 22:56:44.451204 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="388d272a-cffa-4321-ac91-648accbf6930" containerName="glance-httpd" containerID="cri-o://20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35" gracePeriod=30 Feb 19 22:56:44 crc kubenswrapper[4795]: I0219 22:56:44.924872 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.028761 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdwdh\" (UniqueName: \"kubernetes.io/projected/388d272a-cffa-4321-ac91-648accbf6930-kube-api-access-kdwdh\") pod \"388d272a-cffa-4321-ac91-648accbf6930\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.028845 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-combined-ca-bundle\") pod \"388d272a-cffa-4321-ac91-648accbf6930\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.028911 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/388d272a-cffa-4321-ac91-648accbf6930-ceph\") pod \"388d272a-cffa-4321-ac91-648accbf6930\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.028998 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/388d272a-cffa-4321-ac91-648accbf6930-httpd-run\") pod \"388d272a-cffa-4321-ac91-648accbf6930\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.029037 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-config-data\") pod \"388d272a-cffa-4321-ac91-648accbf6930\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.029061 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/388d272a-cffa-4321-ac91-648accbf6930-logs\") pod \"388d272a-cffa-4321-ac91-648accbf6930\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.029085 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-scripts\") pod \"388d272a-cffa-4321-ac91-648accbf6930\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.030011 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/388d272a-cffa-4321-ac91-648accbf6930-logs" (OuterVolumeSpecName: "logs") pod "388d272a-cffa-4321-ac91-648accbf6930" (UID: "388d272a-cffa-4321-ac91-648accbf6930"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.030320 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/388d272a-cffa-4321-ac91-648accbf6930-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "388d272a-cffa-4321-ac91-648accbf6930" (UID: "388d272a-cffa-4321-ac91-648accbf6930"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.034860 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-scripts" (OuterVolumeSpecName: "scripts") pod "388d272a-cffa-4321-ac91-648accbf6930" (UID: "388d272a-cffa-4321-ac91-648accbf6930"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.034954 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/388d272a-cffa-4321-ac91-648accbf6930-ceph" (OuterVolumeSpecName: "ceph") pod "388d272a-cffa-4321-ac91-648accbf6930" (UID: "388d272a-cffa-4321-ac91-648accbf6930"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.035593 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/388d272a-cffa-4321-ac91-648accbf6930-kube-api-access-kdwdh" (OuterVolumeSpecName: "kube-api-access-kdwdh") pod "388d272a-cffa-4321-ac91-648accbf6930" (UID: "388d272a-cffa-4321-ac91-648accbf6930"). InnerVolumeSpecName "kube-api-access-kdwdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.057900 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "388d272a-cffa-4321-ac91-648accbf6930" (UID: "388d272a-cffa-4321-ac91-648accbf6930"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.091672 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-config-data" (OuterVolumeSpecName: "config-data") pod "388d272a-cffa-4321-ac91-648accbf6930" (UID: "388d272a-cffa-4321-ac91-648accbf6930"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.130888 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdwdh\" (UniqueName: \"kubernetes.io/projected/388d272a-cffa-4321-ac91-648accbf6930-kube-api-access-kdwdh\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.130925 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.130934 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/388d272a-cffa-4321-ac91-648accbf6930-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.130943 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/388d272a-cffa-4321-ac91-648accbf6930-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.130952 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.130959 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/388d272a-cffa-4321-ac91-648accbf6930-logs\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.130966 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.463777 4795 generic.go:334] "Generic (PLEG): container finished" podID="388d272a-cffa-4321-ac91-648accbf6930" containerID="20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35" exitCode=0 Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.464151 4795 generic.go:334] "Generic (PLEG): container finished" podID="388d272a-cffa-4321-ac91-648accbf6930" containerID="b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f" exitCode=143 Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.464197 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.464194 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"388d272a-cffa-4321-ac91-648accbf6930","Type":"ContainerDied","Data":"20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35"} Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.464278 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"388d272a-cffa-4321-ac91-648accbf6930","Type":"ContainerDied","Data":"b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f"} Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.464299 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"388d272a-cffa-4321-ac91-648accbf6930","Type":"ContainerDied","Data":"1345bf8f0b6f81825f1c42cd0e670dbc13d5673dca7862d996402d988d2684ef"} Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.464324 4795 scope.go:117] "RemoveContainer" containerID="20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.467924 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5ba19509-98fd-4ae4-b9ab-673c27ab8e85","Type":"ContainerStarted","Data":"f4e052417461008a87fe32a59b836852c3e0dfbe3a7cafdcd8e66c0079a03648"} Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.467963 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5ba19509-98fd-4ae4-b9ab-673c27ab8e85","Type":"ContainerStarted","Data":"e79326eb305f46105ac73051eebe6a8a28eedc36e23a4992a65d558674bb4f58"} Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.491567 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=2.491542103 podStartE2EDuration="2.491542103s" podCreationTimestamp="2026-02-19 22:56:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:56:45.486096484 +0000 UTC m=+5316.678614368" watchObservedRunningTime="2026-02-19 22:56:45.491542103 +0000 UTC m=+5316.684059967" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.503063 4795 scope.go:117] "RemoveContainer" containerID="b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.509672 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.524860 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.532862 4795 scope.go:117] "RemoveContainer" containerID="20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35" Feb 19 22:56:45 crc kubenswrapper[4795]: E0219 22:56:45.533389 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35\": container with ID starting with 20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35 not found: ID does not exist" containerID="20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.533419 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35"} err="failed to get container status \"20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35\": rpc error: code = NotFound desc = could not find container \"20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35\": container with ID starting with 20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35 not found: ID does not exist" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.533438 4795 scope.go:117] "RemoveContainer" containerID="b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f" Feb 19 22:56:45 crc kubenswrapper[4795]: E0219 22:56:45.533937 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f\": container with ID starting with b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f not found: ID does not exist" containerID="b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.533963 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f"} err="failed to get container status \"b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f\": rpc error: code = NotFound desc = could not find container \"b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f\": container with ID starting with b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f not found: ID does not exist" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.533979 4795 scope.go:117] "RemoveContainer" containerID="20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.534256 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35"} err="failed to get container status \"20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35\": rpc error: code = NotFound desc = could not find container \"20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35\": container with ID starting with 20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35 not found: ID does not exist" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.534282 4795 scope.go:117] "RemoveContainer" containerID="b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.534672 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f"} err="failed to get container status \"b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f\": rpc error: code = NotFound desc = could not find container \"b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f\": container with ID starting with b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f not found: ID does not exist" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.544089 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 22:56:45 crc kubenswrapper[4795]: E0219 22:56:45.545577 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="388d272a-cffa-4321-ac91-648accbf6930" containerName="glance-log" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.545612 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="388d272a-cffa-4321-ac91-648accbf6930" containerName="glance-log" Feb 19 22:56:45 crc kubenswrapper[4795]: E0219 22:56:45.545628 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="388d272a-cffa-4321-ac91-648accbf6930" containerName="glance-httpd" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.545635 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="388d272a-cffa-4321-ac91-648accbf6930" containerName="glance-httpd" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.545789 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="388d272a-cffa-4321-ac91-648accbf6930" containerName="glance-log" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.545806 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="388d272a-cffa-4321-ac91-648accbf6930" containerName="glance-httpd" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.548907 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.553620 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.554311 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.640251 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.640304 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljtbh\" (UniqueName: \"kubernetes.io/projected/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-kube-api-access-ljtbh\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.640341 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-ceph\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.640516 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.640630 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-logs\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.640678 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.640772 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.742370 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.742462 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.742488 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljtbh\" (UniqueName: \"kubernetes.io/projected/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-kube-api-access-ljtbh\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.742522 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-ceph\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.742547 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.742577 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-logs\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.742594 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.744551 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.744943 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-logs\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.746914 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-ceph\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.747396 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.747494 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.760865 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.770137 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljtbh\" (UniqueName: \"kubernetes.io/projected/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-kube-api-access-ljtbh\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.863771 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 22:56:46 crc kubenswrapper[4795]: I0219 22:56:46.409369 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 22:56:46 crc kubenswrapper[4795]: I0219 22:56:46.482056 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc","Type":"ContainerStarted","Data":"5b24832724357e9fe3cd524ab47d5c8ea237d7ac8d26ead3ec336002073cfe70"} Feb 19 22:56:47 crc kubenswrapper[4795]: I0219 22:56:47.490103 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc","Type":"ContainerStarted","Data":"1b16e4f15dda00c47ebd1d3a052f48e0eb759759e8c054d1aebe9ed9f2d0750b"} Feb 19 22:56:47 crc kubenswrapper[4795]: I0219 22:56:47.490647 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc","Type":"ContainerStarted","Data":"2881a41d4e90cea66bb06cc15746e467948606aaeb4ab6378ba25eec8d520511"} Feb 19 22:56:47 crc kubenswrapper[4795]: I0219 22:56:47.517285 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.517263849 podStartE2EDuration="2.517263849s" podCreationTimestamp="2026-02-19 22:56:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:56:47.50472692 +0000 UTC m=+5318.697244794" watchObservedRunningTime="2026-02-19 22:56:47.517263849 +0000 UTC m=+5318.709781723" Feb 19 22:56:47 crc kubenswrapper[4795]: I0219 22:56:47.527204 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="388d272a-cffa-4321-ac91-648accbf6930" path="/var/lib/kubelet/pods/388d272a-cffa-4321-ac91-648accbf6930/volumes" Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.087729 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.203576 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d5d85df8f-clq65"] Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.203939 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" podUID="8e5a1fbd-4617-434e-8719-12b16bc88b98" containerName="dnsmasq-dns" containerID="cri-o://db3ac61fe2bbba8663fc1cafd3f78c8dde189ed879980b2ca339cc53d3ebb615" gracePeriod=10 Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.518658 4795 generic.go:334] "Generic (PLEG): container finished" podID="8e5a1fbd-4617-434e-8719-12b16bc88b98" containerID="db3ac61fe2bbba8663fc1cafd3f78c8dde189ed879980b2ca339cc53d3ebb615" exitCode=0 Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.518982 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" event={"ID":"8e5a1fbd-4617-434e-8719-12b16bc88b98","Type":"ContainerDied","Data":"db3ac61fe2bbba8663fc1cafd3f78c8dde189ed879980b2ca339cc53d3ebb615"} Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.700593 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.841091 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-ovsdbserver-nb\") pod \"8e5a1fbd-4617-434e-8719-12b16bc88b98\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.841281 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-config\") pod \"8e5a1fbd-4617-434e-8719-12b16bc88b98\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.841457 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-dns-svc\") pod \"8e5a1fbd-4617-434e-8719-12b16bc88b98\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.841581 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-ovsdbserver-sb\") pod \"8e5a1fbd-4617-434e-8719-12b16bc88b98\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.841737 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl457\" (UniqueName: \"kubernetes.io/projected/8e5a1fbd-4617-434e-8719-12b16bc88b98-kube-api-access-cl457\") pod \"8e5a1fbd-4617-434e-8719-12b16bc88b98\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.864563 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e5a1fbd-4617-434e-8719-12b16bc88b98-kube-api-access-cl457" (OuterVolumeSpecName: "kube-api-access-cl457") pod "8e5a1fbd-4617-434e-8719-12b16bc88b98" (UID: "8e5a1fbd-4617-434e-8719-12b16bc88b98"). InnerVolumeSpecName "kube-api-access-cl457". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.884880 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8e5a1fbd-4617-434e-8719-12b16bc88b98" (UID: "8e5a1fbd-4617-434e-8719-12b16bc88b98"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.885613 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8e5a1fbd-4617-434e-8719-12b16bc88b98" (UID: "8e5a1fbd-4617-434e-8719-12b16bc88b98"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.904212 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8e5a1fbd-4617-434e-8719-12b16bc88b98" (UID: "8e5a1fbd-4617-434e-8719-12b16bc88b98"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.910709 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-config" (OuterVolumeSpecName: "config") pod "8e5a1fbd-4617-434e-8719-12b16bc88b98" (UID: "8e5a1fbd-4617-434e-8719-12b16bc88b98"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.943666 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.943702 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-config\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.943712 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.943720 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.943731 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl457\" (UniqueName: \"kubernetes.io/projected/8e5a1fbd-4617-434e-8719-12b16bc88b98-kube-api-access-cl457\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:51 crc kubenswrapper[4795]: I0219 22:56:51.533551 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" event={"ID":"8e5a1fbd-4617-434e-8719-12b16bc88b98","Type":"ContainerDied","Data":"7217af4cd4eb00396c1a57059f86739ac95943d868ddbc9e0af3ab209b2339ee"} Feb 19 22:56:51 crc kubenswrapper[4795]: I0219 22:56:51.533615 4795 scope.go:117] "RemoveContainer" containerID="db3ac61fe2bbba8663fc1cafd3f78c8dde189ed879980b2ca339cc53d3ebb615" Feb 19 22:56:51 crc kubenswrapper[4795]: I0219 22:56:51.533774 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:56:51 crc kubenswrapper[4795]: I0219 22:56:51.563805 4795 scope.go:117] "RemoveContainer" containerID="400d407c08a50e5c293b398a82df6a8f79475760e406fca1855e5558b33cb384" Feb 19 22:56:51 crc kubenswrapper[4795]: I0219 22:56:51.586194 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d5d85df8f-clq65"] Feb 19 22:56:51 crc kubenswrapper[4795]: I0219 22:56:51.595185 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d5d85df8f-clq65"] Feb 19 22:56:53 crc kubenswrapper[4795]: I0219 22:56:53.527418 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e5a1fbd-4617-434e-8719-12b16bc88b98" path="/var/lib/kubelet/pods/8e5a1fbd-4617-434e-8719-12b16bc88b98/volumes" Feb 19 22:56:53 crc kubenswrapper[4795]: I0219 22:56:53.860819 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 22:56:53 crc kubenswrapper[4795]: I0219 22:56:53.861244 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 22:56:53 crc kubenswrapper[4795]: I0219 22:56:53.898759 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 22:56:53 crc kubenswrapper[4795]: I0219 22:56:53.907198 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 22:56:54 crc kubenswrapper[4795]: I0219 22:56:54.567657 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 22:56:54 crc kubenswrapper[4795]: I0219 22:56:54.567701 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 22:56:55 crc kubenswrapper[4795]: I0219 22:56:55.864920 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 22:56:55 crc kubenswrapper[4795]: I0219 22:56:55.864978 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 22:56:55 crc kubenswrapper[4795]: I0219 22:56:55.890968 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 22:56:55 crc kubenswrapper[4795]: I0219 22:56:55.900216 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 22:56:56 crc kubenswrapper[4795]: I0219 22:56:56.584257 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 22:56:56 crc kubenswrapper[4795]: I0219 22:56:56.584670 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 22:56:56 crc kubenswrapper[4795]: I0219 22:56:56.607607 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 22:56:56 crc kubenswrapper[4795]: I0219 22:56:56.607745 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 22:56:56 crc kubenswrapper[4795]: I0219 22:56:56.611544 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 22:56:58 crc kubenswrapper[4795]: I0219 22:56:58.581708 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 22:56:58 crc kubenswrapper[4795]: I0219 22:56:58.588704 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.378327 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-w9m97"] Feb 19 22:57:06 crc kubenswrapper[4795]: E0219 22:57:06.379437 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e5a1fbd-4617-434e-8719-12b16bc88b98" containerName="init" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.379457 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e5a1fbd-4617-434e-8719-12b16bc88b98" containerName="init" Feb 19 22:57:06 crc kubenswrapper[4795]: E0219 22:57:06.379478 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e5a1fbd-4617-434e-8719-12b16bc88b98" containerName="dnsmasq-dns" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.379487 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e5a1fbd-4617-434e-8719-12b16bc88b98" containerName="dnsmasq-dns" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.379703 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e5a1fbd-4617-434e-8719-12b16bc88b98" containerName="dnsmasq-dns" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.380493 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-w9m97" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.408026 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-4561-account-create-update-zf5q8"] Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.409262 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4561-account-create-update-zf5q8" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.411734 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.432041 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-w9m97"] Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.438883 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-4561-account-create-update-zf5q8"] Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.468379 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eda8a248-0107-4d34-a02b-6dbf30972c64-operator-scripts\") pod \"placement-4561-account-create-update-zf5q8\" (UID: \"eda8a248-0107-4d34-a02b-6dbf30972c64\") " pod="openstack/placement-4561-account-create-update-zf5q8" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.468537 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm798\" (UniqueName: \"kubernetes.io/projected/eda8a248-0107-4d34-a02b-6dbf30972c64-kube-api-access-lm798\") pod \"placement-4561-account-create-update-zf5q8\" (UID: \"eda8a248-0107-4d34-a02b-6dbf30972c64\") " pod="openstack/placement-4561-account-create-update-zf5q8" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.468565 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csjxz\" (UniqueName: \"kubernetes.io/projected/fcc96fc8-80e4-4dda-af2e-91390b6af829-kube-api-access-csjxz\") pod \"placement-db-create-w9m97\" (UID: \"fcc96fc8-80e4-4dda-af2e-91390b6af829\") " pod="openstack/placement-db-create-w9m97" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.468584 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcc96fc8-80e4-4dda-af2e-91390b6af829-operator-scripts\") pod \"placement-db-create-w9m97\" (UID: \"fcc96fc8-80e4-4dda-af2e-91390b6af829\") " pod="openstack/placement-db-create-w9m97" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.570211 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm798\" (UniqueName: \"kubernetes.io/projected/eda8a248-0107-4d34-a02b-6dbf30972c64-kube-api-access-lm798\") pod \"placement-4561-account-create-update-zf5q8\" (UID: \"eda8a248-0107-4d34-a02b-6dbf30972c64\") " pod="openstack/placement-4561-account-create-update-zf5q8" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.570261 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csjxz\" (UniqueName: \"kubernetes.io/projected/fcc96fc8-80e4-4dda-af2e-91390b6af829-kube-api-access-csjxz\") pod \"placement-db-create-w9m97\" (UID: \"fcc96fc8-80e4-4dda-af2e-91390b6af829\") " pod="openstack/placement-db-create-w9m97" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.570282 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcc96fc8-80e4-4dda-af2e-91390b6af829-operator-scripts\") pod \"placement-db-create-w9m97\" (UID: \"fcc96fc8-80e4-4dda-af2e-91390b6af829\") " pod="openstack/placement-db-create-w9m97" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.570432 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eda8a248-0107-4d34-a02b-6dbf30972c64-operator-scripts\") pod \"placement-4561-account-create-update-zf5q8\" (UID: \"eda8a248-0107-4d34-a02b-6dbf30972c64\") " pod="openstack/placement-4561-account-create-update-zf5q8" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.573080 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eda8a248-0107-4d34-a02b-6dbf30972c64-operator-scripts\") pod \"placement-4561-account-create-update-zf5q8\" (UID: \"eda8a248-0107-4d34-a02b-6dbf30972c64\") " pod="openstack/placement-4561-account-create-update-zf5q8" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.574410 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcc96fc8-80e4-4dda-af2e-91390b6af829-operator-scripts\") pod \"placement-db-create-w9m97\" (UID: \"fcc96fc8-80e4-4dda-af2e-91390b6af829\") " pod="openstack/placement-db-create-w9m97" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.596013 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm798\" (UniqueName: \"kubernetes.io/projected/eda8a248-0107-4d34-a02b-6dbf30972c64-kube-api-access-lm798\") pod \"placement-4561-account-create-update-zf5q8\" (UID: \"eda8a248-0107-4d34-a02b-6dbf30972c64\") " pod="openstack/placement-4561-account-create-update-zf5q8" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.599023 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csjxz\" (UniqueName: \"kubernetes.io/projected/fcc96fc8-80e4-4dda-af2e-91390b6af829-kube-api-access-csjxz\") pod \"placement-db-create-w9m97\" (UID: \"fcc96fc8-80e4-4dda-af2e-91390b6af829\") " pod="openstack/placement-db-create-w9m97" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.701370 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-w9m97" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.728836 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4561-account-create-update-zf5q8" Feb 19 22:57:07 crc kubenswrapper[4795]: I0219 22:57:07.143033 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-w9m97"] Feb 19 22:57:07 crc kubenswrapper[4795]: W0219 22:57:07.147620 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcc96fc8_80e4_4dda_af2e_91390b6af829.slice/crio-acdb38f0dceea7aa953861e7f6df090d7821626340d0baa54d270c8c4eabb860 WatchSource:0}: Error finding container acdb38f0dceea7aa953861e7f6df090d7821626340d0baa54d270c8c4eabb860: Status 404 returned error can't find the container with id acdb38f0dceea7aa953861e7f6df090d7821626340d0baa54d270c8c4eabb860 Feb 19 22:57:07 crc kubenswrapper[4795]: I0219 22:57:07.223635 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-4561-account-create-update-zf5q8"] Feb 19 22:57:07 crc kubenswrapper[4795]: I0219 22:57:07.678612 4795 generic.go:334] "Generic (PLEG): container finished" podID="fcc96fc8-80e4-4dda-af2e-91390b6af829" containerID="a92e1a832062a3ff2c20dc8cd15ae2f139e651b81902a3e2ee439ec6e20b2123" exitCode=0 Feb 19 22:57:07 crc kubenswrapper[4795]: I0219 22:57:07.678663 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-w9m97" event={"ID":"fcc96fc8-80e4-4dda-af2e-91390b6af829","Type":"ContainerDied","Data":"a92e1a832062a3ff2c20dc8cd15ae2f139e651b81902a3e2ee439ec6e20b2123"} Feb 19 22:57:07 crc kubenswrapper[4795]: I0219 22:57:07.678981 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-w9m97" event={"ID":"fcc96fc8-80e4-4dda-af2e-91390b6af829","Type":"ContainerStarted","Data":"acdb38f0dceea7aa953861e7f6df090d7821626340d0baa54d270c8c4eabb860"} Feb 19 22:57:07 crc kubenswrapper[4795]: I0219 22:57:07.681213 4795 generic.go:334] "Generic (PLEG): container finished" podID="eda8a248-0107-4d34-a02b-6dbf30972c64" containerID="dbe34d14d5acf3b90d2e3b7465f76c19cc181a6a6717aade5162b1682beb16f4" exitCode=0 Feb 19 22:57:07 crc kubenswrapper[4795]: I0219 22:57:07.681240 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4561-account-create-update-zf5q8" event={"ID":"eda8a248-0107-4d34-a02b-6dbf30972c64","Type":"ContainerDied","Data":"dbe34d14d5acf3b90d2e3b7465f76c19cc181a6a6717aade5162b1682beb16f4"} Feb 19 22:57:07 crc kubenswrapper[4795]: I0219 22:57:07.681253 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4561-account-create-update-zf5q8" event={"ID":"eda8a248-0107-4d34-a02b-6dbf30972c64","Type":"ContainerStarted","Data":"8ca94684f9202aef28c8b84cf7921d5e25e8941ae1b0a05199c45c26a6f9e78a"} Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.133221 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4561-account-create-update-zf5q8" Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.143710 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-w9m97" Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.317541 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csjxz\" (UniqueName: \"kubernetes.io/projected/fcc96fc8-80e4-4dda-af2e-91390b6af829-kube-api-access-csjxz\") pod \"fcc96fc8-80e4-4dda-af2e-91390b6af829\" (UID: \"fcc96fc8-80e4-4dda-af2e-91390b6af829\") " Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.317688 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcc96fc8-80e4-4dda-af2e-91390b6af829-operator-scripts\") pod \"fcc96fc8-80e4-4dda-af2e-91390b6af829\" (UID: \"fcc96fc8-80e4-4dda-af2e-91390b6af829\") " Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.317713 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lm798\" (UniqueName: \"kubernetes.io/projected/eda8a248-0107-4d34-a02b-6dbf30972c64-kube-api-access-lm798\") pod \"eda8a248-0107-4d34-a02b-6dbf30972c64\" (UID: \"eda8a248-0107-4d34-a02b-6dbf30972c64\") " Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.317771 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eda8a248-0107-4d34-a02b-6dbf30972c64-operator-scripts\") pod \"eda8a248-0107-4d34-a02b-6dbf30972c64\" (UID: \"eda8a248-0107-4d34-a02b-6dbf30972c64\") " Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.318258 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcc96fc8-80e4-4dda-af2e-91390b6af829-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fcc96fc8-80e4-4dda-af2e-91390b6af829" (UID: "fcc96fc8-80e4-4dda-af2e-91390b6af829"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.318505 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eda8a248-0107-4d34-a02b-6dbf30972c64-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eda8a248-0107-4d34-a02b-6dbf30972c64" (UID: "eda8a248-0107-4d34-a02b-6dbf30972c64"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.323400 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcc96fc8-80e4-4dda-af2e-91390b6af829-kube-api-access-csjxz" (OuterVolumeSpecName: "kube-api-access-csjxz") pod "fcc96fc8-80e4-4dda-af2e-91390b6af829" (UID: "fcc96fc8-80e4-4dda-af2e-91390b6af829"). InnerVolumeSpecName "kube-api-access-csjxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.326336 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eda8a248-0107-4d34-a02b-6dbf30972c64-kube-api-access-lm798" (OuterVolumeSpecName: "kube-api-access-lm798") pod "eda8a248-0107-4d34-a02b-6dbf30972c64" (UID: "eda8a248-0107-4d34-a02b-6dbf30972c64"). InnerVolumeSpecName "kube-api-access-lm798". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.420140 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eda8a248-0107-4d34-a02b-6dbf30972c64-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.420573 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csjxz\" (UniqueName: \"kubernetes.io/projected/fcc96fc8-80e4-4dda-af2e-91390b6af829-kube-api-access-csjxz\") on node \"crc\" DevicePath \"\"" Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.420591 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcc96fc8-80e4-4dda-af2e-91390b6af829-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.420602 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lm798\" (UniqueName: \"kubernetes.io/projected/eda8a248-0107-4d34-a02b-6dbf30972c64-kube-api-access-lm798\") on node \"crc\" DevicePath \"\"" Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.700302 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4561-account-create-update-zf5q8" event={"ID":"eda8a248-0107-4d34-a02b-6dbf30972c64","Type":"ContainerDied","Data":"8ca94684f9202aef28c8b84cf7921d5e25e8941ae1b0a05199c45c26a6f9e78a"} Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.700352 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ca94684f9202aef28c8b84cf7921d5e25e8941ae1b0a05199c45c26a6f9e78a" Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.700434 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4561-account-create-update-zf5q8" Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.702683 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-w9m97" event={"ID":"fcc96fc8-80e4-4dda-af2e-91390b6af829","Type":"ContainerDied","Data":"acdb38f0dceea7aa953861e7f6df090d7821626340d0baa54d270c8c4eabb860"} Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.702713 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acdb38f0dceea7aa953861e7f6df090d7821626340d0baa54d270c8c4eabb860" Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.702768 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-w9m97" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.708079 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5756cc6d89-g8t6k"] Feb 19 22:57:11 crc kubenswrapper[4795]: E0219 22:57:11.709815 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eda8a248-0107-4d34-a02b-6dbf30972c64" containerName="mariadb-account-create-update" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.709913 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="eda8a248-0107-4d34-a02b-6dbf30972c64" containerName="mariadb-account-create-update" Feb 19 22:57:11 crc kubenswrapper[4795]: E0219 22:57:11.710007 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcc96fc8-80e4-4dda-af2e-91390b6af829" containerName="mariadb-database-create" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.710085 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcc96fc8-80e4-4dda-af2e-91390b6af829" containerName="mariadb-database-create" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.710367 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcc96fc8-80e4-4dda-af2e-91390b6af829" containerName="mariadb-database-create" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.710440 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="eda8a248-0107-4d34-a02b-6dbf30972c64" containerName="mariadb-account-create-update" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.711660 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.735206 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5756cc6d89-g8t6k"] Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.739742 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-srnhx"] Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.742270 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.745955 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.746055 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.746187 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-lzv2r" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.772382 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-srnhx"] Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.862745 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-combined-ca-bundle\") pod \"placement-db-sync-srnhx\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.862790 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-ovsdbserver-sb\") pod \"dnsmasq-dns-5756cc6d89-g8t6k\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.862815 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz6z5\" (UniqueName: \"kubernetes.io/projected/19cb42f3-600f-4079-9dcd-6ba8697d5778-kube-api-access-nz6z5\") pod \"dnsmasq-dns-5756cc6d89-g8t6k\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.862989 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-scripts\") pod \"placement-db-sync-srnhx\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.863038 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7286d7ba-7f8c-4f40-a18a-d29af788c344-logs\") pod \"placement-db-sync-srnhx\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.863118 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-dns-svc\") pod \"dnsmasq-dns-5756cc6d89-g8t6k\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.863310 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss4fc\" (UniqueName: \"kubernetes.io/projected/7286d7ba-7f8c-4f40-a18a-d29af788c344-kube-api-access-ss4fc\") pod \"placement-db-sync-srnhx\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.863664 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-config-data\") pod \"placement-db-sync-srnhx\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.863827 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-ovsdbserver-nb\") pod \"dnsmasq-dns-5756cc6d89-g8t6k\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.863919 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-config\") pod \"dnsmasq-dns-5756cc6d89-g8t6k\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.966092 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-config-data\") pod \"placement-db-sync-srnhx\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.966151 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-ovsdbserver-nb\") pod \"dnsmasq-dns-5756cc6d89-g8t6k\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.966570 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-config\") pod \"dnsmasq-dns-5756cc6d89-g8t6k\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.967062 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-ovsdbserver-nb\") pod \"dnsmasq-dns-5756cc6d89-g8t6k\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.967709 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-config\") pod \"dnsmasq-dns-5756cc6d89-g8t6k\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.967790 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-combined-ca-bundle\") pod \"placement-db-sync-srnhx\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.967816 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-ovsdbserver-sb\") pod \"dnsmasq-dns-5756cc6d89-g8t6k\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.968135 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz6z5\" (UniqueName: \"kubernetes.io/projected/19cb42f3-600f-4079-9dcd-6ba8697d5778-kube-api-access-nz6z5\") pod \"dnsmasq-dns-5756cc6d89-g8t6k\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.968201 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-scripts\") pod \"placement-db-sync-srnhx\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.968221 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7286d7ba-7f8c-4f40-a18a-d29af788c344-logs\") pod \"placement-db-sync-srnhx\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.968239 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-dns-svc\") pod \"dnsmasq-dns-5756cc6d89-g8t6k\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.968278 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss4fc\" (UniqueName: \"kubernetes.io/projected/7286d7ba-7f8c-4f40-a18a-d29af788c344-kube-api-access-ss4fc\") pod \"placement-db-sync-srnhx\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.968665 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-ovsdbserver-sb\") pod \"dnsmasq-dns-5756cc6d89-g8t6k\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.969041 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7286d7ba-7f8c-4f40-a18a-d29af788c344-logs\") pod \"placement-db-sync-srnhx\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.969241 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-dns-svc\") pod \"dnsmasq-dns-5756cc6d89-g8t6k\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.973722 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-scripts\") pod \"placement-db-sync-srnhx\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.986084 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-config-data\") pod \"placement-db-sync-srnhx\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.986853 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-combined-ca-bundle\") pod \"placement-db-sync-srnhx\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.987471 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss4fc\" (UniqueName: \"kubernetes.io/projected/7286d7ba-7f8c-4f40-a18a-d29af788c344-kube-api-access-ss4fc\") pod \"placement-db-sync-srnhx\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.987655 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz6z5\" (UniqueName: \"kubernetes.io/projected/19cb42f3-600f-4079-9dcd-6ba8697d5778-kube-api-access-nz6z5\") pod \"dnsmasq-dns-5756cc6d89-g8t6k\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:12 crc kubenswrapper[4795]: I0219 22:57:12.046371 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:12 crc kubenswrapper[4795]: I0219 22:57:12.059110 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:12 crc kubenswrapper[4795]: I0219 22:57:12.536777 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5756cc6d89-g8t6k"] Feb 19 22:57:12 crc kubenswrapper[4795]: W0219 22:57:12.539578 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19cb42f3_600f_4079_9dcd_6ba8697d5778.slice/crio-1fe0f90c3062cc804bcfcc62ea300db4a0ac21188d96c43b0dd98bd184a851d6 WatchSource:0}: Error finding container 1fe0f90c3062cc804bcfcc62ea300db4a0ac21188d96c43b0dd98bd184a851d6: Status 404 returned error can't find the container with id 1fe0f90c3062cc804bcfcc62ea300db4a0ac21188d96c43b0dd98bd184a851d6 Feb 19 22:57:12 crc kubenswrapper[4795]: I0219 22:57:12.620817 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-srnhx"] Feb 19 22:57:12 crc kubenswrapper[4795]: I0219 22:57:12.736317 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-srnhx" event={"ID":"7286d7ba-7f8c-4f40-a18a-d29af788c344","Type":"ContainerStarted","Data":"39f7733084b133f8aced512072edf7f34651de80ae0434cc8b9bc380ae92dc8a"} Feb 19 22:57:12 crc kubenswrapper[4795]: I0219 22:57:12.738448 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" event={"ID":"19cb42f3-600f-4079-9dcd-6ba8697d5778","Type":"ContainerStarted","Data":"1fe0f90c3062cc804bcfcc62ea300db4a0ac21188d96c43b0dd98bd184a851d6"} Feb 19 22:57:13 crc kubenswrapper[4795]: I0219 22:57:13.752713 4795 generic.go:334] "Generic (PLEG): container finished" podID="19cb42f3-600f-4079-9dcd-6ba8697d5778" containerID="d4afbfb223a6fc5d4544602fd724857bd42e63b98d9bf900ae5cadc88d7398f7" exitCode=0 Feb 19 22:57:13 crc kubenswrapper[4795]: I0219 22:57:13.752812 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" event={"ID":"19cb42f3-600f-4079-9dcd-6ba8697d5778","Type":"ContainerDied","Data":"d4afbfb223a6fc5d4544602fd724857bd42e63b98d9bf900ae5cadc88d7398f7"} Feb 19 22:57:13 crc kubenswrapper[4795]: I0219 22:57:13.756398 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-srnhx" event={"ID":"7286d7ba-7f8c-4f40-a18a-d29af788c344","Type":"ContainerStarted","Data":"2fce75ec54d1aa6599ee05902a3278bc712bcedfa4ab8adb5ae0a48b2003dc71"} Feb 19 22:57:13 crc kubenswrapper[4795]: I0219 22:57:13.824333 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-srnhx" podStartSLOduration=2.8243080689999998 podStartE2EDuration="2.824308069s" podCreationTimestamp="2026-02-19 22:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:57:13.797460696 +0000 UTC m=+5344.989978560" watchObservedRunningTime="2026-02-19 22:57:13.824308069 +0000 UTC m=+5345.016825933" Feb 19 22:57:14 crc kubenswrapper[4795]: I0219 22:57:14.767037 4795 generic.go:334] "Generic (PLEG): container finished" podID="7286d7ba-7f8c-4f40-a18a-d29af788c344" containerID="2fce75ec54d1aa6599ee05902a3278bc712bcedfa4ab8adb5ae0a48b2003dc71" exitCode=0 Feb 19 22:57:14 crc kubenswrapper[4795]: I0219 22:57:14.767135 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-srnhx" event={"ID":"7286d7ba-7f8c-4f40-a18a-d29af788c344","Type":"ContainerDied","Data":"2fce75ec54d1aa6599ee05902a3278bc712bcedfa4ab8adb5ae0a48b2003dc71"} Feb 19 22:57:14 crc kubenswrapper[4795]: I0219 22:57:14.774057 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" event={"ID":"19cb42f3-600f-4079-9dcd-6ba8697d5778","Type":"ContainerStarted","Data":"8576ec3edc0042901926d1839c03c1f30997ed9e98b34eb8f496f95d2763762e"} Feb 19 22:57:14 crc kubenswrapper[4795]: I0219 22:57:14.775270 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:14 crc kubenswrapper[4795]: I0219 22:57:14.829603 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" podStartSLOduration=3.829572654 podStartE2EDuration="3.829572654s" podCreationTimestamp="2026-02-19 22:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:57:14.818348918 +0000 UTC m=+5346.010866792" watchObservedRunningTime="2026-02-19 22:57:14.829572654 +0000 UTC m=+5346.022090558" Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.216421 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.262354 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7286d7ba-7f8c-4f40-a18a-d29af788c344-logs\") pod \"7286d7ba-7f8c-4f40-a18a-d29af788c344\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.262445 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-config-data\") pod \"7286d7ba-7f8c-4f40-a18a-d29af788c344\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.262470 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss4fc\" (UniqueName: \"kubernetes.io/projected/7286d7ba-7f8c-4f40-a18a-d29af788c344-kube-api-access-ss4fc\") pod \"7286d7ba-7f8c-4f40-a18a-d29af788c344\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.262490 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-combined-ca-bundle\") pod \"7286d7ba-7f8c-4f40-a18a-d29af788c344\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.262513 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-scripts\") pod \"7286d7ba-7f8c-4f40-a18a-d29af788c344\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.263970 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7286d7ba-7f8c-4f40-a18a-d29af788c344-logs" (OuterVolumeSpecName: "logs") pod "7286d7ba-7f8c-4f40-a18a-d29af788c344" (UID: "7286d7ba-7f8c-4f40-a18a-d29af788c344"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.271874 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-scripts" (OuterVolumeSpecName: "scripts") pod "7286d7ba-7f8c-4f40-a18a-d29af788c344" (UID: "7286d7ba-7f8c-4f40-a18a-d29af788c344"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.273698 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7286d7ba-7f8c-4f40-a18a-d29af788c344-kube-api-access-ss4fc" (OuterVolumeSpecName: "kube-api-access-ss4fc") pod "7286d7ba-7f8c-4f40-a18a-d29af788c344" (UID: "7286d7ba-7f8c-4f40-a18a-d29af788c344"). InnerVolumeSpecName "kube-api-access-ss4fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.302076 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7286d7ba-7f8c-4f40-a18a-d29af788c344" (UID: "7286d7ba-7f8c-4f40-a18a-d29af788c344"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.312505 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-config-data" (OuterVolumeSpecName: "config-data") pod "7286d7ba-7f8c-4f40-a18a-d29af788c344" (UID: "7286d7ba-7f8c-4f40-a18a-d29af788c344"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.364503 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7286d7ba-7f8c-4f40-a18a-d29af788c344-logs\") on node \"crc\" DevicePath \"\"" Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.364540 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.364550 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss4fc\" (UniqueName: \"kubernetes.io/projected/7286d7ba-7f8c-4f40-a18a-d29af788c344-kube-api-access-ss4fc\") on node \"crc\" DevicePath \"\"" Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.364562 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.364573 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.792243 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-srnhx" event={"ID":"7286d7ba-7f8c-4f40-a18a-d29af788c344","Type":"ContainerDied","Data":"39f7733084b133f8aced512072edf7f34651de80ae0434cc8b9bc380ae92dc8a"} Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.792314 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39f7733084b133f8aced512072edf7f34651de80ae0434cc8b9bc380ae92dc8a" Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.793122 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.407920 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-764895875b-czlhk"] Feb 19 22:57:17 crc kubenswrapper[4795]: E0219 22:57:17.408680 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7286d7ba-7f8c-4f40-a18a-d29af788c344" containerName="placement-db-sync" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.408700 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7286d7ba-7f8c-4f40-a18a-d29af788c344" containerName="placement-db-sync" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.408882 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7286d7ba-7f8c-4f40-a18a-d29af788c344" containerName="placement-db-sync" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.409712 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.416581 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.427984 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.428149 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-lzv2r" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.437732 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-764895875b-czlhk"] Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.523007 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4bb335d-ad73-403a-a25f-8e6f33f60ecb-logs\") pod \"placement-764895875b-czlhk\" (UID: \"f4bb335d-ad73-403a-a25f-8e6f33f60ecb\") " pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.523229 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4bb335d-ad73-403a-a25f-8e6f33f60ecb-config-data\") pod \"placement-764895875b-czlhk\" (UID: \"f4bb335d-ad73-403a-a25f-8e6f33f60ecb\") " pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.523308 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4bb335d-ad73-403a-a25f-8e6f33f60ecb-scripts\") pod \"placement-764895875b-czlhk\" (UID: \"f4bb335d-ad73-403a-a25f-8e6f33f60ecb\") " pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.523337 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4bb335d-ad73-403a-a25f-8e6f33f60ecb-combined-ca-bundle\") pod \"placement-764895875b-czlhk\" (UID: \"f4bb335d-ad73-403a-a25f-8e6f33f60ecb\") " pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.523392 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjsf8\" (UniqueName: \"kubernetes.io/projected/f4bb335d-ad73-403a-a25f-8e6f33f60ecb-kube-api-access-kjsf8\") pod \"placement-764895875b-czlhk\" (UID: \"f4bb335d-ad73-403a-a25f-8e6f33f60ecb\") " pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.625304 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4bb335d-ad73-403a-a25f-8e6f33f60ecb-config-data\") pod \"placement-764895875b-czlhk\" (UID: \"f4bb335d-ad73-403a-a25f-8e6f33f60ecb\") " pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.625394 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4bb335d-ad73-403a-a25f-8e6f33f60ecb-scripts\") pod \"placement-764895875b-czlhk\" (UID: \"f4bb335d-ad73-403a-a25f-8e6f33f60ecb\") " pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.625416 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4bb335d-ad73-403a-a25f-8e6f33f60ecb-combined-ca-bundle\") pod \"placement-764895875b-czlhk\" (UID: \"f4bb335d-ad73-403a-a25f-8e6f33f60ecb\") " pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.625462 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjsf8\" (UniqueName: \"kubernetes.io/projected/f4bb335d-ad73-403a-a25f-8e6f33f60ecb-kube-api-access-kjsf8\") pod \"placement-764895875b-czlhk\" (UID: \"f4bb335d-ad73-403a-a25f-8e6f33f60ecb\") " pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.625506 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4bb335d-ad73-403a-a25f-8e6f33f60ecb-logs\") pod \"placement-764895875b-czlhk\" (UID: \"f4bb335d-ad73-403a-a25f-8e6f33f60ecb\") " pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.628577 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4bb335d-ad73-403a-a25f-8e6f33f60ecb-logs\") pod \"placement-764895875b-czlhk\" (UID: \"f4bb335d-ad73-403a-a25f-8e6f33f60ecb\") " pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.632801 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4bb335d-ad73-403a-a25f-8e6f33f60ecb-scripts\") pod \"placement-764895875b-czlhk\" (UID: \"f4bb335d-ad73-403a-a25f-8e6f33f60ecb\") " pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.633304 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4bb335d-ad73-403a-a25f-8e6f33f60ecb-combined-ca-bundle\") pod \"placement-764895875b-czlhk\" (UID: \"f4bb335d-ad73-403a-a25f-8e6f33f60ecb\") " pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.648030 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4bb335d-ad73-403a-a25f-8e6f33f60ecb-config-data\") pod \"placement-764895875b-czlhk\" (UID: \"f4bb335d-ad73-403a-a25f-8e6f33f60ecb\") " pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.650974 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjsf8\" (UniqueName: \"kubernetes.io/projected/f4bb335d-ad73-403a-a25f-8e6f33f60ecb-kube-api-access-kjsf8\") pod \"placement-764895875b-czlhk\" (UID: \"f4bb335d-ad73-403a-a25f-8e6f33f60ecb\") " pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.734860 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:18 crc kubenswrapper[4795]: I0219 22:57:18.236842 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-764895875b-czlhk"] Feb 19 22:57:18 crc kubenswrapper[4795]: I0219 22:57:18.820002 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-764895875b-czlhk" event={"ID":"f4bb335d-ad73-403a-a25f-8e6f33f60ecb","Type":"ContainerStarted","Data":"9476662e8a510c04df37122068019e7f7875343ef01264ebaaa67abfb0d911b4"} Feb 19 22:57:18 crc kubenswrapper[4795]: I0219 22:57:18.820343 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-764895875b-czlhk" event={"ID":"f4bb335d-ad73-403a-a25f-8e6f33f60ecb","Type":"ContainerStarted","Data":"9d11f5865edf18cb515d62ead7b9e3a7372f85f4e5b64b2b743f8320cebeb94c"} Feb 19 22:57:18 crc kubenswrapper[4795]: I0219 22:57:18.820481 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:18 crc kubenswrapper[4795]: I0219 22:57:18.820499 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-764895875b-czlhk" event={"ID":"f4bb335d-ad73-403a-a25f-8e6f33f60ecb","Type":"ContainerStarted","Data":"612dc9b36f7f89520aa4f1d4e17d7807b27e3c7ad88948a097356a1f7eb1479f"} Feb 19 22:57:18 crc kubenswrapper[4795]: I0219 22:57:18.844613 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-764895875b-czlhk" podStartSLOduration=1.844594591 podStartE2EDuration="1.844594591s" podCreationTimestamp="2026-02-19 22:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:57:18.84141541 +0000 UTC m=+5350.033933294" watchObservedRunningTime="2026-02-19 22:57:18.844594591 +0000 UTC m=+5350.037112465" Feb 19 22:57:19 crc kubenswrapper[4795]: I0219 22:57:19.831593 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.048341 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.124798 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-679b7b556f-vb5wj"] Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.125598 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" podUID="1847cbdb-2b75-48d9-ab0c-db5da5a236a4" containerName="dnsmasq-dns" containerID="cri-o://dc882b0d9e65af862350aae0c80a9f7a4b7f8f9b438717336c96a4725a891917" gracePeriod=10 Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.581778 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.624541 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-ovsdbserver-nb\") pod \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.624683 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-dns-svc\") pod \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.624728 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-config\") pod \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.624752 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmn7q\" (UniqueName: \"kubernetes.io/projected/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-kube-api-access-bmn7q\") pod \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.624806 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-ovsdbserver-sb\") pod \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.632300 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-kube-api-access-bmn7q" (OuterVolumeSpecName: "kube-api-access-bmn7q") pod "1847cbdb-2b75-48d9-ab0c-db5da5a236a4" (UID: "1847cbdb-2b75-48d9-ab0c-db5da5a236a4"). InnerVolumeSpecName "kube-api-access-bmn7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.683855 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-config" (OuterVolumeSpecName: "config") pod "1847cbdb-2b75-48d9-ab0c-db5da5a236a4" (UID: "1847cbdb-2b75-48d9-ab0c-db5da5a236a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.686693 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1847cbdb-2b75-48d9-ab0c-db5da5a236a4" (UID: "1847cbdb-2b75-48d9-ab0c-db5da5a236a4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.689523 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1847cbdb-2b75-48d9-ab0c-db5da5a236a4" (UID: "1847cbdb-2b75-48d9-ab0c-db5da5a236a4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.691516 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1847cbdb-2b75-48d9-ab0c-db5da5a236a4" (UID: "1847cbdb-2b75-48d9-ab0c-db5da5a236a4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.727017 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.727376 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-config\") on node \"crc\" DevicePath \"\"" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.727388 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmn7q\" (UniqueName: \"kubernetes.io/projected/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-kube-api-access-bmn7q\") on node \"crc\" DevicePath \"\"" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.727400 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.727409 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.856257 4795 generic.go:334] "Generic (PLEG): container finished" podID="1847cbdb-2b75-48d9-ab0c-db5da5a236a4" containerID="dc882b0d9e65af862350aae0c80a9f7a4b7f8f9b438717336c96a4725a891917" exitCode=0 Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.856306 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" event={"ID":"1847cbdb-2b75-48d9-ab0c-db5da5a236a4","Type":"ContainerDied","Data":"dc882b0d9e65af862350aae0c80a9f7a4b7f8f9b438717336c96a4725a891917"} Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.856336 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" event={"ID":"1847cbdb-2b75-48d9-ab0c-db5da5a236a4","Type":"ContainerDied","Data":"7922c7a8658b3cbc38083041c7e26029e3e327c9354e7a22ada01242a334b1d4"} Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.856353 4795 scope.go:117] "RemoveContainer" containerID="dc882b0d9e65af862350aae0c80a9f7a4b7f8f9b438717336c96a4725a891917" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.856427 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.879484 4795 scope.go:117] "RemoveContainer" containerID="23858aaac28bb5c70674b0ae3bbc7222b23e412fd7f763759a9bb0ad432d3261" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.904264 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-679b7b556f-vb5wj"] Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.910495 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-679b7b556f-vb5wj"] Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.919668 4795 scope.go:117] "RemoveContainer" containerID="dc882b0d9e65af862350aae0c80a9f7a4b7f8f9b438717336c96a4725a891917" Feb 19 22:57:22 crc kubenswrapper[4795]: E0219 22:57:22.920134 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc882b0d9e65af862350aae0c80a9f7a4b7f8f9b438717336c96a4725a891917\": container with ID starting with dc882b0d9e65af862350aae0c80a9f7a4b7f8f9b438717336c96a4725a891917 not found: ID does not exist" containerID="dc882b0d9e65af862350aae0c80a9f7a4b7f8f9b438717336c96a4725a891917" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.920178 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc882b0d9e65af862350aae0c80a9f7a4b7f8f9b438717336c96a4725a891917"} err="failed to get container status \"dc882b0d9e65af862350aae0c80a9f7a4b7f8f9b438717336c96a4725a891917\": rpc error: code = NotFound desc = could not find container \"dc882b0d9e65af862350aae0c80a9f7a4b7f8f9b438717336c96a4725a891917\": container with ID starting with dc882b0d9e65af862350aae0c80a9f7a4b7f8f9b438717336c96a4725a891917 not found: ID does not exist" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.920200 4795 scope.go:117] "RemoveContainer" containerID="23858aaac28bb5c70674b0ae3bbc7222b23e412fd7f763759a9bb0ad432d3261" Feb 19 22:57:22 crc kubenswrapper[4795]: E0219 22:57:22.920791 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23858aaac28bb5c70674b0ae3bbc7222b23e412fd7f763759a9bb0ad432d3261\": container with ID starting with 23858aaac28bb5c70674b0ae3bbc7222b23e412fd7f763759a9bb0ad432d3261 not found: ID does not exist" containerID="23858aaac28bb5c70674b0ae3bbc7222b23e412fd7f763759a9bb0ad432d3261" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.920817 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23858aaac28bb5c70674b0ae3bbc7222b23e412fd7f763759a9bb0ad432d3261"} err="failed to get container status \"23858aaac28bb5c70674b0ae3bbc7222b23e412fd7f763759a9bb0ad432d3261\": rpc error: code = NotFound desc = could not find container \"23858aaac28bb5c70674b0ae3bbc7222b23e412fd7f763759a9bb0ad432d3261\": container with ID starting with 23858aaac28bb5c70674b0ae3bbc7222b23e412fd7f763759a9bb0ad432d3261 not found: ID does not exist" Feb 19 22:57:23 crc kubenswrapper[4795]: I0219 22:57:23.524821 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1847cbdb-2b75-48d9-ab0c-db5da5a236a4" path="/var/lib/kubelet/pods/1847cbdb-2b75-48d9-ab0c-db5da5a236a4/volumes" Feb 19 22:57:48 crc kubenswrapper[4795]: I0219 22:57:48.688636 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:49 crc kubenswrapper[4795]: I0219 22:57:49.691734 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-764895875b-czlhk" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.447007 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-qp45n"] Feb 19 22:58:13 crc kubenswrapper[4795]: E0219 22:58:13.447875 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1847cbdb-2b75-48d9-ab0c-db5da5a236a4" containerName="dnsmasq-dns" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.447887 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1847cbdb-2b75-48d9-ab0c-db5da5a236a4" containerName="dnsmasq-dns" Feb 19 22:58:13 crc kubenswrapper[4795]: E0219 22:58:13.447913 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1847cbdb-2b75-48d9-ab0c-db5da5a236a4" containerName="init" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.447919 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1847cbdb-2b75-48d9-ab0c-db5da5a236a4" containerName="init" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.448069 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1847cbdb-2b75-48d9-ab0c-db5da5a236a4" containerName="dnsmasq-dns" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.448643 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qp45n" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.462720 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-qp45n"] Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.545076 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-x75sd"] Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.546487 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-x75sd" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.562235 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-x75sd"] Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.588859 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65k7f\" (UniqueName: \"kubernetes.io/projected/c14f4993-80e4-4fbf-a719-22f17750811b-kube-api-access-65k7f\") pod \"nova-api-db-create-qp45n\" (UID: \"c14f4993-80e4-4fbf-a719-22f17750811b\") " pod="openstack/nova-api-db-create-qp45n" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.589039 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c14f4993-80e4-4fbf-a719-22f17750811b-operator-scripts\") pod \"nova-api-db-create-qp45n\" (UID: \"c14f4993-80e4-4fbf-a719-22f17750811b\") " pod="openstack/nova-api-db-create-qp45n" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.658573 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-wj996"] Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.659694 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wj996" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.668752 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-c94f-account-create-update-rqkwd"] Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.670309 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c94f-account-create-update-rqkwd" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.672013 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.681239 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wj996"] Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.690371 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv5w2\" (UniqueName: \"kubernetes.io/projected/41276d39-878a-4ed2-879b-2a053340874e-kube-api-access-mv5w2\") pod \"nova-cell0-db-create-x75sd\" (UID: \"41276d39-878a-4ed2-879b-2a053340874e\") " pod="openstack/nova-cell0-db-create-x75sd" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.690436 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c14f4993-80e4-4fbf-a719-22f17750811b-operator-scripts\") pod \"nova-api-db-create-qp45n\" (UID: \"c14f4993-80e4-4fbf-a719-22f17750811b\") " pod="openstack/nova-api-db-create-qp45n" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.690516 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65k7f\" (UniqueName: \"kubernetes.io/projected/c14f4993-80e4-4fbf-a719-22f17750811b-kube-api-access-65k7f\") pod \"nova-api-db-create-qp45n\" (UID: \"c14f4993-80e4-4fbf-a719-22f17750811b\") " pod="openstack/nova-api-db-create-qp45n" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.690550 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41276d39-878a-4ed2-879b-2a053340874e-operator-scripts\") pod \"nova-cell0-db-create-x75sd\" (UID: \"41276d39-878a-4ed2-879b-2a053340874e\") " pod="openstack/nova-cell0-db-create-x75sd" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.693825 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c14f4993-80e4-4fbf-a719-22f17750811b-operator-scripts\") pod \"nova-api-db-create-qp45n\" (UID: \"c14f4993-80e4-4fbf-a719-22f17750811b\") " pod="openstack/nova-api-db-create-qp45n" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.705783 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c94f-account-create-update-rqkwd"] Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.715389 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65k7f\" (UniqueName: \"kubernetes.io/projected/c14f4993-80e4-4fbf-a719-22f17750811b-kube-api-access-65k7f\") pod \"nova-api-db-create-qp45n\" (UID: \"c14f4993-80e4-4fbf-a719-22f17750811b\") " pod="openstack/nova-api-db-create-qp45n" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.766363 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qp45n" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.791609 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pdvg\" (UniqueName: \"kubernetes.io/projected/6f811fa4-8fb3-4adc-a9a8-6539dc03494c-kube-api-access-8pdvg\") pod \"nova-cell1-db-create-wj996\" (UID: \"6f811fa4-8fb3-4adc-a9a8-6539dc03494c\") " pod="openstack/nova-cell1-db-create-wj996" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.791669 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41276d39-878a-4ed2-879b-2a053340874e-operator-scripts\") pod \"nova-cell0-db-create-x75sd\" (UID: \"41276d39-878a-4ed2-879b-2a053340874e\") " pod="openstack/nova-cell0-db-create-x75sd" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.791763 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f811fa4-8fb3-4adc-a9a8-6539dc03494c-operator-scripts\") pod \"nova-cell1-db-create-wj996\" (UID: \"6f811fa4-8fb3-4adc-a9a8-6539dc03494c\") " pod="openstack/nova-cell1-db-create-wj996" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.791817 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlgkl\" (UniqueName: \"kubernetes.io/projected/bd4e5010-15f4-499e-8279-9a1b814b5490-kube-api-access-xlgkl\") pod \"nova-api-c94f-account-create-update-rqkwd\" (UID: \"bd4e5010-15f4-499e-8279-9a1b814b5490\") " pod="openstack/nova-api-c94f-account-create-update-rqkwd" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.791846 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv5w2\" (UniqueName: \"kubernetes.io/projected/41276d39-878a-4ed2-879b-2a053340874e-kube-api-access-mv5w2\") pod \"nova-cell0-db-create-x75sd\" (UID: \"41276d39-878a-4ed2-879b-2a053340874e\") " pod="openstack/nova-cell0-db-create-x75sd" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.791888 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd4e5010-15f4-499e-8279-9a1b814b5490-operator-scripts\") pod \"nova-api-c94f-account-create-update-rqkwd\" (UID: \"bd4e5010-15f4-499e-8279-9a1b814b5490\") " pod="openstack/nova-api-c94f-account-create-update-rqkwd" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.792657 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41276d39-878a-4ed2-879b-2a053340874e-operator-scripts\") pod \"nova-cell0-db-create-x75sd\" (UID: \"41276d39-878a-4ed2-879b-2a053340874e\") " pod="openstack/nova-cell0-db-create-x75sd" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.813525 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv5w2\" (UniqueName: \"kubernetes.io/projected/41276d39-878a-4ed2-879b-2a053340874e-kube-api-access-mv5w2\") pod \"nova-cell0-db-create-x75sd\" (UID: \"41276d39-878a-4ed2-879b-2a053340874e\") " pod="openstack/nova-cell0-db-create-x75sd" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.862395 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-x75sd" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.895617 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-93a1-account-create-update-fsvms"] Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.897819 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-93a1-account-create-update-fsvms" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.898467 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd4e5010-15f4-499e-8279-9a1b814b5490-operator-scripts\") pod \"nova-api-c94f-account-create-update-rqkwd\" (UID: \"bd4e5010-15f4-499e-8279-9a1b814b5490\") " pod="openstack/nova-api-c94f-account-create-update-rqkwd" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.898613 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pdvg\" (UniqueName: \"kubernetes.io/projected/6f811fa4-8fb3-4adc-a9a8-6539dc03494c-kube-api-access-8pdvg\") pod \"nova-cell1-db-create-wj996\" (UID: \"6f811fa4-8fb3-4adc-a9a8-6539dc03494c\") " pod="openstack/nova-cell1-db-create-wj996" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.898780 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f811fa4-8fb3-4adc-a9a8-6539dc03494c-operator-scripts\") pod \"nova-cell1-db-create-wj996\" (UID: \"6f811fa4-8fb3-4adc-a9a8-6539dc03494c\") " pod="openstack/nova-cell1-db-create-wj996" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.898934 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlgkl\" (UniqueName: \"kubernetes.io/projected/bd4e5010-15f4-499e-8279-9a1b814b5490-kube-api-access-xlgkl\") pod \"nova-api-c94f-account-create-update-rqkwd\" (UID: \"bd4e5010-15f4-499e-8279-9a1b814b5490\") " pod="openstack/nova-api-c94f-account-create-update-rqkwd" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.900709 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.902707 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f811fa4-8fb3-4adc-a9a8-6539dc03494c-operator-scripts\") pod \"nova-cell1-db-create-wj996\" (UID: \"6f811fa4-8fb3-4adc-a9a8-6539dc03494c\") " pod="openstack/nova-cell1-db-create-wj996" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.918009 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-93a1-account-create-update-fsvms"] Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.921837 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd4e5010-15f4-499e-8279-9a1b814b5490-operator-scripts\") pod \"nova-api-c94f-account-create-update-rqkwd\" (UID: \"bd4e5010-15f4-499e-8279-9a1b814b5490\") " pod="openstack/nova-api-c94f-account-create-update-rqkwd" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.926510 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlgkl\" (UniqueName: \"kubernetes.io/projected/bd4e5010-15f4-499e-8279-9a1b814b5490-kube-api-access-xlgkl\") pod \"nova-api-c94f-account-create-update-rqkwd\" (UID: \"bd4e5010-15f4-499e-8279-9a1b814b5490\") " pod="openstack/nova-api-c94f-account-create-update-rqkwd" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.930257 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pdvg\" (UniqueName: \"kubernetes.io/projected/6f811fa4-8fb3-4adc-a9a8-6539dc03494c-kube-api-access-8pdvg\") pod \"nova-cell1-db-create-wj996\" (UID: \"6f811fa4-8fb3-4adc-a9a8-6539dc03494c\") " pod="openstack/nova-cell1-db-create-wj996" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.978353 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wj996" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.988523 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c94f-account-create-update-rqkwd" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.002314 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32478c4a-a97f-4fd3-84f0-a3c221beefe9-operator-scripts\") pod \"nova-cell0-93a1-account-create-update-fsvms\" (UID: \"32478c4a-a97f-4fd3-84f0-a3c221beefe9\") " pod="openstack/nova-cell0-93a1-account-create-update-fsvms" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.002355 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdgwq\" (UniqueName: \"kubernetes.io/projected/32478c4a-a97f-4fd3-84f0-a3c221beefe9-kube-api-access-pdgwq\") pod \"nova-cell0-93a1-account-create-update-fsvms\" (UID: \"32478c4a-a97f-4fd3-84f0-a3c221beefe9\") " pod="openstack/nova-cell0-93a1-account-create-update-fsvms" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.064566 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-72a3-account-create-update-bfhbs"] Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.066940 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-72a3-account-create-update-bfhbs" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.069227 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.078094 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-72a3-account-create-update-bfhbs"] Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.104910 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32478c4a-a97f-4fd3-84f0-a3c221beefe9-operator-scripts\") pod \"nova-cell0-93a1-account-create-update-fsvms\" (UID: \"32478c4a-a97f-4fd3-84f0-a3c221beefe9\") " pod="openstack/nova-cell0-93a1-account-create-update-fsvms" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.104958 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdgwq\" (UniqueName: \"kubernetes.io/projected/32478c4a-a97f-4fd3-84f0-a3c221beefe9-kube-api-access-pdgwq\") pod \"nova-cell0-93a1-account-create-update-fsvms\" (UID: \"32478c4a-a97f-4fd3-84f0-a3c221beefe9\") " pod="openstack/nova-cell0-93a1-account-create-update-fsvms" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.106041 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32478c4a-a97f-4fd3-84f0-a3c221beefe9-operator-scripts\") pod \"nova-cell0-93a1-account-create-update-fsvms\" (UID: \"32478c4a-a97f-4fd3-84f0-a3c221beefe9\") " pod="openstack/nova-cell0-93a1-account-create-update-fsvms" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.140298 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdgwq\" (UniqueName: \"kubernetes.io/projected/32478c4a-a97f-4fd3-84f0-a3c221beefe9-kube-api-access-pdgwq\") pod \"nova-cell0-93a1-account-create-update-fsvms\" (UID: \"32478c4a-a97f-4fd3-84f0-a3c221beefe9\") " pod="openstack/nova-cell0-93a1-account-create-update-fsvms" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.206915 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6f38e11-ea05-447d-8564-117c0f589d88-operator-scripts\") pod \"nova-cell1-72a3-account-create-update-bfhbs\" (UID: \"b6f38e11-ea05-447d-8564-117c0f589d88\") " pod="openstack/nova-cell1-72a3-account-create-update-bfhbs" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.207298 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sctl\" (UniqueName: \"kubernetes.io/projected/b6f38e11-ea05-447d-8564-117c0f589d88-kube-api-access-6sctl\") pod \"nova-cell1-72a3-account-create-update-bfhbs\" (UID: \"b6f38e11-ea05-447d-8564-117c0f589d88\") " pod="openstack/nova-cell1-72a3-account-create-update-bfhbs" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.236203 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-qp45n"] Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.279922 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-93a1-account-create-update-fsvms" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.308601 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6f38e11-ea05-447d-8564-117c0f589d88-operator-scripts\") pod \"nova-cell1-72a3-account-create-update-bfhbs\" (UID: \"b6f38e11-ea05-447d-8564-117c0f589d88\") " pod="openstack/nova-cell1-72a3-account-create-update-bfhbs" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.308706 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sctl\" (UniqueName: \"kubernetes.io/projected/b6f38e11-ea05-447d-8564-117c0f589d88-kube-api-access-6sctl\") pod \"nova-cell1-72a3-account-create-update-bfhbs\" (UID: \"b6f38e11-ea05-447d-8564-117c0f589d88\") " pod="openstack/nova-cell1-72a3-account-create-update-bfhbs" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.309712 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6f38e11-ea05-447d-8564-117c0f589d88-operator-scripts\") pod \"nova-cell1-72a3-account-create-update-bfhbs\" (UID: \"b6f38e11-ea05-447d-8564-117c0f589d88\") " pod="openstack/nova-cell1-72a3-account-create-update-bfhbs" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.326116 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sctl\" (UniqueName: \"kubernetes.io/projected/b6f38e11-ea05-447d-8564-117c0f589d88-kube-api-access-6sctl\") pod \"nova-cell1-72a3-account-create-update-bfhbs\" (UID: \"b6f38e11-ea05-447d-8564-117c0f589d88\") " pod="openstack/nova-cell1-72a3-account-create-update-bfhbs" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.348395 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wj996"] Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.351528 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qp45n" event={"ID":"c14f4993-80e4-4fbf-a719-22f17750811b","Type":"ContainerStarted","Data":"8451213e68f88b50acd285c2a04d0acc0ec369b3f0e6b3da41528231547370ea"} Feb 19 22:58:14 crc kubenswrapper[4795]: W0219 22:58:14.356066 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f811fa4_8fb3_4adc_a9a8_6539dc03494c.slice/crio-60c4b3b880dd656d352efc165eef512c47015e56a27562cc52b105ed22f11634 WatchSource:0}: Error finding container 60c4b3b880dd656d352efc165eef512c47015e56a27562cc52b105ed22f11634: Status 404 returned error can't find the container with id 60c4b3b880dd656d352efc165eef512c47015e56a27562cc52b105ed22f11634 Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.370933 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-x75sd"] Feb 19 22:58:14 crc kubenswrapper[4795]: W0219 22:58:14.376903 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41276d39_878a_4ed2_879b_2a053340874e.slice/crio-efd278d8aeafc00856f0283302cda53cec21bb4e4b76e85ab39284420a8b19de WatchSource:0}: Error finding container efd278d8aeafc00856f0283302cda53cec21bb4e4b76e85ab39284420a8b19de: Status 404 returned error can't find the container with id efd278d8aeafc00856f0283302cda53cec21bb4e4b76e85ab39284420a8b19de Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.393775 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-72a3-account-create-update-bfhbs" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.492123 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c94f-account-create-update-rqkwd"] Feb 19 22:58:14 crc kubenswrapper[4795]: W0219 22:58:14.496977 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd4e5010_15f4_499e_8279_9a1b814b5490.slice/crio-74fe95b965d4f42b627e8a59a18179c1dc0815ee781cc682da8c01cc899b82ae WatchSource:0}: Error finding container 74fe95b965d4f42b627e8a59a18179c1dc0815ee781cc682da8c01cc899b82ae: Status 404 returned error can't find the container with id 74fe95b965d4f42b627e8a59a18179c1dc0815ee781cc682da8c01cc899b82ae Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.726622 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-93a1-account-create-update-fsvms"] Feb 19 22:58:14 crc kubenswrapper[4795]: W0219 22:58:14.736690 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32478c4a_a97f_4fd3_84f0_a3c221beefe9.slice/crio-71d309d67a45889afd20dc4c0e2cf14a0a5ee60ed196b26ebb504301dee4774c WatchSource:0}: Error finding container 71d309d67a45889afd20dc4c0e2cf14a0a5ee60ed196b26ebb504301dee4774c: Status 404 returned error can't find the container with id 71d309d67a45889afd20dc4c0e2cf14a0a5ee60ed196b26ebb504301dee4774c Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.886070 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-72a3-account-create-update-bfhbs"] Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.359149 4795 generic.go:334] "Generic (PLEG): container finished" podID="41276d39-878a-4ed2-879b-2a053340874e" containerID="aa245a9fc5b7b00ccfbfeb6940d75331800d0a653cedd61f24d6b1162fb6f41d" exitCode=0 Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.359224 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-x75sd" event={"ID":"41276d39-878a-4ed2-879b-2a053340874e","Type":"ContainerDied","Data":"aa245a9fc5b7b00ccfbfeb6940d75331800d0a653cedd61f24d6b1162fb6f41d"} Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.359293 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-x75sd" event={"ID":"41276d39-878a-4ed2-879b-2a053340874e","Type":"ContainerStarted","Data":"efd278d8aeafc00856f0283302cda53cec21bb4e4b76e85ab39284420a8b19de"} Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.361337 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-93a1-account-create-update-fsvms" event={"ID":"32478c4a-a97f-4fd3-84f0-a3c221beefe9","Type":"ContainerStarted","Data":"886bef23bb00ea44b033b4812e88d91f6efe4ec5933a69343ea4b9bc4fed7503"} Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.361372 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-93a1-account-create-update-fsvms" event={"ID":"32478c4a-a97f-4fd3-84f0-a3c221beefe9","Type":"ContainerStarted","Data":"71d309d67a45889afd20dc4c0e2cf14a0a5ee60ed196b26ebb504301dee4774c"} Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.363020 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-72a3-account-create-update-bfhbs" event={"ID":"b6f38e11-ea05-447d-8564-117c0f589d88","Type":"ContainerStarted","Data":"0a09a6dbfb16ba30386d5f0a4f52128dc79107a79561ffd233f9f60e4e73100d"} Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.363065 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-72a3-account-create-update-bfhbs" event={"ID":"b6f38e11-ea05-447d-8564-117c0f589d88","Type":"ContainerStarted","Data":"b2256e32f6f1b01f32e54b251404e523a23094a46721ff340018bd197207ffa0"} Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.364593 4795 generic.go:334] "Generic (PLEG): container finished" podID="c14f4993-80e4-4fbf-a719-22f17750811b" containerID="00e59b7398cdff96aaf8a625c69d45b06909964fc853d0d0c0e9c6a12321c27b" exitCode=0 Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.364666 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qp45n" event={"ID":"c14f4993-80e4-4fbf-a719-22f17750811b","Type":"ContainerDied","Data":"00e59b7398cdff96aaf8a625c69d45b06909964fc853d0d0c0e9c6a12321c27b"} Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.366200 4795 generic.go:334] "Generic (PLEG): container finished" podID="bd4e5010-15f4-499e-8279-9a1b814b5490" containerID="4ee5bc336806db9668bb12f08ece22f6cb10a6fe341f3f2abd0e9309ef59d764" exitCode=0 Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.366265 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c94f-account-create-update-rqkwd" event={"ID":"bd4e5010-15f4-499e-8279-9a1b814b5490","Type":"ContainerDied","Data":"4ee5bc336806db9668bb12f08ece22f6cb10a6fe341f3f2abd0e9309ef59d764"} Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.366313 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c94f-account-create-update-rqkwd" event={"ID":"bd4e5010-15f4-499e-8279-9a1b814b5490","Type":"ContainerStarted","Data":"74fe95b965d4f42b627e8a59a18179c1dc0815ee781cc682da8c01cc899b82ae"} Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.368252 4795 generic.go:334] "Generic (PLEG): container finished" podID="6f811fa4-8fb3-4adc-a9a8-6539dc03494c" containerID="2ef66b4d7c2d7c4390b233ae4586dc431786dee8c34b27c0308b9e8e392e9643" exitCode=0 Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.368297 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wj996" event={"ID":"6f811fa4-8fb3-4adc-a9a8-6539dc03494c","Type":"ContainerDied","Data":"2ef66b4d7c2d7c4390b233ae4586dc431786dee8c34b27c0308b9e8e392e9643"} Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.368317 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wj996" event={"ID":"6f811fa4-8fb3-4adc-a9a8-6539dc03494c","Type":"ContainerStarted","Data":"60c4b3b880dd656d352efc165eef512c47015e56a27562cc52b105ed22f11634"} Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.392681 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-93a1-account-create-update-fsvms" podStartSLOduration=2.392637781 podStartE2EDuration="2.392637781s" podCreationTimestamp="2026-02-19 22:58:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:15.391597885 +0000 UTC m=+5406.584115749" watchObservedRunningTime="2026-02-19 22:58:15.392637781 +0000 UTC m=+5406.585155665" Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.428285 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-72a3-account-create-update-bfhbs" podStartSLOduration=1.428261558 podStartE2EDuration="1.428261558s" podCreationTimestamp="2026-02-19 22:58:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:15.424465681 +0000 UTC m=+5406.616983555" watchObservedRunningTime="2026-02-19 22:58:15.428261558 +0000 UTC m=+5406.620779432" Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.436537 4795 scope.go:117] "RemoveContainer" containerID="f513386e45a86e8c71f0247f1dc15d3e271452b3797cb4850f3ce460e73e544e" Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.531728 4795 scope.go:117] "RemoveContainer" containerID="ddf33d9929aa340d91607be45827bb09c61b346f1c01e148af46a4a7d07c2f45" Feb 19 22:58:16 crc kubenswrapper[4795]: I0219 22:58:16.377775 4795 generic.go:334] "Generic (PLEG): container finished" podID="b6f38e11-ea05-447d-8564-117c0f589d88" containerID="0a09a6dbfb16ba30386d5f0a4f52128dc79107a79561ffd233f9f60e4e73100d" exitCode=0 Feb 19 22:58:16 crc kubenswrapper[4795]: I0219 22:58:16.377986 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-72a3-account-create-update-bfhbs" event={"ID":"b6f38e11-ea05-447d-8564-117c0f589d88","Type":"ContainerDied","Data":"0a09a6dbfb16ba30386d5f0a4f52128dc79107a79561ffd233f9f60e4e73100d"} Feb 19 22:58:16 crc kubenswrapper[4795]: I0219 22:58:16.381711 4795 generic.go:334] "Generic (PLEG): container finished" podID="32478c4a-a97f-4fd3-84f0-a3c221beefe9" containerID="886bef23bb00ea44b033b4812e88d91f6efe4ec5933a69343ea4b9bc4fed7503" exitCode=0 Feb 19 22:58:16 crc kubenswrapper[4795]: I0219 22:58:16.381894 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-93a1-account-create-update-fsvms" event={"ID":"32478c4a-a97f-4fd3-84f0-a3c221beefe9","Type":"ContainerDied","Data":"886bef23bb00ea44b033b4812e88d91f6efe4ec5933a69343ea4b9bc4fed7503"} Feb 19 22:58:16 crc kubenswrapper[4795]: I0219 22:58:16.807183 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wj996" Feb 19 22:58:16 crc kubenswrapper[4795]: I0219 22:58:16.853204 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f811fa4-8fb3-4adc-a9a8-6539dc03494c-operator-scripts\") pod \"6f811fa4-8fb3-4adc-a9a8-6539dc03494c\" (UID: \"6f811fa4-8fb3-4adc-a9a8-6539dc03494c\") " Feb 19 22:58:16 crc kubenswrapper[4795]: I0219 22:58:16.853375 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pdvg\" (UniqueName: \"kubernetes.io/projected/6f811fa4-8fb3-4adc-a9a8-6539dc03494c-kube-api-access-8pdvg\") pod \"6f811fa4-8fb3-4adc-a9a8-6539dc03494c\" (UID: \"6f811fa4-8fb3-4adc-a9a8-6539dc03494c\") " Feb 19 22:58:16 crc kubenswrapper[4795]: I0219 22:58:16.857899 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f811fa4-8fb3-4adc-a9a8-6539dc03494c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6f811fa4-8fb3-4adc-a9a8-6539dc03494c" (UID: "6f811fa4-8fb3-4adc-a9a8-6539dc03494c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:58:16 crc kubenswrapper[4795]: I0219 22:58:16.879526 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f811fa4-8fb3-4adc-a9a8-6539dc03494c-kube-api-access-8pdvg" (OuterVolumeSpecName: "kube-api-access-8pdvg") pod "6f811fa4-8fb3-4adc-a9a8-6539dc03494c" (UID: "6f811fa4-8fb3-4adc-a9a8-6539dc03494c"). InnerVolumeSpecName "kube-api-access-8pdvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:16 crc kubenswrapper[4795]: I0219 22:58:16.935045 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qp45n" Feb 19 22:58:16 crc kubenswrapper[4795]: I0219 22:58:16.942514 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-x75sd" Feb 19 22:58:16 crc kubenswrapper[4795]: I0219 22:58:16.949447 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c94f-account-create-update-rqkwd" Feb 19 22:58:16 crc kubenswrapper[4795]: I0219 22:58:16.963521 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f811fa4-8fb3-4adc-a9a8-6539dc03494c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:16 crc kubenswrapper[4795]: I0219 22:58:16.964010 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pdvg\" (UniqueName: \"kubernetes.io/projected/6f811fa4-8fb3-4adc-a9a8-6539dc03494c-kube-api-access-8pdvg\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.064850 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mv5w2\" (UniqueName: \"kubernetes.io/projected/41276d39-878a-4ed2-879b-2a053340874e-kube-api-access-mv5w2\") pod \"41276d39-878a-4ed2-879b-2a053340874e\" (UID: \"41276d39-878a-4ed2-879b-2a053340874e\") " Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.064931 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65k7f\" (UniqueName: \"kubernetes.io/projected/c14f4993-80e4-4fbf-a719-22f17750811b-kube-api-access-65k7f\") pod \"c14f4993-80e4-4fbf-a719-22f17750811b\" (UID: \"c14f4993-80e4-4fbf-a719-22f17750811b\") " Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.065131 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd4e5010-15f4-499e-8279-9a1b814b5490-operator-scripts\") pod \"bd4e5010-15f4-499e-8279-9a1b814b5490\" (UID: \"bd4e5010-15f4-499e-8279-9a1b814b5490\") " Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.065231 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41276d39-878a-4ed2-879b-2a053340874e-operator-scripts\") pod \"41276d39-878a-4ed2-879b-2a053340874e\" (UID: \"41276d39-878a-4ed2-879b-2a053340874e\") " Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.065249 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlgkl\" (UniqueName: \"kubernetes.io/projected/bd4e5010-15f4-499e-8279-9a1b814b5490-kube-api-access-xlgkl\") pod \"bd4e5010-15f4-499e-8279-9a1b814b5490\" (UID: \"bd4e5010-15f4-499e-8279-9a1b814b5490\") " Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.065291 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c14f4993-80e4-4fbf-a719-22f17750811b-operator-scripts\") pod \"c14f4993-80e4-4fbf-a719-22f17750811b\" (UID: \"c14f4993-80e4-4fbf-a719-22f17750811b\") " Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.065804 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41276d39-878a-4ed2-879b-2a053340874e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "41276d39-878a-4ed2-879b-2a053340874e" (UID: "41276d39-878a-4ed2-879b-2a053340874e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.065875 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd4e5010-15f4-499e-8279-9a1b814b5490-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bd4e5010-15f4-499e-8279-9a1b814b5490" (UID: "bd4e5010-15f4-499e-8279-9a1b814b5490"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.065975 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c14f4993-80e4-4fbf-a719-22f17750811b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c14f4993-80e4-4fbf-a719-22f17750811b" (UID: "c14f4993-80e4-4fbf-a719-22f17750811b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.068510 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c14f4993-80e4-4fbf-a719-22f17750811b-kube-api-access-65k7f" (OuterVolumeSpecName: "kube-api-access-65k7f") pod "c14f4993-80e4-4fbf-a719-22f17750811b" (UID: "c14f4993-80e4-4fbf-a719-22f17750811b"). InnerVolumeSpecName "kube-api-access-65k7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.069287 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd4e5010-15f4-499e-8279-9a1b814b5490-kube-api-access-xlgkl" (OuterVolumeSpecName: "kube-api-access-xlgkl") pod "bd4e5010-15f4-499e-8279-9a1b814b5490" (UID: "bd4e5010-15f4-499e-8279-9a1b814b5490"). InnerVolumeSpecName "kube-api-access-xlgkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.070757 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41276d39-878a-4ed2-879b-2a053340874e-kube-api-access-mv5w2" (OuterVolumeSpecName: "kube-api-access-mv5w2") pod "41276d39-878a-4ed2-879b-2a053340874e" (UID: "41276d39-878a-4ed2-879b-2a053340874e"). InnerVolumeSpecName "kube-api-access-mv5w2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.167249 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41276d39-878a-4ed2-879b-2a053340874e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.167286 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlgkl\" (UniqueName: \"kubernetes.io/projected/bd4e5010-15f4-499e-8279-9a1b814b5490-kube-api-access-xlgkl\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.167298 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c14f4993-80e4-4fbf-a719-22f17750811b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.167307 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mv5w2\" (UniqueName: \"kubernetes.io/projected/41276d39-878a-4ed2-879b-2a053340874e-kube-api-access-mv5w2\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.167316 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65k7f\" (UniqueName: \"kubernetes.io/projected/c14f4993-80e4-4fbf-a719-22f17750811b-kube-api-access-65k7f\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.167324 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd4e5010-15f4-499e-8279-9a1b814b5490-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.394891 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-x75sd" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.394895 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-x75sd" event={"ID":"41276d39-878a-4ed2-879b-2a053340874e","Type":"ContainerDied","Data":"efd278d8aeafc00856f0283302cda53cec21bb4e4b76e85ab39284420a8b19de"} Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.394948 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efd278d8aeafc00856f0283302cda53cec21bb4e4b76e85ab39284420a8b19de" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.396763 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qp45n" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.396767 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qp45n" event={"ID":"c14f4993-80e4-4fbf-a719-22f17750811b","Type":"ContainerDied","Data":"8451213e68f88b50acd285c2a04d0acc0ec369b3f0e6b3da41528231547370ea"} Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.396809 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8451213e68f88b50acd285c2a04d0acc0ec369b3f0e6b3da41528231547370ea" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.399657 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c94f-account-create-update-rqkwd" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.399650 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c94f-account-create-update-rqkwd" event={"ID":"bd4e5010-15f4-499e-8279-9a1b814b5490","Type":"ContainerDied","Data":"74fe95b965d4f42b627e8a59a18179c1dc0815ee781cc682da8c01cc899b82ae"} Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.400049 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74fe95b965d4f42b627e8a59a18179c1dc0815ee781cc682da8c01cc899b82ae" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.401541 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wj996" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.403294 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wj996" event={"ID":"6f811fa4-8fb3-4adc-a9a8-6539dc03494c","Type":"ContainerDied","Data":"60c4b3b880dd656d352efc165eef512c47015e56a27562cc52b105ed22f11634"} Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.403500 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60c4b3b880dd656d352efc165eef512c47015e56a27562cc52b105ed22f11634" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.695502 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-93a1-account-create-update-fsvms" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.778561 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32478c4a-a97f-4fd3-84f0-a3c221beefe9-operator-scripts\") pod \"32478c4a-a97f-4fd3-84f0-a3c221beefe9\" (UID: \"32478c4a-a97f-4fd3-84f0-a3c221beefe9\") " Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.778737 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdgwq\" (UniqueName: \"kubernetes.io/projected/32478c4a-a97f-4fd3-84f0-a3c221beefe9-kube-api-access-pdgwq\") pod \"32478c4a-a97f-4fd3-84f0-a3c221beefe9\" (UID: \"32478c4a-a97f-4fd3-84f0-a3c221beefe9\") " Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.779107 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32478c4a-a97f-4fd3-84f0-a3c221beefe9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "32478c4a-a97f-4fd3-84f0-a3c221beefe9" (UID: "32478c4a-a97f-4fd3-84f0-a3c221beefe9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.779237 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32478c4a-a97f-4fd3-84f0-a3c221beefe9-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.784280 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32478c4a-a97f-4fd3-84f0-a3c221beefe9-kube-api-access-pdgwq" (OuterVolumeSpecName: "kube-api-access-pdgwq") pod "32478c4a-a97f-4fd3-84f0-a3c221beefe9" (UID: "32478c4a-a97f-4fd3-84f0-a3c221beefe9"). InnerVolumeSpecName "kube-api-access-pdgwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.853911 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-72a3-account-create-update-bfhbs" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.881209 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdgwq\" (UniqueName: \"kubernetes.io/projected/32478c4a-a97f-4fd3-84f0-a3c221beefe9-kube-api-access-pdgwq\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.982761 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6f38e11-ea05-447d-8564-117c0f589d88-operator-scripts\") pod \"b6f38e11-ea05-447d-8564-117c0f589d88\" (UID: \"b6f38e11-ea05-447d-8564-117c0f589d88\") " Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.982891 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sctl\" (UniqueName: \"kubernetes.io/projected/b6f38e11-ea05-447d-8564-117c0f589d88-kube-api-access-6sctl\") pod \"b6f38e11-ea05-447d-8564-117c0f589d88\" (UID: \"b6f38e11-ea05-447d-8564-117c0f589d88\") " Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.983536 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6f38e11-ea05-447d-8564-117c0f589d88-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b6f38e11-ea05-447d-8564-117c0f589d88" (UID: "b6f38e11-ea05-447d-8564-117c0f589d88"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.985994 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6f38e11-ea05-447d-8564-117c0f589d88-kube-api-access-6sctl" (OuterVolumeSpecName: "kube-api-access-6sctl") pod "b6f38e11-ea05-447d-8564-117c0f589d88" (UID: "b6f38e11-ea05-447d-8564-117c0f589d88"). InnerVolumeSpecName "kube-api-access-6sctl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:18 crc kubenswrapper[4795]: I0219 22:58:18.085111 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6f38e11-ea05-447d-8564-117c0f589d88-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:18 crc kubenswrapper[4795]: I0219 22:58:18.085144 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sctl\" (UniqueName: \"kubernetes.io/projected/b6f38e11-ea05-447d-8564-117c0f589d88-kube-api-access-6sctl\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:18 crc kubenswrapper[4795]: I0219 22:58:18.414015 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-93a1-account-create-update-fsvms" event={"ID":"32478c4a-a97f-4fd3-84f0-a3c221beefe9","Type":"ContainerDied","Data":"71d309d67a45889afd20dc4c0e2cf14a0a5ee60ed196b26ebb504301dee4774c"} Feb 19 22:58:18 crc kubenswrapper[4795]: I0219 22:58:18.414072 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71d309d67a45889afd20dc4c0e2cf14a0a5ee60ed196b26ebb504301dee4774c" Feb 19 22:58:18 crc kubenswrapper[4795]: I0219 22:58:18.414335 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-93a1-account-create-update-fsvms" Feb 19 22:58:18 crc kubenswrapper[4795]: I0219 22:58:18.417977 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-72a3-account-create-update-bfhbs" event={"ID":"b6f38e11-ea05-447d-8564-117c0f589d88","Type":"ContainerDied","Data":"b2256e32f6f1b01f32e54b251404e523a23094a46721ff340018bd197207ffa0"} Feb 19 22:58:18 crc kubenswrapper[4795]: I0219 22:58:18.418038 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2256e32f6f1b01f32e54b251404e523a23094a46721ff340018bd197207ffa0" Feb 19 22:58:18 crc kubenswrapper[4795]: I0219 22:58:18.418122 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-72a3-account-create-update-bfhbs" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.057310 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9nmbd"] Feb 19 22:58:19 crc kubenswrapper[4795]: E0219 22:58:19.057981 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c14f4993-80e4-4fbf-a719-22f17750811b" containerName="mariadb-database-create" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.057998 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c14f4993-80e4-4fbf-a719-22f17750811b" containerName="mariadb-database-create" Feb 19 22:58:19 crc kubenswrapper[4795]: E0219 22:58:19.058032 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd4e5010-15f4-499e-8279-9a1b814b5490" containerName="mariadb-account-create-update" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.058040 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd4e5010-15f4-499e-8279-9a1b814b5490" containerName="mariadb-account-create-update" Feb 19 22:58:19 crc kubenswrapper[4795]: E0219 22:58:19.058061 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32478c4a-a97f-4fd3-84f0-a3c221beefe9" containerName="mariadb-account-create-update" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.058070 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="32478c4a-a97f-4fd3-84f0-a3c221beefe9" containerName="mariadb-account-create-update" Feb 19 22:58:19 crc kubenswrapper[4795]: E0219 22:58:19.058085 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f811fa4-8fb3-4adc-a9a8-6539dc03494c" containerName="mariadb-database-create" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.058092 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f811fa4-8fb3-4adc-a9a8-6539dc03494c" containerName="mariadb-database-create" Feb 19 22:58:19 crc kubenswrapper[4795]: E0219 22:58:19.058109 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41276d39-878a-4ed2-879b-2a053340874e" containerName="mariadb-database-create" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.058116 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="41276d39-878a-4ed2-879b-2a053340874e" containerName="mariadb-database-create" Feb 19 22:58:19 crc kubenswrapper[4795]: E0219 22:58:19.058129 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6f38e11-ea05-447d-8564-117c0f589d88" containerName="mariadb-account-create-update" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.058137 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6f38e11-ea05-447d-8564-117c0f589d88" containerName="mariadb-account-create-update" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.058347 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c14f4993-80e4-4fbf-a719-22f17750811b" containerName="mariadb-database-create" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.058363 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd4e5010-15f4-499e-8279-9a1b814b5490" containerName="mariadb-account-create-update" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.058378 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="32478c4a-a97f-4fd3-84f0-a3c221beefe9" containerName="mariadb-account-create-update" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.058392 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6f38e11-ea05-447d-8564-117c0f589d88" containerName="mariadb-account-create-update" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.058410 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f811fa4-8fb3-4adc-a9a8-6539dc03494c" containerName="mariadb-database-create" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.058426 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="41276d39-878a-4ed2-879b-2a053340874e" containerName="mariadb-database-create" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.059133 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9nmbd" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.062119 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-ghzps" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.062151 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.062406 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.070049 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9nmbd"] Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.203866 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-scripts\") pod \"nova-cell0-conductor-db-sync-9nmbd\" (UID: \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\") " pod="openstack/nova-cell0-conductor-db-sync-9nmbd" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.204136 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-config-data\") pod \"nova-cell0-conductor-db-sync-9nmbd\" (UID: \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\") " pod="openstack/nova-cell0-conductor-db-sync-9nmbd" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.204417 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lplk\" (UniqueName: \"kubernetes.io/projected/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-kube-api-access-5lplk\") pod \"nova-cell0-conductor-db-sync-9nmbd\" (UID: \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\") " pod="openstack/nova-cell0-conductor-db-sync-9nmbd" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.204493 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-9nmbd\" (UID: \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\") " pod="openstack/nova-cell0-conductor-db-sync-9nmbd" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.305507 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-9nmbd\" (UID: \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\") " pod="openstack/nova-cell0-conductor-db-sync-9nmbd" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.305557 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-scripts\") pod \"nova-cell0-conductor-db-sync-9nmbd\" (UID: \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\") " pod="openstack/nova-cell0-conductor-db-sync-9nmbd" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.305648 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-config-data\") pod \"nova-cell0-conductor-db-sync-9nmbd\" (UID: \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\") " pod="openstack/nova-cell0-conductor-db-sync-9nmbd" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.305730 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lplk\" (UniqueName: \"kubernetes.io/projected/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-kube-api-access-5lplk\") pod \"nova-cell0-conductor-db-sync-9nmbd\" (UID: \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\") " pod="openstack/nova-cell0-conductor-db-sync-9nmbd" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.310725 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-scripts\") pod \"nova-cell0-conductor-db-sync-9nmbd\" (UID: \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\") " pod="openstack/nova-cell0-conductor-db-sync-9nmbd" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.310918 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-9nmbd\" (UID: \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\") " pod="openstack/nova-cell0-conductor-db-sync-9nmbd" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.311872 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-config-data\") pod \"nova-cell0-conductor-db-sync-9nmbd\" (UID: \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\") " pod="openstack/nova-cell0-conductor-db-sync-9nmbd" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.333930 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lplk\" (UniqueName: \"kubernetes.io/projected/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-kube-api-access-5lplk\") pod \"nova-cell0-conductor-db-sync-9nmbd\" (UID: \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\") " pod="openstack/nova-cell0-conductor-db-sync-9nmbd" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.385395 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9nmbd" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.808998 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9nmbd"] Feb 19 22:58:19 crc kubenswrapper[4795]: W0219 22:58:19.810667 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod303b3e4f_4b2b_4071_b54d_fe4aec3f18f5.slice/crio-c029b8fb1d770dd06af2884e81d13768a10d32b4265b805c747ad400ca0ceb30 WatchSource:0}: Error finding container c029b8fb1d770dd06af2884e81d13768a10d32b4265b805c747ad400ca0ceb30: Status 404 returned error can't find the container with id c029b8fb1d770dd06af2884e81d13768a10d32b4265b805c747ad400ca0ceb30 Feb 19 22:58:20 crc kubenswrapper[4795]: I0219 22:58:20.459225 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9nmbd" event={"ID":"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5","Type":"ContainerStarted","Data":"eb9dfa4e109128ae704882fd6740207250d7781752ee79e05d86190dadc20b22"} Feb 19 22:58:20 crc kubenswrapper[4795]: I0219 22:58:20.459281 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9nmbd" event={"ID":"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5","Type":"ContainerStarted","Data":"c029b8fb1d770dd06af2884e81d13768a10d32b4265b805c747ad400ca0ceb30"} Feb 19 22:58:20 crc kubenswrapper[4795]: I0219 22:58:20.472755 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-9nmbd" podStartSLOduration=1.472734107 podStartE2EDuration="1.472734107s" podCreationTimestamp="2026-02-19 22:58:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:20.472402738 +0000 UTC m=+5411.664920632" watchObservedRunningTime="2026-02-19 22:58:20.472734107 +0000 UTC m=+5411.665251971" Feb 19 22:58:25 crc kubenswrapper[4795]: I0219 22:58:25.510355 4795 generic.go:334] "Generic (PLEG): container finished" podID="303b3e4f-4b2b-4071-b54d-fe4aec3f18f5" containerID="eb9dfa4e109128ae704882fd6740207250d7781752ee79e05d86190dadc20b22" exitCode=0 Feb 19 22:58:25 crc kubenswrapper[4795]: I0219 22:58:25.510463 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9nmbd" event={"ID":"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5","Type":"ContainerDied","Data":"eb9dfa4e109128ae704882fd6740207250d7781752ee79e05d86190dadc20b22"} Feb 19 22:58:26 crc kubenswrapper[4795]: I0219 22:58:26.803699 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9nmbd" Feb 19 22:58:26 crc kubenswrapper[4795]: I0219 22:58:26.833840 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-combined-ca-bundle\") pod \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\" (UID: \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\") " Feb 19 22:58:26 crc kubenswrapper[4795]: I0219 22:58:26.833906 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-config-data\") pod \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\" (UID: \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\") " Feb 19 22:58:26 crc kubenswrapper[4795]: I0219 22:58:26.833955 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-scripts\") pod \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\" (UID: \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\") " Feb 19 22:58:26 crc kubenswrapper[4795]: I0219 22:58:26.834040 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lplk\" (UniqueName: \"kubernetes.io/projected/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-kube-api-access-5lplk\") pod \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\" (UID: \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\") " Feb 19 22:58:26 crc kubenswrapper[4795]: I0219 22:58:26.839052 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-kube-api-access-5lplk" (OuterVolumeSpecName: "kube-api-access-5lplk") pod "303b3e4f-4b2b-4071-b54d-fe4aec3f18f5" (UID: "303b3e4f-4b2b-4071-b54d-fe4aec3f18f5"). InnerVolumeSpecName "kube-api-access-5lplk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:26 crc kubenswrapper[4795]: I0219 22:58:26.839318 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-scripts" (OuterVolumeSpecName: "scripts") pod "303b3e4f-4b2b-4071-b54d-fe4aec3f18f5" (UID: "303b3e4f-4b2b-4071-b54d-fe4aec3f18f5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:26 crc kubenswrapper[4795]: I0219 22:58:26.857751 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "303b3e4f-4b2b-4071-b54d-fe4aec3f18f5" (UID: "303b3e4f-4b2b-4071-b54d-fe4aec3f18f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:26 crc kubenswrapper[4795]: I0219 22:58:26.859646 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-config-data" (OuterVolumeSpecName: "config-data") pod "303b3e4f-4b2b-4071-b54d-fe4aec3f18f5" (UID: "303b3e4f-4b2b-4071-b54d-fe4aec3f18f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:26 crc kubenswrapper[4795]: I0219 22:58:26.936350 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lplk\" (UniqueName: \"kubernetes.io/projected/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-kube-api-access-5lplk\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:26 crc kubenswrapper[4795]: I0219 22:58:26.936391 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:26 crc kubenswrapper[4795]: I0219 22:58:26.936406 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:26 crc kubenswrapper[4795]: I0219 22:58:26.936418 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.534751 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9nmbd" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.534704 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9nmbd" event={"ID":"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5","Type":"ContainerDied","Data":"c029b8fb1d770dd06af2884e81d13768a10d32b4265b805c747ad400ca0ceb30"} Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.534877 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c029b8fb1d770dd06af2884e81d13768a10d32b4265b805c747ad400ca0ceb30" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.607250 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 22:58:27 crc kubenswrapper[4795]: E0219 22:58:27.607609 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="303b3e4f-4b2b-4071-b54d-fe4aec3f18f5" containerName="nova-cell0-conductor-db-sync" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.607621 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="303b3e4f-4b2b-4071-b54d-fe4aec3f18f5" containerName="nova-cell0-conductor-db-sync" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.607768 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="303b3e4f-4b2b-4071-b54d-fe4aec3f18f5" containerName="nova-cell0-conductor-db-sync" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.608318 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.611540 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.613181 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-ghzps" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.628342 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.648819 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9839fd0b-0161-4772-bda3-ddc2914d7e83-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9839fd0b-0161-4772-bda3-ddc2914d7e83\") " pod="openstack/nova-cell0-conductor-0" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.649043 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9839fd0b-0161-4772-bda3-ddc2914d7e83-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9839fd0b-0161-4772-bda3-ddc2914d7e83\") " pod="openstack/nova-cell0-conductor-0" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.649472 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l54h5\" (UniqueName: \"kubernetes.io/projected/9839fd0b-0161-4772-bda3-ddc2914d7e83-kube-api-access-l54h5\") pod \"nova-cell0-conductor-0\" (UID: \"9839fd0b-0161-4772-bda3-ddc2914d7e83\") " pod="openstack/nova-cell0-conductor-0" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.751230 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l54h5\" (UniqueName: \"kubernetes.io/projected/9839fd0b-0161-4772-bda3-ddc2914d7e83-kube-api-access-l54h5\") pod \"nova-cell0-conductor-0\" (UID: \"9839fd0b-0161-4772-bda3-ddc2914d7e83\") " pod="openstack/nova-cell0-conductor-0" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.751314 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9839fd0b-0161-4772-bda3-ddc2914d7e83-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9839fd0b-0161-4772-bda3-ddc2914d7e83\") " pod="openstack/nova-cell0-conductor-0" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.751386 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9839fd0b-0161-4772-bda3-ddc2914d7e83-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9839fd0b-0161-4772-bda3-ddc2914d7e83\") " pod="openstack/nova-cell0-conductor-0" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.758115 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9839fd0b-0161-4772-bda3-ddc2914d7e83-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9839fd0b-0161-4772-bda3-ddc2914d7e83\") " pod="openstack/nova-cell0-conductor-0" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.758602 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9839fd0b-0161-4772-bda3-ddc2914d7e83-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9839fd0b-0161-4772-bda3-ddc2914d7e83\") " pod="openstack/nova-cell0-conductor-0" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.769152 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l54h5\" (UniqueName: \"kubernetes.io/projected/9839fd0b-0161-4772-bda3-ddc2914d7e83-kube-api-access-l54h5\") pod \"nova-cell0-conductor-0\" (UID: \"9839fd0b-0161-4772-bda3-ddc2914d7e83\") " pod="openstack/nova-cell0-conductor-0" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.927183 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 22:58:28 crc kubenswrapper[4795]: W0219 22:58:28.371134 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9839fd0b_0161_4772_bda3_ddc2914d7e83.slice/crio-453e3afffd87b59561c01a967d49235af5b257402e1ea25bffbfe37497656af8 WatchSource:0}: Error finding container 453e3afffd87b59561c01a967d49235af5b257402e1ea25bffbfe37497656af8: Status 404 returned error can't find the container with id 453e3afffd87b59561c01a967d49235af5b257402e1ea25bffbfe37497656af8 Feb 19 22:58:28 crc kubenswrapper[4795]: I0219 22:58:28.372543 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 22:58:28 crc kubenswrapper[4795]: I0219 22:58:28.428236 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:58:28 crc kubenswrapper[4795]: I0219 22:58:28.428486 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:58:28 crc kubenswrapper[4795]: I0219 22:58:28.554231 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9839fd0b-0161-4772-bda3-ddc2914d7e83","Type":"ContainerStarted","Data":"453e3afffd87b59561c01a967d49235af5b257402e1ea25bffbfe37497656af8"} Feb 19 22:58:28 crc kubenswrapper[4795]: I0219 22:58:28.554861 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 19 22:58:28 crc kubenswrapper[4795]: I0219 22:58:28.579236 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.5792107880000001 podStartE2EDuration="1.579210788s" podCreationTimestamp="2026-02-19 22:58:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:28.572400455 +0000 UTC m=+5419.764918339" watchObservedRunningTime="2026-02-19 22:58:28.579210788 +0000 UTC m=+5419.771728712" Feb 19 22:58:29 crc kubenswrapper[4795]: I0219 22:58:29.565405 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9839fd0b-0161-4772-bda3-ddc2914d7e83","Type":"ContainerStarted","Data":"23d5580181fbc629a99f1fcb391f9c4c8a3cc847543b9789039eab2e110e9ef3"} Feb 19 22:58:37 crc kubenswrapper[4795]: I0219 22:58:37.952571 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.507694 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-cdffm"] Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.508990 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cdffm" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.511804 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.512256 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.529954 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-cdffm"] Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.610790 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.613276 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.614908 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.620755 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.630699 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-scripts\") pod \"nova-cell0-cell-mapping-cdffm\" (UID: \"7314a002-868e-4028-b341-b719a609e21c\") " pod="openstack/nova-cell0-cell-mapping-cdffm" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.630765 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-config-data\") pod \"nova-cell0-cell-mapping-cdffm\" (UID: \"7314a002-868e-4028-b341-b719a609e21c\") " pod="openstack/nova-cell0-cell-mapping-cdffm" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.630921 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-cdffm\" (UID: \"7314a002-868e-4028-b341-b719a609e21c\") " pod="openstack/nova-cell0-cell-mapping-cdffm" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.631105 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgqqm\" (UniqueName: \"kubernetes.io/projected/7314a002-868e-4028-b341-b719a609e21c-kube-api-access-kgqqm\") pod \"nova-cell0-cell-mapping-cdffm\" (UID: \"7314a002-868e-4028-b341-b719a609e21c\") " pod="openstack/nova-cell0-cell-mapping-cdffm" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.702757 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.703906 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.705908 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.711267 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.733125 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7jkp\" (UniqueName: \"kubernetes.io/projected/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-kube-api-access-f7jkp\") pod \"nova-cell1-novncproxy-0\" (UID: \"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.733201 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.733227 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-cdffm\" (UID: \"7314a002-868e-4028-b341-b719a609e21c\") " pod="openstack/nova-cell0-cell-mapping-cdffm" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.733267 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgqqm\" (UniqueName: \"kubernetes.io/projected/7314a002-868e-4028-b341-b719a609e21c-kube-api-access-kgqqm\") pod \"nova-cell0-cell-mapping-cdffm\" (UID: \"7314a002-868e-4028-b341-b719a609e21c\") " pod="openstack/nova-cell0-cell-mapping-cdffm" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.733312 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-scripts\") pod \"nova-cell0-cell-mapping-cdffm\" (UID: \"7314a002-868e-4028-b341-b719a609e21c\") " pod="openstack/nova-cell0-cell-mapping-cdffm" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.733351 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.733377 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-config-data\") pod \"nova-cell0-cell-mapping-cdffm\" (UID: \"7314a002-868e-4028-b341-b719a609e21c\") " pod="openstack/nova-cell0-cell-mapping-cdffm" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.742660 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-cdffm\" (UID: \"7314a002-868e-4028-b341-b719a609e21c\") " pod="openstack/nova-cell0-cell-mapping-cdffm" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.743324 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-scripts\") pod \"nova-cell0-cell-mapping-cdffm\" (UID: \"7314a002-868e-4028-b341-b719a609e21c\") " pod="openstack/nova-cell0-cell-mapping-cdffm" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.750467 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.751784 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.752145 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-config-data\") pod \"nova-cell0-cell-mapping-cdffm\" (UID: \"7314a002-868e-4028-b341-b719a609e21c\") " pod="openstack/nova-cell0-cell-mapping-cdffm" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.755900 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.774649 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgqqm\" (UniqueName: \"kubernetes.io/projected/7314a002-868e-4028-b341-b719a609e21c-kube-api-access-kgqqm\") pod \"nova-cell0-cell-mapping-cdffm\" (UID: \"7314a002-868e-4028-b341-b719a609e21c\") " pod="openstack/nova-cell0-cell-mapping-cdffm" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.793082 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.826741 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cdffm" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.833914 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.836279 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.836322 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\") " pod="openstack/nova-metadata-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.836339 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7knpk\" (UniqueName: \"kubernetes.io/projected/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-kube-api-access-7knpk\") pod \"nova-metadata-0\" (UID: \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\") " pod="openstack/nova-metadata-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.836393 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-logs\") pod \"nova-metadata-0\" (UID: \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\") " pod="openstack/nova-metadata-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.836432 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-config-data\") pod \"nova-metadata-0\" (UID: \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\") " pod="openstack/nova-metadata-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.836467 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da050a33-d860-4577-9ce8-6d85bbdef95f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"da050a33-d860-4577-9ce8-6d85bbdef95f\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.836492 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.836511 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da050a33-d860-4577-9ce8-6d85bbdef95f-config-data\") pod \"nova-scheduler-0\" (UID: \"da050a33-d860-4577-9ce8-6d85bbdef95f\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.836529 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmxc5\" (UniqueName: \"kubernetes.io/projected/da050a33-d860-4577-9ce8-6d85bbdef95f-kube-api-access-pmxc5\") pod \"nova-scheduler-0\" (UID: \"da050a33-d860-4577-9ce8-6d85bbdef95f\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.836571 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7jkp\" (UniqueName: \"kubernetes.io/projected/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-kube-api-access-f7jkp\") pod \"nova-cell1-novncproxy-0\" (UID: \"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.853938 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.859219 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.873092 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.874799 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7jkp\" (UniqueName: \"kubernetes.io/projected/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-kube-api-access-f7jkp\") pod \"nova-cell1-novncproxy-0\" (UID: \"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.880070 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.905839 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.935546 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.938090 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1242edbc-6450-4d81-8c77-a15fe928d782-config-data\") pod \"nova-api-0\" (UID: \"1242edbc-6450-4d81-8c77-a15fe928d782\") " pod="openstack/nova-api-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.938137 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b966c788c-4sfvg"] Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.938180 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\") " pod="openstack/nova-metadata-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.938197 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7knpk\" (UniqueName: \"kubernetes.io/projected/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-kube-api-access-7knpk\") pod \"nova-metadata-0\" (UID: \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\") " pod="openstack/nova-metadata-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.938234 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrbdz\" (UniqueName: \"kubernetes.io/projected/1242edbc-6450-4d81-8c77-a15fe928d782-kube-api-access-nrbdz\") pod \"nova-api-0\" (UID: \"1242edbc-6450-4d81-8c77-a15fe928d782\") " pod="openstack/nova-api-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.938261 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-logs\") pod \"nova-metadata-0\" (UID: \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\") " pod="openstack/nova-metadata-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.938297 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-config-data\") pod \"nova-metadata-0\" (UID: \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\") " pod="openstack/nova-metadata-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.938320 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1242edbc-6450-4d81-8c77-a15fe928d782-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1242edbc-6450-4d81-8c77-a15fe928d782\") " pod="openstack/nova-api-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.938360 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da050a33-d860-4577-9ce8-6d85bbdef95f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"da050a33-d860-4577-9ce8-6d85bbdef95f\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.938396 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1242edbc-6450-4d81-8c77-a15fe928d782-logs\") pod \"nova-api-0\" (UID: \"1242edbc-6450-4d81-8c77-a15fe928d782\") " pod="openstack/nova-api-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.938415 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da050a33-d860-4577-9ce8-6d85bbdef95f-config-data\") pod \"nova-scheduler-0\" (UID: \"da050a33-d860-4577-9ce8-6d85bbdef95f\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.938435 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmxc5\" (UniqueName: \"kubernetes.io/projected/da050a33-d860-4577-9ce8-6d85bbdef95f-kube-api-access-pmxc5\") pod \"nova-scheduler-0\" (UID: \"da050a33-d860-4577-9ce8-6d85bbdef95f\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.941462 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-logs\") pod \"nova-metadata-0\" (UID: \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\") " pod="openstack/nova-metadata-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.946760 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b966c788c-4sfvg"] Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.947365 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\") " pod="openstack/nova-metadata-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.955060 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmxc5\" (UniqueName: \"kubernetes.io/projected/da050a33-d860-4577-9ce8-6d85bbdef95f-kube-api-access-pmxc5\") pod \"nova-scheduler-0\" (UID: \"da050a33-d860-4577-9ce8-6d85bbdef95f\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.958327 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.962997 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da050a33-d860-4577-9ce8-6d85bbdef95f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"da050a33-d860-4577-9ce8-6d85bbdef95f\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.963545 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-config-data\") pod \"nova-metadata-0\" (UID: \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\") " pod="openstack/nova-metadata-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.971704 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da050a33-d860-4577-9ce8-6d85bbdef95f-config-data\") pod \"nova-scheduler-0\" (UID: \"da050a33-d860-4577-9ce8-6d85bbdef95f\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.982409 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7knpk\" (UniqueName: \"kubernetes.io/projected/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-kube-api-access-7knpk\") pod \"nova-metadata-0\" (UID: \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\") " pod="openstack/nova-metadata-0" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.028919 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.040299 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrbdz\" (UniqueName: \"kubernetes.io/projected/1242edbc-6450-4d81-8c77-a15fe928d782-kube-api-access-nrbdz\") pod \"nova-api-0\" (UID: \"1242edbc-6450-4d81-8c77-a15fe928d782\") " pod="openstack/nova-api-0" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.040363 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-config\") pod \"dnsmasq-dns-6b966c788c-4sfvg\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.040812 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8j2w\" (UniqueName: \"kubernetes.io/projected/2ff79db5-8006-4a3f-bb73-ab6e32d93186-kube-api-access-q8j2w\") pod \"dnsmasq-dns-6b966c788c-4sfvg\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.040842 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1242edbc-6450-4d81-8c77-a15fe928d782-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1242edbc-6450-4d81-8c77-a15fe928d782\") " pod="openstack/nova-api-0" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.040880 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-ovsdbserver-nb\") pod \"dnsmasq-dns-6b966c788c-4sfvg\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.040917 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1242edbc-6450-4d81-8c77-a15fe928d782-logs\") pod \"nova-api-0\" (UID: \"1242edbc-6450-4d81-8c77-a15fe928d782\") " pod="openstack/nova-api-0" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.040989 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1242edbc-6450-4d81-8c77-a15fe928d782-config-data\") pod \"nova-api-0\" (UID: \"1242edbc-6450-4d81-8c77-a15fe928d782\") " pod="openstack/nova-api-0" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.041011 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-dns-svc\") pod \"dnsmasq-dns-6b966c788c-4sfvg\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.042321 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-ovsdbserver-sb\") pod \"dnsmasq-dns-6b966c788c-4sfvg\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.042729 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1242edbc-6450-4d81-8c77-a15fe928d782-logs\") pod \"nova-api-0\" (UID: \"1242edbc-6450-4d81-8c77-a15fe928d782\") " pod="openstack/nova-api-0" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.052686 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1242edbc-6450-4d81-8c77-a15fe928d782-config-data\") pod \"nova-api-0\" (UID: \"1242edbc-6450-4d81-8c77-a15fe928d782\") " pod="openstack/nova-api-0" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.052837 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1242edbc-6450-4d81-8c77-a15fe928d782-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1242edbc-6450-4d81-8c77-a15fe928d782\") " pod="openstack/nova-api-0" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.075235 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrbdz\" (UniqueName: \"kubernetes.io/projected/1242edbc-6450-4d81-8c77-a15fe928d782-kube-api-access-nrbdz\") pod \"nova-api-0\" (UID: \"1242edbc-6450-4d81-8c77-a15fe928d782\") " pod="openstack/nova-api-0" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.143694 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-dns-svc\") pod \"dnsmasq-dns-6b966c788c-4sfvg\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.144060 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-ovsdbserver-sb\") pod \"dnsmasq-dns-6b966c788c-4sfvg\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.144099 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-config\") pod \"dnsmasq-dns-6b966c788c-4sfvg\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.144143 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8j2w\" (UniqueName: \"kubernetes.io/projected/2ff79db5-8006-4a3f-bb73-ab6e32d93186-kube-api-access-q8j2w\") pod \"dnsmasq-dns-6b966c788c-4sfvg\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.144186 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-ovsdbserver-nb\") pod \"dnsmasq-dns-6b966c788c-4sfvg\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.144810 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-dns-svc\") pod \"dnsmasq-dns-6b966c788c-4sfvg\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.145062 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-ovsdbserver-nb\") pod \"dnsmasq-dns-6b966c788c-4sfvg\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.145194 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-ovsdbserver-sb\") pod \"dnsmasq-dns-6b966c788c-4sfvg\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.145286 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-config\") pod \"dnsmasq-dns-6b966c788c-4sfvg\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.155099 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.162857 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8j2w\" (UniqueName: \"kubernetes.io/projected/2ff79db5-8006-4a3f-bb73-ab6e32d93186-kube-api-access-q8j2w\") pod \"dnsmasq-dns-6b966c788c-4sfvg\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.264682 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.279671 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.361547 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-cdffm"] Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.475251 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 22:58:39 crc kubenswrapper[4795]: W0219 22:58:39.480669 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53af86cc_b0d0_4ba5_9294_4aaa6cef6c09.slice/crio-3cb295d52ddcf76c340ae392323cec591a88fdaec3fd107d8ed54655b0b89d43 WatchSource:0}: Error finding container 3cb295d52ddcf76c340ae392323cec591a88fdaec3fd107d8ed54655b0b89d43: Status 404 returned error can't find the container with id 3cb295d52ddcf76c340ae392323cec591a88fdaec3fd107d8ed54655b0b89d43 Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.553611 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 22:58:39 crc kubenswrapper[4795]: W0219 22:58:39.570456 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda050a33_d860_4577_9ce8_6d85bbdef95f.slice/crio-0e85b25b71b392efb52bf0edd446550d919136b3f26e10807779e4867fda37bb WatchSource:0}: Error finding container 0e85b25b71b392efb52bf0edd446550d919136b3f26e10807779e4867fda37bb: Status 404 returned error can't find the container with id 0e85b25b71b392efb52bf0edd446550d919136b3f26e10807779e4867fda37bb Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.656073 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7w2k2"] Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.657911 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7w2k2" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.660478 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.661098 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.666281 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7w2k2"] Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.671439 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09","Type":"ContainerStarted","Data":"3cb295d52ddcf76c340ae392323cec591a88fdaec3fd107d8ed54655b0b89d43"} Feb 19 22:58:39 crc kubenswrapper[4795]: W0219 22:58:39.676836 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ce84bcc_7dce_46f4_9ea6_b0f15971eda5.slice/crio-a499dba709ae8a77ef9e5387459f350debf25fd5199f147cbd32741eaacc5a6b WatchSource:0}: Error finding container a499dba709ae8a77ef9e5387459f350debf25fd5199f147cbd32741eaacc5a6b: Status 404 returned error can't find the container with id a499dba709ae8a77ef9e5387459f350debf25fd5199f147cbd32741eaacc5a6b Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.677047 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"da050a33-d860-4577-9ce8-6d85bbdef95f","Type":"ContainerStarted","Data":"0e85b25b71b392efb52bf0edd446550d919136b3f26e10807779e4867fda37bb"} Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.681715 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.685070 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cdffm" event={"ID":"7314a002-868e-4028-b341-b719a609e21c","Type":"ContainerStarted","Data":"fd9abe25e7c7b1351c86593c296366b3dd62af78525a82ce72038120bd5feb1c"} Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.685121 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cdffm" event={"ID":"7314a002-868e-4028-b341-b719a609e21c","Type":"ContainerStarted","Data":"8e5ecfafc8caabdbe5e796e27409792620ce0b087abf56978f0301d1a3cdf4ce"} Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.704091 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-cdffm" podStartSLOduration=1.7040706810000001 podStartE2EDuration="1.704070681s" podCreationTimestamp="2026-02-19 22:58:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:39.699655469 +0000 UTC m=+5430.892173333" watchObservedRunningTime="2026-02-19 22:58:39.704070681 +0000 UTC m=+5430.896588545" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.763331 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-scripts\") pod \"nova-cell1-conductor-db-sync-7w2k2\" (UID: \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\") " pod="openstack/nova-cell1-conductor-db-sync-7w2k2" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.763420 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh5st\" (UniqueName: \"kubernetes.io/projected/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-kube-api-access-gh5st\") pod \"nova-cell1-conductor-db-sync-7w2k2\" (UID: \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\") " pod="openstack/nova-cell1-conductor-db-sync-7w2k2" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.763467 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7w2k2\" (UID: \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\") " pod="openstack/nova-cell1-conductor-db-sync-7w2k2" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.763512 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-config-data\") pod \"nova-cell1-conductor-db-sync-7w2k2\" (UID: \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\") " pod="openstack/nova-cell1-conductor-db-sync-7w2k2" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.858266 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b966c788c-4sfvg"] Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.867333 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-scripts\") pod \"nova-cell1-conductor-db-sync-7w2k2\" (UID: \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\") " pod="openstack/nova-cell1-conductor-db-sync-7w2k2" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.867418 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh5st\" (UniqueName: \"kubernetes.io/projected/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-kube-api-access-gh5st\") pod \"nova-cell1-conductor-db-sync-7w2k2\" (UID: \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\") " pod="openstack/nova-cell1-conductor-db-sync-7w2k2" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.867458 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7w2k2\" (UID: \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\") " pod="openstack/nova-cell1-conductor-db-sync-7w2k2" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.867496 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-config-data\") pod \"nova-cell1-conductor-db-sync-7w2k2\" (UID: \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\") " pod="openstack/nova-cell1-conductor-db-sync-7w2k2" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.868828 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.871906 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-config-data\") pod \"nova-cell1-conductor-db-sync-7w2k2\" (UID: \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\") " pod="openstack/nova-cell1-conductor-db-sync-7w2k2" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.876832 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-scripts\") pod \"nova-cell1-conductor-db-sync-7w2k2\" (UID: \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\") " pod="openstack/nova-cell1-conductor-db-sync-7w2k2" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.885178 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh5st\" (UniqueName: \"kubernetes.io/projected/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-kube-api-access-gh5st\") pod \"nova-cell1-conductor-db-sync-7w2k2\" (UID: \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\") " pod="openstack/nova-cell1-conductor-db-sync-7w2k2" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.885713 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7w2k2\" (UID: \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\") " pod="openstack/nova-cell1-conductor-db-sync-7w2k2" Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.010852 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7w2k2" Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.469032 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7w2k2"] Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.695829 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5","Type":"ContainerStarted","Data":"cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8"} Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.695878 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5","Type":"ContainerStarted","Data":"1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f"} Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.695891 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5","Type":"ContainerStarted","Data":"a499dba709ae8a77ef9e5387459f350debf25fd5199f147cbd32741eaacc5a6b"} Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.697426 4795 generic.go:334] "Generic (PLEG): container finished" podID="2ff79db5-8006-4a3f-bb73-ab6e32d93186" containerID="b1d9f8ec39116ffdbb4a902a6f66f52cc6253d77fc5e2ba0e18cacfaee7d6753" exitCode=0 Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.697515 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" event={"ID":"2ff79db5-8006-4a3f-bb73-ab6e32d93186","Type":"ContainerDied","Data":"b1d9f8ec39116ffdbb4a902a6f66f52cc6253d77fc5e2ba0e18cacfaee7d6753"} Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.697568 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" event={"ID":"2ff79db5-8006-4a3f-bb73-ab6e32d93186","Type":"ContainerStarted","Data":"ee318881516d5dfefa76dbf8f513dce5d08620986165c943cfdd920651e34509"} Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.700458 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09","Type":"ContainerStarted","Data":"4cb474362c7411d131561c37cda83e0ffd7519207d531dd518914dff39a66f87"} Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.703102 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"da050a33-d860-4577-9ce8-6d85bbdef95f","Type":"ContainerStarted","Data":"3d0139e0f9c4cd718f08e91ed33067485c3af22fed1718e1e129fa1a78a292fd"} Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.705146 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7w2k2" event={"ID":"8065cb60-3c91-4fbc-89f1-7d73d11a85e5","Type":"ContainerStarted","Data":"eae4ff82798e1d79080d2f3fe94f82eb86492da01dc7f84fe2ae4aff95927ca0"} Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.705202 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7w2k2" event={"ID":"8065cb60-3c91-4fbc-89f1-7d73d11a85e5","Type":"ContainerStarted","Data":"d200ea8cc3bce3281f900f0e016d10c8a3032b94daf0e389837f13d6cbef13db"} Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.707366 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1242edbc-6450-4d81-8c77-a15fe928d782","Type":"ContainerStarted","Data":"88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413"} Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.707413 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1242edbc-6450-4d81-8c77-a15fe928d782","Type":"ContainerStarted","Data":"7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662"} Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.707423 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1242edbc-6450-4d81-8c77-a15fe928d782","Type":"ContainerStarted","Data":"977aa2412acf5cdad9aa8c2df443e4a15718820365efcbbb8fd0c0e8dbc6ee3f"} Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.767411 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.767396383 podStartE2EDuration="2.767396383s" podCreationTimestamp="2026-02-19 22:58:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:40.767200748 +0000 UTC m=+5431.959718612" watchObservedRunningTime="2026-02-19 22:58:40.767396383 +0000 UTC m=+5431.959914247" Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.863461 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.863444768 podStartE2EDuration="2.863444768s" podCreationTimestamp="2026-02-19 22:58:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:40.861640342 +0000 UTC m=+5432.054158206" watchObservedRunningTime="2026-02-19 22:58:40.863444768 +0000 UTC m=+5432.055962632" Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.868788 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-7w2k2" podStartSLOduration=1.868773424 podStartE2EDuration="1.868773424s" podCreationTimestamp="2026-02-19 22:58:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:40.810371877 +0000 UTC m=+5432.002889741" watchObservedRunningTime="2026-02-19 22:58:40.868773424 +0000 UTC m=+5432.061291278" Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.925068 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.925049606 podStartE2EDuration="2.925049606s" podCreationTimestamp="2026-02-19 22:58:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:40.899795153 +0000 UTC m=+5432.092313027" watchObservedRunningTime="2026-02-19 22:58:40.925049606 +0000 UTC m=+5432.117567470" Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.928580 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.9285747559999997 podStartE2EDuration="2.928574756s" podCreationTimestamp="2026-02-19 22:58:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:40.924456911 +0000 UTC m=+5432.116974775" watchObservedRunningTime="2026-02-19 22:58:40.928574756 +0000 UTC m=+5432.121092620" Feb 19 22:58:41 crc kubenswrapper[4795]: I0219 22:58:41.718433 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" event={"ID":"2ff79db5-8006-4a3f-bb73-ab6e32d93186","Type":"ContainerStarted","Data":"b9f475bcb8fe7daa110726dc68979a5f8257c170cc549dfc5c151f5b0ece628f"} Feb 19 22:58:41 crc kubenswrapper[4795]: I0219 22:58:41.720934 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:41 crc kubenswrapper[4795]: I0219 22:58:41.743266 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" podStartSLOduration=3.74324511 podStartE2EDuration="3.74324511s" podCreationTimestamp="2026-02-19 22:58:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:41.733530323 +0000 UTC m=+5432.926048187" watchObservedRunningTime="2026-02-19 22:58:41.74324511 +0000 UTC m=+5432.935762974" Feb 19 22:58:43 crc kubenswrapper[4795]: I0219 22:58:43.735182 4795 generic.go:334] "Generic (PLEG): container finished" podID="8065cb60-3c91-4fbc-89f1-7d73d11a85e5" containerID="eae4ff82798e1d79080d2f3fe94f82eb86492da01dc7f84fe2ae4aff95927ca0" exitCode=0 Feb 19 22:58:43 crc kubenswrapper[4795]: I0219 22:58:43.735253 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7w2k2" event={"ID":"8065cb60-3c91-4fbc-89f1-7d73d11a85e5","Type":"ContainerDied","Data":"eae4ff82798e1d79080d2f3fe94f82eb86492da01dc7f84fe2ae4aff95927ca0"} Feb 19 22:58:43 crc kubenswrapper[4795]: I0219 22:58:43.936362 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 19 22:58:44 crc kubenswrapper[4795]: I0219 22:58:44.029194 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 22:58:44 crc kubenswrapper[4795]: I0219 22:58:44.155957 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 22:58:44 crc kubenswrapper[4795]: I0219 22:58:44.156826 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 22:58:44 crc kubenswrapper[4795]: I0219 22:58:44.747227 4795 generic.go:334] "Generic (PLEG): container finished" podID="7314a002-868e-4028-b341-b719a609e21c" containerID="fd9abe25e7c7b1351c86593c296366b3dd62af78525a82ce72038120bd5feb1c" exitCode=0 Feb 19 22:58:44 crc kubenswrapper[4795]: I0219 22:58:44.747549 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cdffm" event={"ID":"7314a002-868e-4028-b341-b719a609e21c","Type":"ContainerDied","Data":"fd9abe25e7c7b1351c86593c296366b3dd62af78525a82ce72038120bd5feb1c"} Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.125355 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7w2k2" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.174730 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-combined-ca-bundle\") pod \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\" (UID: \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\") " Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.174860 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-config-data\") pod \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\" (UID: \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\") " Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.174890 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh5st\" (UniqueName: \"kubernetes.io/projected/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-kube-api-access-gh5st\") pod \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\" (UID: \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\") " Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.174987 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-scripts\") pod \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\" (UID: \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\") " Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.180181 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-scripts" (OuterVolumeSpecName: "scripts") pod "8065cb60-3c91-4fbc-89f1-7d73d11a85e5" (UID: "8065cb60-3c91-4fbc-89f1-7d73d11a85e5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.180363 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-kube-api-access-gh5st" (OuterVolumeSpecName: "kube-api-access-gh5st") pod "8065cb60-3c91-4fbc-89f1-7d73d11a85e5" (UID: "8065cb60-3c91-4fbc-89f1-7d73d11a85e5"). InnerVolumeSpecName "kube-api-access-gh5st". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.204484 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8065cb60-3c91-4fbc-89f1-7d73d11a85e5" (UID: "8065cb60-3c91-4fbc-89f1-7d73d11a85e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.205740 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-config-data" (OuterVolumeSpecName: "config-data") pod "8065cb60-3c91-4fbc-89f1-7d73d11a85e5" (UID: "8065cb60-3c91-4fbc-89f1-7d73d11a85e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.276902 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.276929 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh5st\" (UniqueName: \"kubernetes.io/projected/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-kube-api-access-gh5st\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.276938 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.276946 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.757852 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7w2k2" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.757863 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7w2k2" event={"ID":"8065cb60-3c91-4fbc-89f1-7d73d11a85e5","Type":"ContainerDied","Data":"d200ea8cc3bce3281f900f0e016d10c8a3032b94daf0e389837f13d6cbef13db"} Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.758296 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d200ea8cc3bce3281f900f0e016d10c8a3032b94daf0e389837f13d6cbef13db" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.840768 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 22:58:45 crc kubenswrapper[4795]: E0219 22:58:45.841266 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8065cb60-3c91-4fbc-89f1-7d73d11a85e5" containerName="nova-cell1-conductor-db-sync" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.841287 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8065cb60-3c91-4fbc-89f1-7d73d11a85e5" containerName="nova-cell1-conductor-db-sync" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.841560 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8065cb60-3c91-4fbc-89f1-7d73d11a85e5" containerName="nova-cell1-conductor-db-sync" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.842342 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.844685 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.890250 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.890809 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb0fc8ea-d689-468f-a612-b87c3e63077d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"cb0fc8ea-d689-468f-a612-b87c3e63077d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.890894 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb0fc8ea-d689-468f-a612-b87c3e63077d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"cb0fc8ea-d689-468f-a612-b87c3e63077d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.891089 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9np5\" (UniqueName: \"kubernetes.io/projected/cb0fc8ea-d689-468f-a612-b87c3e63077d-kube-api-access-w9np5\") pod \"nova-cell1-conductor-0\" (UID: \"cb0fc8ea-d689-468f-a612-b87c3e63077d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.996932 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9np5\" (UniqueName: \"kubernetes.io/projected/cb0fc8ea-d689-468f-a612-b87c3e63077d-kube-api-access-w9np5\") pod \"nova-cell1-conductor-0\" (UID: \"cb0fc8ea-d689-468f-a612-b87c3e63077d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.997058 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb0fc8ea-d689-468f-a612-b87c3e63077d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"cb0fc8ea-d689-468f-a612-b87c3e63077d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.997144 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb0fc8ea-d689-468f-a612-b87c3e63077d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"cb0fc8ea-d689-468f-a612-b87c3e63077d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.008933 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb0fc8ea-d689-468f-a612-b87c3e63077d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"cb0fc8ea-d689-468f-a612-b87c3e63077d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.017374 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9np5\" (UniqueName: \"kubernetes.io/projected/cb0fc8ea-d689-468f-a612-b87c3e63077d-kube-api-access-w9np5\") pod \"nova-cell1-conductor-0\" (UID: \"cb0fc8ea-d689-468f-a612-b87c3e63077d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.017568 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb0fc8ea-d689-468f-a612-b87c3e63077d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"cb0fc8ea-d689-468f-a612-b87c3e63077d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.162358 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.269372 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cdffm" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.401899 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-scripts\") pod \"7314a002-868e-4028-b341-b719a609e21c\" (UID: \"7314a002-868e-4028-b341-b719a609e21c\") " Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.402013 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgqqm\" (UniqueName: \"kubernetes.io/projected/7314a002-868e-4028-b341-b719a609e21c-kube-api-access-kgqqm\") pod \"7314a002-868e-4028-b341-b719a609e21c\" (UID: \"7314a002-868e-4028-b341-b719a609e21c\") " Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.402297 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-config-data\") pod \"7314a002-868e-4028-b341-b719a609e21c\" (UID: \"7314a002-868e-4028-b341-b719a609e21c\") " Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.402350 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-combined-ca-bundle\") pod \"7314a002-868e-4028-b341-b719a609e21c\" (UID: \"7314a002-868e-4028-b341-b719a609e21c\") " Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.406619 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7314a002-868e-4028-b341-b719a609e21c-kube-api-access-kgqqm" (OuterVolumeSpecName: "kube-api-access-kgqqm") pod "7314a002-868e-4028-b341-b719a609e21c" (UID: "7314a002-868e-4028-b341-b719a609e21c"). InnerVolumeSpecName "kube-api-access-kgqqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.406682 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-scripts" (OuterVolumeSpecName: "scripts") pod "7314a002-868e-4028-b341-b719a609e21c" (UID: "7314a002-868e-4028-b341-b719a609e21c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.428278 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-config-data" (OuterVolumeSpecName: "config-data") pod "7314a002-868e-4028-b341-b719a609e21c" (UID: "7314a002-868e-4028-b341-b719a609e21c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.430225 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7314a002-868e-4028-b341-b719a609e21c" (UID: "7314a002-868e-4028-b341-b719a609e21c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.504991 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.505307 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgqqm\" (UniqueName: \"kubernetes.io/projected/7314a002-868e-4028-b341-b719a609e21c-kube-api-access-kgqqm\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.505320 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.505333 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.576544 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.767489 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"cb0fc8ea-d689-468f-a612-b87c3e63077d","Type":"ContainerStarted","Data":"59c0008df42ff2bf7402d8fd2a516b276c718f55c84fb6bb76f4c9c73d8bd314"} Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.767537 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"cb0fc8ea-d689-468f-a612-b87c3e63077d","Type":"ContainerStarted","Data":"0e3ce7752bf63997ed6557e690f1dee58f948c11df612377dc82c2d8fde569b2"} Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.767654 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.769040 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cdffm" event={"ID":"7314a002-868e-4028-b341-b719a609e21c","Type":"ContainerDied","Data":"8e5ecfafc8caabdbe5e796e27409792620ce0b087abf56978f0301d1a3cdf4ce"} Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.769086 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e5ecfafc8caabdbe5e796e27409792620ce0b087abf56978f0301d1a3cdf4ce" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.769160 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cdffm" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.797453 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.7974353459999999 podStartE2EDuration="1.797435346s" podCreationTimestamp="2026-02-19 22:58:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:46.785482392 +0000 UTC m=+5437.978000296" watchObservedRunningTime="2026-02-19 22:58:46.797435346 +0000 UTC m=+5437.989953210" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.951502 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.951753 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1242edbc-6450-4d81-8c77-a15fe928d782" containerName="nova-api-log" containerID="cri-o://7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662" gracePeriod=30 Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.951912 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1242edbc-6450-4d81-8c77-a15fe928d782" containerName="nova-api-api" containerID="cri-o://88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413" gracePeriod=30 Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.972180 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.972447 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="da050a33-d860-4577-9ce8-6d85bbdef95f" containerName="nova-scheduler-scheduler" containerID="cri-o://3d0139e0f9c4cd718f08e91ed33067485c3af22fed1718e1e129fa1a78a292fd" gracePeriod=30 Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.995073 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.995317 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5ce84bcc-7dce-46f4-9ea6-b0f15971eda5" containerName="nova-metadata-log" containerID="cri-o://1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f" gracePeriod=30 Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.995409 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5ce84bcc-7dce-46f4-9ea6-b0f15971eda5" containerName="nova-metadata-metadata" containerID="cri-o://cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8" gracePeriod=30 Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.505040 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.625007 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrbdz\" (UniqueName: \"kubernetes.io/projected/1242edbc-6450-4d81-8c77-a15fe928d782-kube-api-access-nrbdz\") pod \"1242edbc-6450-4d81-8c77-a15fe928d782\" (UID: \"1242edbc-6450-4d81-8c77-a15fe928d782\") " Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.625114 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1242edbc-6450-4d81-8c77-a15fe928d782-config-data\") pod \"1242edbc-6450-4d81-8c77-a15fe928d782\" (UID: \"1242edbc-6450-4d81-8c77-a15fe928d782\") " Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.625182 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1242edbc-6450-4d81-8c77-a15fe928d782-logs\") pod \"1242edbc-6450-4d81-8c77-a15fe928d782\" (UID: \"1242edbc-6450-4d81-8c77-a15fe928d782\") " Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.625264 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1242edbc-6450-4d81-8c77-a15fe928d782-combined-ca-bundle\") pod \"1242edbc-6450-4d81-8c77-a15fe928d782\" (UID: \"1242edbc-6450-4d81-8c77-a15fe928d782\") " Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.626508 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1242edbc-6450-4d81-8c77-a15fe928d782-logs" (OuterVolumeSpecName: "logs") pod "1242edbc-6450-4d81-8c77-a15fe928d782" (UID: "1242edbc-6450-4d81-8c77-a15fe928d782"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.631779 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1242edbc-6450-4d81-8c77-a15fe928d782-kube-api-access-nrbdz" (OuterVolumeSpecName: "kube-api-access-nrbdz") pod "1242edbc-6450-4d81-8c77-a15fe928d782" (UID: "1242edbc-6450-4d81-8c77-a15fe928d782"). InnerVolumeSpecName "kube-api-access-nrbdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.654145 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1242edbc-6450-4d81-8c77-a15fe928d782-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1242edbc-6450-4d81-8c77-a15fe928d782" (UID: "1242edbc-6450-4d81-8c77-a15fe928d782"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.657033 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1242edbc-6450-4d81-8c77-a15fe928d782-config-data" (OuterVolumeSpecName: "config-data") pod "1242edbc-6450-4d81-8c77-a15fe928d782" (UID: "1242edbc-6450-4d81-8c77-a15fe928d782"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.694355 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.726594 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7knpk\" (UniqueName: \"kubernetes.io/projected/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-kube-api-access-7knpk\") pod \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\" (UID: \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\") " Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.726708 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-logs\") pod \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\" (UID: \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\") " Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.726781 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-config-data\") pod \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\" (UID: \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\") " Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.726855 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-combined-ca-bundle\") pod \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\" (UID: \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\") " Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.727310 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrbdz\" (UniqueName: \"kubernetes.io/projected/1242edbc-6450-4d81-8c77-a15fe928d782-kube-api-access-nrbdz\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.727330 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1242edbc-6450-4d81-8c77-a15fe928d782-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.727342 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1242edbc-6450-4d81-8c77-a15fe928d782-logs\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.727353 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1242edbc-6450-4d81-8c77-a15fe928d782-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.727312 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-logs" (OuterVolumeSpecName: "logs") pod "5ce84bcc-7dce-46f4-9ea6-b0f15971eda5" (UID: "5ce84bcc-7dce-46f4-9ea6-b0f15971eda5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.732487 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-kube-api-access-7knpk" (OuterVolumeSpecName: "kube-api-access-7knpk") pod "5ce84bcc-7dce-46f4-9ea6-b0f15971eda5" (UID: "5ce84bcc-7dce-46f4-9ea6-b0f15971eda5"). InnerVolumeSpecName "kube-api-access-7knpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.747525 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-config-data" (OuterVolumeSpecName: "config-data") pod "5ce84bcc-7dce-46f4-9ea6-b0f15971eda5" (UID: "5ce84bcc-7dce-46f4-9ea6-b0f15971eda5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.752133 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ce84bcc-7dce-46f4-9ea6-b0f15971eda5" (UID: "5ce84bcc-7dce-46f4-9ea6-b0f15971eda5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.779135 4795 generic.go:334] "Generic (PLEG): container finished" podID="1242edbc-6450-4d81-8c77-a15fe928d782" containerID="88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413" exitCode=0 Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.779206 4795 generic.go:334] "Generic (PLEG): container finished" podID="1242edbc-6450-4d81-8c77-a15fe928d782" containerID="7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662" exitCode=143 Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.779205 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1242edbc-6450-4d81-8c77-a15fe928d782","Type":"ContainerDied","Data":"88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413"} Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.779257 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1242edbc-6450-4d81-8c77-a15fe928d782","Type":"ContainerDied","Data":"7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662"} Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.779268 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1242edbc-6450-4d81-8c77-a15fe928d782","Type":"ContainerDied","Data":"977aa2412acf5cdad9aa8c2df443e4a15718820365efcbbb8fd0c0e8dbc6ee3f"} Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.779286 4795 scope.go:117] "RemoveContainer" containerID="88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.779223 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.781778 4795 generic.go:334] "Generic (PLEG): container finished" podID="5ce84bcc-7dce-46f4-9ea6-b0f15971eda5" containerID="cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8" exitCode=0 Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.781823 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5","Type":"ContainerDied","Data":"cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8"} Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.781856 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5","Type":"ContainerDied","Data":"1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f"} Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.781819 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.781834 4795 generic.go:334] "Generic (PLEG): container finished" podID="5ce84bcc-7dce-46f4-9ea6-b0f15971eda5" containerID="1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f" exitCode=143 Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.781986 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5","Type":"ContainerDied","Data":"a499dba709ae8a77ef9e5387459f350debf25fd5199f147cbd32741eaacc5a6b"} Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.804100 4795 scope.go:117] "RemoveContainer" containerID="7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.837689 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7knpk\" (UniqueName: \"kubernetes.io/projected/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-kube-api-access-7knpk\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.837717 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-logs\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.837727 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.837735 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.839119 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.842980 4795 scope.go:117] "RemoveContainer" containerID="88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413" Feb 19 22:58:47 crc kubenswrapper[4795]: E0219 22:58:47.844608 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413\": container with ID starting with 88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413 not found: ID does not exist" containerID="88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.844640 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413"} err="failed to get container status \"88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413\": rpc error: code = NotFound desc = could not find container \"88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413\": container with ID starting with 88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413 not found: ID does not exist" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.844661 4795 scope.go:117] "RemoveContainer" containerID="7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662" Feb 19 22:58:47 crc kubenswrapper[4795]: E0219 22:58:47.848652 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662\": container with ID starting with 7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662 not found: ID does not exist" containerID="7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.848694 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662"} err="failed to get container status \"7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662\": rpc error: code = NotFound desc = could not find container \"7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662\": container with ID starting with 7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662 not found: ID does not exist" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.848720 4795 scope.go:117] "RemoveContainer" containerID="88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.850300 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413"} err="failed to get container status \"88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413\": rpc error: code = NotFound desc = could not find container \"88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413\": container with ID starting with 88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413 not found: ID does not exist" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.850333 4795 scope.go:117] "RemoveContainer" containerID="7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.851781 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662"} err="failed to get container status \"7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662\": rpc error: code = NotFound desc = could not find container \"7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662\": container with ID starting with 7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662 not found: ID does not exist" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.851810 4795 scope.go:117] "RemoveContainer" containerID="cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.872432 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.887261 4795 scope.go:117] "RemoveContainer" containerID="1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.887382 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.908827 4795 scope.go:117] "RemoveContainer" containerID="cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8" Feb 19 22:58:47 crc kubenswrapper[4795]: E0219 22:58:47.909330 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8\": container with ID starting with cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8 not found: ID does not exist" containerID="cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.909363 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8"} err="failed to get container status \"cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8\": rpc error: code = NotFound desc = could not find container \"cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8\": container with ID starting with cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8 not found: ID does not exist" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.909382 4795 scope.go:117] "RemoveContainer" containerID="1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f" Feb 19 22:58:47 crc kubenswrapper[4795]: E0219 22:58:47.909697 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f\": container with ID starting with 1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f not found: ID does not exist" containerID="1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.909727 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f"} err="failed to get container status \"1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f\": rpc error: code = NotFound desc = could not find container \"1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f\": container with ID starting with 1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f not found: ID does not exist" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.909742 4795 scope.go:117] "RemoveContainer" containerID="cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.909961 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8"} err="failed to get container status \"cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8\": rpc error: code = NotFound desc = could not find container \"cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8\": container with ID starting with cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8 not found: ID does not exist" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.909974 4795 scope.go:117] "RemoveContainer" containerID="1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.910215 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f"} err="failed to get container status \"1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f\": rpc error: code = NotFound desc = could not find container \"1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f\": container with ID starting with 1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f not found: ID does not exist" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.914980 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.927115 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 22:58:47 crc kubenswrapper[4795]: E0219 22:58:47.927563 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ce84bcc-7dce-46f4-9ea6-b0f15971eda5" containerName="nova-metadata-log" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.927586 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ce84bcc-7dce-46f4-9ea6-b0f15971eda5" containerName="nova-metadata-log" Feb 19 22:58:47 crc kubenswrapper[4795]: E0219 22:58:47.927602 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ce84bcc-7dce-46f4-9ea6-b0f15971eda5" containerName="nova-metadata-metadata" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.927609 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ce84bcc-7dce-46f4-9ea6-b0f15971eda5" containerName="nova-metadata-metadata" Feb 19 22:58:47 crc kubenswrapper[4795]: E0219 22:58:47.927624 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7314a002-868e-4028-b341-b719a609e21c" containerName="nova-manage" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.927631 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7314a002-868e-4028-b341-b719a609e21c" containerName="nova-manage" Feb 19 22:58:47 crc kubenswrapper[4795]: E0219 22:58:47.927640 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1242edbc-6450-4d81-8c77-a15fe928d782" containerName="nova-api-log" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.927645 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1242edbc-6450-4d81-8c77-a15fe928d782" containerName="nova-api-log" Feb 19 22:58:47 crc kubenswrapper[4795]: E0219 22:58:47.927667 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1242edbc-6450-4d81-8c77-a15fe928d782" containerName="nova-api-api" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.927672 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1242edbc-6450-4d81-8c77-a15fe928d782" containerName="nova-api-api" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.927834 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ce84bcc-7dce-46f4-9ea6-b0f15971eda5" containerName="nova-metadata-log" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.927847 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ce84bcc-7dce-46f4-9ea6-b0f15971eda5" containerName="nova-metadata-metadata" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.927863 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1242edbc-6450-4d81-8c77-a15fe928d782" containerName="nova-api-log" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.927873 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1242edbc-6450-4d81-8c77-a15fe928d782" containerName="nova-api-api" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.927886 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7314a002-868e-4028-b341-b719a609e21c" containerName="nova-manage" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.928849 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.930578 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.943880 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.955070 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.956502 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.960366 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.964376 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.041454 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2afe060c-5092-4927-9f32-02115c78441b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2afe060c-5092-4927-9f32-02115c78441b\") " pod="openstack/nova-metadata-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.041502 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2afe060c-5092-4927-9f32-02115c78441b-config-data\") pod \"nova-metadata-0\" (UID: \"2afe060c-5092-4927-9f32-02115c78441b\") " pod="openstack/nova-metadata-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.041693 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbg69\" (UniqueName: \"kubernetes.io/projected/01aec0cd-4f6c-4299-a07d-48bc5b04206c-kube-api-access-cbg69\") pod \"nova-api-0\" (UID: \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\") " pod="openstack/nova-api-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.041772 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2afe060c-5092-4927-9f32-02115c78441b-logs\") pod \"nova-metadata-0\" (UID: \"2afe060c-5092-4927-9f32-02115c78441b\") " pod="openstack/nova-metadata-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.042034 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01aec0cd-4f6c-4299-a07d-48bc5b04206c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\") " pod="openstack/nova-api-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.042155 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01aec0cd-4f6c-4299-a07d-48bc5b04206c-logs\") pod \"nova-api-0\" (UID: \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\") " pod="openstack/nova-api-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.042250 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvb9v\" (UniqueName: \"kubernetes.io/projected/2afe060c-5092-4927-9f32-02115c78441b-kube-api-access-gvb9v\") pod \"nova-metadata-0\" (UID: \"2afe060c-5092-4927-9f32-02115c78441b\") " pod="openstack/nova-metadata-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.042360 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01aec0cd-4f6c-4299-a07d-48bc5b04206c-config-data\") pod \"nova-api-0\" (UID: \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\") " pod="openstack/nova-api-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.143860 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01aec0cd-4f6c-4299-a07d-48bc5b04206c-config-data\") pod \"nova-api-0\" (UID: \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\") " pod="openstack/nova-api-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.143946 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2afe060c-5092-4927-9f32-02115c78441b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2afe060c-5092-4927-9f32-02115c78441b\") " pod="openstack/nova-metadata-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.143969 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2afe060c-5092-4927-9f32-02115c78441b-config-data\") pod \"nova-metadata-0\" (UID: \"2afe060c-5092-4927-9f32-02115c78441b\") " pod="openstack/nova-metadata-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.143993 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbg69\" (UniqueName: \"kubernetes.io/projected/01aec0cd-4f6c-4299-a07d-48bc5b04206c-kube-api-access-cbg69\") pod \"nova-api-0\" (UID: \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\") " pod="openstack/nova-api-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.144010 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2afe060c-5092-4927-9f32-02115c78441b-logs\") pod \"nova-metadata-0\" (UID: \"2afe060c-5092-4927-9f32-02115c78441b\") " pod="openstack/nova-metadata-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.144056 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01aec0cd-4f6c-4299-a07d-48bc5b04206c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\") " pod="openstack/nova-api-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.144641 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2afe060c-5092-4927-9f32-02115c78441b-logs\") pod \"nova-metadata-0\" (UID: \"2afe060c-5092-4927-9f32-02115c78441b\") " pod="openstack/nova-metadata-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.144961 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01aec0cd-4f6c-4299-a07d-48bc5b04206c-logs\") pod \"nova-api-0\" (UID: \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\") " pod="openstack/nova-api-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.145070 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvb9v\" (UniqueName: \"kubernetes.io/projected/2afe060c-5092-4927-9f32-02115c78441b-kube-api-access-gvb9v\") pod \"nova-metadata-0\" (UID: \"2afe060c-5092-4927-9f32-02115c78441b\") " pod="openstack/nova-metadata-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.145393 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01aec0cd-4f6c-4299-a07d-48bc5b04206c-logs\") pod \"nova-api-0\" (UID: \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\") " pod="openstack/nova-api-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.148312 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01aec0cd-4f6c-4299-a07d-48bc5b04206c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\") " pod="openstack/nova-api-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.148688 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01aec0cd-4f6c-4299-a07d-48bc5b04206c-config-data\") pod \"nova-api-0\" (UID: \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\") " pod="openstack/nova-api-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.149195 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2afe060c-5092-4927-9f32-02115c78441b-config-data\") pod \"nova-metadata-0\" (UID: \"2afe060c-5092-4927-9f32-02115c78441b\") " pod="openstack/nova-metadata-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.149815 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2afe060c-5092-4927-9f32-02115c78441b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2afe060c-5092-4927-9f32-02115c78441b\") " pod="openstack/nova-metadata-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.160345 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvb9v\" (UniqueName: \"kubernetes.io/projected/2afe060c-5092-4927-9f32-02115c78441b-kube-api-access-gvb9v\") pod \"nova-metadata-0\" (UID: \"2afe060c-5092-4927-9f32-02115c78441b\") " pod="openstack/nova-metadata-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.165974 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbg69\" (UniqueName: \"kubernetes.io/projected/01aec0cd-4f6c-4299-a07d-48bc5b04206c-kube-api-access-cbg69\") pod \"nova-api-0\" (UID: \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\") " pod="openstack/nova-api-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.246309 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.273229 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.695604 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.755552 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.810076 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01aec0cd-4f6c-4299-a07d-48bc5b04206c","Type":"ContainerStarted","Data":"c46bf84b81c77e4ca078f2984ed1e17e83d62628cf87f4fed723d84f60bf74aa"} Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.811789 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2afe060c-5092-4927-9f32-02115c78441b","Type":"ContainerStarted","Data":"a3e377664d37d778a2b6faa79d4b02ff6f9bf8a82cfec3bee0e75d89770ccd50"} Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.936565 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.947972 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.281245 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.352096 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5756cc6d89-g8t6k"] Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.352364 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" podUID="19cb42f3-600f-4079-9dcd-6ba8697d5778" containerName="dnsmasq-dns" containerID="cri-o://8576ec3edc0042901926d1839c03c1f30997ed9e98b34eb8f496f95d2763762e" gracePeriod=10 Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.523147 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1242edbc-6450-4d81-8c77-a15fe928d782" path="/var/lib/kubelet/pods/1242edbc-6450-4d81-8c77-a15fe928d782/volumes" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.523977 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ce84bcc-7dce-46f4-9ea6-b0f15971eda5" path="/var/lib/kubelet/pods/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5/volumes" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.820911 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.826731 4795 generic.go:334] "Generic (PLEG): container finished" podID="19cb42f3-600f-4079-9dcd-6ba8697d5778" containerID="8576ec3edc0042901926d1839c03c1f30997ed9e98b34eb8f496f95d2763762e" exitCode=0 Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.826795 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" event={"ID":"19cb42f3-600f-4079-9dcd-6ba8697d5778","Type":"ContainerDied","Data":"8576ec3edc0042901926d1839c03c1f30997ed9e98b34eb8f496f95d2763762e"} Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.826812 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.826844 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" event={"ID":"19cb42f3-600f-4079-9dcd-6ba8697d5778","Type":"ContainerDied","Data":"1fe0f90c3062cc804bcfcc62ea300db4a0ac21188d96c43b0dd98bd184a851d6"} Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.826863 4795 scope.go:117] "RemoveContainer" containerID="8576ec3edc0042901926d1839c03c1f30997ed9e98b34eb8f496f95d2763762e" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.835209 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01aec0cd-4f6c-4299-a07d-48bc5b04206c","Type":"ContainerStarted","Data":"31f457d4286b799feb10f9adf36e58bdbdddb82beffbf370d21f4ced396d7024"} Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.835269 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01aec0cd-4f6c-4299-a07d-48bc5b04206c","Type":"ContainerStarted","Data":"385cd5652e31298bb87ed87e9552ea3a1a0c6429e349e8ce075cfcc0f4e1be2c"} Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.842627 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2afe060c-5092-4927-9f32-02115c78441b","Type":"ContainerStarted","Data":"f0890adcd512cdcbdc9b03caf18d2d2d9153afacd9acd8984f390842540c262f"} Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.842684 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2afe060c-5092-4927-9f32-02115c78441b","Type":"ContainerStarted","Data":"8b2273b05afbf5c3d01bf3929f023270d1f40dc023354cbf3f40e844201a5b01"} Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.858394 4795 scope.go:117] "RemoveContainer" containerID="d4afbfb223a6fc5d4544602fd724857bd42e63b98d9bf900ae5cadc88d7398f7" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.858544 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.863457 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.86343945 podStartE2EDuration="2.86343945s" podCreationTimestamp="2026-02-19 22:58:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:49.861925881 +0000 UTC m=+5441.054443765" watchObservedRunningTime="2026-02-19 22:58:49.86343945 +0000 UTC m=+5441.055957314" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.877141 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-dns-svc\") pod \"19cb42f3-600f-4079-9dcd-6ba8697d5778\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.877270 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-config\") pod \"19cb42f3-600f-4079-9dcd-6ba8697d5778\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.877391 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nz6z5\" (UniqueName: \"kubernetes.io/projected/19cb42f3-600f-4079-9dcd-6ba8697d5778-kube-api-access-nz6z5\") pod \"19cb42f3-600f-4079-9dcd-6ba8697d5778\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.877448 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-ovsdbserver-sb\") pod \"19cb42f3-600f-4079-9dcd-6ba8697d5778\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.877515 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-ovsdbserver-nb\") pod \"19cb42f3-600f-4079-9dcd-6ba8697d5778\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.894075 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19cb42f3-600f-4079-9dcd-6ba8697d5778-kube-api-access-nz6z5" (OuterVolumeSpecName: "kube-api-access-nz6z5") pod "19cb42f3-600f-4079-9dcd-6ba8697d5778" (UID: "19cb42f3-600f-4079-9dcd-6ba8697d5778"). InnerVolumeSpecName "kube-api-access-nz6z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.921433 4795 scope.go:117] "RemoveContainer" containerID="8576ec3edc0042901926d1839c03c1f30997ed9e98b34eb8f496f95d2763762e" Feb 19 22:58:49 crc kubenswrapper[4795]: E0219 22:58:49.930738 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8576ec3edc0042901926d1839c03c1f30997ed9e98b34eb8f496f95d2763762e\": container with ID starting with 8576ec3edc0042901926d1839c03c1f30997ed9e98b34eb8f496f95d2763762e not found: ID does not exist" containerID="8576ec3edc0042901926d1839c03c1f30997ed9e98b34eb8f496f95d2763762e" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.930797 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8576ec3edc0042901926d1839c03c1f30997ed9e98b34eb8f496f95d2763762e"} err="failed to get container status \"8576ec3edc0042901926d1839c03c1f30997ed9e98b34eb8f496f95d2763762e\": rpc error: code = NotFound desc = could not find container \"8576ec3edc0042901926d1839c03c1f30997ed9e98b34eb8f496f95d2763762e\": container with ID starting with 8576ec3edc0042901926d1839c03c1f30997ed9e98b34eb8f496f95d2763762e not found: ID does not exist" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.930826 4795 scope.go:117] "RemoveContainer" containerID="d4afbfb223a6fc5d4544602fd724857bd42e63b98d9bf900ae5cadc88d7398f7" Feb 19 22:58:49 crc kubenswrapper[4795]: E0219 22:58:49.934203 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4afbfb223a6fc5d4544602fd724857bd42e63b98d9bf900ae5cadc88d7398f7\": container with ID starting with d4afbfb223a6fc5d4544602fd724857bd42e63b98d9bf900ae5cadc88d7398f7 not found: ID does not exist" containerID="d4afbfb223a6fc5d4544602fd724857bd42e63b98d9bf900ae5cadc88d7398f7" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.934242 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4afbfb223a6fc5d4544602fd724857bd42e63b98d9bf900ae5cadc88d7398f7"} err="failed to get container status \"d4afbfb223a6fc5d4544602fd724857bd42e63b98d9bf900ae5cadc88d7398f7\": rpc error: code = NotFound desc = could not find container \"d4afbfb223a6fc5d4544602fd724857bd42e63b98d9bf900ae5cadc88d7398f7\": container with ID starting with d4afbfb223a6fc5d4544602fd724857bd42e63b98d9bf900ae5cadc88d7398f7 not found: ID does not exist" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.953611 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "19cb42f3-600f-4079-9dcd-6ba8697d5778" (UID: "19cb42f3-600f-4079-9dcd-6ba8697d5778"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.958361 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.958346085 podStartE2EDuration="2.958346085s" podCreationTimestamp="2026-02-19 22:58:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:49.929545072 +0000 UTC m=+5441.122062936" watchObservedRunningTime="2026-02-19 22:58:49.958346085 +0000 UTC m=+5441.150863939" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.961722 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-config" (OuterVolumeSpecName: "config") pod "19cb42f3-600f-4079-9dcd-6ba8697d5778" (UID: "19cb42f3-600f-4079-9dcd-6ba8697d5778"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.981871 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.981904 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-config\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.981902 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "19cb42f3-600f-4079-9dcd-6ba8697d5778" (UID: "19cb42f3-600f-4079-9dcd-6ba8697d5778"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.981916 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nz6z5\" (UniqueName: \"kubernetes.io/projected/19cb42f3-600f-4079-9dcd-6ba8697d5778-kube-api-access-nz6z5\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:50 crc kubenswrapper[4795]: I0219 22:58:50.013804 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "19cb42f3-600f-4079-9dcd-6ba8697d5778" (UID: "19cb42f3-600f-4079-9dcd-6ba8697d5778"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:58:50 crc kubenswrapper[4795]: I0219 22:58:50.083907 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:50 crc kubenswrapper[4795]: I0219 22:58:50.083937 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:50 crc kubenswrapper[4795]: I0219 22:58:50.166368 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5756cc6d89-g8t6k"] Feb 19 22:58:50 crc kubenswrapper[4795]: I0219 22:58:50.174462 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5756cc6d89-g8t6k"] Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.193678 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.316012 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.404735 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da050a33-d860-4577-9ce8-6d85bbdef95f-combined-ca-bundle\") pod \"da050a33-d860-4577-9ce8-6d85bbdef95f\" (UID: \"da050a33-d860-4577-9ce8-6d85bbdef95f\") " Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.404876 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da050a33-d860-4577-9ce8-6d85bbdef95f-config-data\") pod \"da050a33-d860-4577-9ce8-6d85bbdef95f\" (UID: \"da050a33-d860-4577-9ce8-6d85bbdef95f\") " Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.404952 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmxc5\" (UniqueName: \"kubernetes.io/projected/da050a33-d860-4577-9ce8-6d85bbdef95f-kube-api-access-pmxc5\") pod \"da050a33-d860-4577-9ce8-6d85bbdef95f\" (UID: \"da050a33-d860-4577-9ce8-6d85bbdef95f\") " Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.408145 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da050a33-d860-4577-9ce8-6d85bbdef95f-kube-api-access-pmxc5" (OuterVolumeSpecName: "kube-api-access-pmxc5") pod "da050a33-d860-4577-9ce8-6d85bbdef95f" (UID: "da050a33-d860-4577-9ce8-6d85bbdef95f"). InnerVolumeSpecName "kube-api-access-pmxc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.437606 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da050a33-d860-4577-9ce8-6d85bbdef95f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da050a33-d860-4577-9ce8-6d85bbdef95f" (UID: "da050a33-d860-4577-9ce8-6d85bbdef95f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.437621 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da050a33-d860-4577-9ce8-6d85bbdef95f-config-data" (OuterVolumeSpecName: "config-data") pod "da050a33-d860-4577-9ce8-6d85bbdef95f" (UID: "da050a33-d860-4577-9ce8-6d85bbdef95f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.506745 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmxc5\" (UniqueName: \"kubernetes.io/projected/da050a33-d860-4577-9ce8-6d85bbdef95f-kube-api-access-pmxc5\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.506776 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da050a33-d860-4577-9ce8-6d85bbdef95f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.506786 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da050a33-d860-4577-9ce8-6d85bbdef95f-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.525043 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19cb42f3-600f-4079-9dcd-6ba8697d5778" path="/var/lib/kubelet/pods/19cb42f3-600f-4079-9dcd-6ba8697d5778/volumes" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.605278 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-ltwc9"] Feb 19 22:58:51 crc kubenswrapper[4795]: E0219 22:58:51.605664 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da050a33-d860-4577-9ce8-6d85bbdef95f" containerName="nova-scheduler-scheduler" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.605681 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="da050a33-d860-4577-9ce8-6d85bbdef95f" containerName="nova-scheduler-scheduler" Feb 19 22:58:51 crc kubenswrapper[4795]: E0219 22:58:51.605694 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19cb42f3-600f-4079-9dcd-6ba8697d5778" containerName="dnsmasq-dns" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.605699 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="19cb42f3-600f-4079-9dcd-6ba8697d5778" containerName="dnsmasq-dns" Feb 19 22:58:51 crc kubenswrapper[4795]: E0219 22:58:51.605708 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19cb42f3-600f-4079-9dcd-6ba8697d5778" containerName="init" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.605716 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="19cb42f3-600f-4079-9dcd-6ba8697d5778" containerName="init" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.605884 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="da050a33-d860-4577-9ce8-6d85bbdef95f" containerName="nova-scheduler-scheduler" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.605896 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="19cb42f3-600f-4079-9dcd-6ba8697d5778" containerName="dnsmasq-dns" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.606475 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ltwc9" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.608943 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.609339 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.627560 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-ltwc9"] Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.709660 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btq7g\" (UniqueName: \"kubernetes.io/projected/0532ff51-023e-4663-9c95-6545236a8fb3-kube-api-access-btq7g\") pod \"nova-cell1-cell-mapping-ltwc9\" (UID: \"0532ff51-023e-4663-9c95-6545236a8fb3\") " pod="openstack/nova-cell1-cell-mapping-ltwc9" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.709745 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-config-data\") pod \"nova-cell1-cell-mapping-ltwc9\" (UID: \"0532ff51-023e-4663-9c95-6545236a8fb3\") " pod="openstack/nova-cell1-cell-mapping-ltwc9" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.709801 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-scripts\") pod \"nova-cell1-cell-mapping-ltwc9\" (UID: \"0532ff51-023e-4663-9c95-6545236a8fb3\") " pod="openstack/nova-cell1-cell-mapping-ltwc9" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.710002 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ltwc9\" (UID: \"0532ff51-023e-4663-9c95-6545236a8fb3\") " pod="openstack/nova-cell1-cell-mapping-ltwc9" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.812336 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ltwc9\" (UID: \"0532ff51-023e-4663-9c95-6545236a8fb3\") " pod="openstack/nova-cell1-cell-mapping-ltwc9" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.812447 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btq7g\" (UniqueName: \"kubernetes.io/projected/0532ff51-023e-4663-9c95-6545236a8fb3-kube-api-access-btq7g\") pod \"nova-cell1-cell-mapping-ltwc9\" (UID: \"0532ff51-023e-4663-9c95-6545236a8fb3\") " pod="openstack/nova-cell1-cell-mapping-ltwc9" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.812475 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-config-data\") pod \"nova-cell1-cell-mapping-ltwc9\" (UID: \"0532ff51-023e-4663-9c95-6545236a8fb3\") " pod="openstack/nova-cell1-cell-mapping-ltwc9" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.812526 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-scripts\") pod \"nova-cell1-cell-mapping-ltwc9\" (UID: \"0532ff51-023e-4663-9c95-6545236a8fb3\") " pod="openstack/nova-cell1-cell-mapping-ltwc9" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.817786 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ltwc9\" (UID: \"0532ff51-023e-4663-9c95-6545236a8fb3\") " pod="openstack/nova-cell1-cell-mapping-ltwc9" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.825119 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-config-data\") pod \"nova-cell1-cell-mapping-ltwc9\" (UID: \"0532ff51-023e-4663-9c95-6545236a8fb3\") " pod="openstack/nova-cell1-cell-mapping-ltwc9" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.825533 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-scripts\") pod \"nova-cell1-cell-mapping-ltwc9\" (UID: \"0532ff51-023e-4663-9c95-6545236a8fb3\") " pod="openstack/nova-cell1-cell-mapping-ltwc9" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.834016 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btq7g\" (UniqueName: \"kubernetes.io/projected/0532ff51-023e-4663-9c95-6545236a8fb3-kube-api-access-btq7g\") pod \"nova-cell1-cell-mapping-ltwc9\" (UID: \"0532ff51-023e-4663-9c95-6545236a8fb3\") " pod="openstack/nova-cell1-cell-mapping-ltwc9" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.875103 4795 generic.go:334] "Generic (PLEG): container finished" podID="da050a33-d860-4577-9ce8-6d85bbdef95f" containerID="3d0139e0f9c4cd718f08e91ed33067485c3af22fed1718e1e129fa1a78a292fd" exitCode=0 Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.875155 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"da050a33-d860-4577-9ce8-6d85bbdef95f","Type":"ContainerDied","Data":"3d0139e0f9c4cd718f08e91ed33067485c3af22fed1718e1e129fa1a78a292fd"} Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.875203 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"da050a33-d860-4577-9ce8-6d85bbdef95f","Type":"ContainerDied","Data":"0e85b25b71b392efb52bf0edd446550d919136b3f26e10807779e4867fda37bb"} Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.875223 4795 scope.go:117] "RemoveContainer" containerID="3d0139e0f9c4cd718f08e91ed33067485c3af22fed1718e1e129fa1a78a292fd" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.875328 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.917435 4795 scope.go:117] "RemoveContainer" containerID="3d0139e0f9c4cd718f08e91ed33067485c3af22fed1718e1e129fa1a78a292fd" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.917432 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 22:58:51 crc kubenswrapper[4795]: E0219 22:58:51.918793 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d0139e0f9c4cd718f08e91ed33067485c3af22fed1718e1e129fa1a78a292fd\": container with ID starting with 3d0139e0f9c4cd718f08e91ed33067485c3af22fed1718e1e129fa1a78a292fd not found: ID does not exist" containerID="3d0139e0f9c4cd718f08e91ed33067485c3af22fed1718e1e129fa1a78a292fd" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.918944 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d0139e0f9c4cd718f08e91ed33067485c3af22fed1718e1e129fa1a78a292fd"} err="failed to get container status \"3d0139e0f9c4cd718f08e91ed33067485c3af22fed1718e1e129fa1a78a292fd\": rpc error: code = NotFound desc = could not find container \"3d0139e0f9c4cd718f08e91ed33067485c3af22fed1718e1e129fa1a78a292fd\": container with ID starting with 3d0139e0f9c4cd718f08e91ed33067485c3af22fed1718e1e129fa1a78a292fd not found: ID does not exist" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.930034 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.940807 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.942230 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.944671 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.951607 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.955184 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ltwc9" Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.019499 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-556k9\" (UniqueName: \"kubernetes.io/projected/5eee4148-ad3f-42ae-954b-20103c8869e0-kube-api-access-556k9\") pod \"nova-scheduler-0\" (UID: \"5eee4148-ad3f-42ae-954b-20103c8869e0\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.019559 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eee4148-ad3f-42ae-954b-20103c8869e0-config-data\") pod \"nova-scheduler-0\" (UID: \"5eee4148-ad3f-42ae-954b-20103c8869e0\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.019593 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eee4148-ad3f-42ae-954b-20103c8869e0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5eee4148-ad3f-42ae-954b-20103c8869e0\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.120851 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eee4148-ad3f-42ae-954b-20103c8869e0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5eee4148-ad3f-42ae-954b-20103c8869e0\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.121477 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-556k9\" (UniqueName: \"kubernetes.io/projected/5eee4148-ad3f-42ae-954b-20103c8869e0-kube-api-access-556k9\") pod \"nova-scheduler-0\" (UID: \"5eee4148-ad3f-42ae-954b-20103c8869e0\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.121540 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eee4148-ad3f-42ae-954b-20103c8869e0-config-data\") pod \"nova-scheduler-0\" (UID: \"5eee4148-ad3f-42ae-954b-20103c8869e0\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.131306 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eee4148-ad3f-42ae-954b-20103c8869e0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5eee4148-ad3f-42ae-954b-20103c8869e0\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.131482 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eee4148-ad3f-42ae-954b-20103c8869e0-config-data\") pod \"nova-scheduler-0\" (UID: \"5eee4148-ad3f-42ae-954b-20103c8869e0\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.143353 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-556k9\" (UniqueName: \"kubernetes.io/projected/5eee4148-ad3f-42ae-954b-20103c8869e0-kube-api-access-556k9\") pod \"nova-scheduler-0\" (UID: \"5eee4148-ad3f-42ae-954b-20103c8869e0\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.261466 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.409309 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-ltwc9"] Feb 19 22:58:52 crc kubenswrapper[4795]: W0219 22:58:52.414665 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0532ff51_023e_4663_9c95_6545236a8fb3.slice/crio-f69afc9e6e4cd179463d288656fef123098b33509c0f349c8088c17d49d1fbf7 WatchSource:0}: Error finding container f69afc9e6e4cd179463d288656fef123098b33509c0f349c8088c17d49d1fbf7: Status 404 returned error can't find the container with id f69afc9e6e4cd179463d288656fef123098b33509c0f349c8088c17d49d1fbf7 Feb 19 22:58:52 crc kubenswrapper[4795]: W0219 22:58:52.668798 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5eee4148_ad3f_42ae_954b_20103c8869e0.slice/crio-6f9fc2a856df4fd02d1f188c7bd17a0205125efe04c5dc6a27afb8ab33d1efad WatchSource:0}: Error finding container 6f9fc2a856df4fd02d1f188c7bd17a0205125efe04c5dc6a27afb8ab33d1efad: Status 404 returned error can't find the container with id 6f9fc2a856df4fd02d1f188c7bd17a0205125efe04c5dc6a27afb8ab33d1efad Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.677726 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.884486 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ltwc9" event={"ID":"0532ff51-023e-4663-9c95-6545236a8fb3","Type":"ContainerStarted","Data":"3e6c87188fd15e9809b882f54d39f3a4062c4abbd3e889c36c86f66a2c644f00"} Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.884530 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ltwc9" event={"ID":"0532ff51-023e-4663-9c95-6545236a8fb3","Type":"ContainerStarted","Data":"f69afc9e6e4cd179463d288656fef123098b33509c0f349c8088c17d49d1fbf7"} Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.886966 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5eee4148-ad3f-42ae-954b-20103c8869e0","Type":"ContainerStarted","Data":"280220290d29bc48350410df59cfc0bb598766f640d75821796b950e43cea3ef"} Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.887002 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5eee4148-ad3f-42ae-954b-20103c8869e0","Type":"ContainerStarted","Data":"6f9fc2a856df4fd02d1f188c7bd17a0205125efe04c5dc6a27afb8ab33d1efad"} Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.909155 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-ltwc9" podStartSLOduration=1.909136586 podStartE2EDuration="1.909136586s" podCreationTimestamp="2026-02-19 22:58:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:52.897208133 +0000 UTC m=+5444.089725997" watchObservedRunningTime="2026-02-19 22:58:52.909136586 +0000 UTC m=+5444.101654440" Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.922868 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.922843815 podStartE2EDuration="1.922843815s" podCreationTimestamp="2026-02-19 22:58:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:52.913187789 +0000 UTC m=+5444.105705653" watchObservedRunningTime="2026-02-19 22:58:52.922843815 +0000 UTC m=+5444.115361679" Feb 19 22:58:53 crc kubenswrapper[4795]: I0219 22:58:53.247518 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 22:58:53 crc kubenswrapper[4795]: I0219 22:58:53.247891 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 22:58:53 crc kubenswrapper[4795]: I0219 22:58:53.520702 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da050a33-d860-4577-9ce8-6d85bbdef95f" path="/var/lib/kubelet/pods/da050a33-d860-4577-9ce8-6d85bbdef95f/volumes" Feb 19 22:58:57 crc kubenswrapper[4795]: I0219 22:58:57.262510 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 22:58:57 crc kubenswrapper[4795]: I0219 22:58:57.946412 4795 generic.go:334] "Generic (PLEG): container finished" podID="0532ff51-023e-4663-9c95-6545236a8fb3" containerID="3e6c87188fd15e9809b882f54d39f3a4062c4abbd3e889c36c86f66a2c644f00" exitCode=0 Feb 19 22:58:57 crc kubenswrapper[4795]: I0219 22:58:57.946513 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ltwc9" event={"ID":"0532ff51-023e-4663-9c95-6545236a8fb3","Type":"ContainerDied","Data":"3e6c87188fd15e9809b882f54d39f3a4062c4abbd3e889c36c86f66a2c644f00"} Feb 19 22:58:58 crc kubenswrapper[4795]: I0219 22:58:58.247234 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 22:58:58 crc kubenswrapper[4795]: I0219 22:58:58.247281 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 22:58:58 crc kubenswrapper[4795]: I0219 22:58:58.274399 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 22:58:58 crc kubenswrapper[4795]: I0219 22:58:58.274460 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 22:58:58 crc kubenswrapper[4795]: I0219 22:58:58.427043 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:58:58 crc kubenswrapper[4795]: I0219 22:58:58.427111 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.294405 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ltwc9" Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.374967 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-config-data\") pod \"0532ff51-023e-4663-9c95-6545236a8fb3\" (UID: \"0532ff51-023e-4663-9c95-6545236a8fb3\") " Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.375041 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btq7g\" (UniqueName: \"kubernetes.io/projected/0532ff51-023e-4663-9c95-6545236a8fb3-kube-api-access-btq7g\") pod \"0532ff51-023e-4663-9c95-6545236a8fb3\" (UID: \"0532ff51-023e-4663-9c95-6545236a8fb3\") " Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.375075 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-scripts\") pod \"0532ff51-023e-4663-9c95-6545236a8fb3\" (UID: \"0532ff51-023e-4663-9c95-6545236a8fb3\") " Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.375101 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-combined-ca-bundle\") pod \"0532ff51-023e-4663-9c95-6545236a8fb3\" (UID: \"0532ff51-023e-4663-9c95-6545236a8fb3\") " Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.396697 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0532ff51-023e-4663-9c95-6545236a8fb3-kube-api-access-btq7g" (OuterVolumeSpecName: "kube-api-access-btq7g") pod "0532ff51-023e-4663-9c95-6545236a8fb3" (UID: "0532ff51-023e-4663-9c95-6545236a8fb3"). InnerVolumeSpecName "kube-api-access-btq7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.402430 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-scripts" (OuterVolumeSpecName: "scripts") pod "0532ff51-023e-4663-9c95-6545236a8fb3" (UID: "0532ff51-023e-4663-9c95-6545236a8fb3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.409698 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0532ff51-023e-4663-9c95-6545236a8fb3" (UID: "0532ff51-023e-4663-9c95-6545236a8fb3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.414363 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-config-data" (OuterVolumeSpecName: "config-data") pod "0532ff51-023e-4663-9c95-6545236a8fb3" (UID: "0532ff51-023e-4663-9c95-6545236a8fb3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.416315 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2afe060c-5092-4927-9f32-02115c78441b" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.65:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.416367 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="01aec0cd-4f6c-4299-a07d-48bc5b04206c" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.66:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.416412 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2afe060c-5092-4927-9f32-02115c78441b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.65:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.416676 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="01aec0cd-4f6c-4299-a07d-48bc5b04206c" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.66:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.476958 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.477299 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btq7g\" (UniqueName: \"kubernetes.io/projected/0532ff51-023e-4663-9c95-6545236a8fb3-kube-api-access-btq7g\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.477395 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.477470 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.968272 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ltwc9" event={"ID":"0532ff51-023e-4663-9c95-6545236a8fb3","Type":"ContainerDied","Data":"f69afc9e6e4cd179463d288656fef123098b33509c0f349c8088c17d49d1fbf7"} Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.968722 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f69afc9e6e4cd179463d288656fef123098b33509c0f349c8088c17d49d1fbf7" Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.968415 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ltwc9" Feb 19 22:59:00 crc kubenswrapper[4795]: I0219 22:59:00.162735 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 22:59:00 crc kubenswrapper[4795]: I0219 22:59:00.163019 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="01aec0cd-4f6c-4299-a07d-48bc5b04206c" containerName="nova-api-log" containerID="cri-o://385cd5652e31298bb87ed87e9552ea3a1a0c6429e349e8ce075cfcc0f4e1be2c" gracePeriod=30 Feb 19 22:59:00 crc kubenswrapper[4795]: I0219 22:59:00.163120 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="01aec0cd-4f6c-4299-a07d-48bc5b04206c" containerName="nova-api-api" containerID="cri-o://31f457d4286b799feb10f9adf36e58bdbdddb82beffbf370d21f4ced396d7024" gracePeriod=30 Feb 19 22:59:00 crc kubenswrapper[4795]: I0219 22:59:00.173953 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 22:59:00 crc kubenswrapper[4795]: I0219 22:59:00.174453 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="5eee4148-ad3f-42ae-954b-20103c8869e0" containerName="nova-scheduler-scheduler" containerID="cri-o://280220290d29bc48350410df59cfc0bb598766f640d75821796b950e43cea3ef" gracePeriod=30 Feb 19 22:59:00 crc kubenswrapper[4795]: I0219 22:59:00.230576 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 22:59:00 crc kubenswrapper[4795]: I0219 22:59:00.230920 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2afe060c-5092-4927-9f32-02115c78441b" containerName="nova-metadata-log" containerID="cri-o://8b2273b05afbf5c3d01bf3929f023270d1f40dc023354cbf3f40e844201a5b01" gracePeriod=30 Feb 19 22:59:00 crc kubenswrapper[4795]: I0219 22:59:00.231727 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2afe060c-5092-4927-9f32-02115c78441b" containerName="nova-metadata-metadata" containerID="cri-o://f0890adcd512cdcbdc9b03caf18d2d2d9153afacd9acd8984f390842540c262f" gracePeriod=30 Feb 19 22:59:00 crc kubenswrapper[4795]: I0219 22:59:00.979474 4795 generic.go:334] "Generic (PLEG): container finished" podID="01aec0cd-4f6c-4299-a07d-48bc5b04206c" containerID="385cd5652e31298bb87ed87e9552ea3a1a0c6429e349e8ce075cfcc0f4e1be2c" exitCode=143 Feb 19 22:59:00 crc kubenswrapper[4795]: I0219 22:59:00.979570 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01aec0cd-4f6c-4299-a07d-48bc5b04206c","Type":"ContainerDied","Data":"385cd5652e31298bb87ed87e9552ea3a1a0c6429e349e8ce075cfcc0f4e1be2c"} Feb 19 22:59:00 crc kubenswrapper[4795]: I0219 22:59:00.981914 4795 generic.go:334] "Generic (PLEG): container finished" podID="2afe060c-5092-4927-9f32-02115c78441b" containerID="8b2273b05afbf5c3d01bf3929f023270d1f40dc023354cbf3f40e844201a5b01" exitCode=143 Feb 19 22:59:00 crc kubenswrapper[4795]: I0219 22:59:00.981960 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2afe060c-5092-4927-9f32-02115c78441b","Type":"ContainerDied","Data":"8b2273b05afbf5c3d01bf3929f023270d1f40dc023354cbf3f40e844201a5b01"} Feb 19 22:59:03 crc kubenswrapper[4795]: I0219 22:59:03.794063 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 22:59:03 crc kubenswrapper[4795]: I0219 22:59:03.872768 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvb9v\" (UniqueName: \"kubernetes.io/projected/2afe060c-5092-4927-9f32-02115c78441b-kube-api-access-gvb9v\") pod \"2afe060c-5092-4927-9f32-02115c78441b\" (UID: \"2afe060c-5092-4927-9f32-02115c78441b\") " Feb 19 22:59:03 crc kubenswrapper[4795]: I0219 22:59:03.872821 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2afe060c-5092-4927-9f32-02115c78441b-config-data\") pod \"2afe060c-5092-4927-9f32-02115c78441b\" (UID: \"2afe060c-5092-4927-9f32-02115c78441b\") " Feb 19 22:59:03 crc kubenswrapper[4795]: I0219 22:59:03.873800 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2afe060c-5092-4927-9f32-02115c78441b-combined-ca-bundle\") pod \"2afe060c-5092-4927-9f32-02115c78441b\" (UID: \"2afe060c-5092-4927-9f32-02115c78441b\") " Feb 19 22:59:03 crc kubenswrapper[4795]: I0219 22:59:03.873849 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2afe060c-5092-4927-9f32-02115c78441b-logs\") pod \"2afe060c-5092-4927-9f32-02115c78441b\" (UID: \"2afe060c-5092-4927-9f32-02115c78441b\") " Feb 19 22:59:03 crc kubenswrapper[4795]: I0219 22:59:03.874402 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2afe060c-5092-4927-9f32-02115c78441b-logs" (OuterVolumeSpecName: "logs") pod "2afe060c-5092-4927-9f32-02115c78441b" (UID: "2afe060c-5092-4927-9f32-02115c78441b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:59:03 crc kubenswrapper[4795]: I0219 22:59:03.879641 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2afe060c-5092-4927-9f32-02115c78441b-kube-api-access-gvb9v" (OuterVolumeSpecName: "kube-api-access-gvb9v") pod "2afe060c-5092-4927-9f32-02115c78441b" (UID: "2afe060c-5092-4927-9f32-02115c78441b"). InnerVolumeSpecName "kube-api-access-gvb9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:59:03 crc kubenswrapper[4795]: I0219 22:59:03.914395 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2afe060c-5092-4927-9f32-02115c78441b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2afe060c-5092-4927-9f32-02115c78441b" (UID: "2afe060c-5092-4927-9f32-02115c78441b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:59:03 crc kubenswrapper[4795]: I0219 22:59:03.918803 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2afe060c-5092-4927-9f32-02115c78441b-config-data" (OuterVolumeSpecName: "config-data") pod "2afe060c-5092-4927-9f32-02115c78441b" (UID: "2afe060c-5092-4927-9f32-02115c78441b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:59:03 crc kubenswrapper[4795]: I0219 22:59:03.975922 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvb9v\" (UniqueName: \"kubernetes.io/projected/2afe060c-5092-4927-9f32-02115c78441b-kube-api-access-gvb9v\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:03 crc kubenswrapper[4795]: I0219 22:59:03.975956 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2afe060c-5092-4927-9f32-02115c78441b-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:03 crc kubenswrapper[4795]: I0219 22:59:03.975965 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2afe060c-5092-4927-9f32-02115c78441b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:03 crc kubenswrapper[4795]: I0219 22:59:03.975973 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2afe060c-5092-4927-9f32-02115c78441b-logs\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.017080 4795 generic.go:334] "Generic (PLEG): container finished" podID="2afe060c-5092-4927-9f32-02115c78441b" containerID="f0890adcd512cdcbdc9b03caf18d2d2d9153afacd9acd8984f390842540c262f" exitCode=0 Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.017148 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.017182 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2afe060c-5092-4927-9f32-02115c78441b","Type":"ContainerDied","Data":"f0890adcd512cdcbdc9b03caf18d2d2d9153afacd9acd8984f390842540c262f"} Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.017226 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2afe060c-5092-4927-9f32-02115c78441b","Type":"ContainerDied","Data":"a3e377664d37d778a2b6faa79d4b02ff6f9bf8a82cfec3bee0e75d89770ccd50"} Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.017248 4795 scope.go:117] "RemoveContainer" containerID="f0890adcd512cdcbdc9b03caf18d2d2d9153afacd9acd8984f390842540c262f" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.018981 4795 generic.go:334] "Generic (PLEG): container finished" podID="5eee4148-ad3f-42ae-954b-20103c8869e0" containerID="280220290d29bc48350410df59cfc0bb598766f640d75821796b950e43cea3ef" exitCode=0 Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.019011 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5eee4148-ad3f-42ae-954b-20103c8869e0","Type":"ContainerDied","Data":"280220290d29bc48350410df59cfc0bb598766f640d75821796b950e43cea3ef"} Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.052442 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.057012 4795 scope.go:117] "RemoveContainer" containerID="8b2273b05afbf5c3d01bf3929f023270d1f40dc023354cbf3f40e844201a5b01" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.061541 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.116009 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 22:59:04 crc kubenswrapper[4795]: E0219 22:59:04.118125 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0532ff51-023e-4663-9c95-6545236a8fb3" containerName="nova-manage" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.118153 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0532ff51-023e-4663-9c95-6545236a8fb3" containerName="nova-manage" Feb 19 22:59:04 crc kubenswrapper[4795]: E0219 22:59:04.118196 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2afe060c-5092-4927-9f32-02115c78441b" containerName="nova-metadata-log" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.118206 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2afe060c-5092-4927-9f32-02115c78441b" containerName="nova-metadata-log" Feb 19 22:59:04 crc kubenswrapper[4795]: E0219 22:59:04.118230 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2afe060c-5092-4927-9f32-02115c78441b" containerName="nova-metadata-metadata" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.118241 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2afe060c-5092-4927-9f32-02115c78441b" containerName="nova-metadata-metadata" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.118500 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0532ff51-023e-4663-9c95-6545236a8fb3" containerName="nova-manage" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.118528 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2afe060c-5092-4927-9f32-02115c78441b" containerName="nova-metadata-metadata" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.118544 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2afe060c-5092-4927-9f32-02115c78441b" containerName="nova-metadata-log" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.121016 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.124586 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.124950 4795 scope.go:117] "RemoveContainer" containerID="f0890adcd512cdcbdc9b03caf18d2d2d9153afacd9acd8984f390842540c262f" Feb 19 22:59:04 crc kubenswrapper[4795]: E0219 22:59:04.126013 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0890adcd512cdcbdc9b03caf18d2d2d9153afacd9acd8984f390842540c262f\": container with ID starting with f0890adcd512cdcbdc9b03caf18d2d2d9153afacd9acd8984f390842540c262f not found: ID does not exist" containerID="f0890adcd512cdcbdc9b03caf18d2d2d9153afacd9acd8984f390842540c262f" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.126067 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0890adcd512cdcbdc9b03caf18d2d2d9153afacd9acd8984f390842540c262f"} err="failed to get container status \"f0890adcd512cdcbdc9b03caf18d2d2d9153afacd9acd8984f390842540c262f\": rpc error: code = NotFound desc = could not find container \"f0890adcd512cdcbdc9b03caf18d2d2d9153afacd9acd8984f390842540c262f\": container with ID starting with f0890adcd512cdcbdc9b03caf18d2d2d9153afacd9acd8984f390842540c262f not found: ID does not exist" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.126111 4795 scope.go:117] "RemoveContainer" containerID="8b2273b05afbf5c3d01bf3929f023270d1f40dc023354cbf3f40e844201a5b01" Feb 19 22:59:04 crc kubenswrapper[4795]: E0219 22:59:04.128181 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b2273b05afbf5c3d01bf3929f023270d1f40dc023354cbf3f40e844201a5b01\": container with ID starting with 8b2273b05afbf5c3d01bf3929f023270d1f40dc023354cbf3f40e844201a5b01 not found: ID does not exist" containerID="8b2273b05afbf5c3d01bf3929f023270d1f40dc023354cbf3f40e844201a5b01" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.128215 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b2273b05afbf5c3d01bf3929f023270d1f40dc023354cbf3f40e844201a5b01"} err="failed to get container status \"8b2273b05afbf5c3d01bf3929f023270d1f40dc023354cbf3f40e844201a5b01\": rpc error: code = NotFound desc = could not find container \"8b2273b05afbf5c3d01bf3929f023270d1f40dc023354cbf3f40e844201a5b01\": container with ID starting with 8b2273b05afbf5c3d01bf3929f023270d1f40dc023354cbf3f40e844201a5b01 not found: ID does not exist" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.149975 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.180065 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4khb7\" (UniqueName: \"kubernetes.io/projected/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-kube-api-access-4khb7\") pod \"nova-metadata-0\" (UID: \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\") " pod="openstack/nova-metadata-0" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.180146 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\") " pod="openstack/nova-metadata-0" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.180225 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-config-data\") pod \"nova-metadata-0\" (UID: \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\") " pod="openstack/nova-metadata-0" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.180421 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-logs\") pod \"nova-metadata-0\" (UID: \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\") " pod="openstack/nova-metadata-0" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.281633 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4khb7\" (UniqueName: \"kubernetes.io/projected/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-kube-api-access-4khb7\") pod \"nova-metadata-0\" (UID: \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\") " pod="openstack/nova-metadata-0" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.281733 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\") " pod="openstack/nova-metadata-0" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.281793 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-config-data\") pod \"nova-metadata-0\" (UID: \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\") " pod="openstack/nova-metadata-0" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.281859 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-logs\") pod \"nova-metadata-0\" (UID: \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\") " pod="openstack/nova-metadata-0" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.282364 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-logs\") pod \"nova-metadata-0\" (UID: \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\") " pod="openstack/nova-metadata-0" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.287895 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-config-data\") pod \"nova-metadata-0\" (UID: \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\") " pod="openstack/nova-metadata-0" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.288876 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\") " pod="openstack/nova-metadata-0" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.301270 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4khb7\" (UniqueName: \"kubernetes.io/projected/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-kube-api-access-4khb7\") pod \"nova-metadata-0\" (UID: \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\") " pod="openstack/nova-metadata-0" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.319921 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.383418 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eee4148-ad3f-42ae-954b-20103c8869e0-config-data\") pod \"5eee4148-ad3f-42ae-954b-20103c8869e0\" (UID: \"5eee4148-ad3f-42ae-954b-20103c8869e0\") " Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.383575 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eee4148-ad3f-42ae-954b-20103c8869e0-combined-ca-bundle\") pod \"5eee4148-ad3f-42ae-954b-20103c8869e0\" (UID: \"5eee4148-ad3f-42ae-954b-20103c8869e0\") " Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.383602 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-556k9\" (UniqueName: \"kubernetes.io/projected/5eee4148-ad3f-42ae-954b-20103c8869e0-kube-api-access-556k9\") pod \"5eee4148-ad3f-42ae-954b-20103c8869e0\" (UID: \"5eee4148-ad3f-42ae-954b-20103c8869e0\") " Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.386455 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5eee4148-ad3f-42ae-954b-20103c8869e0-kube-api-access-556k9" (OuterVolumeSpecName: "kube-api-access-556k9") pod "5eee4148-ad3f-42ae-954b-20103c8869e0" (UID: "5eee4148-ad3f-42ae-954b-20103c8869e0"). InnerVolumeSpecName "kube-api-access-556k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.404803 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5eee4148-ad3f-42ae-954b-20103c8869e0-config-data" (OuterVolumeSpecName: "config-data") pod "5eee4148-ad3f-42ae-954b-20103c8869e0" (UID: "5eee4148-ad3f-42ae-954b-20103c8869e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.405893 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5eee4148-ad3f-42ae-954b-20103c8869e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5eee4148-ad3f-42ae-954b-20103c8869e0" (UID: "5eee4148-ad3f-42ae-954b-20103c8869e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.449951 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.486660 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eee4148-ad3f-42ae-954b-20103c8869e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.486727 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-556k9\" (UniqueName: \"kubernetes.io/projected/5eee4148-ad3f-42ae-954b-20103c8869e0-kube-api-access-556k9\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.486743 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eee4148-ad3f-42ae-954b-20103c8869e0-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.883447 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.994684 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.037543 4795 generic.go:334] "Generic (PLEG): container finished" podID="01aec0cd-4f6c-4299-a07d-48bc5b04206c" containerID="31f457d4286b799feb10f9adf36e58bdbdddb82beffbf370d21f4ced396d7024" exitCode=0 Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.037671 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.038385 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01aec0cd-4f6c-4299-a07d-48bc5b04206c","Type":"ContainerDied","Data":"31f457d4286b799feb10f9adf36e58bdbdddb82beffbf370d21f4ced396d7024"} Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.038424 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01aec0cd-4f6c-4299-a07d-48bc5b04206c","Type":"ContainerDied","Data":"c46bf84b81c77e4ca078f2984ed1e17e83d62628cf87f4fed723d84f60bf74aa"} Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.038446 4795 scope.go:117] "RemoveContainer" containerID="31f457d4286b799feb10f9adf36e58bdbdddb82beffbf370d21f4ced396d7024" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.048782 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5eee4148-ad3f-42ae-954b-20103c8869e0","Type":"ContainerDied","Data":"6f9fc2a856df4fd02d1f188c7bd17a0205125efe04c5dc6a27afb8ab33d1efad"} Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.048830 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.050710 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a","Type":"ContainerStarted","Data":"6c69b56affcd6e3319b891be2d4392924607b5fdbd45a93af3cf16129ab8f1e4"} Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.076145 4795 scope.go:117] "RemoveContainer" containerID="385cd5652e31298bb87ed87e9552ea3a1a0c6429e349e8ce075cfcc0f4e1be2c" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.089290 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.100706 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01aec0cd-4f6c-4299-a07d-48bc5b04206c-config-data\") pod \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\" (UID: \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\") " Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.100773 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01aec0cd-4f6c-4299-a07d-48bc5b04206c-combined-ca-bundle\") pod \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\" (UID: \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\") " Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.100812 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01aec0cd-4f6c-4299-a07d-48bc5b04206c-logs\") pod \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\" (UID: \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\") " Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.100962 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbg69\" (UniqueName: \"kubernetes.io/projected/01aec0cd-4f6c-4299-a07d-48bc5b04206c-kube-api-access-cbg69\") pod \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\" (UID: \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\") " Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.101399 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01aec0cd-4f6c-4299-a07d-48bc5b04206c-logs" (OuterVolumeSpecName: "logs") pod "01aec0cd-4f6c-4299-a07d-48bc5b04206c" (UID: "01aec0cd-4f6c-4299-a07d-48bc5b04206c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.101816 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01aec0cd-4f6c-4299-a07d-48bc5b04206c-logs\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.103885 4795 scope.go:117] "RemoveContainer" containerID="31f457d4286b799feb10f9adf36e58bdbdddb82beffbf370d21f4ced396d7024" Feb 19 22:59:05 crc kubenswrapper[4795]: E0219 22:59:05.104283 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31f457d4286b799feb10f9adf36e58bdbdddb82beffbf370d21f4ced396d7024\": container with ID starting with 31f457d4286b799feb10f9adf36e58bdbdddb82beffbf370d21f4ced396d7024 not found: ID does not exist" containerID="31f457d4286b799feb10f9adf36e58bdbdddb82beffbf370d21f4ced396d7024" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.104324 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31f457d4286b799feb10f9adf36e58bdbdddb82beffbf370d21f4ced396d7024"} err="failed to get container status \"31f457d4286b799feb10f9adf36e58bdbdddb82beffbf370d21f4ced396d7024\": rpc error: code = NotFound desc = could not find container \"31f457d4286b799feb10f9adf36e58bdbdddb82beffbf370d21f4ced396d7024\": container with ID starting with 31f457d4286b799feb10f9adf36e58bdbdddb82beffbf370d21f4ced396d7024 not found: ID does not exist" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.104346 4795 scope.go:117] "RemoveContainer" containerID="385cd5652e31298bb87ed87e9552ea3a1a0c6429e349e8ce075cfcc0f4e1be2c" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.104473 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01aec0cd-4f6c-4299-a07d-48bc5b04206c-kube-api-access-cbg69" (OuterVolumeSpecName: "kube-api-access-cbg69") pod "01aec0cd-4f6c-4299-a07d-48bc5b04206c" (UID: "01aec0cd-4f6c-4299-a07d-48bc5b04206c"). InnerVolumeSpecName "kube-api-access-cbg69". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:59:05 crc kubenswrapper[4795]: E0219 22:59:05.104669 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"385cd5652e31298bb87ed87e9552ea3a1a0c6429e349e8ce075cfcc0f4e1be2c\": container with ID starting with 385cd5652e31298bb87ed87e9552ea3a1a0c6429e349e8ce075cfcc0f4e1be2c not found: ID does not exist" containerID="385cd5652e31298bb87ed87e9552ea3a1a0c6429e349e8ce075cfcc0f4e1be2c" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.104701 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"385cd5652e31298bb87ed87e9552ea3a1a0c6429e349e8ce075cfcc0f4e1be2c"} err="failed to get container status \"385cd5652e31298bb87ed87e9552ea3a1a0c6429e349e8ce075cfcc0f4e1be2c\": rpc error: code = NotFound desc = could not find container \"385cd5652e31298bb87ed87e9552ea3a1a0c6429e349e8ce075cfcc0f4e1be2c\": container with ID starting with 385cd5652e31298bb87ed87e9552ea3a1a0c6429e349e8ce075cfcc0f4e1be2c not found: ID does not exist" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.104719 4795 scope.go:117] "RemoveContainer" containerID="280220290d29bc48350410df59cfc0bb598766f640d75821796b950e43cea3ef" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.109098 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.117297 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 22:59:05 crc kubenswrapper[4795]: E0219 22:59:05.117654 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01aec0cd-4f6c-4299-a07d-48bc5b04206c" containerName="nova-api-api" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.117670 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="01aec0cd-4f6c-4299-a07d-48bc5b04206c" containerName="nova-api-api" Feb 19 22:59:05 crc kubenswrapper[4795]: E0219 22:59:05.117682 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eee4148-ad3f-42ae-954b-20103c8869e0" containerName="nova-scheduler-scheduler" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.117688 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eee4148-ad3f-42ae-954b-20103c8869e0" containerName="nova-scheduler-scheduler" Feb 19 22:59:05 crc kubenswrapper[4795]: E0219 22:59:05.117699 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01aec0cd-4f6c-4299-a07d-48bc5b04206c" containerName="nova-api-log" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.117705 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="01aec0cd-4f6c-4299-a07d-48bc5b04206c" containerName="nova-api-log" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.117873 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="01aec0cd-4f6c-4299-a07d-48bc5b04206c" containerName="nova-api-log" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.117890 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eee4148-ad3f-42ae-954b-20103c8869e0" containerName="nova-scheduler-scheduler" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.117902 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="01aec0cd-4f6c-4299-a07d-48bc5b04206c" containerName="nova-api-api" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.118512 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.120604 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.124496 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.125237 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01aec0cd-4f6c-4299-a07d-48bc5b04206c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01aec0cd-4f6c-4299-a07d-48bc5b04206c" (UID: "01aec0cd-4f6c-4299-a07d-48bc5b04206c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.143041 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01aec0cd-4f6c-4299-a07d-48bc5b04206c-config-data" (OuterVolumeSpecName: "config-data") pod "01aec0cd-4f6c-4299-a07d-48bc5b04206c" (UID: "01aec0cd-4f6c-4299-a07d-48bc5b04206c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.202945 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gp8x\" (UniqueName: \"kubernetes.io/projected/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-kube-api-access-7gp8x\") pod \"nova-scheduler-0\" (UID: \"f7e72169-9bdf-4b8f-9f8d-ca2a994287da\") " pod="openstack/nova-scheduler-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.203043 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f7e72169-9bdf-4b8f-9f8d-ca2a994287da\") " pod="openstack/nova-scheduler-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.203078 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-config-data\") pod \"nova-scheduler-0\" (UID: \"f7e72169-9bdf-4b8f-9f8d-ca2a994287da\") " pod="openstack/nova-scheduler-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.203308 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbg69\" (UniqueName: \"kubernetes.io/projected/01aec0cd-4f6c-4299-a07d-48bc5b04206c-kube-api-access-cbg69\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.203333 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01aec0cd-4f6c-4299-a07d-48bc5b04206c-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.203344 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01aec0cd-4f6c-4299-a07d-48bc5b04206c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.304725 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gp8x\" (UniqueName: \"kubernetes.io/projected/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-kube-api-access-7gp8x\") pod \"nova-scheduler-0\" (UID: \"f7e72169-9bdf-4b8f-9f8d-ca2a994287da\") " pod="openstack/nova-scheduler-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.304801 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f7e72169-9bdf-4b8f-9f8d-ca2a994287da\") " pod="openstack/nova-scheduler-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.304830 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-config-data\") pod \"nova-scheduler-0\" (UID: \"f7e72169-9bdf-4b8f-9f8d-ca2a994287da\") " pod="openstack/nova-scheduler-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.308441 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-config-data\") pod \"nova-scheduler-0\" (UID: \"f7e72169-9bdf-4b8f-9f8d-ca2a994287da\") " pod="openstack/nova-scheduler-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.308466 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f7e72169-9bdf-4b8f-9f8d-ca2a994287da\") " pod="openstack/nova-scheduler-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.320452 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gp8x\" (UniqueName: \"kubernetes.io/projected/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-kube-api-access-7gp8x\") pod \"nova-scheduler-0\" (UID: \"f7e72169-9bdf-4b8f-9f8d-ca2a994287da\") " pod="openstack/nova-scheduler-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.369631 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.378287 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.389617 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.391108 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.397353 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.406856 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.445532 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.507605 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30b4399f-d971-4131-8243-3c45e8353cdd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"30b4399f-d971-4131-8243-3c45e8353cdd\") " pod="openstack/nova-api-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.507847 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnff6\" (UniqueName: \"kubernetes.io/projected/30b4399f-d971-4131-8243-3c45e8353cdd-kube-api-access-qnff6\") pod \"nova-api-0\" (UID: \"30b4399f-d971-4131-8243-3c45e8353cdd\") " pod="openstack/nova-api-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.507957 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30b4399f-d971-4131-8243-3c45e8353cdd-logs\") pod \"nova-api-0\" (UID: \"30b4399f-d971-4131-8243-3c45e8353cdd\") " pod="openstack/nova-api-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.508044 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30b4399f-d971-4131-8243-3c45e8353cdd-config-data\") pod \"nova-api-0\" (UID: \"30b4399f-d971-4131-8243-3c45e8353cdd\") " pod="openstack/nova-api-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.523621 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01aec0cd-4f6c-4299-a07d-48bc5b04206c" path="/var/lib/kubelet/pods/01aec0cd-4f6c-4299-a07d-48bc5b04206c/volumes" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.524252 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2afe060c-5092-4927-9f32-02115c78441b" path="/var/lib/kubelet/pods/2afe060c-5092-4927-9f32-02115c78441b/volumes" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.524836 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5eee4148-ad3f-42ae-954b-20103c8869e0" path="/var/lib/kubelet/pods/5eee4148-ad3f-42ae-954b-20103c8869e0/volumes" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.609854 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnff6\" (UniqueName: \"kubernetes.io/projected/30b4399f-d971-4131-8243-3c45e8353cdd-kube-api-access-qnff6\") pod \"nova-api-0\" (UID: \"30b4399f-d971-4131-8243-3c45e8353cdd\") " pod="openstack/nova-api-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.609936 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30b4399f-d971-4131-8243-3c45e8353cdd-logs\") pod \"nova-api-0\" (UID: \"30b4399f-d971-4131-8243-3c45e8353cdd\") " pod="openstack/nova-api-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.609978 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30b4399f-d971-4131-8243-3c45e8353cdd-config-data\") pod \"nova-api-0\" (UID: \"30b4399f-d971-4131-8243-3c45e8353cdd\") " pod="openstack/nova-api-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.610015 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30b4399f-d971-4131-8243-3c45e8353cdd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"30b4399f-d971-4131-8243-3c45e8353cdd\") " pod="openstack/nova-api-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.610760 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30b4399f-d971-4131-8243-3c45e8353cdd-logs\") pod \"nova-api-0\" (UID: \"30b4399f-d971-4131-8243-3c45e8353cdd\") " pod="openstack/nova-api-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.616110 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30b4399f-d971-4131-8243-3c45e8353cdd-config-data\") pod \"nova-api-0\" (UID: \"30b4399f-d971-4131-8243-3c45e8353cdd\") " pod="openstack/nova-api-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.616759 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30b4399f-d971-4131-8243-3c45e8353cdd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"30b4399f-d971-4131-8243-3c45e8353cdd\") " pod="openstack/nova-api-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.624353 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnff6\" (UniqueName: \"kubernetes.io/projected/30b4399f-d971-4131-8243-3c45e8353cdd-kube-api-access-qnff6\") pod \"nova-api-0\" (UID: \"30b4399f-d971-4131-8243-3c45e8353cdd\") " pod="openstack/nova-api-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.713695 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.877095 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 22:59:05 crc kubenswrapper[4795]: W0219 22:59:05.880634 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7e72169_9bdf_4b8f_9f8d_ca2a994287da.slice/crio-914300143cab992575f14bc983146583fd096f4199ceab449ccc8d0f6f188747 WatchSource:0}: Error finding container 914300143cab992575f14bc983146583fd096f4199ceab449ccc8d0f6f188747: Status 404 returned error can't find the container with id 914300143cab992575f14bc983146583fd096f4199ceab449ccc8d0f6f188747 Feb 19 22:59:06 crc kubenswrapper[4795]: I0219 22:59:06.060414 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f7e72169-9bdf-4b8f-9f8d-ca2a994287da","Type":"ContainerStarted","Data":"44d4ca3c4aa7ee9ce5e10ecd6a647b96c7526305cf1a590a9d7e1995ec6eaea5"} Feb 19 22:59:06 crc kubenswrapper[4795]: I0219 22:59:06.060453 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f7e72169-9bdf-4b8f-9f8d-ca2a994287da","Type":"ContainerStarted","Data":"914300143cab992575f14bc983146583fd096f4199ceab449ccc8d0f6f188747"} Feb 19 22:59:06 crc kubenswrapper[4795]: I0219 22:59:06.065310 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a","Type":"ContainerStarted","Data":"a8b613dee124277cee481b45a71a4e627213532ed99ccd578d83629d833ffbc6"} Feb 19 22:59:06 crc kubenswrapper[4795]: I0219 22:59:06.065356 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a","Type":"ContainerStarted","Data":"3e5f71386ccfa754a08ae46bc61bdb80afae6d8cdf0cf2814359a3071d8fb32e"} Feb 19 22:59:06 crc kubenswrapper[4795]: I0219 22:59:06.085321 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.085305366 podStartE2EDuration="1.085305366s" podCreationTimestamp="2026-02-19 22:59:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:59:06.075141687 +0000 UTC m=+5457.267659561" watchObservedRunningTime="2026-02-19 22:59:06.085305366 +0000 UTC m=+5457.277823230" Feb 19 22:59:06 crc kubenswrapper[4795]: I0219 22:59:06.143741 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.143717403 podStartE2EDuration="2.143717403s" podCreationTimestamp="2026-02-19 22:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:59:06.095340781 +0000 UTC m=+5457.287858665" watchObservedRunningTime="2026-02-19 22:59:06.143717403 +0000 UTC m=+5457.336235277" Feb 19 22:59:06 crc kubenswrapper[4795]: I0219 22:59:06.149536 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 22:59:06 crc kubenswrapper[4795]: W0219 22:59:06.152729 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30b4399f_d971_4131_8243_3c45e8353cdd.slice/crio-d7a27d1be53c0ad8661e73984444d15fa7b73baf2b5131526b22e61e61977caf WatchSource:0}: Error finding container d7a27d1be53c0ad8661e73984444d15fa7b73baf2b5131526b22e61e61977caf: Status 404 returned error can't find the container with id d7a27d1be53c0ad8661e73984444d15fa7b73baf2b5131526b22e61e61977caf Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.080221 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"30b4399f-d971-4131-8243-3c45e8353cdd","Type":"ContainerStarted","Data":"6d26cbca7b84b1c79190688e30dc36b02a9a4c86419cdd59664872d8f4e8db9f"} Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.080601 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"30b4399f-d971-4131-8243-3c45e8353cdd","Type":"ContainerStarted","Data":"9f81160ecc73cbe2584f285d4077d91e6e6e7f01ec7eb44fcd8f220d2ecb27ff"} Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.080619 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"30b4399f-d971-4131-8243-3c45e8353cdd","Type":"ContainerStarted","Data":"d7a27d1be53c0ad8661e73984444d15fa7b73baf2b5131526b22e61e61977caf"} Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.132485 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.132458428 podStartE2EDuration="2.132458428s" podCreationTimestamp="2026-02-19 22:59:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:59:07.124835354 +0000 UTC m=+5458.317353268" watchObservedRunningTime="2026-02-19 22:59:07.132458428 +0000 UTC m=+5458.324976312" Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.503628 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p727f"] Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.520759 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.548744 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/082b6029-bcb1-409b-b9a4-62f6c1593d5c-utilities\") pod \"community-operators-p727f\" (UID: \"082b6029-bcb1-409b-b9a4-62f6c1593d5c\") " pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.548972 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/082b6029-bcb1-409b-b9a4-62f6c1593d5c-catalog-content\") pod \"community-operators-p727f\" (UID: \"082b6029-bcb1-409b-b9a4-62f6c1593d5c\") " pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.549046 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztkd9\" (UniqueName: \"kubernetes.io/projected/082b6029-bcb1-409b-b9a4-62f6c1593d5c-kube-api-access-ztkd9\") pod \"community-operators-p727f\" (UID: \"082b6029-bcb1-409b-b9a4-62f6c1593d5c\") " pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.569360 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p727f"] Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.651149 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/082b6029-bcb1-409b-b9a4-62f6c1593d5c-utilities\") pod \"community-operators-p727f\" (UID: \"082b6029-bcb1-409b-b9a4-62f6c1593d5c\") " pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.651261 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/082b6029-bcb1-409b-b9a4-62f6c1593d5c-catalog-content\") pod \"community-operators-p727f\" (UID: \"082b6029-bcb1-409b-b9a4-62f6c1593d5c\") " pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.651307 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztkd9\" (UniqueName: \"kubernetes.io/projected/082b6029-bcb1-409b-b9a4-62f6c1593d5c-kube-api-access-ztkd9\") pod \"community-operators-p727f\" (UID: \"082b6029-bcb1-409b-b9a4-62f6c1593d5c\") " pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.651770 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/082b6029-bcb1-409b-b9a4-62f6c1593d5c-catalog-content\") pod \"community-operators-p727f\" (UID: \"082b6029-bcb1-409b-b9a4-62f6c1593d5c\") " pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.651792 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/082b6029-bcb1-409b-b9a4-62f6c1593d5c-utilities\") pod \"community-operators-p727f\" (UID: \"082b6029-bcb1-409b-b9a4-62f6c1593d5c\") " pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.668830 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztkd9\" (UniqueName: \"kubernetes.io/projected/082b6029-bcb1-409b-b9a4-62f6c1593d5c-kube-api-access-ztkd9\") pod \"community-operators-p727f\" (UID: \"082b6029-bcb1-409b-b9a4-62f6c1593d5c\") " pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.852094 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:08 crc kubenswrapper[4795]: I0219 22:59:08.333969 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p727f"] Feb 19 22:59:09 crc kubenswrapper[4795]: I0219 22:59:09.105594 4795 generic.go:334] "Generic (PLEG): container finished" podID="082b6029-bcb1-409b-b9a4-62f6c1593d5c" containerID="2bf568d8d6d6ee14a8fb0d6a33e9393118dff3752d865483cdb6f6e46d9abaa2" exitCode=0 Feb 19 22:59:09 crc kubenswrapper[4795]: I0219 22:59:09.105679 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p727f" event={"ID":"082b6029-bcb1-409b-b9a4-62f6c1593d5c","Type":"ContainerDied","Data":"2bf568d8d6d6ee14a8fb0d6a33e9393118dff3752d865483cdb6f6e46d9abaa2"} Feb 19 22:59:09 crc kubenswrapper[4795]: I0219 22:59:09.105733 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p727f" event={"ID":"082b6029-bcb1-409b-b9a4-62f6c1593d5c","Type":"ContainerStarted","Data":"80b6d7fc4851512ae99323cad61b29e71941580cd0bda6065a46f6fe3bd3ed00"} Feb 19 22:59:09 crc kubenswrapper[4795]: I0219 22:59:09.107928 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 22:59:09 crc kubenswrapper[4795]: I0219 22:59:09.450537 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 22:59:09 crc kubenswrapper[4795]: I0219 22:59:09.450586 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 22:59:10 crc kubenswrapper[4795]: I0219 22:59:10.115323 4795 generic.go:334] "Generic (PLEG): container finished" podID="082b6029-bcb1-409b-b9a4-62f6c1593d5c" containerID="6b72ed317be26fa3ef273dca3faa7a366a13e5079a1e86af64fa68cbb9229019" exitCode=0 Feb 19 22:59:10 crc kubenswrapper[4795]: I0219 22:59:10.115446 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p727f" event={"ID":"082b6029-bcb1-409b-b9a4-62f6c1593d5c","Type":"ContainerDied","Data":"6b72ed317be26fa3ef273dca3faa7a366a13e5079a1e86af64fa68cbb9229019"} Feb 19 22:59:10 crc kubenswrapper[4795]: I0219 22:59:10.445675 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 22:59:11 crc kubenswrapper[4795]: I0219 22:59:11.126898 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p727f" event={"ID":"082b6029-bcb1-409b-b9a4-62f6c1593d5c","Type":"ContainerStarted","Data":"7de5f0f6ec1305f429e795a9b79007d4a1f18d5b81b88addbe819227b2d93131"} Feb 19 22:59:11 crc kubenswrapper[4795]: I0219 22:59:11.156062 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p727f" podStartSLOduration=2.725196786 podStartE2EDuration="4.156040483s" podCreationTimestamp="2026-02-19 22:59:07 +0000 UTC" firstStartedPulling="2026-02-19 22:59:09.107528365 +0000 UTC m=+5460.300046259" lastFinishedPulling="2026-02-19 22:59:10.538372052 +0000 UTC m=+5461.730889956" observedRunningTime="2026-02-19 22:59:11.148582923 +0000 UTC m=+5462.341100797" watchObservedRunningTime="2026-02-19 22:59:11.156040483 +0000 UTC m=+5462.348558347" Feb 19 22:59:14 crc kubenswrapper[4795]: I0219 22:59:14.450468 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 22:59:14 crc kubenswrapper[4795]: I0219 22:59:14.451282 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 22:59:15 crc kubenswrapper[4795]: I0219 22:59:15.445840 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 22:59:15 crc kubenswrapper[4795]: I0219 22:59:15.474680 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 22:59:15 crc kubenswrapper[4795]: I0219 22:59:15.534282 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.69:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 22:59:15 crc kubenswrapper[4795]: I0219 22:59:15.534282 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.69:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 22:59:15 crc kubenswrapper[4795]: I0219 22:59:15.601650 4795 scope.go:117] "RemoveContainer" containerID="ccef51520029c7e6d5f912ec1258a3f7218e74073cc8ab1ae05322f167a9a678" Feb 19 22:59:15 crc kubenswrapper[4795]: I0219 22:59:15.714635 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 22:59:15 crc kubenswrapper[4795]: I0219 22:59:15.714690 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 22:59:16 crc kubenswrapper[4795]: I0219 22:59:16.204916 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 22:59:16 crc kubenswrapper[4795]: I0219 22:59:16.797608 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="30b4399f-d971-4131-8243-3c45e8353cdd" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.71:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 22:59:16 crc kubenswrapper[4795]: I0219 22:59:16.797693 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="30b4399f-d971-4131-8243-3c45e8353cdd" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.71:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 22:59:17 crc kubenswrapper[4795]: I0219 22:59:17.853013 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:17 crc kubenswrapper[4795]: I0219 22:59:17.853946 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:17 crc kubenswrapper[4795]: I0219 22:59:17.927965 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:18 crc kubenswrapper[4795]: I0219 22:59:18.240271 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:18 crc kubenswrapper[4795]: I0219 22:59:18.305803 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p727f"] Feb 19 22:59:20 crc kubenswrapper[4795]: I0219 22:59:20.203140 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p727f" podUID="082b6029-bcb1-409b-b9a4-62f6c1593d5c" containerName="registry-server" containerID="cri-o://7de5f0f6ec1305f429e795a9b79007d4a1f18d5b81b88addbe819227b2d93131" gracePeriod=2 Feb 19 22:59:20 crc kubenswrapper[4795]: I0219 22:59:20.726877 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:20 crc kubenswrapper[4795]: I0219 22:59:20.908118 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/082b6029-bcb1-409b-b9a4-62f6c1593d5c-utilities\") pod \"082b6029-bcb1-409b-b9a4-62f6c1593d5c\" (UID: \"082b6029-bcb1-409b-b9a4-62f6c1593d5c\") " Feb 19 22:59:20 crc kubenswrapper[4795]: I0219 22:59:20.908361 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/082b6029-bcb1-409b-b9a4-62f6c1593d5c-catalog-content\") pod \"082b6029-bcb1-409b-b9a4-62f6c1593d5c\" (UID: \"082b6029-bcb1-409b-b9a4-62f6c1593d5c\") " Feb 19 22:59:20 crc kubenswrapper[4795]: I0219 22:59:20.908422 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztkd9\" (UniqueName: \"kubernetes.io/projected/082b6029-bcb1-409b-b9a4-62f6c1593d5c-kube-api-access-ztkd9\") pod \"082b6029-bcb1-409b-b9a4-62f6c1593d5c\" (UID: \"082b6029-bcb1-409b-b9a4-62f6c1593d5c\") " Feb 19 22:59:20 crc kubenswrapper[4795]: I0219 22:59:20.909484 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/082b6029-bcb1-409b-b9a4-62f6c1593d5c-utilities" (OuterVolumeSpecName: "utilities") pod "082b6029-bcb1-409b-b9a4-62f6c1593d5c" (UID: "082b6029-bcb1-409b-b9a4-62f6c1593d5c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:59:20 crc kubenswrapper[4795]: I0219 22:59:20.917230 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/082b6029-bcb1-409b-b9a4-62f6c1593d5c-kube-api-access-ztkd9" (OuterVolumeSpecName: "kube-api-access-ztkd9") pod "082b6029-bcb1-409b-b9a4-62f6c1593d5c" (UID: "082b6029-bcb1-409b-b9a4-62f6c1593d5c"). InnerVolumeSpecName "kube-api-access-ztkd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:59:20 crc kubenswrapper[4795]: I0219 22:59:20.956666 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/082b6029-bcb1-409b-b9a4-62f6c1593d5c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "082b6029-bcb1-409b-b9a4-62f6c1593d5c" (UID: "082b6029-bcb1-409b-b9a4-62f6c1593d5c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.010211 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/082b6029-bcb1-409b-b9a4-62f6c1593d5c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.010248 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztkd9\" (UniqueName: \"kubernetes.io/projected/082b6029-bcb1-409b-b9a4-62f6c1593d5c-kube-api-access-ztkd9\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.010260 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/082b6029-bcb1-409b-b9a4-62f6c1593d5c-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.222674 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.222786 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p727f" event={"ID":"082b6029-bcb1-409b-b9a4-62f6c1593d5c","Type":"ContainerDied","Data":"7de5f0f6ec1305f429e795a9b79007d4a1f18d5b81b88addbe819227b2d93131"} Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.222930 4795 scope.go:117] "RemoveContainer" containerID="7de5f0f6ec1305f429e795a9b79007d4a1f18d5b81b88addbe819227b2d93131" Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.222517 4795 generic.go:334] "Generic (PLEG): container finished" podID="082b6029-bcb1-409b-b9a4-62f6c1593d5c" containerID="7de5f0f6ec1305f429e795a9b79007d4a1f18d5b81b88addbe819227b2d93131" exitCode=0 Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.223330 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p727f" event={"ID":"082b6029-bcb1-409b-b9a4-62f6c1593d5c","Type":"ContainerDied","Data":"80b6d7fc4851512ae99323cad61b29e71941580cd0bda6065a46f6fe3bd3ed00"} Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.244958 4795 scope.go:117] "RemoveContainer" containerID="6b72ed317be26fa3ef273dca3faa7a366a13e5079a1e86af64fa68cbb9229019" Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.274423 4795 scope.go:117] "RemoveContainer" containerID="2bf568d8d6d6ee14a8fb0d6a33e9393118dff3752d865483cdb6f6e46d9abaa2" Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.282811 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p727f"] Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.292983 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p727f"] Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.335840 4795 scope.go:117] "RemoveContainer" containerID="7de5f0f6ec1305f429e795a9b79007d4a1f18d5b81b88addbe819227b2d93131" Feb 19 22:59:21 crc kubenswrapper[4795]: E0219 22:59:21.336436 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7de5f0f6ec1305f429e795a9b79007d4a1f18d5b81b88addbe819227b2d93131\": container with ID starting with 7de5f0f6ec1305f429e795a9b79007d4a1f18d5b81b88addbe819227b2d93131 not found: ID does not exist" containerID="7de5f0f6ec1305f429e795a9b79007d4a1f18d5b81b88addbe819227b2d93131" Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.336505 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7de5f0f6ec1305f429e795a9b79007d4a1f18d5b81b88addbe819227b2d93131"} err="failed to get container status \"7de5f0f6ec1305f429e795a9b79007d4a1f18d5b81b88addbe819227b2d93131\": rpc error: code = NotFound desc = could not find container \"7de5f0f6ec1305f429e795a9b79007d4a1f18d5b81b88addbe819227b2d93131\": container with ID starting with 7de5f0f6ec1305f429e795a9b79007d4a1f18d5b81b88addbe819227b2d93131 not found: ID does not exist" Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.336533 4795 scope.go:117] "RemoveContainer" containerID="6b72ed317be26fa3ef273dca3faa7a366a13e5079a1e86af64fa68cbb9229019" Feb 19 22:59:21 crc kubenswrapper[4795]: E0219 22:59:21.337133 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b72ed317be26fa3ef273dca3faa7a366a13e5079a1e86af64fa68cbb9229019\": container with ID starting with 6b72ed317be26fa3ef273dca3faa7a366a13e5079a1e86af64fa68cbb9229019 not found: ID does not exist" containerID="6b72ed317be26fa3ef273dca3faa7a366a13e5079a1e86af64fa68cbb9229019" Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.337185 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b72ed317be26fa3ef273dca3faa7a366a13e5079a1e86af64fa68cbb9229019"} err="failed to get container status \"6b72ed317be26fa3ef273dca3faa7a366a13e5079a1e86af64fa68cbb9229019\": rpc error: code = NotFound desc = could not find container \"6b72ed317be26fa3ef273dca3faa7a366a13e5079a1e86af64fa68cbb9229019\": container with ID starting with 6b72ed317be26fa3ef273dca3faa7a366a13e5079a1e86af64fa68cbb9229019 not found: ID does not exist" Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.337218 4795 scope.go:117] "RemoveContainer" containerID="2bf568d8d6d6ee14a8fb0d6a33e9393118dff3752d865483cdb6f6e46d9abaa2" Feb 19 22:59:21 crc kubenswrapper[4795]: E0219 22:59:21.337526 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bf568d8d6d6ee14a8fb0d6a33e9393118dff3752d865483cdb6f6e46d9abaa2\": container with ID starting with 2bf568d8d6d6ee14a8fb0d6a33e9393118dff3752d865483cdb6f6e46d9abaa2 not found: ID does not exist" containerID="2bf568d8d6d6ee14a8fb0d6a33e9393118dff3752d865483cdb6f6e46d9abaa2" Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.337561 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bf568d8d6d6ee14a8fb0d6a33e9393118dff3752d865483cdb6f6e46d9abaa2"} err="failed to get container status \"2bf568d8d6d6ee14a8fb0d6a33e9393118dff3752d865483cdb6f6e46d9abaa2\": rpc error: code = NotFound desc = could not find container \"2bf568d8d6d6ee14a8fb0d6a33e9393118dff3752d865483cdb6f6e46d9abaa2\": container with ID starting with 2bf568d8d6d6ee14a8fb0d6a33e9393118dff3752d865483cdb6f6e46d9abaa2 not found: ID does not exist" Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.534806 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="082b6029-bcb1-409b-b9a4-62f6c1593d5c" path="/var/lib/kubelet/pods/082b6029-bcb1-409b-b9a4-62f6c1593d5c/volumes" Feb 19 22:59:24 crc kubenswrapper[4795]: I0219 22:59:24.452687 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 22:59:24 crc kubenswrapper[4795]: I0219 22:59:24.453316 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 22:59:24 crc kubenswrapper[4795]: I0219 22:59:24.455080 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 22:59:24 crc kubenswrapper[4795]: I0219 22:59:24.455870 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 22:59:25 crc kubenswrapper[4795]: I0219 22:59:25.718046 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 22:59:25 crc kubenswrapper[4795]: I0219 22:59:25.718644 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 22:59:25 crc kubenswrapper[4795]: I0219 22:59:25.718888 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 22:59:25 crc kubenswrapper[4795]: I0219 22:59:25.721564 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.266353 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.272520 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.477155 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bd886b897-t5785"] Feb 19 22:59:26 crc kubenswrapper[4795]: E0219 22:59:26.477526 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="082b6029-bcb1-409b-b9a4-62f6c1593d5c" containerName="extract-utilities" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.477549 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="082b6029-bcb1-409b-b9a4-62f6c1593d5c" containerName="extract-utilities" Feb 19 22:59:26 crc kubenswrapper[4795]: E0219 22:59:26.477584 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="082b6029-bcb1-409b-b9a4-62f6c1593d5c" containerName="registry-server" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.477591 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="082b6029-bcb1-409b-b9a4-62f6c1593d5c" containerName="registry-server" Feb 19 22:59:26 crc kubenswrapper[4795]: E0219 22:59:26.477605 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="082b6029-bcb1-409b-b9a4-62f6c1593d5c" containerName="extract-content" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.477611 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="082b6029-bcb1-409b-b9a4-62f6c1593d5c" containerName="extract-content" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.482428 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="082b6029-bcb1-409b-b9a4-62f6c1593d5c" containerName="registry-server" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.483492 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.499553 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bd886b897-t5785"] Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.608271 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmqvw\" (UniqueName: \"kubernetes.io/projected/25125096-9221-46cf-9c10-21242922dc39-kube-api-access-jmqvw\") pod \"dnsmasq-dns-bd886b897-t5785\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.608315 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-ovsdbserver-sb\") pod \"dnsmasq-dns-bd886b897-t5785\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.608379 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-config\") pod \"dnsmasq-dns-bd886b897-t5785\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.608438 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-ovsdbserver-nb\") pod \"dnsmasq-dns-bd886b897-t5785\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.608474 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-dns-svc\") pod \"dnsmasq-dns-bd886b897-t5785\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.711211 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmqvw\" (UniqueName: \"kubernetes.io/projected/25125096-9221-46cf-9c10-21242922dc39-kube-api-access-jmqvw\") pod \"dnsmasq-dns-bd886b897-t5785\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.711285 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-ovsdbserver-sb\") pod \"dnsmasq-dns-bd886b897-t5785\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.711356 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-config\") pod \"dnsmasq-dns-bd886b897-t5785\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.711418 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-ovsdbserver-nb\") pod \"dnsmasq-dns-bd886b897-t5785\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.711459 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-dns-svc\") pod \"dnsmasq-dns-bd886b897-t5785\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.712517 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-config\") pod \"dnsmasq-dns-bd886b897-t5785\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.712593 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-ovsdbserver-sb\") pod \"dnsmasq-dns-bd886b897-t5785\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.712835 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-dns-svc\") pod \"dnsmasq-dns-bd886b897-t5785\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.713095 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-ovsdbserver-nb\") pod \"dnsmasq-dns-bd886b897-t5785\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.733882 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmqvw\" (UniqueName: \"kubernetes.io/projected/25125096-9221-46cf-9c10-21242922dc39-kube-api-access-jmqvw\") pod \"dnsmasq-dns-bd886b897-t5785\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.804651 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:27 crc kubenswrapper[4795]: I0219 22:59:27.268396 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bd886b897-t5785"] Feb 19 22:59:27 crc kubenswrapper[4795]: W0219 22:59:27.273318 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25125096_9221_46cf_9c10_21242922dc39.slice/crio-cb323091e0b46dedb3513673e4bcc0af0bdc32ed871a599a68f0f38aa938ff1f WatchSource:0}: Error finding container cb323091e0b46dedb3513673e4bcc0af0bdc32ed871a599a68f0f38aa938ff1f: Status 404 returned error can't find the container with id cb323091e0b46dedb3513673e4bcc0af0bdc32ed871a599a68f0f38aa938ff1f Feb 19 22:59:28 crc kubenswrapper[4795]: I0219 22:59:28.301310 4795 generic.go:334] "Generic (PLEG): container finished" podID="25125096-9221-46cf-9c10-21242922dc39" containerID="1208585385cb268d9e64c92ccccb01209cc9892bf210d3382d48ae16872a3290" exitCode=0 Feb 19 22:59:28 crc kubenswrapper[4795]: I0219 22:59:28.301613 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bd886b897-t5785" event={"ID":"25125096-9221-46cf-9c10-21242922dc39","Type":"ContainerDied","Data":"1208585385cb268d9e64c92ccccb01209cc9892bf210d3382d48ae16872a3290"} Feb 19 22:59:28 crc kubenswrapper[4795]: I0219 22:59:28.302061 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bd886b897-t5785" event={"ID":"25125096-9221-46cf-9c10-21242922dc39","Type":"ContainerStarted","Data":"cb323091e0b46dedb3513673e4bcc0af0bdc32ed871a599a68f0f38aa938ff1f"} Feb 19 22:59:28 crc kubenswrapper[4795]: I0219 22:59:28.428150 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:59:28 crc kubenswrapper[4795]: I0219 22:59:28.428224 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:59:28 crc kubenswrapper[4795]: I0219 22:59:28.428267 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 22:59:28 crc kubenswrapper[4795]: I0219 22:59:28.428959 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fd820b5d0adc1705d78ec76939670a71a79ca07206c8d0459e23712d0b015f16"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 22:59:28 crc kubenswrapper[4795]: I0219 22:59:28.429013 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://fd820b5d0adc1705d78ec76939670a71a79ca07206c8d0459e23712d0b015f16" gracePeriod=600 Feb 19 22:59:29 crc kubenswrapper[4795]: I0219 22:59:29.312609 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="fd820b5d0adc1705d78ec76939670a71a79ca07206c8d0459e23712d0b015f16" exitCode=0 Feb 19 22:59:29 crc kubenswrapper[4795]: I0219 22:59:29.313154 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"fd820b5d0adc1705d78ec76939670a71a79ca07206c8d0459e23712d0b015f16"} Feb 19 22:59:29 crc kubenswrapper[4795]: I0219 22:59:29.313195 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c"} Feb 19 22:59:29 crc kubenswrapper[4795]: I0219 22:59:29.313210 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:59:29 crc kubenswrapper[4795]: I0219 22:59:29.316208 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bd886b897-t5785" event={"ID":"25125096-9221-46cf-9c10-21242922dc39","Type":"ContainerStarted","Data":"6b2bdae1ece62dc20662e376419faadb82e3b42065ea06a9558f606c33c4b7d2"} Feb 19 22:59:29 crc kubenswrapper[4795]: I0219 22:59:29.316866 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:29 crc kubenswrapper[4795]: I0219 22:59:29.359133 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bd886b897-t5785" podStartSLOduration=3.358915398 podStartE2EDuration="3.358915398s" podCreationTimestamp="2026-02-19 22:59:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:59:29.353318126 +0000 UTC m=+5480.545836000" watchObservedRunningTime="2026-02-19 22:59:29.358915398 +0000 UTC m=+5480.551433262" Feb 19 22:59:36 crc kubenswrapper[4795]: I0219 22:59:36.806382 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:36 crc kubenswrapper[4795]: I0219 22:59:36.878216 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b966c788c-4sfvg"] Feb 19 22:59:36 crc kubenswrapper[4795]: I0219 22:59:36.878500 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" podUID="2ff79db5-8006-4a3f-bb73-ab6e32d93186" containerName="dnsmasq-dns" containerID="cri-o://b9f475bcb8fe7daa110726dc68979a5f8257c170cc549dfc5c151f5b0ece628f" gracePeriod=10 Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.392225 4795 generic.go:334] "Generic (PLEG): container finished" podID="2ff79db5-8006-4a3f-bb73-ab6e32d93186" containerID="b9f475bcb8fe7daa110726dc68979a5f8257c170cc549dfc5c151f5b0ece628f" exitCode=0 Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.392623 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" event={"ID":"2ff79db5-8006-4a3f-bb73-ab6e32d93186","Type":"ContainerDied","Data":"b9f475bcb8fe7daa110726dc68979a5f8257c170cc549dfc5c151f5b0ece628f"} Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.392655 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" event={"ID":"2ff79db5-8006-4a3f-bb73-ab6e32d93186","Type":"ContainerDied","Data":"ee318881516d5dfefa76dbf8f513dce5d08620986165c943cfdd920651e34509"} Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.392668 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee318881516d5dfefa76dbf8f513dce5d08620986165c943cfdd920651e34509" Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.402432 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.530398 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-ovsdbserver-sb\") pod \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.530728 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8j2w\" (UniqueName: \"kubernetes.io/projected/2ff79db5-8006-4a3f-bb73-ab6e32d93186-kube-api-access-q8j2w\") pod \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.530795 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-dns-svc\") pod \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.530816 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-ovsdbserver-nb\") pod \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.530858 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-config\") pod \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.543341 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ff79db5-8006-4a3f-bb73-ab6e32d93186-kube-api-access-q8j2w" (OuterVolumeSpecName: "kube-api-access-q8j2w") pod "2ff79db5-8006-4a3f-bb73-ab6e32d93186" (UID: "2ff79db5-8006-4a3f-bb73-ab6e32d93186"). InnerVolumeSpecName "kube-api-access-q8j2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.570999 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-config" (OuterVolumeSpecName: "config") pod "2ff79db5-8006-4a3f-bb73-ab6e32d93186" (UID: "2ff79db5-8006-4a3f-bb73-ab6e32d93186"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.576142 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2ff79db5-8006-4a3f-bb73-ab6e32d93186" (UID: "2ff79db5-8006-4a3f-bb73-ab6e32d93186"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.577998 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2ff79db5-8006-4a3f-bb73-ab6e32d93186" (UID: "2ff79db5-8006-4a3f-bb73-ab6e32d93186"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.584813 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2ff79db5-8006-4a3f-bb73-ab6e32d93186" (UID: "2ff79db5-8006-4a3f-bb73-ab6e32d93186"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.632820 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.632854 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8j2w\" (UniqueName: \"kubernetes.io/projected/2ff79db5-8006-4a3f-bb73-ab6e32d93186-kube-api-access-q8j2w\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.632865 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.632873 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.632883 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-config\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.401609 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.447436 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b966c788c-4sfvg"] Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.459664 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b966c788c-4sfvg"] Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.699361 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-z6zgv"] Feb 19 22:59:38 crc kubenswrapper[4795]: E0219 22:59:38.699787 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ff79db5-8006-4a3f-bb73-ab6e32d93186" containerName="init" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.699804 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff79db5-8006-4a3f-bb73-ab6e32d93186" containerName="init" Feb 19 22:59:38 crc kubenswrapper[4795]: E0219 22:59:38.699819 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ff79db5-8006-4a3f-bb73-ab6e32d93186" containerName="dnsmasq-dns" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.699827 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff79db5-8006-4a3f-bb73-ab6e32d93186" containerName="dnsmasq-dns" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.699993 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ff79db5-8006-4a3f-bb73-ab6e32d93186" containerName="dnsmasq-dns" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.700607 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-z6zgv" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.719702 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-z6zgv"] Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.734612 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-0a84-account-create-update-pl5gt"] Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.735732 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0a84-account-create-update-pl5gt" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.738234 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.776250 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-0a84-account-create-update-pl5gt"] Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.856321 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf9mm\" (UniqueName: \"kubernetes.io/projected/6eae15d3-0be7-4510-9803-a7ad3f947148-kube-api-access-bf9mm\") pod \"cinder-db-create-z6zgv\" (UID: \"6eae15d3-0be7-4510-9803-a7ad3f947148\") " pod="openstack/cinder-db-create-z6zgv" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.856415 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dhb4\" (UniqueName: \"kubernetes.io/projected/de06c33e-b82b-46eb-964b-4bdd02c94166-kube-api-access-9dhb4\") pod \"cinder-0a84-account-create-update-pl5gt\" (UID: \"de06c33e-b82b-46eb-964b-4bdd02c94166\") " pod="openstack/cinder-0a84-account-create-update-pl5gt" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.856456 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de06c33e-b82b-46eb-964b-4bdd02c94166-operator-scripts\") pod \"cinder-0a84-account-create-update-pl5gt\" (UID: \"de06c33e-b82b-46eb-964b-4bdd02c94166\") " pod="openstack/cinder-0a84-account-create-update-pl5gt" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.856478 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6eae15d3-0be7-4510-9803-a7ad3f947148-operator-scripts\") pod \"cinder-db-create-z6zgv\" (UID: \"6eae15d3-0be7-4510-9803-a7ad3f947148\") " pod="openstack/cinder-db-create-z6zgv" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.958227 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de06c33e-b82b-46eb-964b-4bdd02c94166-operator-scripts\") pod \"cinder-0a84-account-create-update-pl5gt\" (UID: \"de06c33e-b82b-46eb-964b-4bdd02c94166\") " pod="openstack/cinder-0a84-account-create-update-pl5gt" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.958276 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6eae15d3-0be7-4510-9803-a7ad3f947148-operator-scripts\") pod \"cinder-db-create-z6zgv\" (UID: \"6eae15d3-0be7-4510-9803-a7ad3f947148\") " pod="openstack/cinder-db-create-z6zgv" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.958362 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf9mm\" (UniqueName: \"kubernetes.io/projected/6eae15d3-0be7-4510-9803-a7ad3f947148-kube-api-access-bf9mm\") pod \"cinder-db-create-z6zgv\" (UID: \"6eae15d3-0be7-4510-9803-a7ad3f947148\") " pod="openstack/cinder-db-create-z6zgv" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.958413 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dhb4\" (UniqueName: \"kubernetes.io/projected/de06c33e-b82b-46eb-964b-4bdd02c94166-kube-api-access-9dhb4\") pod \"cinder-0a84-account-create-update-pl5gt\" (UID: \"de06c33e-b82b-46eb-964b-4bdd02c94166\") " pod="openstack/cinder-0a84-account-create-update-pl5gt" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.958885 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de06c33e-b82b-46eb-964b-4bdd02c94166-operator-scripts\") pod \"cinder-0a84-account-create-update-pl5gt\" (UID: \"de06c33e-b82b-46eb-964b-4bdd02c94166\") " pod="openstack/cinder-0a84-account-create-update-pl5gt" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.959309 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6eae15d3-0be7-4510-9803-a7ad3f947148-operator-scripts\") pod \"cinder-db-create-z6zgv\" (UID: \"6eae15d3-0be7-4510-9803-a7ad3f947148\") " pod="openstack/cinder-db-create-z6zgv" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.973820 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dhb4\" (UniqueName: \"kubernetes.io/projected/de06c33e-b82b-46eb-964b-4bdd02c94166-kube-api-access-9dhb4\") pod \"cinder-0a84-account-create-update-pl5gt\" (UID: \"de06c33e-b82b-46eb-964b-4bdd02c94166\") " pod="openstack/cinder-0a84-account-create-update-pl5gt" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.974442 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf9mm\" (UniqueName: \"kubernetes.io/projected/6eae15d3-0be7-4510-9803-a7ad3f947148-kube-api-access-bf9mm\") pod \"cinder-db-create-z6zgv\" (UID: \"6eae15d3-0be7-4510-9803-a7ad3f947148\") " pod="openstack/cinder-db-create-z6zgv" Feb 19 22:59:39 crc kubenswrapper[4795]: I0219 22:59:39.016292 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-z6zgv" Feb 19 22:59:39 crc kubenswrapper[4795]: I0219 22:59:39.059101 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0a84-account-create-update-pl5gt" Feb 19 22:59:39 crc kubenswrapper[4795]: I0219 22:59:39.387340 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-0a84-account-create-update-pl5gt"] Feb 19 22:59:39 crc kubenswrapper[4795]: I0219 22:59:39.410455 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0a84-account-create-update-pl5gt" event={"ID":"de06c33e-b82b-46eb-964b-4bdd02c94166","Type":"ContainerStarted","Data":"994224a55a32aedbb842e33e8f9a0f61c34196b1a61e74254cb494cf650cadb0"} Feb 19 22:59:39 crc kubenswrapper[4795]: I0219 22:59:39.468617 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-z6zgv"] Feb 19 22:59:39 crc kubenswrapper[4795]: W0219 22:59:39.469907 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6eae15d3_0be7_4510_9803_a7ad3f947148.slice/crio-0fba21faa525c124f9a44b646f17455f1be57821e6a84476adf2f12852386511 WatchSource:0}: Error finding container 0fba21faa525c124f9a44b646f17455f1be57821e6a84476adf2f12852386511: Status 404 returned error can't find the container with id 0fba21faa525c124f9a44b646f17455f1be57821e6a84476adf2f12852386511 Feb 19 22:59:39 crc kubenswrapper[4795]: I0219 22:59:39.522484 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ff79db5-8006-4a3f-bb73-ab6e32d93186" path="/var/lib/kubelet/pods/2ff79db5-8006-4a3f-bb73-ab6e32d93186/volumes" Feb 19 22:59:40 crc kubenswrapper[4795]: I0219 22:59:40.420159 4795 generic.go:334] "Generic (PLEG): container finished" podID="de06c33e-b82b-46eb-964b-4bdd02c94166" containerID="87dbef2ca275f0dcf2d6fcb445371c808cbb03bd9ce8927982987b0e668dfab9" exitCode=0 Feb 19 22:59:40 crc kubenswrapper[4795]: I0219 22:59:40.420256 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0a84-account-create-update-pl5gt" event={"ID":"de06c33e-b82b-46eb-964b-4bdd02c94166","Type":"ContainerDied","Data":"87dbef2ca275f0dcf2d6fcb445371c808cbb03bd9ce8927982987b0e668dfab9"} Feb 19 22:59:40 crc kubenswrapper[4795]: I0219 22:59:40.422987 4795 generic.go:334] "Generic (PLEG): container finished" podID="6eae15d3-0be7-4510-9803-a7ad3f947148" containerID="dc19c0a9eee505e65f65e0357256fb1d1ef9373c082944f9697154c21d026163" exitCode=0 Feb 19 22:59:40 crc kubenswrapper[4795]: I0219 22:59:40.423075 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-z6zgv" event={"ID":"6eae15d3-0be7-4510-9803-a7ad3f947148","Type":"ContainerDied","Data":"dc19c0a9eee505e65f65e0357256fb1d1ef9373c082944f9697154c21d026163"} Feb 19 22:59:40 crc kubenswrapper[4795]: I0219 22:59:40.423116 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-z6zgv" event={"ID":"6eae15d3-0be7-4510-9803-a7ad3f947148","Type":"ContainerStarted","Data":"0fba21faa525c124f9a44b646f17455f1be57821e6a84476adf2f12852386511"} Feb 19 22:59:41 crc kubenswrapper[4795]: I0219 22:59:41.769361 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0a84-account-create-update-pl5gt" Feb 19 22:59:41 crc kubenswrapper[4795]: I0219 22:59:41.845891 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-z6zgv" Feb 19 22:59:41 crc kubenswrapper[4795]: I0219 22:59:41.907760 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de06c33e-b82b-46eb-964b-4bdd02c94166-operator-scripts\") pod \"de06c33e-b82b-46eb-964b-4bdd02c94166\" (UID: \"de06c33e-b82b-46eb-964b-4bdd02c94166\") " Feb 19 22:59:41 crc kubenswrapper[4795]: I0219 22:59:41.908006 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dhb4\" (UniqueName: \"kubernetes.io/projected/de06c33e-b82b-46eb-964b-4bdd02c94166-kube-api-access-9dhb4\") pod \"de06c33e-b82b-46eb-964b-4bdd02c94166\" (UID: \"de06c33e-b82b-46eb-964b-4bdd02c94166\") " Feb 19 22:59:41 crc kubenswrapper[4795]: I0219 22:59:41.908255 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de06c33e-b82b-46eb-964b-4bdd02c94166-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "de06c33e-b82b-46eb-964b-4bdd02c94166" (UID: "de06c33e-b82b-46eb-964b-4bdd02c94166"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:59:41 crc kubenswrapper[4795]: I0219 22:59:41.908848 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de06c33e-b82b-46eb-964b-4bdd02c94166-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:41 crc kubenswrapper[4795]: I0219 22:59:41.913446 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de06c33e-b82b-46eb-964b-4bdd02c94166-kube-api-access-9dhb4" (OuterVolumeSpecName: "kube-api-access-9dhb4") pod "de06c33e-b82b-46eb-964b-4bdd02c94166" (UID: "de06c33e-b82b-46eb-964b-4bdd02c94166"). InnerVolumeSpecName "kube-api-access-9dhb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:59:42 crc kubenswrapper[4795]: I0219 22:59:42.010315 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf9mm\" (UniqueName: \"kubernetes.io/projected/6eae15d3-0be7-4510-9803-a7ad3f947148-kube-api-access-bf9mm\") pod \"6eae15d3-0be7-4510-9803-a7ad3f947148\" (UID: \"6eae15d3-0be7-4510-9803-a7ad3f947148\") " Feb 19 22:59:42 crc kubenswrapper[4795]: I0219 22:59:42.010401 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6eae15d3-0be7-4510-9803-a7ad3f947148-operator-scripts\") pod \"6eae15d3-0be7-4510-9803-a7ad3f947148\" (UID: \"6eae15d3-0be7-4510-9803-a7ad3f947148\") " Feb 19 22:59:42 crc kubenswrapper[4795]: I0219 22:59:42.010945 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dhb4\" (UniqueName: \"kubernetes.io/projected/de06c33e-b82b-46eb-964b-4bdd02c94166-kube-api-access-9dhb4\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:42 crc kubenswrapper[4795]: I0219 22:59:42.011017 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6eae15d3-0be7-4510-9803-a7ad3f947148-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6eae15d3-0be7-4510-9803-a7ad3f947148" (UID: "6eae15d3-0be7-4510-9803-a7ad3f947148"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:59:42 crc kubenswrapper[4795]: I0219 22:59:42.012870 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eae15d3-0be7-4510-9803-a7ad3f947148-kube-api-access-bf9mm" (OuterVolumeSpecName: "kube-api-access-bf9mm") pod "6eae15d3-0be7-4510-9803-a7ad3f947148" (UID: "6eae15d3-0be7-4510-9803-a7ad3f947148"). InnerVolumeSpecName "kube-api-access-bf9mm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:59:42 crc kubenswrapper[4795]: I0219 22:59:42.112679 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf9mm\" (UniqueName: \"kubernetes.io/projected/6eae15d3-0be7-4510-9803-a7ad3f947148-kube-api-access-bf9mm\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:42 crc kubenswrapper[4795]: I0219 22:59:42.112713 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6eae15d3-0be7-4510-9803-a7ad3f947148-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:42 crc kubenswrapper[4795]: I0219 22:59:42.442194 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0a84-account-create-update-pl5gt" Feb 19 22:59:42 crc kubenswrapper[4795]: I0219 22:59:42.442194 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0a84-account-create-update-pl5gt" event={"ID":"de06c33e-b82b-46eb-964b-4bdd02c94166","Type":"ContainerDied","Data":"994224a55a32aedbb842e33e8f9a0f61c34196b1a61e74254cb494cf650cadb0"} Feb 19 22:59:42 crc kubenswrapper[4795]: I0219 22:59:42.442310 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="994224a55a32aedbb842e33e8f9a0f61c34196b1a61e74254cb494cf650cadb0" Feb 19 22:59:42 crc kubenswrapper[4795]: I0219 22:59:42.448441 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-z6zgv" event={"ID":"6eae15d3-0be7-4510-9803-a7ad3f947148","Type":"ContainerDied","Data":"0fba21faa525c124f9a44b646f17455f1be57821e6a84476adf2f12852386511"} Feb 19 22:59:42 crc kubenswrapper[4795]: I0219 22:59:42.448464 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fba21faa525c124f9a44b646f17455f1be57821e6a84476adf2f12852386511" Feb 19 22:59:42 crc kubenswrapper[4795]: I0219 22:59:42.448500 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-z6zgv" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.050747 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-mn2kc"] Feb 19 22:59:44 crc kubenswrapper[4795]: E0219 22:59:44.054761 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eae15d3-0be7-4510-9803-a7ad3f947148" containerName="mariadb-database-create" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.054784 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eae15d3-0be7-4510-9803-a7ad3f947148" containerName="mariadb-database-create" Feb 19 22:59:44 crc kubenswrapper[4795]: E0219 22:59:44.054832 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de06c33e-b82b-46eb-964b-4bdd02c94166" containerName="mariadb-account-create-update" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.054842 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="de06c33e-b82b-46eb-964b-4bdd02c94166" containerName="mariadb-account-create-update" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.055040 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="de06c33e-b82b-46eb-964b-4bdd02c94166" containerName="mariadb-account-create-update" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.055053 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eae15d3-0be7-4510-9803-a7ad3f947148" containerName="mariadb-database-create" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.055776 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.057631 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.058809 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.059137 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-tv4ht" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.059943 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-mn2kc"] Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.083597 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51673183-2fe8-4a11-98f0-dec10081e7fc-etc-machine-id\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.083895 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-combined-ca-bundle\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.084153 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-config-data\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.084347 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzxlv\" (UniqueName: \"kubernetes.io/projected/51673183-2fe8-4a11-98f0-dec10081e7fc-kube-api-access-rzxlv\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.084534 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-db-sync-config-data\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.084766 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-scripts\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.185915 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-scripts\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.186247 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51673183-2fe8-4a11-98f0-dec10081e7fc-etc-machine-id\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.186335 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51673183-2fe8-4a11-98f0-dec10081e7fc-etc-machine-id\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.186361 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-combined-ca-bundle\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.186621 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-config-data\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.186735 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzxlv\" (UniqueName: \"kubernetes.io/projected/51673183-2fe8-4a11-98f0-dec10081e7fc-kube-api-access-rzxlv\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.186854 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-db-sync-config-data\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.191816 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-scripts\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.192006 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-combined-ca-bundle\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.192237 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-config-data\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.192875 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-db-sync-config-data\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.213252 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzxlv\" (UniqueName: \"kubernetes.io/projected/51673183-2fe8-4a11-98f0-dec10081e7fc-kube-api-access-rzxlv\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.375672 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.862807 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-mn2kc"] Feb 19 22:59:44 crc kubenswrapper[4795]: W0219 22:59:44.867228 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51673183_2fe8_4a11_98f0_dec10081e7fc.slice/crio-1831d0960e7fb41b83a5d61d8eac44fc4e5fcb9295a67af8e498aee6a42554a2 WatchSource:0}: Error finding container 1831d0960e7fb41b83a5d61d8eac44fc4e5fcb9295a67af8e498aee6a42554a2: Status 404 returned error can't find the container with id 1831d0960e7fb41b83a5d61d8eac44fc4e5fcb9295a67af8e498aee6a42554a2 Feb 19 22:59:45 crc kubenswrapper[4795]: I0219 22:59:45.493549 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mn2kc" event={"ID":"51673183-2fe8-4a11-98f0-dec10081e7fc","Type":"ContainerStarted","Data":"1831d0960e7fb41b83a5d61d8eac44fc4e5fcb9295a67af8e498aee6a42554a2"} Feb 19 22:59:46 crc kubenswrapper[4795]: I0219 22:59:46.504429 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mn2kc" event={"ID":"51673183-2fe8-4a11-98f0-dec10081e7fc","Type":"ContainerStarted","Data":"ad3831db9a9f6e12a5ac1eb158cce87c770b2e137cadf64a7ed412c1aedd54c2"} Feb 19 22:59:46 crc kubenswrapper[4795]: I0219 22:59:46.521079 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-mn2kc" podStartSLOduration=2.521062846 podStartE2EDuration="2.521062846s" podCreationTimestamp="2026-02-19 22:59:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:59:46.519399314 +0000 UTC m=+5497.711917198" watchObservedRunningTime="2026-02-19 22:59:46.521062846 +0000 UTC m=+5497.713580710" Feb 19 22:59:48 crc kubenswrapper[4795]: I0219 22:59:48.520543 4795 generic.go:334] "Generic (PLEG): container finished" podID="51673183-2fe8-4a11-98f0-dec10081e7fc" containerID="ad3831db9a9f6e12a5ac1eb158cce87c770b2e137cadf64a7ed412c1aedd54c2" exitCode=0 Feb 19 22:59:48 crc kubenswrapper[4795]: I0219 22:59:48.520615 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mn2kc" event={"ID":"51673183-2fe8-4a11-98f0-dec10081e7fc","Type":"ContainerDied","Data":"ad3831db9a9f6e12a5ac1eb158cce87c770b2e137cadf64a7ed412c1aedd54c2"} Feb 19 22:59:49 crc kubenswrapper[4795]: I0219 22:59:49.903622 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:49 crc kubenswrapper[4795]: I0219 22:59:49.986955 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-combined-ca-bundle\") pod \"51673183-2fe8-4a11-98f0-dec10081e7fc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " Feb 19 22:59:49 crc kubenswrapper[4795]: I0219 22:59:49.987049 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-db-sync-config-data\") pod \"51673183-2fe8-4a11-98f0-dec10081e7fc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " Feb 19 22:59:49 crc kubenswrapper[4795]: I0219 22:59:49.987070 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51673183-2fe8-4a11-98f0-dec10081e7fc-etc-machine-id\") pod \"51673183-2fe8-4a11-98f0-dec10081e7fc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " Feb 19 22:59:49 crc kubenswrapper[4795]: I0219 22:59:49.987201 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-scripts\") pod \"51673183-2fe8-4a11-98f0-dec10081e7fc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " Feb 19 22:59:49 crc kubenswrapper[4795]: I0219 22:59:49.987268 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzxlv\" (UniqueName: \"kubernetes.io/projected/51673183-2fe8-4a11-98f0-dec10081e7fc-kube-api-access-rzxlv\") pod \"51673183-2fe8-4a11-98f0-dec10081e7fc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " Feb 19 22:59:49 crc kubenswrapper[4795]: I0219 22:59:49.987292 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-config-data\") pod \"51673183-2fe8-4a11-98f0-dec10081e7fc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " Feb 19 22:59:49 crc kubenswrapper[4795]: I0219 22:59:49.987432 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/51673183-2fe8-4a11-98f0-dec10081e7fc-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "51673183-2fe8-4a11-98f0-dec10081e7fc" (UID: "51673183-2fe8-4a11-98f0-dec10081e7fc"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 22:59:49 crc kubenswrapper[4795]: I0219 22:59:49.987867 4795 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51673183-2fe8-4a11-98f0-dec10081e7fc-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:49 crc kubenswrapper[4795]: I0219 22:59:49.993443 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-scripts" (OuterVolumeSpecName: "scripts") pod "51673183-2fe8-4a11-98f0-dec10081e7fc" (UID: "51673183-2fe8-4a11-98f0-dec10081e7fc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:59:49 crc kubenswrapper[4795]: I0219 22:59:49.996514 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51673183-2fe8-4a11-98f0-dec10081e7fc-kube-api-access-rzxlv" (OuterVolumeSpecName: "kube-api-access-rzxlv") pod "51673183-2fe8-4a11-98f0-dec10081e7fc" (UID: "51673183-2fe8-4a11-98f0-dec10081e7fc"). InnerVolumeSpecName "kube-api-access-rzxlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:59:49 crc kubenswrapper[4795]: I0219 22:59:49.996596 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "51673183-2fe8-4a11-98f0-dec10081e7fc" (UID: "51673183-2fe8-4a11-98f0-dec10081e7fc"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.012826 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51673183-2fe8-4a11-98f0-dec10081e7fc" (UID: "51673183-2fe8-4a11-98f0-dec10081e7fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.044335 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-config-data" (OuterVolumeSpecName: "config-data") pod "51673183-2fe8-4a11-98f0-dec10081e7fc" (UID: "51673183-2fe8-4a11-98f0-dec10081e7fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.089759 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.089790 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.089800 4795 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.089810 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.089819 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzxlv\" (UniqueName: \"kubernetes.io/projected/51673183-2fe8-4a11-98f0-dec10081e7fc-kube-api-access-rzxlv\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.542956 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mn2kc" event={"ID":"51673183-2fe8-4a11-98f0-dec10081e7fc","Type":"ContainerDied","Data":"1831d0960e7fb41b83a5d61d8eac44fc4e5fcb9295a67af8e498aee6a42554a2"} Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.543001 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1831d0960e7fb41b83a5d61d8eac44fc4e5fcb9295a67af8e498aee6a42554a2" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.543064 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:50 crc kubenswrapper[4795]: E0219 22:59:50.723327 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51673183_2fe8_4a11_98f0_dec10081e7fc.slice\": RecentStats: unable to find data in memory cache]" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.893859 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9677b4c57-7nn9w"] Feb 19 22:59:50 crc kubenswrapper[4795]: E0219 22:59:50.894625 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51673183-2fe8-4a11-98f0-dec10081e7fc" containerName="cinder-db-sync" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.894649 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="51673183-2fe8-4a11-98f0-dec10081e7fc" containerName="cinder-db-sync" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.894867 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="51673183-2fe8-4a11-98f0-dec10081e7fc" containerName="cinder-db-sync" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.895995 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.905256 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-dns-svc\") pod \"dnsmasq-dns-9677b4c57-7nn9w\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.907154 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-config\") pod \"dnsmasq-dns-9677b4c57-7nn9w\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.907834 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc6jd\" (UniqueName: \"kubernetes.io/projected/b20710ae-8abe-4d80-8cdf-582fe785e2cc-kube-api-access-vc6jd\") pod \"dnsmasq-dns-9677b4c57-7nn9w\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.907976 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-ovsdbserver-nb\") pod \"dnsmasq-dns-9677b4c57-7nn9w\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.908143 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-ovsdbserver-sb\") pod \"dnsmasq-dns-9677b4c57-7nn9w\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.010191 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-config\") pod \"dnsmasq-dns-9677b4c57-7nn9w\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.010269 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc6jd\" (UniqueName: \"kubernetes.io/projected/b20710ae-8abe-4d80-8cdf-582fe785e2cc-kube-api-access-vc6jd\") pod \"dnsmasq-dns-9677b4c57-7nn9w\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.010294 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-ovsdbserver-nb\") pod \"dnsmasq-dns-9677b4c57-7nn9w\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.010330 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-ovsdbserver-sb\") pod \"dnsmasq-dns-9677b4c57-7nn9w\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.010349 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-dns-svc\") pod \"dnsmasq-dns-9677b4c57-7nn9w\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.011264 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-dns-svc\") pod \"dnsmasq-dns-9677b4c57-7nn9w\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.011345 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-ovsdbserver-nb\") pod \"dnsmasq-dns-9677b4c57-7nn9w\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.011513 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-ovsdbserver-sb\") pod \"dnsmasq-dns-9677b4c57-7nn9w\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.017225 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-config\") pod \"dnsmasq-dns-9677b4c57-7nn9w\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.019832 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9677b4c57-7nn9w"] Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.046003 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc6jd\" (UniqueName: \"kubernetes.io/projected/b20710ae-8abe-4d80-8cdf-582fe785e2cc-kube-api-access-vc6jd\") pod \"dnsmasq-dns-9677b4c57-7nn9w\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.130266 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.131723 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.142928 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.143646 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.143924 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.144037 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-tv4ht" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.144243 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.220821 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3922ed4f-baf9-481e-af8b-b009440dfea2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.220879 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3922ed4f-baf9-481e-af8b-b009440dfea2-logs\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.220930 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.220962 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-scripts\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.221072 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-config-data\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.221158 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thkh7\" (UniqueName: \"kubernetes.io/projected/3922ed4f-baf9-481e-af8b-b009440dfea2-kube-api-access-thkh7\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.221201 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-config-data-custom\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.222333 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.326105 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thkh7\" (UniqueName: \"kubernetes.io/projected/3922ed4f-baf9-481e-af8b-b009440dfea2-kube-api-access-thkh7\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.326410 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-config-data-custom\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.326491 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3922ed4f-baf9-481e-af8b-b009440dfea2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.326509 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3922ed4f-baf9-481e-af8b-b009440dfea2-logs\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.326532 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.326550 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-scripts\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.326592 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-config-data\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.327455 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3922ed4f-baf9-481e-af8b-b009440dfea2-logs\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.327516 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3922ed4f-baf9-481e-af8b-b009440dfea2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.335947 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-scripts\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.336286 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-config-data\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.336919 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-config-data-custom\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.346509 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.357731 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thkh7\" (UniqueName: \"kubernetes.io/projected/3922ed4f-baf9-481e-af8b-b009440dfea2-kube-api-access-thkh7\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.504829 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: W0219 22:59:51.771600 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb20710ae_8abe_4d80_8cdf_582fe785e2cc.slice/crio-676d66baf2862b20915b7f3f4da6df9613656d85d389c8b7a38b315539d7e5e8 WatchSource:0}: Error finding container 676d66baf2862b20915b7f3f4da6df9613656d85d389c8b7a38b315539d7e5e8: Status 404 returned error can't find the container with id 676d66baf2862b20915b7f3f4da6df9613656d85d389c8b7a38b315539d7e5e8 Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.780051 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9677b4c57-7nn9w"] Feb 19 22:59:52 crc kubenswrapper[4795]: I0219 22:59:52.040123 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 22:59:52 crc kubenswrapper[4795]: W0219 22:59:52.051425 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3922ed4f_baf9_481e_af8b_b009440dfea2.slice/crio-595aeff6e650976b06c39b4880224d655a58680d4787becedbb202c16d33fdcb WatchSource:0}: Error finding container 595aeff6e650976b06c39b4880224d655a58680d4787becedbb202c16d33fdcb: Status 404 returned error can't find the container with id 595aeff6e650976b06c39b4880224d655a58680d4787becedbb202c16d33fdcb Feb 19 22:59:52 crc kubenswrapper[4795]: I0219 22:59:52.564685 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3922ed4f-baf9-481e-af8b-b009440dfea2","Type":"ContainerStarted","Data":"595aeff6e650976b06c39b4880224d655a58680d4787becedbb202c16d33fdcb"} Feb 19 22:59:52 crc kubenswrapper[4795]: I0219 22:59:52.566855 4795 generic.go:334] "Generic (PLEG): container finished" podID="b20710ae-8abe-4d80-8cdf-582fe785e2cc" containerID="b799c6817a13193660264c0e1a6b6bbda173865957b6b9f053b5f17c7df00f19" exitCode=0 Feb 19 22:59:52 crc kubenswrapper[4795]: I0219 22:59:52.566950 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" event={"ID":"b20710ae-8abe-4d80-8cdf-582fe785e2cc","Type":"ContainerDied","Data":"b799c6817a13193660264c0e1a6b6bbda173865957b6b9f053b5f17c7df00f19"} Feb 19 22:59:52 crc kubenswrapper[4795]: I0219 22:59:52.566968 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" event={"ID":"b20710ae-8abe-4d80-8cdf-582fe785e2cc","Type":"ContainerStarted","Data":"676d66baf2862b20915b7f3f4da6df9613656d85d389c8b7a38b315539d7e5e8"} Feb 19 22:59:53 crc kubenswrapper[4795]: I0219 22:59:53.577913 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3922ed4f-baf9-481e-af8b-b009440dfea2","Type":"ContainerStarted","Data":"776e2209bad65e1284c6f9e29b06d04a385b79aab9583f4b7309a11a22e8add7"} Feb 19 22:59:53 crc kubenswrapper[4795]: I0219 22:59:53.578372 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 19 22:59:53 crc kubenswrapper[4795]: I0219 22:59:53.578385 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3922ed4f-baf9-481e-af8b-b009440dfea2","Type":"ContainerStarted","Data":"aeb4083d1a03c26f7d2a5602b43820b940e6170cf9c38835c9bf0e75e7c3c930"} Feb 19 22:59:53 crc kubenswrapper[4795]: I0219 22:59:53.580286 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" event={"ID":"b20710ae-8abe-4d80-8cdf-582fe785e2cc","Type":"ContainerStarted","Data":"af38521e1f3eb2665f893506c1a46deb190e71a060f854d967c38b1e5c2dbb46"} Feb 19 22:59:53 crc kubenswrapper[4795]: I0219 22:59:53.580886 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:53 crc kubenswrapper[4795]: I0219 22:59:53.597368 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.597347517 podStartE2EDuration="2.597347517s" podCreationTimestamp="2026-02-19 22:59:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:59:53.593339034 +0000 UTC m=+5504.785856908" watchObservedRunningTime="2026-02-19 22:59:53.597347517 +0000 UTC m=+5504.789865401" Feb 19 22:59:53 crc kubenswrapper[4795]: I0219 22:59:53.622312 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" podStartSLOduration=3.622294061 podStartE2EDuration="3.622294061s" podCreationTimestamp="2026-02-19 22:59:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:59:53.616708789 +0000 UTC m=+5504.809226643" watchObservedRunningTime="2026-02-19 22:59:53.622294061 +0000 UTC m=+5504.814811925" Feb 19 23:00:00 crc kubenswrapper[4795]: I0219 23:00:00.134889 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68"] Feb 19 23:00:00 crc kubenswrapper[4795]: I0219 23:00:00.136820 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68" Feb 19 23:00:00 crc kubenswrapper[4795]: I0219 23:00:00.144974 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 23:00:00 crc kubenswrapper[4795]: I0219 23:00:00.144974 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 23:00:00 crc kubenswrapper[4795]: I0219 23:00:00.151039 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68"] Feb 19 23:00:00 crc kubenswrapper[4795]: I0219 23:00:00.241730 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40381732-f007-4395-b8d1-02b3fc37b091-config-volume\") pod \"collect-profiles-29525700-psw68\" (UID: \"40381732-f007-4395-b8d1-02b3fc37b091\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68" Feb 19 23:00:00 crc kubenswrapper[4795]: I0219 23:00:00.241913 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqwqj\" (UniqueName: \"kubernetes.io/projected/40381732-f007-4395-b8d1-02b3fc37b091-kube-api-access-fqwqj\") pod \"collect-profiles-29525700-psw68\" (UID: \"40381732-f007-4395-b8d1-02b3fc37b091\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68" Feb 19 23:00:00 crc kubenswrapper[4795]: I0219 23:00:00.242000 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40381732-f007-4395-b8d1-02b3fc37b091-secret-volume\") pod \"collect-profiles-29525700-psw68\" (UID: \"40381732-f007-4395-b8d1-02b3fc37b091\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68" Feb 19 23:00:00 crc kubenswrapper[4795]: I0219 23:00:00.343444 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40381732-f007-4395-b8d1-02b3fc37b091-config-volume\") pod \"collect-profiles-29525700-psw68\" (UID: \"40381732-f007-4395-b8d1-02b3fc37b091\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68" Feb 19 23:00:00 crc kubenswrapper[4795]: I0219 23:00:00.343806 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqwqj\" (UniqueName: \"kubernetes.io/projected/40381732-f007-4395-b8d1-02b3fc37b091-kube-api-access-fqwqj\") pod \"collect-profiles-29525700-psw68\" (UID: \"40381732-f007-4395-b8d1-02b3fc37b091\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68" Feb 19 23:00:00 crc kubenswrapper[4795]: I0219 23:00:00.343988 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40381732-f007-4395-b8d1-02b3fc37b091-secret-volume\") pod \"collect-profiles-29525700-psw68\" (UID: \"40381732-f007-4395-b8d1-02b3fc37b091\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68" Feb 19 23:00:00 crc kubenswrapper[4795]: I0219 23:00:00.344526 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40381732-f007-4395-b8d1-02b3fc37b091-config-volume\") pod \"collect-profiles-29525700-psw68\" (UID: \"40381732-f007-4395-b8d1-02b3fc37b091\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68" Feb 19 23:00:00 crc kubenswrapper[4795]: I0219 23:00:00.353885 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40381732-f007-4395-b8d1-02b3fc37b091-secret-volume\") pod \"collect-profiles-29525700-psw68\" (UID: \"40381732-f007-4395-b8d1-02b3fc37b091\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68" Feb 19 23:00:00 crc kubenswrapper[4795]: I0219 23:00:00.358179 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqwqj\" (UniqueName: \"kubernetes.io/projected/40381732-f007-4395-b8d1-02b3fc37b091-kube-api-access-fqwqj\") pod \"collect-profiles-29525700-psw68\" (UID: \"40381732-f007-4395-b8d1-02b3fc37b091\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68" Feb 19 23:00:00 crc kubenswrapper[4795]: I0219 23:00:00.458908 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68" Feb 19 23:00:00 crc kubenswrapper[4795]: I0219 23:00:00.921523 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68"] Feb 19 23:00:01 crc kubenswrapper[4795]: I0219 23:00:01.231309 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 23:00:01 crc kubenswrapper[4795]: I0219 23:00:01.303653 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bd886b897-t5785"] Feb 19 23:00:01 crc kubenswrapper[4795]: I0219 23:00:01.303905 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bd886b897-t5785" podUID="25125096-9221-46cf-9c10-21242922dc39" containerName="dnsmasq-dns" containerID="cri-o://6b2bdae1ece62dc20662e376419faadb82e3b42065ea06a9558f606c33c4b7d2" gracePeriod=10 Feb 19 23:00:01 crc kubenswrapper[4795]: I0219 23:00:01.683637 4795 generic.go:334] "Generic (PLEG): container finished" podID="25125096-9221-46cf-9c10-21242922dc39" containerID="6b2bdae1ece62dc20662e376419faadb82e3b42065ea06a9558f606c33c4b7d2" exitCode=0 Feb 19 23:00:01 crc kubenswrapper[4795]: I0219 23:00:01.683718 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bd886b897-t5785" event={"ID":"25125096-9221-46cf-9c10-21242922dc39","Type":"ContainerDied","Data":"6b2bdae1ece62dc20662e376419faadb82e3b42065ea06a9558f606c33c4b7d2"} Feb 19 23:00:01 crc kubenswrapper[4795]: I0219 23:00:01.708953 4795 generic.go:334] "Generic (PLEG): container finished" podID="40381732-f007-4395-b8d1-02b3fc37b091" containerID="f0a7532029d7c277b9d534f1fee42695eb4906e52f264e18f9db0ba49bbbdeb7" exitCode=0 Feb 19 23:00:01 crc kubenswrapper[4795]: I0219 23:00:01.709268 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68" event={"ID":"40381732-f007-4395-b8d1-02b3fc37b091","Type":"ContainerDied","Data":"f0a7532029d7c277b9d534f1fee42695eb4906e52f264e18f9db0ba49bbbdeb7"} Feb 19 23:00:01 crc kubenswrapper[4795]: I0219 23:00:01.709293 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68" event={"ID":"40381732-f007-4395-b8d1-02b3fc37b091","Type":"ContainerStarted","Data":"8765f9a21122d91b62195625103e3e18985cebc537953bba019e930ab4a19f4c"} Feb 19 23:00:01 crc kubenswrapper[4795]: I0219 23:00:01.919215 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.074695 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-ovsdbserver-nb\") pod \"25125096-9221-46cf-9c10-21242922dc39\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.074748 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-dns-svc\") pod \"25125096-9221-46cf-9c10-21242922dc39\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.074876 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmqvw\" (UniqueName: \"kubernetes.io/projected/25125096-9221-46cf-9c10-21242922dc39-kube-api-access-jmqvw\") pod \"25125096-9221-46cf-9c10-21242922dc39\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.074919 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-ovsdbserver-sb\") pod \"25125096-9221-46cf-9c10-21242922dc39\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.074975 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-config\") pod \"25125096-9221-46cf-9c10-21242922dc39\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.098315 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25125096-9221-46cf-9c10-21242922dc39-kube-api-access-jmqvw" (OuterVolumeSpecName: "kube-api-access-jmqvw") pod "25125096-9221-46cf-9c10-21242922dc39" (UID: "25125096-9221-46cf-9c10-21242922dc39"). InnerVolumeSpecName "kube-api-access-jmqvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.120593 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "25125096-9221-46cf-9c10-21242922dc39" (UID: "25125096-9221-46cf-9c10-21242922dc39"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.125730 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-config" (OuterVolumeSpecName: "config") pod "25125096-9221-46cf-9c10-21242922dc39" (UID: "25125096-9221-46cf-9c10-21242922dc39"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.135665 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "25125096-9221-46cf-9c10-21242922dc39" (UID: "25125096-9221-46cf-9c10-21242922dc39"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.144597 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "25125096-9221-46cf-9c10-21242922dc39" (UID: "25125096-9221-46cf-9c10-21242922dc39"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.179599 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.179627 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.179637 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmqvw\" (UniqueName: \"kubernetes.io/projected/25125096-9221-46cf-9c10-21242922dc39-kube-api-access-jmqvw\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.179647 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.179656 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-config\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.720736 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.721078 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bd886b897-t5785" event={"ID":"25125096-9221-46cf-9c10-21242922dc39","Type":"ContainerDied","Data":"cb323091e0b46dedb3513673e4bcc0af0bdc32ed871a599a68f0f38aa938ff1f"} Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.721209 4795 scope.go:117] "RemoveContainer" containerID="6b2bdae1ece62dc20662e376419faadb82e3b42065ea06a9558f606c33c4b7d2" Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.759883 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bd886b897-t5785"] Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.770665 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bd886b897-t5785"] Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.772509 4795 scope.go:117] "RemoveContainer" containerID="1208585385cb268d9e64c92ccccb01209cc9892bf210d3382d48ae16872a3290" Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.897444 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.897921 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f7e72169-9bdf-4b8f-9f8d-ca2a994287da" containerName="nova-scheduler-scheduler" containerID="cri-o://44d4ca3c4aa7ee9ce5e10ecd6a647b96c7526305cf1a590a9d7e1995ec6eaea5" gracePeriod=30 Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.935542 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.935917 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="9839fd0b-0161-4772-bda3-ddc2914d7e83" containerName="nova-cell0-conductor-conductor" containerID="cri-o://23d5580181fbc629a99f1fcb391f9c4c8a3cc847543b9789039eab2e110e9ef3" gracePeriod=30 Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.950993 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.951262 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" containerName="nova-metadata-log" containerID="cri-o://3e5f71386ccfa754a08ae46bc61bdb80afae6d8cdf0cf2814359a3071d8fb32e" gracePeriod=30 Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.951659 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" containerName="nova-metadata-metadata" containerID="cri-o://a8b613dee124277cee481b45a71a4e627213532ed99ccd578d83629d833ffbc6" gracePeriod=30 Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.961225 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.961604 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="30b4399f-d971-4131-8243-3c45e8353cdd" containerName="nova-api-log" containerID="cri-o://9f81160ecc73cbe2584f285d4077d91e6e6e7f01ec7eb44fcd8f220d2ecb27ff" gracePeriod=30 Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.962076 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="30b4399f-d971-4131-8243-3c45e8353cdd" containerName="nova-api-api" containerID="cri-o://6d26cbca7b84b1c79190688e30dc36b02a9a4c86419cdd59664872d8f4e8db9f" gracePeriod=30 Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:02.997398 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.004212 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="53af86cc-b0d0-4ba5-9294-4aaa6cef6c09" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://4cb474362c7411d131561c37cda83e0ffd7519207d531dd518914dff39a66f87" gracePeriod=30 Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.062450 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.062628 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="cb0fc8ea-d689-468f-a612-b87c3e63077d" containerName="nova-cell1-conductor-conductor" containerID="cri-o://59c0008df42ff2bf7402d8fd2a516b276c718f55c84fb6bb76f4c9c73d8bd314" gracePeriod=30 Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.170206 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68" Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.310234 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40381732-f007-4395-b8d1-02b3fc37b091-secret-volume\") pod \"40381732-f007-4395-b8d1-02b3fc37b091\" (UID: \"40381732-f007-4395-b8d1-02b3fc37b091\") " Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.310333 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqwqj\" (UniqueName: \"kubernetes.io/projected/40381732-f007-4395-b8d1-02b3fc37b091-kube-api-access-fqwqj\") pod \"40381732-f007-4395-b8d1-02b3fc37b091\" (UID: \"40381732-f007-4395-b8d1-02b3fc37b091\") " Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.310412 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40381732-f007-4395-b8d1-02b3fc37b091-config-volume\") pod \"40381732-f007-4395-b8d1-02b3fc37b091\" (UID: \"40381732-f007-4395-b8d1-02b3fc37b091\") " Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.311536 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40381732-f007-4395-b8d1-02b3fc37b091-config-volume" (OuterVolumeSpecName: "config-volume") pod "40381732-f007-4395-b8d1-02b3fc37b091" (UID: "40381732-f007-4395-b8d1-02b3fc37b091"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.319475 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40381732-f007-4395-b8d1-02b3fc37b091-kube-api-access-fqwqj" (OuterVolumeSpecName: "kube-api-access-fqwqj") pod "40381732-f007-4395-b8d1-02b3fc37b091" (UID: "40381732-f007-4395-b8d1-02b3fc37b091"). InnerVolumeSpecName "kube-api-access-fqwqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.327440 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40381732-f007-4395-b8d1-02b3fc37b091-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "40381732-f007-4395-b8d1-02b3fc37b091" (UID: "40381732-f007-4395-b8d1-02b3fc37b091"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.412400 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40381732-f007-4395-b8d1-02b3fc37b091-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.412440 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqwqj\" (UniqueName: \"kubernetes.io/projected/40381732-f007-4395-b8d1-02b3fc37b091-kube-api-access-fqwqj\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.412454 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40381732-f007-4395-b8d1-02b3fc37b091-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.544118 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25125096-9221-46cf-9c10-21242922dc39" path="/var/lib/kubelet/pods/25125096-9221-46cf-9c10-21242922dc39/volumes" Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.738382 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68" Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.739453 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68" event={"ID":"40381732-f007-4395-b8d1-02b3fc37b091","Type":"ContainerDied","Data":"8765f9a21122d91b62195625103e3e18985cebc537953bba019e930ab4a19f4c"} Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.739508 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8765f9a21122d91b62195625103e3e18985cebc537953bba019e930ab4a19f4c" Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.747730 4795 generic.go:334] "Generic (PLEG): container finished" podID="30b4399f-d971-4131-8243-3c45e8353cdd" containerID="9f81160ecc73cbe2584f285d4077d91e6e6e7f01ec7eb44fcd8f220d2ecb27ff" exitCode=143 Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.747832 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"30b4399f-d971-4131-8243-3c45e8353cdd","Type":"ContainerDied","Data":"9f81160ecc73cbe2584f285d4077d91e6e6e7f01ec7eb44fcd8f220d2ecb27ff"} Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.750547 4795 generic.go:334] "Generic (PLEG): container finished" podID="8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" containerID="3e5f71386ccfa754a08ae46bc61bdb80afae6d8cdf0cf2814359a3071d8fb32e" exitCode=143 Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.750604 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a","Type":"ContainerDied","Data":"3e5f71386ccfa754a08ae46bc61bdb80afae6d8cdf0cf2814359a3071d8fb32e"} Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.751848 4795 generic.go:334] "Generic (PLEG): container finished" podID="53af86cc-b0d0-4ba5-9294-4aaa6cef6c09" containerID="4cb474362c7411d131561c37cda83e0ffd7519207d531dd518914dff39a66f87" exitCode=0 Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.751874 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09","Type":"ContainerDied","Data":"4cb474362c7411d131561c37cda83e0ffd7519207d531dd518914dff39a66f87"} Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.751892 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09","Type":"ContainerDied","Data":"3cb295d52ddcf76c340ae392323cec591a88fdaec3fd107d8ed54655b0b89d43"} Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.751905 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cb295d52ddcf76c340ae392323cec591a88fdaec3fd107d8ed54655b0b89d43" Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.766300 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.792639 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.925012 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7jkp\" (UniqueName: \"kubernetes.io/projected/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-kube-api-access-f7jkp\") pod \"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09\" (UID: \"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09\") " Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.925086 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-config-data\") pod \"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09\" (UID: \"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09\") " Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.925247 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-combined-ca-bundle\") pod \"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09\" (UID: \"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09\") " Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.942009 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-kube-api-access-f7jkp" (OuterVolumeSpecName: "kube-api-access-f7jkp") pod "53af86cc-b0d0-4ba5-9294-4aaa6cef6c09" (UID: "53af86cc-b0d0-4ba5-9294-4aaa6cef6c09"). InnerVolumeSpecName "kube-api-access-f7jkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.962570 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53af86cc-b0d0-4ba5-9294-4aaa6cef6c09" (UID: "53af86cc-b0d0-4ba5-9294-4aaa6cef6c09"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.981181 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-config-data" (OuterVolumeSpecName: "config-data") pod "53af86cc-b0d0-4ba5-9294-4aaa6cef6c09" (UID: "53af86cc-b0d0-4ba5-9294-4aaa6cef6c09"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.028627 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.028748 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.028817 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7jkp\" (UniqueName: \"kubernetes.io/projected/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-kube-api-access-f7jkp\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.319235 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv"] Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.336517 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv"] Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.348621 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.439755 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gp8x\" (UniqueName: \"kubernetes.io/projected/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-kube-api-access-7gp8x\") pod \"f7e72169-9bdf-4b8f-9f8d-ca2a994287da\" (UID: \"f7e72169-9bdf-4b8f-9f8d-ca2a994287da\") " Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.439971 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-config-data\") pod \"f7e72169-9bdf-4b8f-9f8d-ca2a994287da\" (UID: \"f7e72169-9bdf-4b8f-9f8d-ca2a994287da\") " Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.440008 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-combined-ca-bundle\") pod \"f7e72169-9bdf-4b8f-9f8d-ca2a994287da\" (UID: \"f7e72169-9bdf-4b8f-9f8d-ca2a994287da\") " Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.450385 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-kube-api-access-7gp8x" (OuterVolumeSpecName: "kube-api-access-7gp8x") pod "f7e72169-9bdf-4b8f-9f8d-ca2a994287da" (UID: "f7e72169-9bdf-4b8f-9f8d-ca2a994287da"). InnerVolumeSpecName "kube-api-access-7gp8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.466832 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-config-data" (OuterVolumeSpecName: "config-data") pod "f7e72169-9bdf-4b8f-9f8d-ca2a994287da" (UID: "f7e72169-9bdf-4b8f-9f8d-ca2a994287da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.484907 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7e72169-9bdf-4b8f-9f8d-ca2a994287da" (UID: "f7e72169-9bdf-4b8f-9f8d-ca2a994287da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.542603 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.542642 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.542656 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gp8x\" (UniqueName: \"kubernetes.io/projected/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-kube-api-access-7gp8x\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.761964 4795 generic.go:334] "Generic (PLEG): container finished" podID="f7e72169-9bdf-4b8f-9f8d-ca2a994287da" containerID="44d4ca3c4aa7ee9ce5e10ecd6a647b96c7526305cf1a590a9d7e1995ec6eaea5" exitCode=0 Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.762013 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f7e72169-9bdf-4b8f-9f8d-ca2a994287da","Type":"ContainerDied","Data":"44d4ca3c4aa7ee9ce5e10ecd6a647b96c7526305cf1a590a9d7e1995ec6eaea5"} Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.762058 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f7e72169-9bdf-4b8f-9f8d-ca2a994287da","Type":"ContainerDied","Data":"914300143cab992575f14bc983146583fd096f4199ceab449ccc8d0f6f188747"} Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.762061 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.762074 4795 scope.go:117] "RemoveContainer" containerID="44d4ca3c4aa7ee9ce5e10ecd6a647b96c7526305cf1a590a9d7e1995ec6eaea5" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.762070 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.793901 4795 scope.go:117] "RemoveContainer" containerID="44d4ca3c4aa7ee9ce5e10ecd6a647b96c7526305cf1a590a9d7e1995ec6eaea5" Feb 19 23:00:04 crc kubenswrapper[4795]: E0219 23:00:04.794997 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44d4ca3c4aa7ee9ce5e10ecd6a647b96c7526305cf1a590a9d7e1995ec6eaea5\": container with ID starting with 44d4ca3c4aa7ee9ce5e10ecd6a647b96c7526305cf1a590a9d7e1995ec6eaea5 not found: ID does not exist" containerID="44d4ca3c4aa7ee9ce5e10ecd6a647b96c7526305cf1a590a9d7e1995ec6eaea5" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.795117 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44d4ca3c4aa7ee9ce5e10ecd6a647b96c7526305cf1a590a9d7e1995ec6eaea5"} err="failed to get container status \"44d4ca3c4aa7ee9ce5e10ecd6a647b96c7526305cf1a590a9d7e1995ec6eaea5\": rpc error: code = NotFound desc = could not find container \"44d4ca3c4aa7ee9ce5e10ecd6a647b96c7526305cf1a590a9d7e1995ec6eaea5\": container with ID starting with 44d4ca3c4aa7ee9ce5e10ecd6a647b96c7526305cf1a590a9d7e1995ec6eaea5 not found: ID does not exist" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.818676 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.847093 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.864740 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.902295 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.914305 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:00:04 crc kubenswrapper[4795]: E0219 23:00:04.915077 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40381732-f007-4395-b8d1-02b3fc37b091" containerName="collect-profiles" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.915098 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="40381732-f007-4395-b8d1-02b3fc37b091" containerName="collect-profiles" Feb 19 23:00:04 crc kubenswrapper[4795]: E0219 23:00:04.915109 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7e72169-9bdf-4b8f-9f8d-ca2a994287da" containerName="nova-scheduler-scheduler" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.915115 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7e72169-9bdf-4b8f-9f8d-ca2a994287da" containerName="nova-scheduler-scheduler" Feb 19 23:00:04 crc kubenswrapper[4795]: E0219 23:00:04.915130 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25125096-9221-46cf-9c10-21242922dc39" containerName="dnsmasq-dns" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.915138 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="25125096-9221-46cf-9c10-21242922dc39" containerName="dnsmasq-dns" Feb 19 23:00:04 crc kubenswrapper[4795]: E0219 23:00:04.915151 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25125096-9221-46cf-9c10-21242922dc39" containerName="init" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.915157 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="25125096-9221-46cf-9c10-21242922dc39" containerName="init" Feb 19 23:00:04 crc kubenswrapper[4795]: E0219 23:00:04.918349 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53af86cc-b0d0-4ba5-9294-4aaa6cef6c09" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.918378 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="53af86cc-b0d0-4ba5-9294-4aaa6cef6c09" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.918780 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="40381732-f007-4395-b8d1-02b3fc37b091" containerName="collect-profiles" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.918922 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7e72169-9bdf-4b8f-9f8d-ca2a994287da" containerName="nova-scheduler-scheduler" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.918937 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="53af86cc-b0d0-4ba5-9294-4aaa6cef6c09" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.918951 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="25125096-9221-46cf-9c10-21242922dc39" containerName="dnsmasq-dns" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.919759 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.922003 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.935565 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.948979 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.950152 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.952445 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.956528 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.052079 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dpxs\" (UniqueName: \"kubernetes.io/projected/2eb28a2e-eb12-4867-9c26-3416349cc1cc-kube-api-access-4dpxs\") pod \"nova-scheduler-0\" (UID: \"2eb28a2e-eb12-4867-9c26-3416349cc1cc\") " pod="openstack/nova-scheduler-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.052152 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwh68\" (UniqueName: \"kubernetes.io/projected/a96a8189-2b04-4ce7-908b-3544dc3b7ec4-kube-api-access-xwh68\") pod \"nova-cell1-novncproxy-0\" (UID: \"a96a8189-2b04-4ce7-908b-3544dc3b7ec4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.052193 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eb28a2e-eb12-4867-9c26-3416349cc1cc-config-data\") pod \"nova-scheduler-0\" (UID: \"2eb28a2e-eb12-4867-9c26-3416349cc1cc\") " pod="openstack/nova-scheduler-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.052218 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a96a8189-2b04-4ce7-908b-3544dc3b7ec4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a96a8189-2b04-4ce7-908b-3544dc3b7ec4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.052235 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a96a8189-2b04-4ce7-908b-3544dc3b7ec4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a96a8189-2b04-4ce7-908b-3544dc3b7ec4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.052309 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eb28a2e-eb12-4867-9c26-3416349cc1cc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2eb28a2e-eb12-4867-9c26-3416349cc1cc\") " pod="openstack/nova-scheduler-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.153515 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eb28a2e-eb12-4867-9c26-3416349cc1cc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2eb28a2e-eb12-4867-9c26-3416349cc1cc\") " pod="openstack/nova-scheduler-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.153588 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dpxs\" (UniqueName: \"kubernetes.io/projected/2eb28a2e-eb12-4867-9c26-3416349cc1cc-kube-api-access-4dpxs\") pod \"nova-scheduler-0\" (UID: \"2eb28a2e-eb12-4867-9c26-3416349cc1cc\") " pod="openstack/nova-scheduler-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.153633 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwh68\" (UniqueName: \"kubernetes.io/projected/a96a8189-2b04-4ce7-908b-3544dc3b7ec4-kube-api-access-xwh68\") pod \"nova-cell1-novncproxy-0\" (UID: \"a96a8189-2b04-4ce7-908b-3544dc3b7ec4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.153674 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eb28a2e-eb12-4867-9c26-3416349cc1cc-config-data\") pod \"nova-scheduler-0\" (UID: \"2eb28a2e-eb12-4867-9c26-3416349cc1cc\") " pod="openstack/nova-scheduler-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.153704 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a96a8189-2b04-4ce7-908b-3544dc3b7ec4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a96a8189-2b04-4ce7-908b-3544dc3b7ec4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.153724 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a96a8189-2b04-4ce7-908b-3544dc3b7ec4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a96a8189-2b04-4ce7-908b-3544dc3b7ec4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.159682 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a96a8189-2b04-4ce7-908b-3544dc3b7ec4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a96a8189-2b04-4ce7-908b-3544dc3b7ec4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.161363 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eb28a2e-eb12-4867-9c26-3416349cc1cc-config-data\") pod \"nova-scheduler-0\" (UID: \"2eb28a2e-eb12-4867-9c26-3416349cc1cc\") " pod="openstack/nova-scheduler-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.164775 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a96a8189-2b04-4ce7-908b-3544dc3b7ec4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a96a8189-2b04-4ce7-908b-3544dc3b7ec4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.169592 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dpxs\" (UniqueName: \"kubernetes.io/projected/2eb28a2e-eb12-4867-9c26-3416349cc1cc-kube-api-access-4dpxs\") pod \"nova-scheduler-0\" (UID: \"2eb28a2e-eb12-4867-9c26-3416349cc1cc\") " pod="openstack/nova-scheduler-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.170447 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwh68\" (UniqueName: \"kubernetes.io/projected/a96a8189-2b04-4ce7-908b-3544dc3b7ec4-kube-api-access-xwh68\") pod \"nova-cell1-novncproxy-0\" (UID: \"a96a8189-2b04-4ce7-908b-3544dc3b7ec4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.181714 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eb28a2e-eb12-4867-9c26-3416349cc1cc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2eb28a2e-eb12-4867-9c26-3416349cc1cc\") " pod="openstack/nova-scheduler-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.259601 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.325859 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.333521 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.358470 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb0fc8ea-d689-468f-a612-b87c3e63077d-config-data\") pod \"cb0fc8ea-d689-468f-a612-b87c3e63077d\" (UID: \"cb0fc8ea-d689-468f-a612-b87c3e63077d\") " Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.358619 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9np5\" (UniqueName: \"kubernetes.io/projected/cb0fc8ea-d689-468f-a612-b87c3e63077d-kube-api-access-w9np5\") pod \"cb0fc8ea-d689-468f-a612-b87c3e63077d\" (UID: \"cb0fc8ea-d689-468f-a612-b87c3e63077d\") " Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.358654 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb0fc8ea-d689-468f-a612-b87c3e63077d-combined-ca-bundle\") pod \"cb0fc8ea-d689-468f-a612-b87c3e63077d\" (UID: \"cb0fc8ea-d689-468f-a612-b87c3e63077d\") " Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.369039 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb0fc8ea-d689-468f-a612-b87c3e63077d-kube-api-access-w9np5" (OuterVolumeSpecName: "kube-api-access-w9np5") pod "cb0fc8ea-d689-468f-a612-b87c3e63077d" (UID: "cb0fc8ea-d689-468f-a612-b87c3e63077d"). InnerVolumeSpecName "kube-api-access-w9np5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.391592 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb0fc8ea-d689-468f-a612-b87c3e63077d-config-data" (OuterVolumeSpecName: "config-data") pod "cb0fc8ea-d689-468f-a612-b87c3e63077d" (UID: "cb0fc8ea-d689-468f-a612-b87c3e63077d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.405402 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb0fc8ea-d689-468f-a612-b87c3e63077d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb0fc8ea-d689-468f-a612-b87c3e63077d" (UID: "cb0fc8ea-d689-468f-a612-b87c3e63077d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.461404 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb0fc8ea-d689-468f-a612-b87c3e63077d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.461431 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb0fc8ea-d689-468f-a612-b87c3e63077d-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.461442 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9np5\" (UniqueName: \"kubernetes.io/projected/cb0fc8ea-d689-468f-a612-b87c3e63077d-kube-api-access-w9np5\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.526818 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d0ffce6-6c23-4d04-a029-6322d065ff24" path="/var/lib/kubelet/pods/1d0ffce6-6c23-4d04-a029-6322d065ff24/volumes" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.527582 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53af86cc-b0d0-4ba5-9294-4aaa6cef6c09" path="/var/lib/kubelet/pods/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09/volumes" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.528095 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7e72169-9bdf-4b8f-9f8d-ca2a994287da" path="/var/lib/kubelet/pods/f7e72169-9bdf-4b8f-9f8d-ca2a994287da/volumes" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.793077 4795 generic.go:334] "Generic (PLEG): container finished" podID="cb0fc8ea-d689-468f-a612-b87c3e63077d" containerID="59c0008df42ff2bf7402d8fd2a516b276c718f55c84fb6bb76f4c9c73d8bd314" exitCode=0 Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.793612 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"cb0fc8ea-d689-468f-a612-b87c3e63077d","Type":"ContainerDied","Data":"59c0008df42ff2bf7402d8fd2a516b276c718f55c84fb6bb76f4c9c73d8bd314"} Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.793648 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"cb0fc8ea-d689-468f-a612-b87c3e63077d","Type":"ContainerDied","Data":"0e3ce7752bf63997ed6557e690f1dee58f948c11df612377dc82c2d8fde569b2"} Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.793670 4795 scope.go:117] "RemoveContainer" containerID="59c0008df42ff2bf7402d8fd2a516b276c718f55c84fb6bb76f4c9c73d8bd314" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.793829 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.818652 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.826458 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.844848 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 23:00:05 crc kubenswrapper[4795]: E0219 23:00:05.845850 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb0fc8ea-d689-468f-a612-b87c3e63077d" containerName="nova-cell1-conductor-conductor" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.852262 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb0fc8ea-d689-468f-a612-b87c3e63077d" containerName="nova-cell1-conductor-conductor" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.852674 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb0fc8ea-d689-468f-a612-b87c3e63077d" containerName="nova-cell1-conductor-conductor" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.853200 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.853316 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.836487 4795 scope.go:117] "RemoveContainer" containerID="59c0008df42ff2bf7402d8fd2a516b276c718f55c84fb6bb76f4c9c73d8bd314" Feb 19 23:00:05 crc kubenswrapper[4795]: E0219 23:00:05.854442 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59c0008df42ff2bf7402d8fd2a516b276c718f55c84fb6bb76f4c9c73d8bd314\": container with ID starting with 59c0008df42ff2bf7402d8fd2a516b276c718f55c84fb6bb76f4c9c73d8bd314 not found: ID does not exist" containerID="59c0008df42ff2bf7402d8fd2a516b276c718f55c84fb6bb76f4c9c73d8bd314" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.854469 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59c0008df42ff2bf7402d8fd2a516b276c718f55c84fb6bb76f4c9c73d8bd314"} err="failed to get container status \"59c0008df42ff2bf7402d8fd2a516b276c718f55c84fb6bb76f4c9c73d8bd314\": rpc error: code = NotFound desc = could not find container \"59c0008df42ff2bf7402d8fd2a516b276c718f55c84fb6bb76f4c9c73d8bd314\": container with ID starting with 59c0008df42ff2bf7402d8fd2a516b276c718f55c84fb6bb76f4c9c73d8bd314 not found: ID does not exist" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.858953 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.861330 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.927563 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.976859 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnl2g\" (UniqueName: \"kubernetes.io/projected/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-kube-api-access-qnl2g\") pod \"nova-cell1-conductor-0\" (UID: \"f1a9135a-42ef-42ca-880a-f4f5ffd78a13\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.976924 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f1a9135a-42ef-42ca-880a-f4f5ffd78a13\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.976977 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f1a9135a-42ef-42ca-880a-f4f5ffd78a13\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.078453 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f1a9135a-42ef-42ca-880a-f4f5ffd78a13\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.078639 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnl2g\" (UniqueName: \"kubernetes.io/projected/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-kube-api-access-qnl2g\") pod \"nova-cell1-conductor-0\" (UID: \"f1a9135a-42ef-42ca-880a-f4f5ffd78a13\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.078708 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f1a9135a-42ef-42ca-880a-f4f5ffd78a13\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.086205 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f1a9135a-42ef-42ca-880a-f4f5ffd78a13\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.088790 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f1a9135a-42ef-42ca-880a-f4f5ffd78a13\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.095343 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnl2g\" (UniqueName: \"kubernetes.io/projected/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-kube-api-access-qnl2g\") pod \"nova-cell1-conductor-0\" (UID: \"f1a9135a-42ef-42ca-880a-f4f5ffd78a13\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.117153 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.69:8775/\": read tcp 10.217.0.2:35782->10.217.1.69:8775: read: connection reset by peer" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.117250 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.69:8775/\": read tcp 10.217.0.2:35772->10.217.1.69:8775: read: connection reset by peer" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.126598 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="30b4399f-d971-4131-8243-3c45e8353cdd" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.71:8774/\": read tcp 10.217.0.2:47930->10.217.1.71:8774: read: connection reset by peer" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.126675 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="30b4399f-d971-4131-8243-3c45e8353cdd" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.71:8774/\": read tcp 10.217.0.2:47942->10.217.1.71:8774: read: connection reset by peer" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.318140 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.594971 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.678218 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.687673 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4khb7\" (UniqueName: \"kubernetes.io/projected/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-kube-api-access-4khb7\") pod \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\" (UID: \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\") " Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.687727 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-logs\") pod \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\" (UID: \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\") " Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.687763 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-config-data\") pod \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\" (UID: \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\") " Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.687842 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-combined-ca-bundle\") pod \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\" (UID: \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\") " Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.689673 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-logs" (OuterVolumeSpecName: "logs") pod "8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" (UID: "8c7ab995-e9bc-4bc6-913f-6ceb0efa562a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.695282 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-kube-api-access-4khb7" (OuterVolumeSpecName: "kube-api-access-4khb7") pod "8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" (UID: "8c7ab995-e9bc-4bc6-913f-6ceb0efa562a"). InnerVolumeSpecName "kube-api-access-4khb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.721514 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" (UID: "8c7ab995-e9bc-4bc6-913f-6ceb0efa562a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.757667 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-config-data" (OuterVolumeSpecName: "config-data") pod "8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" (UID: "8c7ab995-e9bc-4bc6-913f-6ceb0efa562a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.789044 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30b4399f-d971-4131-8243-3c45e8353cdd-config-data\") pod \"30b4399f-d971-4131-8243-3c45e8353cdd\" (UID: \"30b4399f-d971-4131-8243-3c45e8353cdd\") " Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.789433 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30b4399f-d971-4131-8243-3c45e8353cdd-combined-ca-bundle\") pod \"30b4399f-d971-4131-8243-3c45e8353cdd\" (UID: \"30b4399f-d971-4131-8243-3c45e8353cdd\") " Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.789597 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnff6\" (UniqueName: \"kubernetes.io/projected/30b4399f-d971-4131-8243-3c45e8353cdd-kube-api-access-qnff6\") pod \"30b4399f-d971-4131-8243-3c45e8353cdd\" (UID: \"30b4399f-d971-4131-8243-3c45e8353cdd\") " Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.789742 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30b4399f-d971-4131-8243-3c45e8353cdd-logs\") pod \"30b4399f-d971-4131-8243-3c45e8353cdd\" (UID: \"30b4399f-d971-4131-8243-3c45e8353cdd\") " Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.790299 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4khb7\" (UniqueName: \"kubernetes.io/projected/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-kube-api-access-4khb7\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.792203 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.794158 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.794309 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.793273 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30b4399f-d971-4131-8243-3c45e8353cdd-logs" (OuterVolumeSpecName: "logs") pod "30b4399f-d971-4131-8243-3c45e8353cdd" (UID: "30b4399f-d971-4131-8243-3c45e8353cdd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.796294 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30b4399f-d971-4131-8243-3c45e8353cdd-kube-api-access-qnff6" (OuterVolumeSpecName: "kube-api-access-qnff6") pod "30b4399f-d971-4131-8243-3c45e8353cdd" (UID: "30b4399f-d971-4131-8243-3c45e8353cdd"). InnerVolumeSpecName "kube-api-access-qnff6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.809289 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-bd886b897-t5785" podUID="25125096-9221-46cf-9c10-21242922dc39" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.73:5353: i/o timeout" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.820490 4795 generic.go:334] "Generic (PLEG): container finished" podID="9839fd0b-0161-4772-bda3-ddc2914d7e83" containerID="23d5580181fbc629a99f1fcb391f9c4c8a3cc847543b9789039eab2e110e9ef3" exitCode=0 Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.820543 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9839fd0b-0161-4772-bda3-ddc2914d7e83","Type":"ContainerDied","Data":"23d5580181fbc629a99f1fcb391f9c4c8a3cc847543b9789039eab2e110e9ef3"} Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.821752 4795 generic.go:334] "Generic (PLEG): container finished" podID="8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" containerID="a8b613dee124277cee481b45a71a4e627213532ed99ccd578d83629d833ffbc6" exitCode=0 Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.821789 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a","Type":"ContainerDied","Data":"a8b613dee124277cee481b45a71a4e627213532ed99ccd578d83629d833ffbc6"} Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.821806 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a","Type":"ContainerDied","Data":"6c69b56affcd6e3319b891be2d4392924607b5fdbd45a93af3cf16129ab8f1e4"} Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.821821 4795 scope.go:117] "RemoveContainer" containerID="a8b613dee124277cee481b45a71a4e627213532ed99ccd578d83629d833ffbc6" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.821932 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.822421 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30b4399f-d971-4131-8243-3c45e8353cdd-config-data" (OuterVolumeSpecName: "config-data") pod "30b4399f-d971-4131-8243-3c45e8353cdd" (UID: "30b4399f-d971-4131-8243-3c45e8353cdd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.829989 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2eb28a2e-eb12-4867-9c26-3416349cc1cc","Type":"ContainerStarted","Data":"1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2"} Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.830027 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2eb28a2e-eb12-4867-9c26-3416349cc1cc","Type":"ContainerStarted","Data":"93528f93304bf625c4781fad623cd7f9e1b6953a05d729d20b96627d846cf536"} Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.830870 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30b4399f-d971-4131-8243-3c45e8353cdd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30b4399f-d971-4131-8243-3c45e8353cdd" (UID: "30b4399f-d971-4131-8243-3c45e8353cdd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.832769 4795 generic.go:334] "Generic (PLEG): container finished" podID="30b4399f-d971-4131-8243-3c45e8353cdd" containerID="6d26cbca7b84b1c79190688e30dc36b02a9a4c86419cdd59664872d8f4e8db9f" exitCode=0 Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.833118 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.833129 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"30b4399f-d971-4131-8243-3c45e8353cdd","Type":"ContainerDied","Data":"6d26cbca7b84b1c79190688e30dc36b02a9a4c86419cdd59664872d8f4e8db9f"} Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.833413 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"30b4399f-d971-4131-8243-3c45e8353cdd","Type":"ContainerDied","Data":"d7a27d1be53c0ad8661e73984444d15fa7b73baf2b5131526b22e61e61977caf"} Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.834596 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a96a8189-2b04-4ce7-908b-3544dc3b7ec4","Type":"ContainerStarted","Data":"4082a6cff09d82d8edbb880cbdfe3a90c43570e40c46f1c1fef04e5446058702"} Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.834634 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a96a8189-2b04-4ce7-908b-3544dc3b7ec4","Type":"ContainerStarted","Data":"6c4d2b266d4e96be58f9a61951fa9e543bbdb3cc7c893afe0899c3de4f75ba7d"} Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.843617 4795 scope.go:117] "RemoveContainer" containerID="3e5f71386ccfa754a08ae46bc61bdb80afae6d8cdf0cf2814359a3071d8fb32e" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.855841 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.855820261 podStartE2EDuration="2.855820261s" podCreationTimestamp="2026-02-19 23:00:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:00:06.851595334 +0000 UTC m=+5518.044113218" watchObservedRunningTime="2026-02-19 23:00:06.855820261 +0000 UTC m=+5518.048338125" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.861479 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.877913 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.8778956730000003 podStartE2EDuration="2.877895673s" podCreationTimestamp="2026-02-19 23:00:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:00:06.87738212 +0000 UTC m=+5518.069899984" watchObservedRunningTime="2026-02-19 23:00:06.877895673 +0000 UTC m=+5518.070413537" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.896908 4795 scope.go:117] "RemoveContainer" containerID="a8b613dee124277cee481b45a71a4e627213532ed99ccd578d83629d833ffbc6" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.897997 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30b4399f-d971-4131-8243-3c45e8353cdd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.898034 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnff6\" (UniqueName: \"kubernetes.io/projected/30b4399f-d971-4131-8243-3c45e8353cdd-kube-api-access-qnff6\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.898046 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30b4399f-d971-4131-8243-3c45e8353cdd-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.898058 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30b4399f-d971-4131-8243-3c45e8353cdd-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:06 crc kubenswrapper[4795]: E0219 23:00:06.906786 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8b613dee124277cee481b45a71a4e627213532ed99ccd578d83629d833ffbc6\": container with ID starting with a8b613dee124277cee481b45a71a4e627213532ed99ccd578d83629d833ffbc6 not found: ID does not exist" containerID="a8b613dee124277cee481b45a71a4e627213532ed99ccd578d83629d833ffbc6" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.906821 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8b613dee124277cee481b45a71a4e627213532ed99ccd578d83629d833ffbc6"} err="failed to get container status \"a8b613dee124277cee481b45a71a4e627213532ed99ccd578d83629d833ffbc6\": rpc error: code = NotFound desc = could not find container \"a8b613dee124277cee481b45a71a4e627213532ed99ccd578d83629d833ffbc6\": container with ID starting with a8b613dee124277cee481b45a71a4e627213532ed99ccd578d83629d833ffbc6 not found: ID does not exist" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.906845 4795 scope.go:117] "RemoveContainer" containerID="3e5f71386ccfa754a08ae46bc61bdb80afae6d8cdf0cf2814359a3071d8fb32e" Feb 19 23:00:06 crc kubenswrapper[4795]: E0219 23:00:06.914312 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e5f71386ccfa754a08ae46bc61bdb80afae6d8cdf0cf2814359a3071d8fb32e\": container with ID starting with 3e5f71386ccfa754a08ae46bc61bdb80afae6d8cdf0cf2814359a3071d8fb32e not found: ID does not exist" containerID="3e5f71386ccfa754a08ae46bc61bdb80afae6d8cdf0cf2814359a3071d8fb32e" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.914367 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e5f71386ccfa754a08ae46bc61bdb80afae6d8cdf0cf2814359a3071d8fb32e"} err="failed to get container status \"3e5f71386ccfa754a08ae46bc61bdb80afae6d8cdf0cf2814359a3071d8fb32e\": rpc error: code = NotFound desc = could not find container \"3e5f71386ccfa754a08ae46bc61bdb80afae6d8cdf0cf2814359a3071d8fb32e\": container with ID starting with 3e5f71386ccfa754a08ae46bc61bdb80afae6d8cdf0cf2814359a3071d8fb32e not found: ID does not exist" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.914395 4795 scope.go:117] "RemoveContainer" containerID="6d26cbca7b84b1c79190688e30dc36b02a9a4c86419cdd59664872d8f4e8db9f" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.922300 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.969082 4795 scope.go:117] "RemoveContainer" containerID="9f81160ecc73cbe2584f285d4077d91e6e6e7f01ec7eb44fcd8f220d2ecb27ff" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.978101 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.999250 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9839fd0b-0161-4772-bda3-ddc2914d7e83-combined-ca-bundle\") pod \"9839fd0b-0161-4772-bda3-ddc2914d7e83\" (UID: \"9839fd0b-0161-4772-bda3-ddc2914d7e83\") " Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:06.999340 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l54h5\" (UniqueName: \"kubernetes.io/projected/9839fd0b-0161-4772-bda3-ddc2914d7e83-kube-api-access-l54h5\") pod \"9839fd0b-0161-4772-bda3-ddc2914d7e83\" (UID: \"9839fd0b-0161-4772-bda3-ddc2914d7e83\") " Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.004546 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9839fd0b-0161-4772-bda3-ddc2914d7e83-config-data\") pod \"9839fd0b-0161-4772-bda3-ddc2914d7e83\" (UID: \"9839fd0b-0161-4772-bda3-ddc2914d7e83\") " Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.008907 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9839fd0b-0161-4772-bda3-ddc2914d7e83-kube-api-access-l54h5" (OuterVolumeSpecName: "kube-api-access-l54h5") pod "9839fd0b-0161-4772-bda3-ddc2914d7e83" (UID: "9839fd0b-0161-4772-bda3-ddc2914d7e83"). InnerVolumeSpecName "kube-api-access-l54h5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.018283 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:00:07 crc kubenswrapper[4795]: E0219 23:00:07.018669 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9839fd0b-0161-4772-bda3-ddc2914d7e83" containerName="nova-cell0-conductor-conductor" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.018681 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9839fd0b-0161-4772-bda3-ddc2914d7e83" containerName="nova-cell0-conductor-conductor" Feb 19 23:00:07 crc kubenswrapper[4795]: E0219 23:00:07.018698 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30b4399f-d971-4131-8243-3c45e8353cdd" containerName="nova-api-log" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.018706 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="30b4399f-d971-4131-8243-3c45e8353cdd" containerName="nova-api-log" Feb 19 23:00:07 crc kubenswrapper[4795]: E0219 23:00:07.018731 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" containerName="nova-metadata-log" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.018738 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" containerName="nova-metadata-log" Feb 19 23:00:07 crc kubenswrapper[4795]: E0219 23:00:07.018754 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" containerName="nova-metadata-metadata" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.018761 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" containerName="nova-metadata-metadata" Feb 19 23:00:07 crc kubenswrapper[4795]: E0219 23:00:07.018779 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30b4399f-d971-4131-8243-3c45e8353cdd" containerName="nova-api-api" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.018785 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="30b4399f-d971-4131-8243-3c45e8353cdd" containerName="nova-api-api" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.018937 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" containerName="nova-metadata-log" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.018950 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9839fd0b-0161-4772-bda3-ddc2914d7e83" containerName="nova-cell0-conductor-conductor" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.018960 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="30b4399f-d971-4131-8243-3c45e8353cdd" containerName="nova-api-api" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.018975 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" containerName="nova-metadata-metadata" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.018983 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="30b4399f-d971-4131-8243-3c45e8353cdd" containerName="nova-api-log" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.020551 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.023762 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.033409 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9839fd0b-0161-4772-bda3-ddc2914d7e83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9839fd0b-0161-4772-bda3-ddc2914d7e83" (UID: "9839fd0b-0161-4772-bda3-ddc2914d7e83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.036353 4795 scope.go:117] "RemoveContainer" containerID="6d26cbca7b84b1c79190688e30dc36b02a9a4c86419cdd59664872d8f4e8db9f" Feb 19 23:00:07 crc kubenswrapper[4795]: E0219 23:00:07.036933 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d26cbca7b84b1c79190688e30dc36b02a9a4c86419cdd59664872d8f4e8db9f\": container with ID starting with 6d26cbca7b84b1c79190688e30dc36b02a9a4c86419cdd59664872d8f4e8db9f not found: ID does not exist" containerID="6d26cbca7b84b1c79190688e30dc36b02a9a4c86419cdd59664872d8f4e8db9f" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.036959 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d26cbca7b84b1c79190688e30dc36b02a9a4c86419cdd59664872d8f4e8db9f"} err="failed to get container status \"6d26cbca7b84b1c79190688e30dc36b02a9a4c86419cdd59664872d8f4e8db9f\": rpc error: code = NotFound desc = could not find container \"6d26cbca7b84b1c79190688e30dc36b02a9a4c86419cdd59664872d8f4e8db9f\": container with ID starting with 6d26cbca7b84b1c79190688e30dc36b02a9a4c86419cdd59664872d8f4e8db9f not found: ID does not exist" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.036977 4795 scope.go:117] "RemoveContainer" containerID="9f81160ecc73cbe2584f285d4077d91e6e6e7f01ec7eb44fcd8f220d2ecb27ff" Feb 19 23:00:07 crc kubenswrapper[4795]: E0219 23:00:07.037443 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f81160ecc73cbe2584f285d4077d91e6e6e7f01ec7eb44fcd8f220d2ecb27ff\": container with ID starting with 9f81160ecc73cbe2584f285d4077d91e6e6e7f01ec7eb44fcd8f220d2ecb27ff not found: ID does not exist" containerID="9f81160ecc73cbe2584f285d4077d91e6e6e7f01ec7eb44fcd8f220d2ecb27ff" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.037466 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f81160ecc73cbe2584f285d4077d91e6e6e7f01ec7eb44fcd8f220d2ecb27ff"} err="failed to get container status \"9f81160ecc73cbe2584f285d4077d91e6e6e7f01ec7eb44fcd8f220d2ecb27ff\": rpc error: code = NotFound desc = could not find container \"9f81160ecc73cbe2584f285d4077d91e6e6e7f01ec7eb44fcd8f220d2ecb27ff\": container with ID starting with 9f81160ecc73cbe2584f285d4077d91e6e6e7f01ec7eb44fcd8f220d2ecb27ff not found: ID does not exist" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.050319 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.060395 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9839fd0b-0161-4772-bda3-ddc2914d7e83-config-data" (OuterVolumeSpecName: "config-data") pod "9839fd0b-0161-4772-bda3-ddc2914d7e83" (UID: "9839fd0b-0161-4772-bda3-ddc2914d7e83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.067765 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.074777 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.085873 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.087701 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.090206 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.102389 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.106320 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-config-data\") pod \"nova-metadata-0\" (UID: \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\") " pod="openstack/nova-metadata-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.106365 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-logs\") pod \"nova-metadata-0\" (UID: \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\") " pod="openstack/nova-metadata-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.106385 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\") " pod="openstack/nova-metadata-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.106449 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dq6w\" (UniqueName: \"kubernetes.io/projected/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-kube-api-access-2dq6w\") pod \"nova-metadata-0\" (UID: \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\") " pod="openstack/nova-metadata-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.106557 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l54h5\" (UniqueName: \"kubernetes.io/projected/9839fd0b-0161-4772-bda3-ddc2914d7e83-kube-api-access-l54h5\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.106572 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9839fd0b-0161-4772-bda3-ddc2914d7e83-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.106581 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9839fd0b-0161-4772-bda3-ddc2914d7e83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.141324 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.212078 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-config-data\") pod \"nova-metadata-0\" (UID: \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\") " pod="openstack/nova-metadata-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.212143 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8af13c78-4805-4828-980c-45e1defd94c3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8af13c78-4805-4828-980c-45e1defd94c3\") " pod="openstack/nova-api-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.212180 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-logs\") pod \"nova-metadata-0\" (UID: \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\") " pod="openstack/nova-metadata-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.212198 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\") " pod="openstack/nova-metadata-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.212223 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8af13c78-4805-4828-980c-45e1defd94c3-config-data\") pod \"nova-api-0\" (UID: \"8af13c78-4805-4828-980c-45e1defd94c3\") " pod="openstack/nova-api-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.212241 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8af13c78-4805-4828-980c-45e1defd94c3-logs\") pod \"nova-api-0\" (UID: \"8af13c78-4805-4828-980c-45e1defd94c3\") " pod="openstack/nova-api-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.212326 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dq6w\" (UniqueName: \"kubernetes.io/projected/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-kube-api-access-2dq6w\") pod \"nova-metadata-0\" (UID: \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\") " pod="openstack/nova-metadata-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.212400 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2xch\" (UniqueName: \"kubernetes.io/projected/8af13c78-4805-4828-980c-45e1defd94c3-kube-api-access-r2xch\") pod \"nova-api-0\" (UID: \"8af13c78-4805-4828-980c-45e1defd94c3\") " pod="openstack/nova-api-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.212822 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-logs\") pod \"nova-metadata-0\" (UID: \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\") " pod="openstack/nova-metadata-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.218900 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-config-data\") pod \"nova-metadata-0\" (UID: \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\") " pod="openstack/nova-metadata-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.232865 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\") " pod="openstack/nova-metadata-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.255480 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dq6w\" (UniqueName: \"kubernetes.io/projected/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-kube-api-access-2dq6w\") pod \"nova-metadata-0\" (UID: \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\") " pod="openstack/nova-metadata-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.316179 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2xch\" (UniqueName: \"kubernetes.io/projected/8af13c78-4805-4828-980c-45e1defd94c3-kube-api-access-r2xch\") pod \"nova-api-0\" (UID: \"8af13c78-4805-4828-980c-45e1defd94c3\") " pod="openstack/nova-api-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.316245 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8af13c78-4805-4828-980c-45e1defd94c3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8af13c78-4805-4828-980c-45e1defd94c3\") " pod="openstack/nova-api-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.316272 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8af13c78-4805-4828-980c-45e1defd94c3-config-data\") pod \"nova-api-0\" (UID: \"8af13c78-4805-4828-980c-45e1defd94c3\") " pod="openstack/nova-api-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.316288 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8af13c78-4805-4828-980c-45e1defd94c3-logs\") pod \"nova-api-0\" (UID: \"8af13c78-4805-4828-980c-45e1defd94c3\") " pod="openstack/nova-api-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.316789 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8af13c78-4805-4828-980c-45e1defd94c3-logs\") pod \"nova-api-0\" (UID: \"8af13c78-4805-4828-980c-45e1defd94c3\") " pod="openstack/nova-api-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.324068 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8af13c78-4805-4828-980c-45e1defd94c3-config-data\") pod \"nova-api-0\" (UID: \"8af13c78-4805-4828-980c-45e1defd94c3\") " pod="openstack/nova-api-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.338654 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8af13c78-4805-4828-980c-45e1defd94c3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8af13c78-4805-4828-980c-45e1defd94c3\") " pod="openstack/nova-api-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.346667 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2xch\" (UniqueName: \"kubernetes.io/projected/8af13c78-4805-4828-980c-45e1defd94c3-kube-api-access-r2xch\") pod \"nova-api-0\" (UID: \"8af13c78-4805-4828-980c-45e1defd94c3\") " pod="openstack/nova-api-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.361613 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.409619 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.547207 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30b4399f-d971-4131-8243-3c45e8353cdd" path="/var/lib/kubelet/pods/30b4399f-d971-4131-8243-3c45e8353cdd/volumes" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.547925 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" path="/var/lib/kubelet/pods/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a/volumes" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.548467 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb0fc8ea-d689-468f-a612-b87c3e63077d" path="/var/lib/kubelet/pods/cb0fc8ea-d689-468f-a612-b87c3e63077d/volumes" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.844348 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f1a9135a-42ef-42ca-880a-f4f5ffd78a13","Type":"ContainerStarted","Data":"47428fb66fcdd0ecf9a1523a5b7b854d3890395cb0c5cd90338ec63eed539ed1"} Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.844585 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f1a9135a-42ef-42ca-880a-f4f5ffd78a13","Type":"ContainerStarted","Data":"2adc08c6dd489704d7eddd8052ac3149a11c14f7992162c2dccd22cfce6e5fe5"} Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.846571 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.850824 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9839fd0b-0161-4772-bda3-ddc2914d7e83","Type":"ContainerDied","Data":"453e3afffd87b59561c01a967d49235af5b257402e1ea25bffbfe37497656af8"} Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.850873 4795 scope.go:117] "RemoveContainer" containerID="23d5580181fbc629a99f1fcb391f9c4c8a3cc847543b9789039eab2e110e9ef3" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.850965 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.870967 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.870944368 podStartE2EDuration="2.870944368s" podCreationTimestamp="2026-02-19 23:00:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:00:07.865842748 +0000 UTC m=+5519.058360612" watchObservedRunningTime="2026-02-19 23:00:07.870944368 +0000 UTC m=+5519.063462232" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.899980 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.916655 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.007079 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.021141 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.022642 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.028551 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.031801 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.043258 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.160392 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mzqk\" (UniqueName: \"kubernetes.io/projected/761a7217-33fa-4d78-8a05-492cbb33f48d-kube-api-access-7mzqk\") pod \"nova-cell0-conductor-0\" (UID: \"761a7217-33fa-4d78-8a05-492cbb33f48d\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.160785 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/761a7217-33fa-4d78-8a05-492cbb33f48d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"761a7217-33fa-4d78-8a05-492cbb33f48d\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.160884 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/761a7217-33fa-4d78-8a05-492cbb33f48d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"761a7217-33fa-4d78-8a05-492cbb33f48d\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.264300 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mzqk\" (UniqueName: \"kubernetes.io/projected/761a7217-33fa-4d78-8a05-492cbb33f48d-kube-api-access-7mzqk\") pod \"nova-cell0-conductor-0\" (UID: \"761a7217-33fa-4d78-8a05-492cbb33f48d\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.264381 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/761a7217-33fa-4d78-8a05-492cbb33f48d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"761a7217-33fa-4d78-8a05-492cbb33f48d\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.264476 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/761a7217-33fa-4d78-8a05-492cbb33f48d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"761a7217-33fa-4d78-8a05-492cbb33f48d\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.273134 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/761a7217-33fa-4d78-8a05-492cbb33f48d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"761a7217-33fa-4d78-8a05-492cbb33f48d\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.277252 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/761a7217-33fa-4d78-8a05-492cbb33f48d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"761a7217-33fa-4d78-8a05-492cbb33f48d\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.286597 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mzqk\" (UniqueName: \"kubernetes.io/projected/761a7217-33fa-4d78-8a05-492cbb33f48d-kube-api-access-7mzqk\") pod \"nova-cell0-conductor-0\" (UID: \"761a7217-33fa-4d78-8a05-492cbb33f48d\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.377400 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.819198 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.867936 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9","Type":"ContainerStarted","Data":"fd1e78c709cd03e12813d0773547c4465f0e66a7a0cdec8f42bc3426e77cff3f"} Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.867987 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9","Type":"ContainerStarted","Data":"aa2cfffccfdef39a961925828c69395a846784ec188eaf37747eb10bd8158d2b"} Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.868001 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9","Type":"ContainerStarted","Data":"df7150bd3c379f19d4f935bb8e348093119703bff34dd1ad6781416721057a60"} Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.874120 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8af13c78-4805-4828-980c-45e1defd94c3","Type":"ContainerStarted","Data":"d925bab645dff9712b2a171f2f4482d18d0a0ee4391b990a9a7a705d61ad3def"} Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.874150 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8af13c78-4805-4828-980c-45e1defd94c3","Type":"ContainerStarted","Data":"ad930281e03b5e1a65091cba120348431f474125eb5bc690c29d8ca4f3247c21"} Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.874182 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8af13c78-4805-4828-980c-45e1defd94c3","Type":"ContainerStarted","Data":"7ee07f55b63d65c8ea13f8ca8377dd262c2422aff92a6a2abfe47d6fef72c015"} Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.879070 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"761a7217-33fa-4d78-8a05-492cbb33f48d","Type":"ContainerStarted","Data":"a920c7c53728d52d3ab518fdecf0b6800cb795ab36fabc65145920815940fa68"} Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.895558 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.895539045 podStartE2EDuration="2.895539045s" podCreationTimestamp="2026-02-19 23:00:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:00:08.885097209 +0000 UTC m=+5520.077615073" watchObservedRunningTime="2026-02-19 23:00:08.895539045 +0000 UTC m=+5520.088056919" Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.915522 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.915493983 podStartE2EDuration="2.915493983s" podCreationTimestamp="2026-02-19 23:00:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:00:08.90161435 +0000 UTC m=+5520.094132214" watchObservedRunningTime="2026-02-19 23:00:08.915493983 +0000 UTC m=+5520.108011877" Feb 19 23:00:09 crc kubenswrapper[4795]: I0219 23:00:09.529614 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9839fd0b-0161-4772-bda3-ddc2914d7e83" path="/var/lib/kubelet/pods/9839fd0b-0161-4772-bda3-ddc2914d7e83/volumes" Feb 19 23:00:09 crc kubenswrapper[4795]: I0219 23:00:09.894793 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"761a7217-33fa-4d78-8a05-492cbb33f48d","Type":"ContainerStarted","Data":"2e039fa6d9d7121c3b280b416aad376a4ff38189c9a9b31473131235502fb9b5"} Feb 19 23:00:09 crc kubenswrapper[4795]: I0219 23:00:09.895827 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 19 23:00:09 crc kubenswrapper[4795]: I0219 23:00:09.920399 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.9203789479999998 podStartE2EDuration="2.920378948s" podCreationTimestamp="2026-02-19 23:00:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:00:09.913486022 +0000 UTC m=+5521.106003896" watchObservedRunningTime="2026-02-19 23:00:09.920378948 +0000 UTC m=+5521.112896812" Feb 19 23:00:10 crc kubenswrapper[4795]: I0219 23:00:10.327029 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 23:00:10 crc kubenswrapper[4795]: I0219 23:00:10.333973 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:12 crc kubenswrapper[4795]: I0219 23:00:12.362214 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 23:00:12 crc kubenswrapper[4795]: I0219 23:00:12.362623 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 23:00:15 crc kubenswrapper[4795]: I0219 23:00:15.327673 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 23:00:15 crc kubenswrapper[4795]: I0219 23:00:15.334655 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:15 crc kubenswrapper[4795]: I0219 23:00:15.345580 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:15 crc kubenswrapper[4795]: I0219 23:00:15.355339 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 23:00:15 crc kubenswrapper[4795]: I0219 23:00:15.823704 4795 scope.go:117] "RemoveContainer" containerID="e5dc57de5d860d1b9f4b0c7fa5487f6cf98d60033e436fbbba7b629cc254b689" Feb 19 23:00:15 crc kubenswrapper[4795]: I0219 23:00:15.993709 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:16 crc kubenswrapper[4795]: I0219 23:00:16.045477 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 23:00:16 crc kubenswrapper[4795]: I0219 23:00:16.346773 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 19 23:00:17 crc kubenswrapper[4795]: I0219 23:00:17.362423 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 23:00:17 crc kubenswrapper[4795]: I0219 23:00:17.362485 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 23:00:17 crc kubenswrapper[4795]: I0219 23:00:17.410308 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 23:00:17 crc kubenswrapper[4795]: I0219 23:00:17.410363 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 23:00:18 crc kubenswrapper[4795]: I0219 23:00:18.407823 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 19 23:00:18 crc kubenswrapper[4795]: I0219 23:00:18.446498 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.83:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 23:00:18 crc kubenswrapper[4795]: I0219 23:00:18.446686 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.83:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 23:00:18 crc kubenswrapper[4795]: I0219 23:00:18.534499 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8af13c78-4805-4828-980c-45e1defd94c3" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.84:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 23:00:18 crc kubenswrapper[4795]: I0219 23:00:18.534513 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8af13c78-4805-4828-980c-45e1defd94c3" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.84:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.595775 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.598411 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.600929 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.611339 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.750402 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.750511 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.750585 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-config-data\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.750675 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq4nx\" (UniqueName: \"kubernetes.io/projected/29196316-7b32-486b-a786-10a3912bc206-kube-api-access-pq4nx\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.750726 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29196316-7b32-486b-a786-10a3912bc206-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.750765 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-scripts\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.852790 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.852848 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.852879 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-config-data\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.852914 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq4nx\" (UniqueName: \"kubernetes.io/projected/29196316-7b32-486b-a786-10a3912bc206-kube-api-access-pq4nx\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.852934 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29196316-7b32-486b-a786-10a3912bc206-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.852953 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-scripts\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.853506 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29196316-7b32-486b-a786-10a3912bc206-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.858567 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-scripts\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.859548 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.859727 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-config-data\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.861508 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.871905 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq4nx\" (UniqueName: \"kubernetes.io/projected/29196316-7b32-486b-a786-10a3912bc206-kube-api-access-pq4nx\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.967858 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 23:00:22 crc kubenswrapper[4795]: I0219 23:00:22.475644 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 23:00:23 crc kubenswrapper[4795]: I0219 23:00:23.071295 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"29196316-7b32-486b-a786-10a3912bc206","Type":"ContainerStarted","Data":"92b489e7d1f8c23a12cd2c3b0232a07d52f20da60c5e7bd656a106e1bbfc254b"} Feb 19 23:00:23 crc kubenswrapper[4795]: I0219 23:00:23.384185 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 19 23:00:23 crc kubenswrapper[4795]: I0219 23:00:23.385116 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3922ed4f-baf9-481e-af8b-b009440dfea2" containerName="cinder-api-log" containerID="cri-o://aeb4083d1a03c26f7d2a5602b43820b940e6170cf9c38835c9bf0e75e7c3c930" gracePeriod=30 Feb 19 23:00:23 crc kubenswrapper[4795]: I0219 23:00:23.385264 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3922ed4f-baf9-481e-af8b-b009440dfea2" containerName="cinder-api" containerID="cri-o://776e2209bad65e1284c6f9e29b06d04a385b79aab9583f4b7309a11a22e8add7" gracePeriod=30 Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.081998 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"29196316-7b32-486b-a786-10a3912bc206","Type":"ContainerStarted","Data":"43556d75e94601761021c0d54884cee46c2545d93705f88358280a86bd6460b9"} Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.082347 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"29196316-7b32-486b-a786-10a3912bc206","Type":"ContainerStarted","Data":"d9346b789eea6533075dc5c63a660bdc3b57347c9d222a08726141055f09a5fe"} Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.085320 4795 generic.go:334] "Generic (PLEG): container finished" podID="3922ed4f-baf9-481e-af8b-b009440dfea2" containerID="aeb4083d1a03c26f7d2a5602b43820b940e6170cf9c38835c9bf0e75e7c3c930" exitCode=143 Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.085355 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3922ed4f-baf9-481e-af8b-b009440dfea2","Type":"ContainerDied","Data":"aeb4083d1a03c26f7d2a5602b43820b940e6170cf9c38835c9bf0e75e7c3c930"} Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.100524 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.102426 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.108275 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.125702 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.128724 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.128696538 podStartE2EDuration="3.128696538s" podCreationTimestamp="2026-02-19 23:00:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:00:24.108795251 +0000 UTC m=+5535.301313115" watchObservedRunningTime="2026-02-19 23:00:24.128696538 +0000 UTC m=+5535.321214402" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.195584 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e22321-4464-4199-b873-8998821a02ed-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.195661 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.195712 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90e22321-4464-4199-b873-8998821a02ed-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.195737 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.195770 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.195789 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90e22321-4464-4199-b873-8998821a02ed-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.195810 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-dev\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.195834 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.196093 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.196154 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-sys\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.196292 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8xj8\" (UniqueName: \"kubernetes.io/projected/90e22321-4464-4199-b873-8998821a02ed-kube-api-access-r8xj8\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.196367 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90e22321-4464-4199-b873-8998821a02ed-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.196488 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.196632 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/90e22321-4464-4199-b873-8998821a02ed-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.196686 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.196715 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-run\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.299408 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.299479 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-run\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.299526 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e22321-4464-4199-b873-8998821a02ed-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.299576 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.299610 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90e22321-4464-4199-b873-8998821a02ed-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.299646 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.299681 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.299703 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90e22321-4464-4199-b873-8998821a02ed-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.299725 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-dev\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.299759 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.299820 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.299854 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-sys\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.299899 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8xj8\" (UniqueName: \"kubernetes.io/projected/90e22321-4464-4199-b873-8998821a02ed-kube-api-access-r8xj8\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.299934 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90e22321-4464-4199-b873-8998821a02ed-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.299984 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.300048 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/90e22321-4464-4199-b873-8998821a02ed-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.301671 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-sys\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.301739 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-dev\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.302042 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.302598 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-run\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.302629 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.302643 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.302662 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.302695 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.302715 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.302752 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.307348 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90e22321-4464-4199-b873-8998821a02ed-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.307727 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/90e22321-4464-4199-b873-8998821a02ed-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.307962 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e22321-4464-4199-b873-8998821a02ed-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.309264 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90e22321-4464-4199-b873-8998821a02ed-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.314010 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90e22321-4464-4199-b873-8998821a02ed-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.322024 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8xj8\" (UniqueName: \"kubernetes.io/projected/90e22321-4464-4199-b873-8998821a02ed-kube-api-access-r8xj8\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.424925 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.014875 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.017022 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.020994 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.031746 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.112380 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.125192 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.125229 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12de80a7-e42b-4768-83d4-0ed7d7490c30-scripts\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.125253 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12de80a7-e42b-4768-83d4-0ed7d7490c30-config-data\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.125268 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12de80a7-e42b-4768-83d4-0ed7d7490c30-config-data-custom\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.125303 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.125327 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-sys\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.125348 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.125382 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-etc-nvme\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.125403 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/12de80a7-e42b-4768-83d4-0ed7d7490c30-ceph\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.125423 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-lib-modules\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.125443 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-dev\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.125466 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.125480 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12de80a7-e42b-4768-83d4-0ed7d7490c30-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.125504 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-run\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.125521 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-457zc\" (UniqueName: \"kubernetes.io/projected/12de80a7-e42b-4768-83d4-0ed7d7490c30-kube-api-access-457zc\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.125544 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.227554 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.227647 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.227667 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12de80a7-e42b-4768-83d4-0ed7d7490c30-scripts\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.227689 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12de80a7-e42b-4768-83d4-0ed7d7490c30-config-data\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.227706 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12de80a7-e42b-4768-83d4-0ed7d7490c30-config-data-custom\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.227750 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.227780 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-sys\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.227801 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.227840 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-etc-nvme\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.227863 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/12de80a7-e42b-4768-83d4-0ed7d7490c30-ceph\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.227891 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-lib-modules\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.227924 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-dev\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.227949 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.227968 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12de80a7-e42b-4768-83d4-0ed7d7490c30-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.227992 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-run\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.228009 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-457zc\" (UniqueName: \"kubernetes.io/projected/12de80a7-e42b-4768-83d4-0ed7d7490c30-kube-api-access-457zc\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.228495 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.228656 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.228786 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-dev\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.228843 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-lib-modules\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.229081 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.229134 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.229449 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.229478 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-sys\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.229500 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-run\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.229572 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-etc-nvme\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.236001 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12de80a7-e42b-4768-83d4-0ed7d7490c30-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.236377 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12de80a7-e42b-4768-83d4-0ed7d7490c30-scripts\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.236715 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12de80a7-e42b-4768-83d4-0ed7d7490c30-config-data\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.237213 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/12de80a7-e42b-4768-83d4-0ed7d7490c30-ceph\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.239611 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12de80a7-e42b-4768-83d4-0ed7d7490c30-config-data-custom\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.263188 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-457zc\" (UniqueName: \"kubernetes.io/projected/12de80a7-e42b-4768-83d4-0ed7d7490c30-kube-api-access-457zc\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.385704 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.021215 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b9kk9"] Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.023966 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:26 crc kubenswrapper[4795]: W0219 23:00:26.041951 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12de80a7_e42b_4768_83d4_0ed7d7490c30.slice/crio-881e7c70e362f07106c7210992c15ba1c9599247aeb6ae56884f8e6be5569570 WatchSource:0}: Error finding container 881e7c70e362f07106c7210992c15ba1c9599247aeb6ae56884f8e6be5569570: Status 404 returned error can't find the container with id 881e7c70e362f07106c7210992c15ba1c9599247aeb6ae56884f8e6be5569570 Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.060006 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.106207 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b9kk9"] Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.116265 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"90e22321-4464-4199-b873-8998821a02ed","Type":"ContainerStarted","Data":"ea7efe1c9e8ab00bffd97273e2e98eee7a5bf08616ffdbda291a2ddac368b77f"} Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.117509 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"12de80a7-e42b-4768-83d4-0ed7d7490c30","Type":"ContainerStarted","Data":"881e7c70e362f07106c7210992c15ba1c9599247aeb6ae56884f8e6be5569570"} Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.147307 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdcxt\" (UniqueName: \"kubernetes.io/projected/020ffbb9-5c2d-4cdd-af08-44c28850c44c-kube-api-access-gdcxt\") pod \"redhat-operators-b9kk9\" (UID: \"020ffbb9-5c2d-4cdd-af08-44c28850c44c\") " pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.147362 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/020ffbb9-5c2d-4cdd-af08-44c28850c44c-catalog-content\") pod \"redhat-operators-b9kk9\" (UID: \"020ffbb9-5c2d-4cdd-af08-44c28850c44c\") " pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.147919 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/020ffbb9-5c2d-4cdd-af08-44c28850c44c-utilities\") pod \"redhat-operators-b9kk9\" (UID: \"020ffbb9-5c2d-4cdd-af08-44c28850c44c\") " pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.249811 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/020ffbb9-5c2d-4cdd-af08-44c28850c44c-utilities\") pod \"redhat-operators-b9kk9\" (UID: \"020ffbb9-5c2d-4cdd-af08-44c28850c44c\") " pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.250411 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdcxt\" (UniqueName: \"kubernetes.io/projected/020ffbb9-5c2d-4cdd-af08-44c28850c44c-kube-api-access-gdcxt\") pod \"redhat-operators-b9kk9\" (UID: \"020ffbb9-5c2d-4cdd-af08-44c28850c44c\") " pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.250508 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/020ffbb9-5c2d-4cdd-af08-44c28850c44c-catalog-content\") pod \"redhat-operators-b9kk9\" (UID: \"020ffbb9-5c2d-4cdd-af08-44c28850c44c\") " pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.250428 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/020ffbb9-5c2d-4cdd-af08-44c28850c44c-utilities\") pod \"redhat-operators-b9kk9\" (UID: \"020ffbb9-5c2d-4cdd-af08-44c28850c44c\") " pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.250979 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/020ffbb9-5c2d-4cdd-af08-44c28850c44c-catalog-content\") pod \"redhat-operators-b9kk9\" (UID: \"020ffbb9-5c2d-4cdd-af08-44c28850c44c\") " pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.283541 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdcxt\" (UniqueName: \"kubernetes.io/projected/020ffbb9-5c2d-4cdd-af08-44c28850c44c-kube-api-access-gdcxt\") pod \"redhat-operators-b9kk9\" (UID: \"020ffbb9-5c2d-4cdd-af08-44c28850c44c\") " pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.345657 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.565814 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="3922ed4f-baf9-481e-af8b-b009440dfea2" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.78:8776/healthcheck\": read tcp 10.217.0.2:40670->10.217.1.78:8776: read: connection reset by peer" Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.875459 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b9kk9"] Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.968817 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.066760 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.163572 4795 generic.go:334] "Generic (PLEG): container finished" podID="3922ed4f-baf9-481e-af8b-b009440dfea2" containerID="776e2209bad65e1284c6f9e29b06d04a385b79aab9583f4b7309a11a22e8add7" exitCode=0 Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.163639 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.163665 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3922ed4f-baf9-481e-af8b-b009440dfea2","Type":"ContainerDied","Data":"776e2209bad65e1284c6f9e29b06d04a385b79aab9583f4b7309a11a22e8add7"} Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.165934 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3922ed4f-baf9-481e-af8b-b009440dfea2","Type":"ContainerDied","Data":"595aeff6e650976b06c39b4880224d655a58680d4787becedbb202c16d33fdcb"} Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.166042 4795 scope.go:117] "RemoveContainer" containerID="776e2209bad65e1284c6f9e29b06d04a385b79aab9583f4b7309a11a22e8add7" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.172755 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9kk9" event={"ID":"020ffbb9-5c2d-4cdd-af08-44c28850c44c","Type":"ContainerStarted","Data":"09bb4d42669be15e952593140fb7fd9a24d0d3ef4f5071cd4b5b653e574c2307"} Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.181407 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"90e22321-4464-4199-b873-8998821a02ed","Type":"ContainerStarted","Data":"457bbb15ecd6c0b5e367713e9b9154013c2947fef4fac9999aba9df45c607652"} Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.181456 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"90e22321-4464-4199-b873-8998821a02ed","Type":"ContainerStarted","Data":"99be7e590f387dd0d638a8fcea28d04ea3256b782be62a9a947c6792d92b775d"} Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.182743 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-config-data-custom\") pod \"3922ed4f-baf9-481e-af8b-b009440dfea2\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.182859 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-combined-ca-bundle\") pod \"3922ed4f-baf9-481e-af8b-b009440dfea2\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.182911 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thkh7\" (UniqueName: \"kubernetes.io/projected/3922ed4f-baf9-481e-af8b-b009440dfea2-kube-api-access-thkh7\") pod \"3922ed4f-baf9-481e-af8b-b009440dfea2\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.182955 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-config-data\") pod \"3922ed4f-baf9-481e-af8b-b009440dfea2\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.182978 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-scripts\") pod \"3922ed4f-baf9-481e-af8b-b009440dfea2\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.183039 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3922ed4f-baf9-481e-af8b-b009440dfea2-logs\") pod \"3922ed4f-baf9-481e-af8b-b009440dfea2\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.183058 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3922ed4f-baf9-481e-af8b-b009440dfea2-etc-machine-id\") pod \"3922ed4f-baf9-481e-af8b-b009440dfea2\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.183457 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3922ed4f-baf9-481e-af8b-b009440dfea2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3922ed4f-baf9-481e-af8b-b009440dfea2" (UID: "3922ed4f-baf9-481e-af8b-b009440dfea2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.188424 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3922ed4f-baf9-481e-af8b-b009440dfea2-logs" (OuterVolumeSpecName: "logs") pod "3922ed4f-baf9-481e-af8b-b009440dfea2" (UID: "3922ed4f-baf9-481e-af8b-b009440dfea2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.204305 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3922ed4f-baf9-481e-af8b-b009440dfea2" (UID: "3922ed4f-baf9-481e-af8b-b009440dfea2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.208440 4795 scope.go:117] "RemoveContainer" containerID="aeb4083d1a03c26f7d2a5602b43820b940e6170cf9c38835c9bf0e75e7c3c930" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.216845 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3922ed4f-baf9-481e-af8b-b009440dfea2-kube-api-access-thkh7" (OuterVolumeSpecName: "kube-api-access-thkh7") pod "3922ed4f-baf9-481e-af8b-b009440dfea2" (UID: "3922ed4f-baf9-481e-af8b-b009440dfea2"). InnerVolumeSpecName "kube-api-access-thkh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.217649 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-scripts" (OuterVolumeSpecName: "scripts") pod "3922ed4f-baf9-481e-af8b-b009440dfea2" (UID: "3922ed4f-baf9-481e-af8b-b009440dfea2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.229386 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=2.496400299 podStartE2EDuration="3.229355514s" podCreationTimestamp="2026-02-19 23:00:24 +0000 UTC" firstStartedPulling="2026-02-19 23:00:25.121030504 +0000 UTC m=+5536.313548368" lastFinishedPulling="2026-02-19 23:00:25.853985719 +0000 UTC m=+5537.046503583" observedRunningTime="2026-02-19 23:00:27.216557799 +0000 UTC m=+5538.409075683" watchObservedRunningTime="2026-02-19 23:00:27.229355514 +0000 UTC m=+5538.421873368" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.242139 4795 scope.go:117] "RemoveContainer" containerID="776e2209bad65e1284c6f9e29b06d04a385b79aab9583f4b7309a11a22e8add7" Feb 19 23:00:27 crc kubenswrapper[4795]: E0219 23:00:27.243879 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"776e2209bad65e1284c6f9e29b06d04a385b79aab9583f4b7309a11a22e8add7\": container with ID starting with 776e2209bad65e1284c6f9e29b06d04a385b79aab9583f4b7309a11a22e8add7 not found: ID does not exist" containerID="776e2209bad65e1284c6f9e29b06d04a385b79aab9583f4b7309a11a22e8add7" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.243922 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"776e2209bad65e1284c6f9e29b06d04a385b79aab9583f4b7309a11a22e8add7"} err="failed to get container status \"776e2209bad65e1284c6f9e29b06d04a385b79aab9583f4b7309a11a22e8add7\": rpc error: code = NotFound desc = could not find container \"776e2209bad65e1284c6f9e29b06d04a385b79aab9583f4b7309a11a22e8add7\": container with ID starting with 776e2209bad65e1284c6f9e29b06d04a385b79aab9583f4b7309a11a22e8add7 not found: ID does not exist" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.243951 4795 scope.go:117] "RemoveContainer" containerID="aeb4083d1a03c26f7d2a5602b43820b940e6170cf9c38835c9bf0e75e7c3c930" Feb 19 23:00:27 crc kubenswrapper[4795]: E0219 23:00:27.244646 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aeb4083d1a03c26f7d2a5602b43820b940e6170cf9c38835c9bf0e75e7c3c930\": container with ID starting with aeb4083d1a03c26f7d2a5602b43820b940e6170cf9c38835c9bf0e75e7c3c930 not found: ID does not exist" containerID="aeb4083d1a03c26f7d2a5602b43820b940e6170cf9c38835c9bf0e75e7c3c930" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.244672 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aeb4083d1a03c26f7d2a5602b43820b940e6170cf9c38835c9bf0e75e7c3c930"} err="failed to get container status \"aeb4083d1a03c26f7d2a5602b43820b940e6170cf9c38835c9bf0e75e7c3c930\": rpc error: code = NotFound desc = could not find container \"aeb4083d1a03c26f7d2a5602b43820b940e6170cf9c38835c9bf0e75e7c3c930\": container with ID starting with aeb4083d1a03c26f7d2a5602b43820b940e6170cf9c38835c9bf0e75e7c3c930 not found: ID does not exist" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.287777 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.289493 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thkh7\" (UniqueName: \"kubernetes.io/projected/3922ed4f-baf9-481e-af8b-b009440dfea2-kube-api-access-thkh7\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.289516 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.289530 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3922ed4f-baf9-481e-af8b-b009440dfea2-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.289545 4795 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3922ed4f-baf9-481e-af8b-b009440dfea2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.288548 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3922ed4f-baf9-481e-af8b-b009440dfea2" (UID: "3922ed4f-baf9-481e-af8b-b009440dfea2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.340858 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-config-data" (OuterVolumeSpecName: "config-data") pod "3922ed4f-baf9-481e-af8b-b009440dfea2" (UID: "3922ed4f-baf9-481e-af8b-b009440dfea2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.368704 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.371942 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.386747 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.392700 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.392732 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.426536 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.427549 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.432996 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.439510 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.580345 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.611607 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.627574 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 19 23:00:27 crc kubenswrapper[4795]: E0219 23:00:27.628000 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3922ed4f-baf9-481e-af8b-b009440dfea2" containerName="cinder-api-log" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.628013 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3922ed4f-baf9-481e-af8b-b009440dfea2" containerName="cinder-api-log" Feb 19 23:00:27 crc kubenswrapper[4795]: E0219 23:00:27.628027 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3922ed4f-baf9-481e-af8b-b009440dfea2" containerName="cinder-api" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.628033 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3922ed4f-baf9-481e-af8b-b009440dfea2" containerName="cinder-api" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.628272 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3922ed4f-baf9-481e-af8b-b009440dfea2" containerName="cinder-api" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.628290 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3922ed4f-baf9-481e-af8b-b009440dfea2" containerName="cinder-api-log" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.629261 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.636891 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.639933 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.711038 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daeb9555-6d76-45ca-b3da-b6dd91c33e00-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.711136 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzgqt\" (UniqueName: \"kubernetes.io/projected/daeb9555-6d76-45ca-b3da-b6dd91c33e00-kube-api-access-dzgqt\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.711355 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/daeb9555-6d76-45ca-b3da-b6dd91c33e00-etc-machine-id\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.711445 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/daeb9555-6d76-45ca-b3da-b6dd91c33e00-logs\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.711470 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daeb9555-6d76-45ca-b3da-b6dd91c33e00-config-data\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.711525 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/daeb9555-6d76-45ca-b3da-b6dd91c33e00-config-data-custom\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.711603 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/daeb9555-6d76-45ca-b3da-b6dd91c33e00-scripts\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.813258 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/daeb9555-6d76-45ca-b3da-b6dd91c33e00-scripts\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.813343 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daeb9555-6d76-45ca-b3da-b6dd91c33e00-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.813393 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzgqt\" (UniqueName: \"kubernetes.io/projected/daeb9555-6d76-45ca-b3da-b6dd91c33e00-kube-api-access-dzgqt\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.813447 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/daeb9555-6d76-45ca-b3da-b6dd91c33e00-etc-machine-id\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.813470 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/daeb9555-6d76-45ca-b3da-b6dd91c33e00-logs\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.813488 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daeb9555-6d76-45ca-b3da-b6dd91c33e00-config-data\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.813517 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/daeb9555-6d76-45ca-b3da-b6dd91c33e00-config-data-custom\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.813927 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/daeb9555-6d76-45ca-b3da-b6dd91c33e00-etc-machine-id\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.814195 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/daeb9555-6d76-45ca-b3da-b6dd91c33e00-logs\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.819737 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/daeb9555-6d76-45ca-b3da-b6dd91c33e00-scripts\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.824755 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/daeb9555-6d76-45ca-b3da-b6dd91c33e00-config-data-custom\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.826039 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daeb9555-6d76-45ca-b3da-b6dd91c33e00-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.826846 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daeb9555-6d76-45ca-b3da-b6dd91c33e00-config-data\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.852277 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzgqt\" (UniqueName: \"kubernetes.io/projected/daeb9555-6d76-45ca-b3da-b6dd91c33e00-kube-api-access-dzgqt\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.968398 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 23:00:28 crc kubenswrapper[4795]: I0219 23:00:28.222210 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"12de80a7-e42b-4768-83d4-0ed7d7490c30","Type":"ContainerStarted","Data":"1d6f3a131c3730f147c75e6cf092cfe1b2fef75915a932a4a830767b2806bbb3"} Feb 19 23:00:28 crc kubenswrapper[4795]: I0219 23:00:28.222599 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"12de80a7-e42b-4768-83d4-0ed7d7490c30","Type":"ContainerStarted","Data":"539f48c9031cb8e3d4c10faf0cffcb2a7d4258bbee19b5ff93167b6f1554ee43"} Feb 19 23:00:28 crc kubenswrapper[4795]: I0219 23:00:28.232001 4795 generic.go:334] "Generic (PLEG): container finished" podID="020ffbb9-5c2d-4cdd-af08-44c28850c44c" containerID="3604e5b9a639bd35d71757f720a9b88617fedfa662eef3b2876130119bd577e4" exitCode=0 Feb 19 23:00:28 crc kubenswrapper[4795]: I0219 23:00:28.233019 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9kk9" event={"ID":"020ffbb9-5c2d-4cdd-af08-44c28850c44c","Type":"ContainerDied","Data":"3604e5b9a639bd35d71757f720a9b88617fedfa662eef3b2876130119bd577e4"} Feb 19 23:00:28 crc kubenswrapper[4795]: I0219 23:00:28.235702 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 23:00:28 crc kubenswrapper[4795]: I0219 23:00:28.258494 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 23:00:28 crc kubenswrapper[4795]: I0219 23:00:28.260801 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 23:00:28 crc kubenswrapper[4795]: I0219 23:00:28.272710 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.29915732 podStartE2EDuration="4.272693918s" podCreationTimestamp="2026-02-19 23:00:24 +0000 UTC" firstStartedPulling="2026-02-19 23:00:26.049065774 +0000 UTC m=+5537.241583638" lastFinishedPulling="2026-02-19 23:00:27.022602372 +0000 UTC m=+5538.215120236" observedRunningTime="2026-02-19 23:00:28.257558743 +0000 UTC m=+5539.450076607" watchObservedRunningTime="2026-02-19 23:00:28.272693918 +0000 UTC m=+5539.465211782" Feb 19 23:00:28 crc kubenswrapper[4795]: I0219 23:00:28.529305 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 23:00:28 crc kubenswrapper[4795]: I0219 23:00:28.809764 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4vwwf"] Feb 19 23:00:28 crc kubenswrapper[4795]: I0219 23:00:28.815242 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:28 crc kubenswrapper[4795]: I0219 23:00:28.818523 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4vwwf"] Feb 19 23:00:28 crc kubenswrapper[4795]: I0219 23:00:28.950656 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa08f457-a2bb-40ae-afe4-647920d80f5d-catalog-content\") pod \"certified-operators-4vwwf\" (UID: \"aa08f457-a2bb-40ae-afe4-647920d80f5d\") " pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:28 crc kubenswrapper[4795]: I0219 23:00:28.950779 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa08f457-a2bb-40ae-afe4-647920d80f5d-utilities\") pod \"certified-operators-4vwwf\" (UID: \"aa08f457-a2bb-40ae-afe4-647920d80f5d\") " pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:28 crc kubenswrapper[4795]: I0219 23:00:28.950832 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvrs6\" (UniqueName: \"kubernetes.io/projected/aa08f457-a2bb-40ae-afe4-647920d80f5d-kube-api-access-zvrs6\") pod \"certified-operators-4vwwf\" (UID: \"aa08f457-a2bb-40ae-afe4-647920d80f5d\") " pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:29 crc kubenswrapper[4795]: I0219 23:00:29.053250 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa08f457-a2bb-40ae-afe4-647920d80f5d-catalog-content\") pod \"certified-operators-4vwwf\" (UID: \"aa08f457-a2bb-40ae-afe4-647920d80f5d\") " pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:29 crc kubenswrapper[4795]: I0219 23:00:29.053378 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa08f457-a2bb-40ae-afe4-647920d80f5d-utilities\") pod \"certified-operators-4vwwf\" (UID: \"aa08f457-a2bb-40ae-afe4-647920d80f5d\") " pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:29 crc kubenswrapper[4795]: I0219 23:00:29.053429 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvrs6\" (UniqueName: \"kubernetes.io/projected/aa08f457-a2bb-40ae-afe4-647920d80f5d-kube-api-access-zvrs6\") pod \"certified-operators-4vwwf\" (UID: \"aa08f457-a2bb-40ae-afe4-647920d80f5d\") " pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:29 crc kubenswrapper[4795]: I0219 23:00:29.054445 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa08f457-a2bb-40ae-afe4-647920d80f5d-catalog-content\") pod \"certified-operators-4vwwf\" (UID: \"aa08f457-a2bb-40ae-afe4-647920d80f5d\") " pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:29 crc kubenswrapper[4795]: I0219 23:00:29.054743 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa08f457-a2bb-40ae-afe4-647920d80f5d-utilities\") pod \"certified-operators-4vwwf\" (UID: \"aa08f457-a2bb-40ae-afe4-647920d80f5d\") " pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:29 crc kubenswrapper[4795]: I0219 23:00:29.077060 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvrs6\" (UniqueName: \"kubernetes.io/projected/aa08f457-a2bb-40ae-afe4-647920d80f5d-kube-api-access-zvrs6\") pod \"certified-operators-4vwwf\" (UID: \"aa08f457-a2bb-40ae-afe4-647920d80f5d\") " pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:29 crc kubenswrapper[4795]: I0219 23:00:29.147474 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:29 crc kubenswrapper[4795]: I0219 23:00:29.277200 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"daeb9555-6d76-45ca-b3da-b6dd91c33e00","Type":"ContainerStarted","Data":"e8e06b8be2864d518c9bf61d6a21ee38c0b47470da224147a202924170bba205"} Feb 19 23:00:29 crc kubenswrapper[4795]: I0219 23:00:29.426620 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:29 crc kubenswrapper[4795]: I0219 23:00:29.546365 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3922ed4f-baf9-481e-af8b-b009440dfea2" path="/var/lib/kubelet/pods/3922ed4f-baf9-481e-af8b-b009440dfea2/volumes" Feb 19 23:00:29 crc kubenswrapper[4795]: I0219 23:00:29.777777 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4vwwf"] Feb 19 23:00:30 crc kubenswrapper[4795]: I0219 23:00:30.286631 4795 generic.go:334] "Generic (PLEG): container finished" podID="aa08f457-a2bb-40ae-afe4-647920d80f5d" containerID="d94bf9a18a9e2fe63761c24c1da9448bde51def8159c0a24a7add8d212314292" exitCode=0 Feb 19 23:00:30 crc kubenswrapper[4795]: I0219 23:00:30.286738 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vwwf" event={"ID":"aa08f457-a2bb-40ae-afe4-647920d80f5d","Type":"ContainerDied","Data":"d94bf9a18a9e2fe63761c24c1da9448bde51def8159c0a24a7add8d212314292"} Feb 19 23:00:30 crc kubenswrapper[4795]: I0219 23:00:30.286984 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vwwf" event={"ID":"aa08f457-a2bb-40ae-afe4-647920d80f5d","Type":"ContainerStarted","Data":"a57329754df0615f27cca643d3caf47c4f91f7d152f100af5abc965130234bef"} Feb 19 23:00:30 crc kubenswrapper[4795]: I0219 23:00:30.298445 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"daeb9555-6d76-45ca-b3da-b6dd91c33e00","Type":"ContainerStarted","Data":"5be777487e5586bd93badf6c2c1b00f6b197adc9ea1d9c8e60728a166616e293"} Feb 19 23:00:30 crc kubenswrapper[4795]: I0219 23:00:30.298489 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"daeb9555-6d76-45ca-b3da-b6dd91c33e00","Type":"ContainerStarted","Data":"9c6fa1de60cfdd76be176b9f9175d3f6eee78ce3e820e7f88108d85834a9ef82"} Feb 19 23:00:30 crc kubenswrapper[4795]: I0219 23:00:30.298562 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 19 23:00:30 crc kubenswrapper[4795]: I0219 23:00:30.301023 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9kk9" event={"ID":"020ffbb9-5c2d-4cdd-af08-44c28850c44c","Type":"ContainerStarted","Data":"cac79134c9314d800f04bfb3cf858eecee6be20b351d06d3a975449071b3d8fb"} Feb 19 23:00:30 crc kubenswrapper[4795]: I0219 23:00:30.348074 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.348054119 podStartE2EDuration="3.348054119s" podCreationTimestamp="2026-02-19 23:00:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:00:30.334237267 +0000 UTC m=+5541.526755131" watchObservedRunningTime="2026-02-19 23:00:30.348054119 +0000 UTC m=+5541.540572013" Feb 19 23:00:30 crc kubenswrapper[4795]: I0219 23:00:30.387780 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Feb 19 23:00:31 crc kubenswrapper[4795]: I0219 23:00:31.326756 4795 generic.go:334] "Generic (PLEG): container finished" podID="020ffbb9-5c2d-4cdd-af08-44c28850c44c" containerID="cac79134c9314d800f04bfb3cf858eecee6be20b351d06d3a975449071b3d8fb" exitCode=0 Feb 19 23:00:31 crc kubenswrapper[4795]: I0219 23:00:31.326898 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9kk9" event={"ID":"020ffbb9-5c2d-4cdd-af08-44c28850c44c","Type":"ContainerDied","Data":"cac79134c9314d800f04bfb3cf858eecee6be20b351d06d3a975449071b3d8fb"} Feb 19 23:00:31 crc kubenswrapper[4795]: I0219 23:00:31.350834 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vwwf" event={"ID":"aa08f457-a2bb-40ae-afe4-647920d80f5d","Type":"ContainerStarted","Data":"10212642ebf7b9d732af19b3e74e14cc1893a2a12c26a1fddef9354851ecf3ac"} Feb 19 23:00:32 crc kubenswrapper[4795]: I0219 23:00:32.157037 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 19 23:00:32 crc kubenswrapper[4795]: I0219 23:00:32.282909 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 23:00:32 crc kubenswrapper[4795]: I0219 23:00:32.363932 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="29196316-7b32-486b-a786-10a3912bc206" containerName="cinder-scheduler" containerID="cri-o://d9346b789eea6533075dc5c63a660bdc3b57347c9d222a08726141055f09a5fe" gracePeriod=30 Feb 19 23:00:32 crc kubenswrapper[4795]: I0219 23:00:32.365065 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9kk9" event={"ID":"020ffbb9-5c2d-4cdd-af08-44c28850c44c","Type":"ContainerStarted","Data":"073422a5416bd29e48cb4ea48f28db816baf075d9c482ced2680d3a6e1225d6d"} Feb 19 23:00:32 crc kubenswrapper[4795]: I0219 23:00:32.365696 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="29196316-7b32-486b-a786-10a3912bc206" containerName="probe" containerID="cri-o://43556d75e94601761021c0d54884cee46c2545d93705f88358280a86bd6460b9" gracePeriod=30 Feb 19 23:00:32 crc kubenswrapper[4795]: I0219 23:00:32.411833 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b9kk9" podStartSLOduration=2.95841042 podStartE2EDuration="6.411812604s" podCreationTimestamp="2026-02-19 23:00:26 +0000 UTC" firstStartedPulling="2026-02-19 23:00:28.247048765 +0000 UTC m=+5539.439566629" lastFinishedPulling="2026-02-19 23:00:31.700450949 +0000 UTC m=+5542.892968813" observedRunningTime="2026-02-19 23:00:32.39554836 +0000 UTC m=+5543.588066234" watchObservedRunningTime="2026-02-19 23:00:32.411812604 +0000 UTC m=+5543.604330468" Feb 19 23:00:33 crc kubenswrapper[4795]: I0219 23:00:33.374110 4795 generic.go:334] "Generic (PLEG): container finished" podID="29196316-7b32-486b-a786-10a3912bc206" containerID="43556d75e94601761021c0d54884cee46c2545d93705f88358280a86bd6460b9" exitCode=0 Feb 19 23:00:33 crc kubenswrapper[4795]: I0219 23:00:33.374157 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"29196316-7b32-486b-a786-10a3912bc206","Type":"ContainerDied","Data":"43556d75e94601761021c0d54884cee46c2545d93705f88358280a86bd6460b9"} Feb 19 23:00:33 crc kubenswrapper[4795]: I0219 23:00:33.377146 4795 generic.go:334] "Generic (PLEG): container finished" podID="aa08f457-a2bb-40ae-afe4-647920d80f5d" containerID="10212642ebf7b9d732af19b3e74e14cc1893a2a12c26a1fddef9354851ecf3ac" exitCode=0 Feb 19 23:00:33 crc kubenswrapper[4795]: I0219 23:00:33.377186 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vwwf" event={"ID":"aa08f457-a2bb-40ae-afe4-647920d80f5d","Type":"ContainerDied","Data":"10212642ebf7b9d732af19b3e74e14cc1893a2a12c26a1fddef9354851ecf3ac"} Feb 19 23:00:34 crc kubenswrapper[4795]: I0219 23:00:34.648661 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.152829 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.223388 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-scripts\") pod \"29196316-7b32-486b-a786-10a3912bc206\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.223710 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-config-data\") pod \"29196316-7b32-486b-a786-10a3912bc206\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.223782 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-combined-ca-bundle\") pod \"29196316-7b32-486b-a786-10a3912bc206\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.223835 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq4nx\" (UniqueName: \"kubernetes.io/projected/29196316-7b32-486b-a786-10a3912bc206-kube-api-access-pq4nx\") pod \"29196316-7b32-486b-a786-10a3912bc206\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.223895 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-config-data-custom\") pod \"29196316-7b32-486b-a786-10a3912bc206\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.223921 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29196316-7b32-486b-a786-10a3912bc206-etc-machine-id\") pod \"29196316-7b32-486b-a786-10a3912bc206\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.224316 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29196316-7b32-486b-a786-10a3912bc206-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "29196316-7b32-486b-a786-10a3912bc206" (UID: "29196316-7b32-486b-a786-10a3912bc206"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.230821 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "29196316-7b32-486b-a786-10a3912bc206" (UID: "29196316-7b32-486b-a786-10a3912bc206"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.230947 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-scripts" (OuterVolumeSpecName: "scripts") pod "29196316-7b32-486b-a786-10a3912bc206" (UID: "29196316-7b32-486b-a786-10a3912bc206"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.241377 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29196316-7b32-486b-a786-10a3912bc206-kube-api-access-pq4nx" (OuterVolumeSpecName: "kube-api-access-pq4nx") pod "29196316-7b32-486b-a786-10a3912bc206" (UID: "29196316-7b32-486b-a786-10a3912bc206"). InnerVolumeSpecName "kube-api-access-pq4nx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.283285 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29196316-7b32-486b-a786-10a3912bc206" (UID: "29196316-7b32-486b-a786-10a3912bc206"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.326321 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.326358 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.326368 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq4nx\" (UniqueName: \"kubernetes.io/projected/29196316-7b32-486b-a786-10a3912bc206-kube-api-access-pq4nx\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.326379 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.326388 4795 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29196316-7b32-486b-a786-10a3912bc206-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.341885 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-config-data" (OuterVolumeSpecName: "config-data") pod "29196316-7b32-486b-a786-10a3912bc206" (UID: "29196316-7b32-486b-a786-10a3912bc206"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.399452 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vwwf" event={"ID":"aa08f457-a2bb-40ae-afe4-647920d80f5d","Type":"ContainerStarted","Data":"354b28fba47cef5561d9493add6a54eacdc83cf2022b7003621a76bafc4e36e8"} Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.407450 4795 generic.go:334] "Generic (PLEG): container finished" podID="29196316-7b32-486b-a786-10a3912bc206" containerID="d9346b789eea6533075dc5c63a660bdc3b57347c9d222a08726141055f09a5fe" exitCode=0 Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.407499 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"29196316-7b32-486b-a786-10a3912bc206","Type":"ContainerDied","Data":"d9346b789eea6533075dc5c63a660bdc3b57347c9d222a08726141055f09a5fe"} Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.407530 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"29196316-7b32-486b-a786-10a3912bc206","Type":"ContainerDied","Data":"92b489e7d1f8c23a12cd2c3b0232a07d52f20da60c5e7bd656a106e1bbfc254b"} Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.407548 4795 scope.go:117] "RemoveContainer" containerID="43556d75e94601761021c0d54884cee46c2545d93705f88358280a86bd6460b9" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.407671 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.428472 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.438865 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4vwwf" podStartSLOduration=3.338683081 podStartE2EDuration="7.438845426s" podCreationTimestamp="2026-02-19 23:00:28 +0000 UTC" firstStartedPulling="2026-02-19 23:00:30.289750895 +0000 UTC m=+5541.482268759" lastFinishedPulling="2026-02-19 23:00:34.38991324 +0000 UTC m=+5545.582431104" observedRunningTime="2026-02-19 23:00:35.425368463 +0000 UTC m=+5546.617886327" watchObservedRunningTime="2026-02-19 23:00:35.438845426 +0000 UTC m=+5546.631363290" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.458296 4795 scope.go:117] "RemoveContainer" containerID="d9346b789eea6533075dc5c63a660bdc3b57347c9d222a08726141055f09a5fe" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.462275 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.476194 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.491941 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 23:00:35 crc kubenswrapper[4795]: E0219 23:00:35.492377 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29196316-7b32-486b-a786-10a3912bc206" containerName="cinder-scheduler" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.492390 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="29196316-7b32-486b-a786-10a3912bc206" containerName="cinder-scheduler" Feb 19 23:00:35 crc kubenswrapper[4795]: E0219 23:00:35.492414 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29196316-7b32-486b-a786-10a3912bc206" containerName="probe" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.492420 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="29196316-7b32-486b-a786-10a3912bc206" containerName="probe" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.492604 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="29196316-7b32-486b-a786-10a3912bc206" containerName="probe" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.492631 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="29196316-7b32-486b-a786-10a3912bc206" containerName="cinder-scheduler" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.493593 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.497035 4795 scope.go:117] "RemoveContainer" containerID="43556d75e94601761021c0d54884cee46c2545d93705f88358280a86bd6460b9" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.497257 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 19 23:00:35 crc kubenswrapper[4795]: E0219 23:00:35.499602 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43556d75e94601761021c0d54884cee46c2545d93705f88358280a86bd6460b9\": container with ID starting with 43556d75e94601761021c0d54884cee46c2545d93705f88358280a86bd6460b9 not found: ID does not exist" containerID="43556d75e94601761021c0d54884cee46c2545d93705f88358280a86bd6460b9" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.499647 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43556d75e94601761021c0d54884cee46c2545d93705f88358280a86bd6460b9"} err="failed to get container status \"43556d75e94601761021c0d54884cee46c2545d93705f88358280a86bd6460b9\": rpc error: code = NotFound desc = could not find container \"43556d75e94601761021c0d54884cee46c2545d93705f88358280a86bd6460b9\": container with ID starting with 43556d75e94601761021c0d54884cee46c2545d93705f88358280a86bd6460b9 not found: ID does not exist" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.499675 4795 scope.go:117] "RemoveContainer" containerID="d9346b789eea6533075dc5c63a660bdc3b57347c9d222a08726141055f09a5fe" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.502849 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 23:00:35 crc kubenswrapper[4795]: E0219 23:00:35.503554 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9346b789eea6533075dc5c63a660bdc3b57347c9d222a08726141055f09a5fe\": container with ID starting with d9346b789eea6533075dc5c63a660bdc3b57347c9d222a08726141055f09a5fe not found: ID does not exist" containerID="d9346b789eea6533075dc5c63a660bdc3b57347c9d222a08726141055f09a5fe" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.503589 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9346b789eea6533075dc5c63a660bdc3b57347c9d222a08726141055f09a5fe"} err="failed to get container status \"d9346b789eea6533075dc5c63a660bdc3b57347c9d222a08726141055f09a5fe\": rpc error: code = NotFound desc = could not find container \"d9346b789eea6533075dc5c63a660bdc3b57347c9d222a08726141055f09a5fe\": container with ID starting with d9346b789eea6533075dc5c63a660bdc3b57347c9d222a08726141055f09a5fe not found: ID does not exist" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.526473 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29196316-7b32-486b-a786-10a3912bc206" path="/var/lib/kubelet/pods/29196316-7b32-486b-a786-10a3912bc206/volumes" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.529824 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85502c41-99ab-4a8f-9c36-f4d839b931a1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.529945 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85502c41-99ab-4a8f-9c36-f4d839b931a1-config-data\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.530049 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85502c41-99ab-4a8f-9c36-f4d839b931a1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.530119 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85502c41-99ab-4a8f-9c36-f4d839b931a1-scripts\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.530324 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xddk9\" (UniqueName: \"kubernetes.io/projected/85502c41-99ab-4a8f-9c36-f4d839b931a1-kube-api-access-xddk9\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.530400 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/85502c41-99ab-4a8f-9c36-f4d839b931a1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.632925 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/85502c41-99ab-4a8f-9c36-f4d839b931a1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.633100 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85502c41-99ab-4a8f-9c36-f4d839b931a1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.633194 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85502c41-99ab-4a8f-9c36-f4d839b931a1-config-data\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.633259 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85502c41-99ab-4a8f-9c36-f4d839b931a1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.633283 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85502c41-99ab-4a8f-9c36-f4d839b931a1-scripts\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.633334 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xddk9\" (UniqueName: \"kubernetes.io/projected/85502c41-99ab-4a8f-9c36-f4d839b931a1-kube-api-access-xddk9\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.633521 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/85502c41-99ab-4a8f-9c36-f4d839b931a1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.636985 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85502c41-99ab-4a8f-9c36-f4d839b931a1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.637763 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85502c41-99ab-4a8f-9c36-f4d839b931a1-config-data\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.642676 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85502c41-99ab-4a8f-9c36-f4d839b931a1-scripts\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.653730 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85502c41-99ab-4a8f-9c36-f4d839b931a1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.654892 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xddk9\" (UniqueName: \"kubernetes.io/projected/85502c41-99ab-4a8f-9c36-f4d839b931a1-kube-api-access-xddk9\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.661425 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.811600 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 23:00:36 crc kubenswrapper[4795]: I0219 23:00:36.346543 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:36 crc kubenswrapper[4795]: I0219 23:00:36.346906 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:36 crc kubenswrapper[4795]: I0219 23:00:36.363944 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 23:00:36 crc kubenswrapper[4795]: I0219 23:00:36.417140 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"85502c41-99ab-4a8f-9c36-f4d839b931a1","Type":"ContainerStarted","Data":"97453bbb3480978fa58406787fe8a850f1853c6b62b3df6b6fa05ef1fc9640f1"} Feb 19 23:00:37 crc kubenswrapper[4795]: I0219 23:00:37.395037 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b9kk9" podUID="020ffbb9-5c2d-4cdd-af08-44c28850c44c" containerName="registry-server" probeResult="failure" output=< Feb 19 23:00:37 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Feb 19 23:00:37 crc kubenswrapper[4795]: > Feb 19 23:00:37 crc kubenswrapper[4795]: I0219 23:00:37.428752 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"85502c41-99ab-4a8f-9c36-f4d839b931a1","Type":"ContainerStarted","Data":"b590684afc64442bc6607340dcf33bb82f3583beec7f72c755dc77fedc311fdc"} Feb 19 23:00:38 crc kubenswrapper[4795]: I0219 23:00:38.440207 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"85502c41-99ab-4a8f-9c36-f4d839b931a1","Type":"ContainerStarted","Data":"b01c5ad1bb4374b6ffb7df6128eb4d4719ee00325fb78af74d7e54b6006036b2"} Feb 19 23:00:38 crc kubenswrapper[4795]: I0219 23:00:38.475110 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.475089992 podStartE2EDuration="3.475089992s" podCreationTimestamp="2026-02-19 23:00:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:00:38.46676016 +0000 UTC m=+5549.659278044" watchObservedRunningTime="2026-02-19 23:00:38.475089992 +0000 UTC m=+5549.667607856" Feb 19 23:00:39 crc kubenswrapper[4795]: I0219 23:00:39.147628 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:39 crc kubenswrapper[4795]: I0219 23:00:39.148534 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:39 crc kubenswrapper[4795]: I0219 23:00:39.198129 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:39 crc kubenswrapper[4795]: I0219 23:00:39.495559 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:39 crc kubenswrapper[4795]: I0219 23:00:39.594802 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4vwwf"] Feb 19 23:00:39 crc kubenswrapper[4795]: I0219 23:00:39.773369 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 19 23:00:40 crc kubenswrapper[4795]: I0219 23:00:40.813320 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 19 23:00:41 crc kubenswrapper[4795]: I0219 23:00:41.464895 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4vwwf" podUID="aa08f457-a2bb-40ae-afe4-647920d80f5d" containerName="registry-server" containerID="cri-o://354b28fba47cef5561d9493add6a54eacdc83cf2022b7003621a76bafc4e36e8" gracePeriod=2 Feb 19 23:00:41 crc kubenswrapper[4795]: I0219 23:00:41.958911 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.068441 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa08f457-a2bb-40ae-afe4-647920d80f5d-utilities\") pod \"aa08f457-a2bb-40ae-afe4-647920d80f5d\" (UID: \"aa08f457-a2bb-40ae-afe4-647920d80f5d\") " Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.069464 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa08f457-a2bb-40ae-afe4-647920d80f5d-utilities" (OuterVolumeSpecName: "utilities") pod "aa08f457-a2bb-40ae-afe4-647920d80f5d" (UID: "aa08f457-a2bb-40ae-afe4-647920d80f5d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.069579 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa08f457-a2bb-40ae-afe4-647920d80f5d-catalog-content\") pod \"aa08f457-a2bb-40ae-afe4-647920d80f5d\" (UID: \"aa08f457-a2bb-40ae-afe4-647920d80f5d\") " Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.069860 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvrs6\" (UniqueName: \"kubernetes.io/projected/aa08f457-a2bb-40ae-afe4-647920d80f5d-kube-api-access-zvrs6\") pod \"aa08f457-a2bb-40ae-afe4-647920d80f5d\" (UID: \"aa08f457-a2bb-40ae-afe4-647920d80f5d\") " Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.070942 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa08f457-a2bb-40ae-afe4-647920d80f5d-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.077189 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa08f457-a2bb-40ae-afe4-647920d80f5d-kube-api-access-zvrs6" (OuterVolumeSpecName: "kube-api-access-zvrs6") pod "aa08f457-a2bb-40ae-afe4-647920d80f5d" (UID: "aa08f457-a2bb-40ae-afe4-647920d80f5d"). InnerVolumeSpecName "kube-api-access-zvrs6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.116614 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa08f457-a2bb-40ae-afe4-647920d80f5d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa08f457-a2bb-40ae-afe4-647920d80f5d" (UID: "aa08f457-a2bb-40ae-afe4-647920d80f5d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.173007 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa08f457-a2bb-40ae-afe4-647920d80f5d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.173038 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvrs6\" (UniqueName: \"kubernetes.io/projected/aa08f457-a2bb-40ae-afe4-647920d80f5d-kube-api-access-zvrs6\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.491737 4795 generic.go:334] "Generic (PLEG): container finished" podID="aa08f457-a2bb-40ae-afe4-647920d80f5d" containerID="354b28fba47cef5561d9493add6a54eacdc83cf2022b7003621a76bafc4e36e8" exitCode=0 Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.491831 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.491840 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vwwf" event={"ID":"aa08f457-a2bb-40ae-afe4-647920d80f5d","Type":"ContainerDied","Data":"354b28fba47cef5561d9493add6a54eacdc83cf2022b7003621a76bafc4e36e8"} Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.492611 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vwwf" event={"ID":"aa08f457-a2bb-40ae-afe4-647920d80f5d","Type":"ContainerDied","Data":"a57329754df0615f27cca643d3caf47c4f91f7d152f100af5abc965130234bef"} Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.492648 4795 scope.go:117] "RemoveContainer" containerID="354b28fba47cef5561d9493add6a54eacdc83cf2022b7003621a76bafc4e36e8" Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.531230 4795 scope.go:117] "RemoveContainer" containerID="10212642ebf7b9d732af19b3e74e14cc1893a2a12c26a1fddef9354851ecf3ac" Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.550803 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4vwwf"] Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.559475 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4vwwf"] Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.585618 4795 scope.go:117] "RemoveContainer" containerID="d94bf9a18a9e2fe63761c24c1da9448bde51def8159c0a24a7add8d212314292" Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.624095 4795 scope.go:117] "RemoveContainer" containerID="354b28fba47cef5561d9493add6a54eacdc83cf2022b7003621a76bafc4e36e8" Feb 19 23:00:42 crc kubenswrapper[4795]: E0219 23:00:42.624655 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"354b28fba47cef5561d9493add6a54eacdc83cf2022b7003621a76bafc4e36e8\": container with ID starting with 354b28fba47cef5561d9493add6a54eacdc83cf2022b7003621a76bafc4e36e8 not found: ID does not exist" containerID="354b28fba47cef5561d9493add6a54eacdc83cf2022b7003621a76bafc4e36e8" Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.624724 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"354b28fba47cef5561d9493add6a54eacdc83cf2022b7003621a76bafc4e36e8"} err="failed to get container status \"354b28fba47cef5561d9493add6a54eacdc83cf2022b7003621a76bafc4e36e8\": rpc error: code = NotFound desc = could not find container \"354b28fba47cef5561d9493add6a54eacdc83cf2022b7003621a76bafc4e36e8\": container with ID starting with 354b28fba47cef5561d9493add6a54eacdc83cf2022b7003621a76bafc4e36e8 not found: ID does not exist" Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.624763 4795 scope.go:117] "RemoveContainer" containerID="10212642ebf7b9d732af19b3e74e14cc1893a2a12c26a1fddef9354851ecf3ac" Feb 19 23:00:42 crc kubenswrapper[4795]: E0219 23:00:42.625240 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10212642ebf7b9d732af19b3e74e14cc1893a2a12c26a1fddef9354851ecf3ac\": container with ID starting with 10212642ebf7b9d732af19b3e74e14cc1893a2a12c26a1fddef9354851ecf3ac not found: ID does not exist" containerID="10212642ebf7b9d732af19b3e74e14cc1893a2a12c26a1fddef9354851ecf3ac" Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.625335 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10212642ebf7b9d732af19b3e74e14cc1893a2a12c26a1fddef9354851ecf3ac"} err="failed to get container status \"10212642ebf7b9d732af19b3e74e14cc1893a2a12c26a1fddef9354851ecf3ac\": rpc error: code = NotFound desc = could not find container \"10212642ebf7b9d732af19b3e74e14cc1893a2a12c26a1fddef9354851ecf3ac\": container with ID starting with 10212642ebf7b9d732af19b3e74e14cc1893a2a12c26a1fddef9354851ecf3ac not found: ID does not exist" Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.625414 4795 scope.go:117] "RemoveContainer" containerID="d94bf9a18a9e2fe63761c24c1da9448bde51def8159c0a24a7add8d212314292" Feb 19 23:00:42 crc kubenswrapper[4795]: E0219 23:00:42.625854 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d94bf9a18a9e2fe63761c24c1da9448bde51def8159c0a24a7add8d212314292\": container with ID starting with d94bf9a18a9e2fe63761c24c1da9448bde51def8159c0a24a7add8d212314292 not found: ID does not exist" containerID="d94bf9a18a9e2fe63761c24c1da9448bde51def8159c0a24a7add8d212314292" Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.625897 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d94bf9a18a9e2fe63761c24c1da9448bde51def8159c0a24a7add8d212314292"} err="failed to get container status \"d94bf9a18a9e2fe63761c24c1da9448bde51def8159c0a24a7add8d212314292\": rpc error: code = NotFound desc = could not find container \"d94bf9a18a9e2fe63761c24c1da9448bde51def8159c0a24a7add8d212314292\": container with ID starting with d94bf9a18a9e2fe63761c24c1da9448bde51def8159c0a24a7add8d212314292 not found: ID does not exist" Feb 19 23:00:43 crc kubenswrapper[4795]: I0219 23:00:43.531469 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa08f457-a2bb-40ae-afe4-647920d80f5d" path="/var/lib/kubelet/pods/aa08f457-a2bb-40ae-afe4-647920d80f5d/volumes" Feb 19 23:00:46 crc kubenswrapper[4795]: I0219 23:00:46.058766 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 19 23:00:46 crc kubenswrapper[4795]: I0219 23:00:46.393020 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:46 crc kubenswrapper[4795]: I0219 23:00:46.434860 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:46 crc kubenswrapper[4795]: I0219 23:00:46.631408 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b9kk9"] Feb 19 23:00:47 crc kubenswrapper[4795]: I0219 23:00:47.533688 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b9kk9" podUID="020ffbb9-5c2d-4cdd-af08-44c28850c44c" containerName="registry-server" containerID="cri-o://073422a5416bd29e48cb4ea48f28db816baf075d9c482ced2680d3a6e1225d6d" gracePeriod=2 Feb 19 23:00:47 crc kubenswrapper[4795]: I0219 23:00:47.947280 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.097101 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/020ffbb9-5c2d-4cdd-af08-44c28850c44c-catalog-content\") pod \"020ffbb9-5c2d-4cdd-af08-44c28850c44c\" (UID: \"020ffbb9-5c2d-4cdd-af08-44c28850c44c\") " Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.097535 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdcxt\" (UniqueName: \"kubernetes.io/projected/020ffbb9-5c2d-4cdd-af08-44c28850c44c-kube-api-access-gdcxt\") pod \"020ffbb9-5c2d-4cdd-af08-44c28850c44c\" (UID: \"020ffbb9-5c2d-4cdd-af08-44c28850c44c\") " Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.097662 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/020ffbb9-5c2d-4cdd-af08-44c28850c44c-utilities\") pod \"020ffbb9-5c2d-4cdd-af08-44c28850c44c\" (UID: \"020ffbb9-5c2d-4cdd-af08-44c28850c44c\") " Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.098027 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/020ffbb9-5c2d-4cdd-af08-44c28850c44c-utilities" (OuterVolumeSpecName: "utilities") pod "020ffbb9-5c2d-4cdd-af08-44c28850c44c" (UID: "020ffbb9-5c2d-4cdd-af08-44c28850c44c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.098306 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/020ffbb9-5c2d-4cdd-af08-44c28850c44c-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.102033 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/020ffbb9-5c2d-4cdd-af08-44c28850c44c-kube-api-access-gdcxt" (OuterVolumeSpecName: "kube-api-access-gdcxt") pod "020ffbb9-5c2d-4cdd-af08-44c28850c44c" (UID: "020ffbb9-5c2d-4cdd-af08-44c28850c44c"). InnerVolumeSpecName "kube-api-access-gdcxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.200303 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdcxt\" (UniqueName: \"kubernetes.io/projected/020ffbb9-5c2d-4cdd-af08-44c28850c44c-kube-api-access-gdcxt\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.216104 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/020ffbb9-5c2d-4cdd-af08-44c28850c44c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "020ffbb9-5c2d-4cdd-af08-44c28850c44c" (UID: "020ffbb9-5c2d-4cdd-af08-44c28850c44c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.301990 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/020ffbb9-5c2d-4cdd-af08-44c28850c44c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.545696 4795 generic.go:334] "Generic (PLEG): container finished" podID="020ffbb9-5c2d-4cdd-af08-44c28850c44c" containerID="073422a5416bd29e48cb4ea48f28db816baf075d9c482ced2680d3a6e1225d6d" exitCode=0 Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.545742 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9kk9" event={"ID":"020ffbb9-5c2d-4cdd-af08-44c28850c44c","Type":"ContainerDied","Data":"073422a5416bd29e48cb4ea48f28db816baf075d9c482ced2680d3a6e1225d6d"} Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.545776 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9kk9" event={"ID":"020ffbb9-5c2d-4cdd-af08-44c28850c44c","Type":"ContainerDied","Data":"09bb4d42669be15e952593140fb7fd9a24d0d3ef4f5071cd4b5b653e574c2307"} Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.545797 4795 scope.go:117] "RemoveContainer" containerID="073422a5416bd29e48cb4ea48f28db816baf075d9c482ced2680d3a6e1225d6d" Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.545947 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.577087 4795 scope.go:117] "RemoveContainer" containerID="cac79134c9314d800f04bfb3cf858eecee6be20b351d06d3a975449071b3d8fb" Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.620265 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b9kk9"] Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.629046 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b9kk9"] Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.629673 4795 scope.go:117] "RemoveContainer" containerID="3604e5b9a639bd35d71757f720a9b88617fedfa662eef3b2876130119bd577e4" Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.658250 4795 scope.go:117] "RemoveContainer" containerID="073422a5416bd29e48cb4ea48f28db816baf075d9c482ced2680d3a6e1225d6d" Feb 19 23:00:48 crc kubenswrapper[4795]: E0219 23:00:48.658670 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"073422a5416bd29e48cb4ea48f28db816baf075d9c482ced2680d3a6e1225d6d\": container with ID starting with 073422a5416bd29e48cb4ea48f28db816baf075d9c482ced2680d3a6e1225d6d not found: ID does not exist" containerID="073422a5416bd29e48cb4ea48f28db816baf075d9c482ced2680d3a6e1225d6d" Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.658724 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"073422a5416bd29e48cb4ea48f28db816baf075d9c482ced2680d3a6e1225d6d"} err="failed to get container status \"073422a5416bd29e48cb4ea48f28db816baf075d9c482ced2680d3a6e1225d6d\": rpc error: code = NotFound desc = could not find container \"073422a5416bd29e48cb4ea48f28db816baf075d9c482ced2680d3a6e1225d6d\": container with ID starting with 073422a5416bd29e48cb4ea48f28db816baf075d9c482ced2680d3a6e1225d6d not found: ID does not exist" Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.658757 4795 scope.go:117] "RemoveContainer" containerID="cac79134c9314d800f04bfb3cf858eecee6be20b351d06d3a975449071b3d8fb" Feb 19 23:00:48 crc kubenswrapper[4795]: E0219 23:00:48.659203 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cac79134c9314d800f04bfb3cf858eecee6be20b351d06d3a975449071b3d8fb\": container with ID starting with cac79134c9314d800f04bfb3cf858eecee6be20b351d06d3a975449071b3d8fb not found: ID does not exist" containerID="cac79134c9314d800f04bfb3cf858eecee6be20b351d06d3a975449071b3d8fb" Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.659247 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cac79134c9314d800f04bfb3cf858eecee6be20b351d06d3a975449071b3d8fb"} err="failed to get container status \"cac79134c9314d800f04bfb3cf858eecee6be20b351d06d3a975449071b3d8fb\": rpc error: code = NotFound desc = could not find container \"cac79134c9314d800f04bfb3cf858eecee6be20b351d06d3a975449071b3d8fb\": container with ID starting with cac79134c9314d800f04bfb3cf858eecee6be20b351d06d3a975449071b3d8fb not found: ID does not exist" Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.659275 4795 scope.go:117] "RemoveContainer" containerID="3604e5b9a639bd35d71757f720a9b88617fedfa662eef3b2876130119bd577e4" Feb 19 23:00:48 crc kubenswrapper[4795]: E0219 23:00:48.659589 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3604e5b9a639bd35d71757f720a9b88617fedfa662eef3b2876130119bd577e4\": container with ID starting with 3604e5b9a639bd35d71757f720a9b88617fedfa662eef3b2876130119bd577e4 not found: ID does not exist" containerID="3604e5b9a639bd35d71757f720a9b88617fedfa662eef3b2876130119bd577e4" Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.659641 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3604e5b9a639bd35d71757f720a9b88617fedfa662eef3b2876130119bd577e4"} err="failed to get container status \"3604e5b9a639bd35d71757f720a9b88617fedfa662eef3b2876130119bd577e4\": rpc error: code = NotFound desc = could not find container \"3604e5b9a639bd35d71757f720a9b88617fedfa662eef3b2876130119bd577e4\": container with ID starting with 3604e5b9a639bd35d71757f720a9b88617fedfa662eef3b2876130119bd577e4 not found: ID does not exist" Feb 19 23:00:49 crc kubenswrapper[4795]: I0219 23:00:49.532859 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="020ffbb9-5c2d-4cdd-af08-44c28850c44c" path="/var/lib/kubelet/pods/020ffbb9-5c2d-4cdd-af08-44c28850c44c/volumes" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.161583 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29525701-nmz5v"] Feb 19 23:01:00 crc kubenswrapper[4795]: E0219 23:01:00.162668 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa08f457-a2bb-40ae-afe4-647920d80f5d" containerName="extract-utilities" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.162690 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa08f457-a2bb-40ae-afe4-647920d80f5d" containerName="extract-utilities" Feb 19 23:01:00 crc kubenswrapper[4795]: E0219 23:01:00.162713 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa08f457-a2bb-40ae-afe4-647920d80f5d" containerName="extract-content" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.162725 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa08f457-a2bb-40ae-afe4-647920d80f5d" containerName="extract-content" Feb 19 23:01:00 crc kubenswrapper[4795]: E0219 23:01:00.162761 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020ffbb9-5c2d-4cdd-af08-44c28850c44c" containerName="registry-server" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.162770 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="020ffbb9-5c2d-4cdd-af08-44c28850c44c" containerName="registry-server" Feb 19 23:01:00 crc kubenswrapper[4795]: E0219 23:01:00.162791 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020ffbb9-5c2d-4cdd-af08-44c28850c44c" containerName="extract-content" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.162801 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="020ffbb9-5c2d-4cdd-af08-44c28850c44c" containerName="extract-content" Feb 19 23:01:00 crc kubenswrapper[4795]: E0219 23:01:00.162821 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020ffbb9-5c2d-4cdd-af08-44c28850c44c" containerName="extract-utilities" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.162831 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="020ffbb9-5c2d-4cdd-af08-44c28850c44c" containerName="extract-utilities" Feb 19 23:01:00 crc kubenswrapper[4795]: E0219 23:01:00.162847 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa08f457-a2bb-40ae-afe4-647920d80f5d" containerName="registry-server" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.162855 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa08f457-a2bb-40ae-afe4-647920d80f5d" containerName="registry-server" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.163060 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="020ffbb9-5c2d-4cdd-af08-44c28850c44c" containerName="registry-server" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.163118 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa08f457-a2bb-40ae-afe4-647920d80f5d" containerName="registry-server" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.163994 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525701-nmz5v" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.181674 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29525701-nmz5v"] Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.348735 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-config-data\") pod \"keystone-cron-29525701-nmz5v\" (UID: \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\") " pod="openstack/keystone-cron-29525701-nmz5v" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.349555 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwnlc\" (UniqueName: \"kubernetes.io/projected/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-kube-api-access-mwnlc\") pod \"keystone-cron-29525701-nmz5v\" (UID: \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\") " pod="openstack/keystone-cron-29525701-nmz5v" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.349625 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-combined-ca-bundle\") pod \"keystone-cron-29525701-nmz5v\" (UID: \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\") " pod="openstack/keystone-cron-29525701-nmz5v" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.349791 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-fernet-keys\") pod \"keystone-cron-29525701-nmz5v\" (UID: \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\") " pod="openstack/keystone-cron-29525701-nmz5v" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.452019 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-config-data\") pod \"keystone-cron-29525701-nmz5v\" (UID: \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\") " pod="openstack/keystone-cron-29525701-nmz5v" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.452114 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwnlc\" (UniqueName: \"kubernetes.io/projected/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-kube-api-access-mwnlc\") pod \"keystone-cron-29525701-nmz5v\" (UID: \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\") " pod="openstack/keystone-cron-29525701-nmz5v" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.452134 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-combined-ca-bundle\") pod \"keystone-cron-29525701-nmz5v\" (UID: \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\") " pod="openstack/keystone-cron-29525701-nmz5v" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.452155 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-fernet-keys\") pod \"keystone-cron-29525701-nmz5v\" (UID: \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\") " pod="openstack/keystone-cron-29525701-nmz5v" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.458751 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-config-data\") pod \"keystone-cron-29525701-nmz5v\" (UID: \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\") " pod="openstack/keystone-cron-29525701-nmz5v" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.458956 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-fernet-keys\") pod \"keystone-cron-29525701-nmz5v\" (UID: \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\") " pod="openstack/keystone-cron-29525701-nmz5v" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.460373 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-combined-ca-bundle\") pod \"keystone-cron-29525701-nmz5v\" (UID: \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\") " pod="openstack/keystone-cron-29525701-nmz5v" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.469361 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwnlc\" (UniqueName: \"kubernetes.io/projected/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-kube-api-access-mwnlc\") pod \"keystone-cron-29525701-nmz5v\" (UID: \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\") " pod="openstack/keystone-cron-29525701-nmz5v" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.488476 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525701-nmz5v" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.938607 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29525701-nmz5v"] Feb 19 23:01:00 crc kubenswrapper[4795]: W0219 23:01:00.945091 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f2d7932_b11f_4e9b_a6e0_2a9a069a3459.slice/crio-9e80301705c26ed37e88db528f388e314a92c46e04a198b9f0f7512654d0d045 WatchSource:0}: Error finding container 9e80301705c26ed37e88db528f388e314a92c46e04a198b9f0f7512654d0d045: Status 404 returned error can't find the container with id 9e80301705c26ed37e88db528f388e314a92c46e04a198b9f0f7512654d0d045 Feb 19 23:01:01 crc kubenswrapper[4795]: I0219 23:01:01.725869 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525701-nmz5v" event={"ID":"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459","Type":"ContainerStarted","Data":"928018fdfd9df98dcda0e154657ded64328052c445bfd14bf81fd4a1c8a9f74b"} Feb 19 23:01:01 crc kubenswrapper[4795]: I0219 23:01:01.726243 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525701-nmz5v" event={"ID":"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459","Type":"ContainerStarted","Data":"9e80301705c26ed37e88db528f388e314a92c46e04a198b9f0f7512654d0d045"} Feb 19 23:01:01 crc kubenswrapper[4795]: I0219 23:01:01.753085 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29525701-nmz5v" podStartSLOduration=1.7530656169999999 podStartE2EDuration="1.753065617s" podCreationTimestamp="2026-02-19 23:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:01:01.745644868 +0000 UTC m=+5572.938162732" watchObservedRunningTime="2026-02-19 23:01:01.753065617 +0000 UTC m=+5572.945583481" Feb 19 23:01:03 crc kubenswrapper[4795]: I0219 23:01:03.755074 4795 generic.go:334] "Generic (PLEG): container finished" podID="5f2d7932-b11f-4e9b-a6e0-2a9a069a3459" containerID="928018fdfd9df98dcda0e154657ded64328052c445bfd14bf81fd4a1c8a9f74b" exitCode=0 Feb 19 23:01:03 crc kubenswrapper[4795]: I0219 23:01:03.755255 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525701-nmz5v" event={"ID":"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459","Type":"ContainerDied","Data":"928018fdfd9df98dcda0e154657ded64328052c445bfd14bf81fd4a1c8a9f74b"} Feb 19 23:01:05 crc kubenswrapper[4795]: I0219 23:01:05.092308 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525701-nmz5v" Feb 19 23:01:05 crc kubenswrapper[4795]: I0219 23:01:05.241583 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-fernet-keys\") pod \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\" (UID: \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\") " Feb 19 23:01:05 crc kubenswrapper[4795]: I0219 23:01:05.241690 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-config-data\") pod \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\" (UID: \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\") " Feb 19 23:01:05 crc kubenswrapper[4795]: I0219 23:01:05.241839 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-combined-ca-bundle\") pod \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\" (UID: \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\") " Feb 19 23:01:05 crc kubenswrapper[4795]: I0219 23:01:05.241871 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwnlc\" (UniqueName: \"kubernetes.io/projected/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-kube-api-access-mwnlc\") pod \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\" (UID: \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\") " Feb 19 23:01:05 crc kubenswrapper[4795]: I0219 23:01:05.247152 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5f2d7932-b11f-4e9b-a6e0-2a9a069a3459" (UID: "5f2d7932-b11f-4e9b-a6e0-2a9a069a3459"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:01:05 crc kubenswrapper[4795]: I0219 23:01:05.247529 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-kube-api-access-mwnlc" (OuterVolumeSpecName: "kube-api-access-mwnlc") pod "5f2d7932-b11f-4e9b-a6e0-2a9a069a3459" (UID: "5f2d7932-b11f-4e9b-a6e0-2a9a069a3459"). InnerVolumeSpecName "kube-api-access-mwnlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:01:05 crc kubenswrapper[4795]: I0219 23:01:05.278253 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f2d7932-b11f-4e9b-a6e0-2a9a069a3459" (UID: "5f2d7932-b11f-4e9b-a6e0-2a9a069a3459"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:01:05 crc kubenswrapper[4795]: I0219 23:01:05.289179 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-config-data" (OuterVolumeSpecName: "config-data") pod "5f2d7932-b11f-4e9b-a6e0-2a9a069a3459" (UID: "5f2d7932-b11f-4e9b-a6e0-2a9a069a3459"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:01:05 crc kubenswrapper[4795]: I0219 23:01:05.343423 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:05 crc kubenswrapper[4795]: I0219 23:01:05.343449 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwnlc\" (UniqueName: \"kubernetes.io/projected/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-kube-api-access-mwnlc\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:05 crc kubenswrapper[4795]: I0219 23:01:05.343459 4795 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:05 crc kubenswrapper[4795]: I0219 23:01:05.343469 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:05 crc kubenswrapper[4795]: I0219 23:01:05.779550 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525701-nmz5v" event={"ID":"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459","Type":"ContainerDied","Data":"9e80301705c26ed37e88db528f388e314a92c46e04a198b9f0f7512654d0d045"} Feb 19 23:01:05 crc kubenswrapper[4795]: I0219 23:01:05.779946 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e80301705c26ed37e88db528f388e314a92c46e04a198b9f0f7512654d0d045" Feb 19 23:01:05 crc kubenswrapper[4795]: I0219 23:01:05.780215 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525701-nmz5v" Feb 19 23:01:16 crc kubenswrapper[4795]: I0219 23:01:16.042270 4795 scope.go:117] "RemoveContainer" containerID="da78b2ce5ad3acba003da6948c0acb827331e556f914cb3178cd5862028563a8" Feb 19 23:01:16 crc kubenswrapper[4795]: I0219 23:01:16.075605 4795 scope.go:117] "RemoveContainer" containerID="be2f6f4f8f1c84b00fee1744df7613e5b2cb7c534a7ea1884cb10ab8da51e6b9" Feb 19 23:01:28 crc kubenswrapper[4795]: I0219 23:01:28.427945 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:01:28 crc kubenswrapper[4795]: I0219 23:01:28.428807 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:01:58 crc kubenswrapper[4795]: I0219 23:01:58.427439 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:01:58 crc kubenswrapper[4795]: I0219 23:01:58.429097 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:02:22 crc kubenswrapper[4795]: I0219 23:02:22.064446 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-pgc5v"] Feb 19 23:02:22 crc kubenswrapper[4795]: I0219 23:02:22.072843 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-af99-account-create-update-5rdfd"] Feb 19 23:02:22 crc kubenswrapper[4795]: I0219 23:02:22.081832 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-af99-account-create-update-5rdfd"] Feb 19 23:02:22 crc kubenswrapper[4795]: I0219 23:02:22.089202 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-pgc5v"] Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.525671 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="036fd6f7-0c88-4c92-9a98-0a774124c8fd" path="/var/lib/kubelet/pods/036fd6f7-0c88-4c92-9a98-0a774124c8fd/volumes" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.526829 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1e0382a-40d3-42e1-93d3-e5098af1e54f" path="/var/lib/kubelet/pods/d1e0382a-40d3-42e1-93d3-e5098af1e54f/volumes" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.535471 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-knqfl"] Feb 19 23:02:23 crc kubenswrapper[4795]: E0219 23:02:23.535947 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f2d7932-b11f-4e9b-a6e0-2a9a069a3459" containerName="keystone-cron" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.535968 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f2d7932-b11f-4e9b-a6e0-2a9a069a3459" containerName="keystone-cron" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.536298 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f2d7932-b11f-4e9b-a6e0-2a9a069a3459" containerName="keystone-cron" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.537021 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.541779 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.542265 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-zbt4l" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.549372 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-lrv52"] Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.567311 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-knqfl"] Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.567762 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.574519 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-lrv52"] Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.602714 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3bbc323f-3f18-42bc-b0d8-12f021d91d6b-scripts\") pod \"ovn-controller-knqfl\" (UID: \"3bbc323f-3f18-42bc-b0d8-12f021d91d6b\") " pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.604404 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3bbc323f-3f18-42bc-b0d8-12f021d91d6b-var-log-ovn\") pod \"ovn-controller-knqfl\" (UID: \"3bbc323f-3f18-42bc-b0d8-12f021d91d6b\") " pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.604652 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx4xt\" (UniqueName: \"kubernetes.io/projected/3bbc323f-3f18-42bc-b0d8-12f021d91d6b-kube-api-access-rx4xt\") pod \"ovn-controller-knqfl\" (UID: \"3bbc323f-3f18-42bc-b0d8-12f021d91d6b\") " pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.604837 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3bbc323f-3f18-42bc-b0d8-12f021d91d6b-var-run\") pod \"ovn-controller-knqfl\" (UID: \"3bbc323f-3f18-42bc-b0d8-12f021d91d6b\") " pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.605047 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3bbc323f-3f18-42bc-b0d8-12f021d91d6b-var-run-ovn\") pod \"ovn-controller-knqfl\" (UID: \"3bbc323f-3f18-42bc-b0d8-12f021d91d6b\") " pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.708538 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3bbc323f-3f18-42bc-b0d8-12f021d91d6b-scripts\") pod \"ovn-controller-knqfl\" (UID: \"3bbc323f-3f18-42bc-b0d8-12f021d91d6b\") " pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.708789 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9e98c62c-20fc-462c-9973-2616cb184032-var-run\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.708877 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3bbc323f-3f18-42bc-b0d8-12f021d91d6b-var-log-ovn\") pod \"ovn-controller-knqfl\" (UID: \"3bbc323f-3f18-42bc-b0d8-12f021d91d6b\") " pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.710369 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx4xt\" (UniqueName: \"kubernetes.io/projected/3bbc323f-3f18-42bc-b0d8-12f021d91d6b-kube-api-access-rx4xt\") pod \"ovn-controller-knqfl\" (UID: \"3bbc323f-3f18-42bc-b0d8-12f021d91d6b\") " pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.710780 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3bbc323f-3f18-42bc-b0d8-12f021d91d6b-var-log-ovn\") pod \"ovn-controller-knqfl\" (UID: \"3bbc323f-3f18-42bc-b0d8-12f021d91d6b\") " pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.711668 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e98c62c-20fc-462c-9973-2616cb184032-scripts\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.711837 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3bbc323f-3f18-42bc-b0d8-12f021d91d6b-var-run\") pod \"ovn-controller-knqfl\" (UID: \"3bbc323f-3f18-42bc-b0d8-12f021d91d6b\") " pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.711960 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9e98c62c-20fc-462c-9973-2616cb184032-etc-ovs\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.712049 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9e98c62c-20fc-462c-9973-2616cb184032-var-log\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.712275 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3bbc323f-3f18-42bc-b0d8-12f021d91d6b-var-run-ovn\") pod \"ovn-controller-knqfl\" (UID: \"3bbc323f-3f18-42bc-b0d8-12f021d91d6b\") " pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.712432 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9e98c62c-20fc-462c-9973-2616cb184032-var-lib\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.712510 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v7wq\" (UniqueName: \"kubernetes.io/projected/9e98c62c-20fc-462c-9973-2616cb184032-kube-api-access-4v7wq\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.712755 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3bbc323f-3f18-42bc-b0d8-12f021d91d6b-var-run\") pod \"ovn-controller-knqfl\" (UID: \"3bbc323f-3f18-42bc-b0d8-12f021d91d6b\") " pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.712877 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3bbc323f-3f18-42bc-b0d8-12f021d91d6b-var-run-ovn\") pod \"ovn-controller-knqfl\" (UID: \"3bbc323f-3f18-42bc-b0d8-12f021d91d6b\") " pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.713349 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3bbc323f-3f18-42bc-b0d8-12f021d91d6b-scripts\") pod \"ovn-controller-knqfl\" (UID: \"3bbc323f-3f18-42bc-b0d8-12f021d91d6b\") " pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.729928 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx4xt\" (UniqueName: \"kubernetes.io/projected/3bbc323f-3f18-42bc-b0d8-12f021d91d6b-kube-api-access-rx4xt\") pod \"ovn-controller-knqfl\" (UID: \"3bbc323f-3f18-42bc-b0d8-12f021d91d6b\") " pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.813775 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9e98c62c-20fc-462c-9973-2616cb184032-etc-ovs\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.813823 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9e98c62c-20fc-462c-9973-2616cb184032-var-log\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.813887 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v7wq\" (UniqueName: \"kubernetes.io/projected/9e98c62c-20fc-462c-9973-2616cb184032-kube-api-access-4v7wq\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.813908 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9e98c62c-20fc-462c-9973-2616cb184032-var-lib\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.813942 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9e98c62c-20fc-462c-9973-2616cb184032-etc-ovs\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.813948 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9e98c62c-20fc-462c-9973-2616cb184032-var-run\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.813988 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9e98c62c-20fc-462c-9973-2616cb184032-var-run\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.814031 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9e98c62c-20fc-462c-9973-2616cb184032-var-log\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.814037 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e98c62c-20fc-462c-9973-2616cb184032-scripts\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.814320 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9e98c62c-20fc-462c-9973-2616cb184032-var-lib\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.815912 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e98c62c-20fc-462c-9973-2616cb184032-scripts\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.829234 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v7wq\" (UniqueName: \"kubernetes.io/projected/9e98c62c-20fc-462c-9973-2616cb184032-kube-api-access-4v7wq\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.889718 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.907545 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:24 crc kubenswrapper[4795]: I0219 23:02:24.370302 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-knqfl"] Feb 19 23:02:24 crc kubenswrapper[4795]: I0219 23:02:24.780128 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-lrv52"] Feb 19 23:02:24 crc kubenswrapper[4795]: W0219 23:02:24.781600 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e98c62c_20fc_462c_9973_2616cb184032.slice/crio-cf7f93d39e5b96fef8397c2e3eb29d0e53274c9b6760f32bae9d3cedc9e2c9ec WatchSource:0}: Error finding container cf7f93d39e5b96fef8397c2e3eb29d0e53274c9b6760f32bae9d3cedc9e2c9ec: Status 404 returned error can't find the container with id cf7f93d39e5b96fef8397c2e3eb29d0e53274c9b6760f32bae9d3cedc9e2c9ec Feb 19 23:02:24 crc kubenswrapper[4795]: I0219 23:02:24.783661 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-knqfl" event={"ID":"3bbc323f-3f18-42bc-b0d8-12f021d91d6b","Type":"ContainerStarted","Data":"b63d30a586da308b8a8d09e24fdb92d0880e58235af42ed511eb9acb18ec4616"} Feb 19 23:02:24 crc kubenswrapper[4795]: I0219 23:02:24.783695 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-knqfl" event={"ID":"3bbc323f-3f18-42bc-b0d8-12f021d91d6b","Type":"ContainerStarted","Data":"a2192e4b328ebbe549f9817abd6e79e991147dad6ae8a53ca760a3d199231af1"} Feb 19 23:02:24 crc kubenswrapper[4795]: I0219 23:02:24.784361 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-knqfl" Feb 19 23:02:24 crc kubenswrapper[4795]: I0219 23:02:24.808298 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-knqfl" podStartSLOduration=1.808274002 podStartE2EDuration="1.808274002s" podCreationTimestamp="2026-02-19 23:02:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:02:24.798290888 +0000 UTC m=+5655.990808752" watchObservedRunningTime="2026-02-19 23:02:24.808274002 +0000 UTC m=+5656.000791876" Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.068281 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-kx9qd"] Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.070055 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-kx9qd" Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.074857 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.088247 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-kx9qd"] Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.146331 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b48804d5-a275-45dd-896c-f35b7a322690-ovn-rundir\") pod \"ovn-controller-metrics-kx9qd\" (UID: \"b48804d5-a275-45dd-896c-f35b7a322690\") " pod="openstack/ovn-controller-metrics-kx9qd" Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.146481 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b48804d5-a275-45dd-896c-f35b7a322690-ovs-rundir\") pod \"ovn-controller-metrics-kx9qd\" (UID: \"b48804d5-a275-45dd-896c-f35b7a322690\") " pod="openstack/ovn-controller-metrics-kx9qd" Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.146538 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgpd8\" (UniqueName: \"kubernetes.io/projected/b48804d5-a275-45dd-896c-f35b7a322690-kube-api-access-wgpd8\") pod \"ovn-controller-metrics-kx9qd\" (UID: \"b48804d5-a275-45dd-896c-f35b7a322690\") " pod="openstack/ovn-controller-metrics-kx9qd" Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.146654 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b48804d5-a275-45dd-896c-f35b7a322690-config\") pod \"ovn-controller-metrics-kx9qd\" (UID: \"b48804d5-a275-45dd-896c-f35b7a322690\") " pod="openstack/ovn-controller-metrics-kx9qd" Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.248642 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b48804d5-a275-45dd-896c-f35b7a322690-ovn-rundir\") pod \"ovn-controller-metrics-kx9qd\" (UID: \"b48804d5-a275-45dd-896c-f35b7a322690\") " pod="openstack/ovn-controller-metrics-kx9qd" Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.248772 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b48804d5-a275-45dd-896c-f35b7a322690-ovs-rundir\") pod \"ovn-controller-metrics-kx9qd\" (UID: \"b48804d5-a275-45dd-896c-f35b7a322690\") " pod="openstack/ovn-controller-metrics-kx9qd" Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.248802 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgpd8\" (UniqueName: \"kubernetes.io/projected/b48804d5-a275-45dd-896c-f35b7a322690-kube-api-access-wgpd8\") pod \"ovn-controller-metrics-kx9qd\" (UID: \"b48804d5-a275-45dd-896c-f35b7a322690\") " pod="openstack/ovn-controller-metrics-kx9qd" Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.248865 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b48804d5-a275-45dd-896c-f35b7a322690-config\") pod \"ovn-controller-metrics-kx9qd\" (UID: \"b48804d5-a275-45dd-896c-f35b7a322690\") " pod="openstack/ovn-controller-metrics-kx9qd" Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.248972 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b48804d5-a275-45dd-896c-f35b7a322690-ovs-rundir\") pod \"ovn-controller-metrics-kx9qd\" (UID: \"b48804d5-a275-45dd-896c-f35b7a322690\") " pod="openstack/ovn-controller-metrics-kx9qd" Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.248969 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b48804d5-a275-45dd-896c-f35b7a322690-ovn-rundir\") pod \"ovn-controller-metrics-kx9qd\" (UID: \"b48804d5-a275-45dd-896c-f35b7a322690\") " pod="openstack/ovn-controller-metrics-kx9qd" Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.249743 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b48804d5-a275-45dd-896c-f35b7a322690-config\") pod \"ovn-controller-metrics-kx9qd\" (UID: \"b48804d5-a275-45dd-896c-f35b7a322690\") " pod="openstack/ovn-controller-metrics-kx9qd" Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.281189 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgpd8\" (UniqueName: \"kubernetes.io/projected/b48804d5-a275-45dd-896c-f35b7a322690-kube-api-access-wgpd8\") pod \"ovn-controller-metrics-kx9qd\" (UID: \"b48804d5-a275-45dd-896c-f35b7a322690\") " pod="openstack/ovn-controller-metrics-kx9qd" Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.397192 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-kx9qd" Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.792392 4795 generic.go:334] "Generic (PLEG): container finished" podID="9e98c62c-20fc-462c-9973-2616cb184032" containerID="b00e9c8b111bd9cfeb5dfdb129b307c5979b745ca4ba7e8292aa3f29a3405232" exitCode=0 Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.792448 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lrv52" event={"ID":"9e98c62c-20fc-462c-9973-2616cb184032","Type":"ContainerDied","Data":"b00e9c8b111bd9cfeb5dfdb129b307c5979b745ca4ba7e8292aa3f29a3405232"} Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.792697 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lrv52" event={"ID":"9e98c62c-20fc-462c-9973-2616cb184032","Type":"ContainerStarted","Data":"cf7f93d39e5b96fef8397c2e3eb29d0e53274c9b6760f32bae9d3cedc9e2c9ec"} Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.854258 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-kx9qd"] Feb 19 23:02:26 crc kubenswrapper[4795]: I0219 23:02:26.802527 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-kx9qd" event={"ID":"b48804d5-a275-45dd-896c-f35b7a322690","Type":"ContainerStarted","Data":"e8ae1b3ad036414afe3416f20a1f10c4b5931e9c43dc1afd4e11101790124a0d"} Feb 19 23:02:26 crc kubenswrapper[4795]: I0219 23:02:26.803204 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-kx9qd" event={"ID":"b48804d5-a275-45dd-896c-f35b7a322690","Type":"ContainerStarted","Data":"cc310fc1cee67a993981338a5f14330a9d8f36fcd9ccbe97e51c5a20c1afb110"} Feb 19 23:02:26 crc kubenswrapper[4795]: I0219 23:02:26.812452 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lrv52" event={"ID":"9e98c62c-20fc-462c-9973-2616cb184032","Type":"ContainerStarted","Data":"8975fba2cfc268c57f22092b674d4659416031c54a1449b009e4d0297e7c9dbb"} Feb 19 23:02:26 crc kubenswrapper[4795]: I0219 23:02:26.812511 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lrv52" event={"ID":"9e98c62c-20fc-462c-9973-2616cb184032","Type":"ContainerStarted","Data":"ae4b9c53934ac248ca2b3d73eac1cbc515c2e9bbb8a2d212f4a6d383268f1547"} Feb 19 23:02:26 crc kubenswrapper[4795]: I0219 23:02:26.812787 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:26 crc kubenswrapper[4795]: I0219 23:02:26.812850 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:26 crc kubenswrapper[4795]: I0219 23:02:26.826989 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-kx9qd" podStartSLOduration=2.826973335 podStartE2EDuration="2.826973335s" podCreationTimestamp="2026-02-19 23:02:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:02:26.820124101 +0000 UTC m=+5658.012641965" watchObservedRunningTime="2026-02-19 23:02:26.826973335 +0000 UTC m=+5658.019491199" Feb 19 23:02:26 crc kubenswrapper[4795]: I0219 23:02:26.858888 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-lrv52" podStartSLOduration=3.8587940400000003 podStartE2EDuration="3.85879404s" podCreationTimestamp="2026-02-19 23:02:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:02:26.846953903 +0000 UTC m=+5658.039471777" watchObservedRunningTime="2026-02-19 23:02:26.85879404 +0000 UTC m=+5658.051311904" Feb 19 23:02:28 crc kubenswrapper[4795]: I0219 23:02:28.427357 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:02:28 crc kubenswrapper[4795]: I0219 23:02:28.427727 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:02:28 crc kubenswrapper[4795]: I0219 23:02:28.427793 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 23:02:28 crc kubenswrapper[4795]: I0219 23:02:28.428655 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 23:02:28 crc kubenswrapper[4795]: I0219 23:02:28.428716 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" gracePeriod=600 Feb 19 23:02:28 crc kubenswrapper[4795]: E0219 23:02:28.549268 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:02:28 crc kubenswrapper[4795]: I0219 23:02:28.838709 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" exitCode=0 Feb 19 23:02:28 crc kubenswrapper[4795]: I0219 23:02:28.838770 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c"} Feb 19 23:02:28 crc kubenswrapper[4795]: I0219 23:02:28.838822 4795 scope.go:117] "RemoveContainer" containerID="fd820b5d0adc1705d78ec76939670a71a79ca07206c8d0459e23712d0b015f16" Feb 19 23:02:28 crc kubenswrapper[4795]: I0219 23:02:28.839708 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:02:28 crc kubenswrapper[4795]: E0219 23:02:28.840227 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:02:29 crc kubenswrapper[4795]: I0219 23:02:29.043488 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-bm6ln"] Feb 19 23:02:29 crc kubenswrapper[4795]: I0219 23:02:29.056516 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-bm6ln"] Feb 19 23:02:29 crc kubenswrapper[4795]: I0219 23:02:29.524482 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96625ae6-8eb0-43d0-a180-20c79dfd6717" path="/var/lib/kubelet/pods/96625ae6-8eb0-43d0-a180-20c79dfd6717/volumes" Feb 19 23:02:39 crc kubenswrapper[4795]: I0219 23:02:39.322798 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-rfntz"] Feb 19 23:02:39 crc kubenswrapper[4795]: I0219 23:02:39.325655 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-rfntz" Feb 19 23:02:39 crc kubenswrapper[4795]: I0219 23:02:39.338988 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-rfntz"] Feb 19 23:02:39 crc kubenswrapper[4795]: I0219 23:02:39.386901 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7389820e-b641-4068-b624-af539a234699-operator-scripts\") pod \"octavia-db-create-rfntz\" (UID: \"7389820e-b641-4068-b624-af539a234699\") " pod="openstack/octavia-db-create-rfntz" Feb 19 23:02:39 crc kubenswrapper[4795]: I0219 23:02:39.386971 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twfsc\" (UniqueName: \"kubernetes.io/projected/7389820e-b641-4068-b624-af539a234699-kube-api-access-twfsc\") pod \"octavia-db-create-rfntz\" (UID: \"7389820e-b641-4068-b624-af539a234699\") " pod="openstack/octavia-db-create-rfntz" Feb 19 23:02:39 crc kubenswrapper[4795]: I0219 23:02:39.488696 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twfsc\" (UniqueName: \"kubernetes.io/projected/7389820e-b641-4068-b624-af539a234699-kube-api-access-twfsc\") pod \"octavia-db-create-rfntz\" (UID: \"7389820e-b641-4068-b624-af539a234699\") " pod="openstack/octavia-db-create-rfntz" Feb 19 23:02:39 crc kubenswrapper[4795]: I0219 23:02:39.488919 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7389820e-b641-4068-b624-af539a234699-operator-scripts\") pod \"octavia-db-create-rfntz\" (UID: \"7389820e-b641-4068-b624-af539a234699\") " pod="openstack/octavia-db-create-rfntz" Feb 19 23:02:39 crc kubenswrapper[4795]: I0219 23:02:39.489947 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7389820e-b641-4068-b624-af539a234699-operator-scripts\") pod \"octavia-db-create-rfntz\" (UID: \"7389820e-b641-4068-b624-af539a234699\") " pod="openstack/octavia-db-create-rfntz" Feb 19 23:02:39 crc kubenswrapper[4795]: I0219 23:02:39.510424 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twfsc\" (UniqueName: \"kubernetes.io/projected/7389820e-b641-4068-b624-af539a234699-kube-api-access-twfsc\") pod \"octavia-db-create-rfntz\" (UID: \"7389820e-b641-4068-b624-af539a234699\") " pod="openstack/octavia-db-create-rfntz" Feb 19 23:02:39 crc kubenswrapper[4795]: I0219 23:02:39.652848 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-rfntz" Feb 19 23:02:40 crc kubenswrapper[4795]: I0219 23:02:40.123776 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-rfntz"] Feb 19 23:02:40 crc kubenswrapper[4795]: I0219 23:02:40.577194 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-c873-account-create-update-fggql"] Feb 19 23:02:40 crc kubenswrapper[4795]: I0219 23:02:40.578520 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-c873-account-create-update-fggql" Feb 19 23:02:40 crc kubenswrapper[4795]: I0219 23:02:40.581108 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Feb 19 23:02:40 crc kubenswrapper[4795]: I0219 23:02:40.611446 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb4qp\" (UniqueName: \"kubernetes.io/projected/9ca0a783-4d18-4d0a-81d8-7cc1970379a9-kube-api-access-qb4qp\") pod \"octavia-c873-account-create-update-fggql\" (UID: \"9ca0a783-4d18-4d0a-81d8-7cc1970379a9\") " pod="openstack/octavia-c873-account-create-update-fggql" Feb 19 23:02:40 crc kubenswrapper[4795]: I0219 23:02:40.611687 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ca0a783-4d18-4d0a-81d8-7cc1970379a9-operator-scripts\") pod \"octavia-c873-account-create-update-fggql\" (UID: \"9ca0a783-4d18-4d0a-81d8-7cc1970379a9\") " pod="openstack/octavia-c873-account-create-update-fggql" Feb 19 23:02:40 crc kubenswrapper[4795]: I0219 23:02:40.642931 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-c873-account-create-update-fggql"] Feb 19 23:02:40 crc kubenswrapper[4795]: I0219 23:02:40.713294 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb4qp\" (UniqueName: \"kubernetes.io/projected/9ca0a783-4d18-4d0a-81d8-7cc1970379a9-kube-api-access-qb4qp\") pod \"octavia-c873-account-create-update-fggql\" (UID: \"9ca0a783-4d18-4d0a-81d8-7cc1970379a9\") " pod="openstack/octavia-c873-account-create-update-fggql" Feb 19 23:02:40 crc kubenswrapper[4795]: I0219 23:02:40.713422 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ca0a783-4d18-4d0a-81d8-7cc1970379a9-operator-scripts\") pod \"octavia-c873-account-create-update-fggql\" (UID: \"9ca0a783-4d18-4d0a-81d8-7cc1970379a9\") " pod="openstack/octavia-c873-account-create-update-fggql" Feb 19 23:02:40 crc kubenswrapper[4795]: I0219 23:02:40.714520 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ca0a783-4d18-4d0a-81d8-7cc1970379a9-operator-scripts\") pod \"octavia-c873-account-create-update-fggql\" (UID: \"9ca0a783-4d18-4d0a-81d8-7cc1970379a9\") " pod="openstack/octavia-c873-account-create-update-fggql" Feb 19 23:02:40 crc kubenswrapper[4795]: I0219 23:02:40.737468 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb4qp\" (UniqueName: \"kubernetes.io/projected/9ca0a783-4d18-4d0a-81d8-7cc1970379a9-kube-api-access-qb4qp\") pod \"octavia-c873-account-create-update-fggql\" (UID: \"9ca0a783-4d18-4d0a-81d8-7cc1970379a9\") " pod="openstack/octavia-c873-account-create-update-fggql" Feb 19 23:02:40 crc kubenswrapper[4795]: I0219 23:02:40.894132 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-c873-account-create-update-fggql" Feb 19 23:02:40 crc kubenswrapper[4795]: I0219 23:02:40.950858 4795 generic.go:334] "Generic (PLEG): container finished" podID="7389820e-b641-4068-b624-af539a234699" containerID="461ac9425c34a3821048eb55409a0a70365c0acacbfa307ea4409068b90afe68" exitCode=0 Feb 19 23:02:40 crc kubenswrapper[4795]: I0219 23:02:40.950901 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-rfntz" event={"ID":"7389820e-b641-4068-b624-af539a234699","Type":"ContainerDied","Data":"461ac9425c34a3821048eb55409a0a70365c0acacbfa307ea4409068b90afe68"} Feb 19 23:02:40 crc kubenswrapper[4795]: I0219 23:02:40.950924 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-rfntz" event={"ID":"7389820e-b641-4068-b624-af539a234699","Type":"ContainerStarted","Data":"2d1715a61d0ec12b7dcc1eff75ded916d2307e4020ffa4bcd927cf7a2bbbe7c6"} Feb 19 23:02:41 crc kubenswrapper[4795]: I0219 23:02:41.349654 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-c873-account-create-update-fggql"] Feb 19 23:02:41 crc kubenswrapper[4795]: I0219 23:02:41.967097 4795 generic.go:334] "Generic (PLEG): container finished" podID="9ca0a783-4d18-4d0a-81d8-7cc1970379a9" containerID="fe6c050cae9125e8669040d524bc8951c45b9effbc5921d0ee07f69c88d514a2" exitCode=0 Feb 19 23:02:41 crc kubenswrapper[4795]: I0219 23:02:41.967299 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-c873-account-create-update-fggql" event={"ID":"9ca0a783-4d18-4d0a-81d8-7cc1970379a9","Type":"ContainerDied","Data":"fe6c050cae9125e8669040d524bc8951c45b9effbc5921d0ee07f69c88d514a2"} Feb 19 23:02:41 crc kubenswrapper[4795]: I0219 23:02:41.967325 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-c873-account-create-update-fggql" event={"ID":"9ca0a783-4d18-4d0a-81d8-7cc1970379a9","Type":"ContainerStarted","Data":"9f0e85eba5f4b32dbdff81ed3ccaef5e2ea7cb994e8d4f47c422be26c462d6b7"} Feb 19 23:02:42 crc kubenswrapper[4795]: I0219 23:02:42.064024 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-tghht"] Feb 19 23:02:42 crc kubenswrapper[4795]: I0219 23:02:42.070973 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-tghht"] Feb 19 23:02:42 crc kubenswrapper[4795]: I0219 23:02:42.398006 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-rfntz" Feb 19 23:02:42 crc kubenswrapper[4795]: I0219 23:02:42.511532 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:02:42 crc kubenswrapper[4795]: E0219 23:02:42.511848 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:02:42 crc kubenswrapper[4795]: I0219 23:02:42.564870 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7389820e-b641-4068-b624-af539a234699-operator-scripts\") pod \"7389820e-b641-4068-b624-af539a234699\" (UID: \"7389820e-b641-4068-b624-af539a234699\") " Feb 19 23:02:42 crc kubenswrapper[4795]: I0219 23:02:42.565065 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twfsc\" (UniqueName: \"kubernetes.io/projected/7389820e-b641-4068-b624-af539a234699-kube-api-access-twfsc\") pod \"7389820e-b641-4068-b624-af539a234699\" (UID: \"7389820e-b641-4068-b624-af539a234699\") " Feb 19 23:02:42 crc kubenswrapper[4795]: I0219 23:02:42.565651 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7389820e-b641-4068-b624-af539a234699-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7389820e-b641-4068-b624-af539a234699" (UID: "7389820e-b641-4068-b624-af539a234699"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:02:42 crc kubenswrapper[4795]: I0219 23:02:42.566458 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7389820e-b641-4068-b624-af539a234699-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:42 crc kubenswrapper[4795]: I0219 23:02:42.578334 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7389820e-b641-4068-b624-af539a234699-kube-api-access-twfsc" (OuterVolumeSpecName: "kube-api-access-twfsc") pod "7389820e-b641-4068-b624-af539a234699" (UID: "7389820e-b641-4068-b624-af539a234699"). InnerVolumeSpecName "kube-api-access-twfsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:02:42 crc kubenswrapper[4795]: I0219 23:02:42.668211 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twfsc\" (UniqueName: \"kubernetes.io/projected/7389820e-b641-4068-b624-af539a234699-kube-api-access-twfsc\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:42 crc kubenswrapper[4795]: I0219 23:02:42.980784 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-rfntz" Feb 19 23:02:42 crc kubenswrapper[4795]: I0219 23:02:42.980816 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-rfntz" event={"ID":"7389820e-b641-4068-b624-af539a234699","Type":"ContainerDied","Data":"2d1715a61d0ec12b7dcc1eff75ded916d2307e4020ffa4bcd927cf7a2bbbe7c6"} Feb 19 23:02:42 crc kubenswrapper[4795]: I0219 23:02:42.980891 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d1715a61d0ec12b7dcc1eff75ded916d2307e4020ffa4bcd927cf7a2bbbe7c6" Feb 19 23:02:43 crc kubenswrapper[4795]: I0219 23:02:43.397311 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-c873-account-create-update-fggql" Feb 19 23:02:43 crc kubenswrapper[4795]: I0219 23:02:43.494904 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb4qp\" (UniqueName: \"kubernetes.io/projected/9ca0a783-4d18-4d0a-81d8-7cc1970379a9-kube-api-access-qb4qp\") pod \"9ca0a783-4d18-4d0a-81d8-7cc1970379a9\" (UID: \"9ca0a783-4d18-4d0a-81d8-7cc1970379a9\") " Feb 19 23:02:43 crc kubenswrapper[4795]: I0219 23:02:43.494965 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ca0a783-4d18-4d0a-81d8-7cc1970379a9-operator-scripts\") pod \"9ca0a783-4d18-4d0a-81d8-7cc1970379a9\" (UID: \"9ca0a783-4d18-4d0a-81d8-7cc1970379a9\") " Feb 19 23:02:43 crc kubenswrapper[4795]: I0219 23:02:43.496367 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ca0a783-4d18-4d0a-81d8-7cc1970379a9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9ca0a783-4d18-4d0a-81d8-7cc1970379a9" (UID: "9ca0a783-4d18-4d0a-81d8-7cc1970379a9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:02:43 crc kubenswrapper[4795]: I0219 23:02:43.500370 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ca0a783-4d18-4d0a-81d8-7cc1970379a9-kube-api-access-qb4qp" (OuterVolumeSpecName: "kube-api-access-qb4qp") pod "9ca0a783-4d18-4d0a-81d8-7cc1970379a9" (UID: "9ca0a783-4d18-4d0a-81d8-7cc1970379a9"). InnerVolumeSpecName "kube-api-access-qb4qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:02:43 crc kubenswrapper[4795]: I0219 23:02:43.527296 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe89b6c7-308b-42a8-92a9-da093d6bbae4" path="/var/lib/kubelet/pods/fe89b6c7-308b-42a8-92a9-da093d6bbae4/volumes" Feb 19 23:02:43 crc kubenswrapper[4795]: I0219 23:02:43.596906 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qb4qp\" (UniqueName: \"kubernetes.io/projected/9ca0a783-4d18-4d0a-81d8-7cc1970379a9-kube-api-access-qb4qp\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:43 crc kubenswrapper[4795]: I0219 23:02:43.596966 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ca0a783-4d18-4d0a-81d8-7cc1970379a9-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:43 crc kubenswrapper[4795]: I0219 23:02:43.989427 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-c873-account-create-update-fggql" event={"ID":"9ca0a783-4d18-4d0a-81d8-7cc1970379a9","Type":"ContainerDied","Data":"9f0e85eba5f4b32dbdff81ed3ccaef5e2ea7cb994e8d4f47c422be26c462d6b7"} Feb 19 23:02:43 crc kubenswrapper[4795]: I0219 23:02:43.989467 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f0e85eba5f4b32dbdff81ed3ccaef5e2ea7cb994e8d4f47c422be26c462d6b7" Feb 19 23:02:43 crc kubenswrapper[4795]: I0219 23:02:43.989470 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-c873-account-create-update-fggql" Feb 19 23:02:45 crc kubenswrapper[4795]: I0219 23:02:45.795313 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-2vlg7"] Feb 19 23:02:45 crc kubenswrapper[4795]: E0219 23:02:45.796096 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ca0a783-4d18-4d0a-81d8-7cc1970379a9" containerName="mariadb-account-create-update" Feb 19 23:02:45 crc kubenswrapper[4795]: I0219 23:02:45.796150 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ca0a783-4d18-4d0a-81d8-7cc1970379a9" containerName="mariadb-account-create-update" Feb 19 23:02:45 crc kubenswrapper[4795]: E0219 23:02:45.796191 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7389820e-b641-4068-b624-af539a234699" containerName="mariadb-database-create" Feb 19 23:02:45 crc kubenswrapper[4795]: I0219 23:02:45.796201 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7389820e-b641-4068-b624-af539a234699" containerName="mariadb-database-create" Feb 19 23:02:45 crc kubenswrapper[4795]: I0219 23:02:45.796468 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ca0a783-4d18-4d0a-81d8-7cc1970379a9" containerName="mariadb-account-create-update" Feb 19 23:02:45 crc kubenswrapper[4795]: I0219 23:02:45.796496 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7389820e-b641-4068-b624-af539a234699" containerName="mariadb-database-create" Feb 19 23:02:45 crc kubenswrapper[4795]: I0219 23:02:45.797227 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-2vlg7" Feb 19 23:02:45 crc kubenswrapper[4795]: I0219 23:02:45.830210 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-2vlg7"] Feb 19 23:02:45 crc kubenswrapper[4795]: I0219 23:02:45.933906 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsnmq\" (UniqueName: \"kubernetes.io/projected/2af5a019-2aa4-449d-a1a5-148cbf8a1ffa-kube-api-access-nsnmq\") pod \"octavia-persistence-db-create-2vlg7\" (UID: \"2af5a019-2aa4-449d-a1a5-148cbf8a1ffa\") " pod="openstack/octavia-persistence-db-create-2vlg7" Feb 19 23:02:45 crc kubenswrapper[4795]: I0219 23:02:45.934066 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af5a019-2aa4-449d-a1a5-148cbf8a1ffa-operator-scripts\") pod \"octavia-persistence-db-create-2vlg7\" (UID: \"2af5a019-2aa4-449d-a1a5-148cbf8a1ffa\") " pod="openstack/octavia-persistence-db-create-2vlg7" Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.035872 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsnmq\" (UniqueName: \"kubernetes.io/projected/2af5a019-2aa4-449d-a1a5-148cbf8a1ffa-kube-api-access-nsnmq\") pod \"octavia-persistence-db-create-2vlg7\" (UID: \"2af5a019-2aa4-449d-a1a5-148cbf8a1ffa\") " pod="openstack/octavia-persistence-db-create-2vlg7" Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.036097 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af5a019-2aa4-449d-a1a5-148cbf8a1ffa-operator-scripts\") pod \"octavia-persistence-db-create-2vlg7\" (UID: \"2af5a019-2aa4-449d-a1a5-148cbf8a1ffa\") " pod="openstack/octavia-persistence-db-create-2vlg7" Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.037444 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af5a019-2aa4-449d-a1a5-148cbf8a1ffa-operator-scripts\") pod \"octavia-persistence-db-create-2vlg7\" (UID: \"2af5a019-2aa4-449d-a1a5-148cbf8a1ffa\") " pod="openstack/octavia-persistence-db-create-2vlg7" Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.058825 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsnmq\" (UniqueName: \"kubernetes.io/projected/2af5a019-2aa4-449d-a1a5-148cbf8a1ffa-kube-api-access-nsnmq\") pod \"octavia-persistence-db-create-2vlg7\" (UID: \"2af5a019-2aa4-449d-a1a5-148cbf8a1ffa\") " pod="openstack/octavia-persistence-db-create-2vlg7" Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.119316 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-2vlg7" Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.292995 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-75e1-account-create-update-mq672"] Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.294764 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-75e1-account-create-update-mq672" Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.299240 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.321286 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-75e1-account-create-update-mq672"] Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.444040 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtbjb\" (UniqueName: \"kubernetes.io/projected/0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2-kube-api-access-xtbjb\") pod \"octavia-75e1-account-create-update-mq672\" (UID: \"0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2\") " pod="openstack/octavia-75e1-account-create-update-mq672" Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.444379 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2-operator-scripts\") pod \"octavia-75e1-account-create-update-mq672\" (UID: \"0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2\") " pod="openstack/octavia-75e1-account-create-update-mq672" Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.546073 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtbjb\" (UniqueName: \"kubernetes.io/projected/0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2-kube-api-access-xtbjb\") pod \"octavia-75e1-account-create-update-mq672\" (UID: \"0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2\") " pod="openstack/octavia-75e1-account-create-update-mq672" Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.546117 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2-operator-scripts\") pod \"octavia-75e1-account-create-update-mq672\" (UID: \"0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2\") " pod="openstack/octavia-75e1-account-create-update-mq672" Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.546844 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2-operator-scripts\") pod \"octavia-75e1-account-create-update-mq672\" (UID: \"0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2\") " pod="openstack/octavia-75e1-account-create-update-mq672" Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.563650 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtbjb\" (UniqueName: \"kubernetes.io/projected/0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2-kube-api-access-xtbjb\") pod \"octavia-75e1-account-create-update-mq672\" (UID: \"0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2\") " pod="openstack/octavia-75e1-account-create-update-mq672" Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.604790 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-2vlg7"] Feb 19 23:02:46 crc kubenswrapper[4795]: W0219 23:02:46.608088 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2af5a019_2aa4_449d_a1a5_148cbf8a1ffa.slice/crio-d39a053ef75b8e4887773a29b0af9d375a1f9f156f0f9ab43541043f16242fbc WatchSource:0}: Error finding container d39a053ef75b8e4887773a29b0af9d375a1f9f156f0f9ab43541043f16242fbc: Status 404 returned error can't find the container with id d39a053ef75b8e4887773a29b0af9d375a1f9f156f0f9ab43541043f16242fbc Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.621682 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-75e1-account-create-update-mq672" Feb 19 23:02:47 crc kubenswrapper[4795]: I0219 23:02:47.014472 4795 generic.go:334] "Generic (PLEG): container finished" podID="2af5a019-2aa4-449d-a1a5-148cbf8a1ffa" containerID="922d8fbba822aa06530b36452ecdfe6cce93b9221523be2934bb7556f81619e3" exitCode=0 Feb 19 23:02:47 crc kubenswrapper[4795]: I0219 23:02:47.014752 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-2vlg7" event={"ID":"2af5a019-2aa4-449d-a1a5-148cbf8a1ffa","Type":"ContainerDied","Data":"922d8fbba822aa06530b36452ecdfe6cce93b9221523be2934bb7556f81619e3"} Feb 19 23:02:47 crc kubenswrapper[4795]: I0219 23:02:47.014774 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-2vlg7" event={"ID":"2af5a019-2aa4-449d-a1a5-148cbf8a1ffa","Type":"ContainerStarted","Data":"d39a053ef75b8e4887773a29b0af9d375a1f9f156f0f9ab43541043f16242fbc"} Feb 19 23:02:47 crc kubenswrapper[4795]: W0219 23:02:47.103359 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ea8f33a_4a23_4058_a6f1_ccd27d64f1f2.slice/crio-b32dc138dfb0dab9a6c37caccc92e45f9ae9b912808aef3bb5c8f22c54256865 WatchSource:0}: Error finding container b32dc138dfb0dab9a6c37caccc92e45f9ae9b912808aef3bb5c8f22c54256865: Status 404 returned error can't find the container with id b32dc138dfb0dab9a6c37caccc92e45f9ae9b912808aef3bb5c8f22c54256865 Feb 19 23:02:47 crc kubenswrapper[4795]: I0219 23:02:47.103501 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-75e1-account-create-update-mq672"] Feb 19 23:02:48 crc kubenswrapper[4795]: I0219 23:02:48.023413 4795 generic.go:334] "Generic (PLEG): container finished" podID="0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2" containerID="8783e1276005ef506acf0881371a237a591f18c217cbbd901f218637b5c95d2c" exitCode=0 Feb 19 23:02:48 crc kubenswrapper[4795]: I0219 23:02:48.023453 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-75e1-account-create-update-mq672" event={"ID":"0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2","Type":"ContainerDied","Data":"8783e1276005ef506acf0881371a237a591f18c217cbbd901f218637b5c95d2c"} Feb 19 23:02:48 crc kubenswrapper[4795]: I0219 23:02:48.023836 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-75e1-account-create-update-mq672" event={"ID":"0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2","Type":"ContainerStarted","Data":"b32dc138dfb0dab9a6c37caccc92e45f9ae9b912808aef3bb5c8f22c54256865"} Feb 19 23:02:48 crc kubenswrapper[4795]: I0219 23:02:48.386660 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-2vlg7" Feb 19 23:02:48 crc kubenswrapper[4795]: I0219 23:02:48.582412 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsnmq\" (UniqueName: \"kubernetes.io/projected/2af5a019-2aa4-449d-a1a5-148cbf8a1ffa-kube-api-access-nsnmq\") pod \"2af5a019-2aa4-449d-a1a5-148cbf8a1ffa\" (UID: \"2af5a019-2aa4-449d-a1a5-148cbf8a1ffa\") " Feb 19 23:02:48 crc kubenswrapper[4795]: I0219 23:02:48.582632 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af5a019-2aa4-449d-a1a5-148cbf8a1ffa-operator-scripts\") pod \"2af5a019-2aa4-449d-a1a5-148cbf8a1ffa\" (UID: \"2af5a019-2aa4-449d-a1a5-148cbf8a1ffa\") " Feb 19 23:02:48 crc kubenswrapper[4795]: I0219 23:02:48.583369 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2af5a019-2aa4-449d-a1a5-148cbf8a1ffa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2af5a019-2aa4-449d-a1a5-148cbf8a1ffa" (UID: "2af5a019-2aa4-449d-a1a5-148cbf8a1ffa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:02:48 crc kubenswrapper[4795]: I0219 23:02:48.588561 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2af5a019-2aa4-449d-a1a5-148cbf8a1ffa-kube-api-access-nsnmq" (OuterVolumeSpecName: "kube-api-access-nsnmq") pod "2af5a019-2aa4-449d-a1a5-148cbf8a1ffa" (UID: "2af5a019-2aa4-449d-a1a5-148cbf8a1ffa"). InnerVolumeSpecName "kube-api-access-nsnmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:02:48 crc kubenswrapper[4795]: I0219 23:02:48.686712 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsnmq\" (UniqueName: \"kubernetes.io/projected/2af5a019-2aa4-449d-a1a5-148cbf8a1ffa-kube-api-access-nsnmq\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:48 crc kubenswrapper[4795]: I0219 23:02:48.686766 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af5a019-2aa4-449d-a1a5-148cbf8a1ffa-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:49 crc kubenswrapper[4795]: I0219 23:02:49.037462 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-2vlg7" event={"ID":"2af5a019-2aa4-449d-a1a5-148cbf8a1ffa","Type":"ContainerDied","Data":"d39a053ef75b8e4887773a29b0af9d375a1f9f156f0f9ab43541043f16242fbc"} Feb 19 23:02:49 crc kubenswrapper[4795]: I0219 23:02:49.037532 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-2vlg7" Feb 19 23:02:49 crc kubenswrapper[4795]: I0219 23:02:49.037540 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d39a053ef75b8e4887773a29b0af9d375a1f9f156f0f9ab43541043f16242fbc" Feb 19 23:02:49 crc kubenswrapper[4795]: I0219 23:02:49.498002 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-75e1-account-create-update-mq672" Feb 19 23:02:49 crc kubenswrapper[4795]: I0219 23:02:49.504690 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtbjb\" (UniqueName: \"kubernetes.io/projected/0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2-kube-api-access-xtbjb\") pod \"0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2\" (UID: \"0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2\") " Feb 19 23:02:49 crc kubenswrapper[4795]: I0219 23:02:49.504829 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2-operator-scripts\") pod \"0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2\" (UID: \"0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2\") " Feb 19 23:02:49 crc kubenswrapper[4795]: I0219 23:02:49.505335 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2" (UID: "0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:02:49 crc kubenswrapper[4795]: I0219 23:02:49.506012 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:49 crc kubenswrapper[4795]: I0219 23:02:49.510238 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2-kube-api-access-xtbjb" (OuterVolumeSpecName: "kube-api-access-xtbjb") pod "0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2" (UID: "0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2"). InnerVolumeSpecName "kube-api-access-xtbjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:02:49 crc kubenswrapper[4795]: I0219 23:02:49.607391 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtbjb\" (UniqueName: \"kubernetes.io/projected/0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2-kube-api-access-xtbjb\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:50 crc kubenswrapper[4795]: I0219 23:02:50.049671 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-75e1-account-create-update-mq672" event={"ID":"0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2","Type":"ContainerDied","Data":"b32dc138dfb0dab9a6c37caccc92e45f9ae9b912808aef3bb5c8f22c54256865"} Feb 19 23:02:50 crc kubenswrapper[4795]: I0219 23:02:50.049715 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b32dc138dfb0dab9a6c37caccc92e45f9ae9b912808aef3bb5c8f22c54256865" Feb 19 23:02:50 crc kubenswrapper[4795]: I0219 23:02:50.049773 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-75e1-account-create-update-mq672" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.461954 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-556fc55b45-7gxcm"] Feb 19 23:02:52 crc kubenswrapper[4795]: E0219 23:02:52.463065 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af5a019-2aa4-449d-a1a5-148cbf8a1ffa" containerName="mariadb-database-create" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.463091 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af5a019-2aa4-449d-a1a5-148cbf8a1ffa" containerName="mariadb-database-create" Feb 19 23:02:52 crc kubenswrapper[4795]: E0219 23:02:52.463139 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2" containerName="mariadb-account-create-update" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.463152 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2" containerName="mariadb-account-create-update" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.463514 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2" containerName="mariadb-account-create-update" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.463560 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2af5a019-2aa4-449d-a1a5-148cbf8a1ffa" containerName="mariadb-database-create" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.465572 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.468082 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.468533 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.468654 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-pcc54" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.469901 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-556fc55b45-7gxcm"] Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.581996 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c95e6fdb-6007-4490-9572-a2709f8b7daf-config-data-merged\") pod \"octavia-api-556fc55b45-7gxcm\" (UID: \"c95e6fdb-6007-4490-9572-a2709f8b7daf\") " pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.582091 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c95e6fdb-6007-4490-9572-a2709f8b7daf-combined-ca-bundle\") pod \"octavia-api-556fc55b45-7gxcm\" (UID: \"c95e6fdb-6007-4490-9572-a2709f8b7daf\") " pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.582130 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c95e6fdb-6007-4490-9572-a2709f8b7daf-scripts\") pod \"octavia-api-556fc55b45-7gxcm\" (UID: \"c95e6fdb-6007-4490-9572-a2709f8b7daf\") " pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.582328 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c95e6fdb-6007-4490-9572-a2709f8b7daf-config-data\") pod \"octavia-api-556fc55b45-7gxcm\" (UID: \"c95e6fdb-6007-4490-9572-a2709f8b7daf\") " pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.582363 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/c95e6fdb-6007-4490-9572-a2709f8b7daf-octavia-run\") pod \"octavia-api-556fc55b45-7gxcm\" (UID: \"c95e6fdb-6007-4490-9572-a2709f8b7daf\") " pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.684039 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c95e6fdb-6007-4490-9572-a2709f8b7daf-config-data\") pod \"octavia-api-556fc55b45-7gxcm\" (UID: \"c95e6fdb-6007-4490-9572-a2709f8b7daf\") " pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.684088 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/c95e6fdb-6007-4490-9572-a2709f8b7daf-octavia-run\") pod \"octavia-api-556fc55b45-7gxcm\" (UID: \"c95e6fdb-6007-4490-9572-a2709f8b7daf\") " pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.684132 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c95e6fdb-6007-4490-9572-a2709f8b7daf-config-data-merged\") pod \"octavia-api-556fc55b45-7gxcm\" (UID: \"c95e6fdb-6007-4490-9572-a2709f8b7daf\") " pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.684200 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c95e6fdb-6007-4490-9572-a2709f8b7daf-combined-ca-bundle\") pod \"octavia-api-556fc55b45-7gxcm\" (UID: \"c95e6fdb-6007-4490-9572-a2709f8b7daf\") " pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.684226 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c95e6fdb-6007-4490-9572-a2709f8b7daf-scripts\") pod \"octavia-api-556fc55b45-7gxcm\" (UID: \"c95e6fdb-6007-4490-9572-a2709f8b7daf\") " pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.685409 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c95e6fdb-6007-4490-9572-a2709f8b7daf-config-data-merged\") pod \"octavia-api-556fc55b45-7gxcm\" (UID: \"c95e6fdb-6007-4490-9572-a2709f8b7daf\") " pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.685572 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/c95e6fdb-6007-4490-9572-a2709f8b7daf-octavia-run\") pod \"octavia-api-556fc55b45-7gxcm\" (UID: \"c95e6fdb-6007-4490-9572-a2709f8b7daf\") " pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.690235 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c95e6fdb-6007-4490-9572-a2709f8b7daf-combined-ca-bundle\") pod \"octavia-api-556fc55b45-7gxcm\" (UID: \"c95e6fdb-6007-4490-9572-a2709f8b7daf\") " pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.691070 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c95e6fdb-6007-4490-9572-a2709f8b7daf-config-data\") pod \"octavia-api-556fc55b45-7gxcm\" (UID: \"c95e6fdb-6007-4490-9572-a2709f8b7daf\") " pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.692916 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c95e6fdb-6007-4490-9572-a2709f8b7daf-scripts\") pod \"octavia-api-556fc55b45-7gxcm\" (UID: \"c95e6fdb-6007-4490-9572-a2709f8b7daf\") " pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.785824 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:53 crc kubenswrapper[4795]: I0219 23:02:53.349521 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-556fc55b45-7gxcm"] Feb 19 23:02:54 crc kubenswrapper[4795]: I0219 23:02:54.105022 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-556fc55b45-7gxcm" event={"ID":"c95e6fdb-6007-4490-9572-a2709f8b7daf","Type":"ContainerStarted","Data":"758dffc4eee1d177de011dc64f85b5ca149775230838f4cfd16f8f6799f68752"} Feb 19 23:02:54 crc kubenswrapper[4795]: I0219 23:02:54.511983 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:02:54 crc kubenswrapper[4795]: E0219 23:02:54.512354 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:02:58 crc kubenswrapper[4795]: I0219 23:02:58.953688 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-knqfl" podUID="3bbc323f-3f18-42bc-b0d8-12f021d91d6b" containerName="ovn-controller" probeResult="failure" output=< Feb 19 23:02:58 crc kubenswrapper[4795]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 19 23:02:58 crc kubenswrapper[4795]: > Feb 19 23:02:58 crc kubenswrapper[4795]: I0219 23:02:58.960848 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:58 crc kubenswrapper[4795]: I0219 23:02:58.964244 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.094223 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-knqfl-config-qq79f"] Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.097985 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.101529 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.103091 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d710e8ab-01d7-4137-872b-71a05ac52188-scripts\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.103131 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-726pc\" (UniqueName: \"kubernetes.io/projected/d710e8ab-01d7-4137-872b-71a05ac52188-kube-api-access-726pc\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.103175 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-run\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.103192 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-run-ovn\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.103237 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-log-ovn\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.103291 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d710e8ab-01d7-4137-872b-71a05ac52188-additional-scripts\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.120869 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-knqfl-config-qq79f"] Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.207137 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-log-ovn\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.207285 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d710e8ab-01d7-4137-872b-71a05ac52188-additional-scripts\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.207416 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d710e8ab-01d7-4137-872b-71a05ac52188-scripts\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.207443 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-726pc\" (UniqueName: \"kubernetes.io/projected/d710e8ab-01d7-4137-872b-71a05ac52188-kube-api-access-726pc\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.207484 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-run\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.207507 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-run-ovn\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.208662 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-log-ovn\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.209145 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-run\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.209213 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-run-ovn\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.211893 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d710e8ab-01d7-4137-872b-71a05ac52188-additional-scripts\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.213252 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d710e8ab-01d7-4137-872b-71a05ac52188-scripts\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.234687 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-726pc\" (UniqueName: \"kubernetes.io/projected/d710e8ab-01d7-4137-872b-71a05ac52188-kube-api-access-726pc\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.423719 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:03:01 crc kubenswrapper[4795]: I0219 23:03:01.536138 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-knqfl-config-qq79f"] Feb 19 23:03:02 crc kubenswrapper[4795]: I0219 23:03:02.195456 4795 generic.go:334] "Generic (PLEG): container finished" podID="c95e6fdb-6007-4490-9572-a2709f8b7daf" containerID="69b94ec375777caa88f05328a06535b8d3ebce689a7af410b0f69a119ac02efd" exitCode=0 Feb 19 23:03:02 crc kubenswrapper[4795]: I0219 23:03:02.195521 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-556fc55b45-7gxcm" event={"ID":"c95e6fdb-6007-4490-9572-a2709f8b7daf","Type":"ContainerDied","Data":"69b94ec375777caa88f05328a06535b8d3ebce689a7af410b0f69a119ac02efd"} Feb 19 23:03:02 crc kubenswrapper[4795]: I0219 23:03:02.200233 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-knqfl-config-qq79f" event={"ID":"d710e8ab-01d7-4137-872b-71a05ac52188","Type":"ContainerStarted","Data":"2096e22df9aade6dc70570abe24d3dbe208045474ec5189f56a5d1beac2fa739"} Feb 19 23:03:02 crc kubenswrapper[4795]: I0219 23:03:02.200300 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-knqfl-config-qq79f" event={"ID":"d710e8ab-01d7-4137-872b-71a05ac52188","Type":"ContainerStarted","Data":"7a8fcb62e8aafde30d2ad214505dba921e7102800f504cadda4d53ca44430291"} Feb 19 23:03:02 crc kubenswrapper[4795]: I0219 23:03:02.260284 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-knqfl-config-qq79f" podStartSLOduration=3.260262863 podStartE2EDuration="3.260262863s" podCreationTimestamp="2026-02-19 23:02:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:03:02.258453631 +0000 UTC m=+5693.450971505" watchObservedRunningTime="2026-02-19 23:03:02.260262863 +0000 UTC m=+5693.452780737" Feb 19 23:03:03 crc kubenswrapper[4795]: I0219 23:03:03.210254 4795 generic.go:334] "Generic (PLEG): container finished" podID="d710e8ab-01d7-4137-872b-71a05ac52188" containerID="2096e22df9aade6dc70570abe24d3dbe208045474ec5189f56a5d1beac2fa739" exitCode=0 Feb 19 23:03:03 crc kubenswrapper[4795]: I0219 23:03:03.210445 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-knqfl-config-qq79f" event={"ID":"d710e8ab-01d7-4137-872b-71a05ac52188","Type":"ContainerDied","Data":"2096e22df9aade6dc70570abe24d3dbe208045474ec5189f56a5d1beac2fa739"} Feb 19 23:03:03 crc kubenswrapper[4795]: I0219 23:03:03.213126 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-556fc55b45-7gxcm" event={"ID":"c95e6fdb-6007-4490-9572-a2709f8b7daf","Type":"ContainerStarted","Data":"a54f0ee6b3807ca88881dac5abb58702b27c13ccaa0b8c0ff4269b867d42f5d8"} Feb 19 23:03:03 crc kubenswrapper[4795]: I0219 23:03:03.213176 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-556fc55b45-7gxcm" event={"ID":"c95e6fdb-6007-4490-9572-a2709f8b7daf","Type":"ContainerStarted","Data":"ac78542103643657605fe59b2edbcd40f711e2b924559ba9abeea6a866861ecc"} Feb 19 23:03:03 crc kubenswrapper[4795]: I0219 23:03:03.214060 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:03:03 crc kubenswrapper[4795]: I0219 23:03:03.214083 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:03:03 crc kubenswrapper[4795]: I0219 23:03:03.257928 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-556fc55b45-7gxcm" podStartSLOduration=3.523069343 podStartE2EDuration="11.257902747s" podCreationTimestamp="2026-02-19 23:02:52 +0000 UTC" firstStartedPulling="2026-02-19 23:02:53.356058605 +0000 UTC m=+5684.548576469" lastFinishedPulling="2026-02-19 23:03:01.090892009 +0000 UTC m=+5692.283409873" observedRunningTime="2026-02-19 23:03:03.256748484 +0000 UTC m=+5694.449266348" watchObservedRunningTime="2026-02-19 23:03:03.257902747 +0000 UTC m=+5694.450420631" Feb 19 23:03:03 crc kubenswrapper[4795]: I0219 23:03:03.941383 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-knqfl" Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.597580 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.609224 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-run-ovn\") pod \"d710e8ab-01d7-4137-872b-71a05ac52188\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.609359 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "d710e8ab-01d7-4137-872b-71a05ac52188" (UID: "d710e8ab-01d7-4137-872b-71a05ac52188"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.609390 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d710e8ab-01d7-4137-872b-71a05ac52188-scripts\") pod \"d710e8ab-01d7-4137-872b-71a05ac52188\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.609539 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-run\") pod \"d710e8ab-01d7-4137-872b-71a05ac52188\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.609651 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-run" (OuterVolumeSpecName: "var-run") pod "d710e8ab-01d7-4137-872b-71a05ac52188" (UID: "d710e8ab-01d7-4137-872b-71a05ac52188"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.609671 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-726pc\" (UniqueName: \"kubernetes.io/projected/d710e8ab-01d7-4137-872b-71a05ac52188-kube-api-access-726pc\") pod \"d710e8ab-01d7-4137-872b-71a05ac52188\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.609713 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d710e8ab-01d7-4137-872b-71a05ac52188-additional-scripts\") pod \"d710e8ab-01d7-4137-872b-71a05ac52188\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.609745 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-log-ovn\") pod \"d710e8ab-01d7-4137-872b-71a05ac52188\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.609862 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "d710e8ab-01d7-4137-872b-71a05ac52188" (UID: "d710e8ab-01d7-4137-872b-71a05ac52188"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.610212 4795 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.610229 4795 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.610240 4795 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-run\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.610388 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d710e8ab-01d7-4137-872b-71a05ac52188-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "d710e8ab-01d7-4137-872b-71a05ac52188" (UID: "d710e8ab-01d7-4137-872b-71a05ac52188"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.610663 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d710e8ab-01d7-4137-872b-71a05ac52188-scripts" (OuterVolumeSpecName: "scripts") pod "d710e8ab-01d7-4137-872b-71a05ac52188" (UID: "d710e8ab-01d7-4137-872b-71a05ac52188"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.648126 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d710e8ab-01d7-4137-872b-71a05ac52188-kube-api-access-726pc" (OuterVolumeSpecName: "kube-api-access-726pc") pod "d710e8ab-01d7-4137-872b-71a05ac52188" (UID: "d710e8ab-01d7-4137-872b-71a05ac52188"). InnerVolumeSpecName "kube-api-access-726pc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.713257 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-726pc\" (UniqueName: \"kubernetes.io/projected/d710e8ab-01d7-4137-872b-71a05ac52188-kube-api-access-726pc\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.713331 4795 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d710e8ab-01d7-4137-872b-71a05ac52188-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.713344 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d710e8ab-01d7-4137-872b-71a05ac52188-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.230653 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.230846 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-knqfl-config-qq79f" event={"ID":"d710e8ab-01d7-4137-872b-71a05ac52188","Type":"ContainerDied","Data":"7a8fcb62e8aafde30d2ad214505dba921e7102800f504cadda4d53ca44430291"} Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.231054 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a8fcb62e8aafde30d2ad214505dba921e7102800f504cadda4d53ca44430291" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.347045 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-gzbmf"] Feb 19 23:03:05 crc kubenswrapper[4795]: E0219 23:03:05.347524 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d710e8ab-01d7-4137-872b-71a05ac52188" containerName="ovn-config" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.347541 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d710e8ab-01d7-4137-872b-71a05ac52188" containerName="ovn-config" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.347714 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d710e8ab-01d7-4137-872b-71a05ac52188" containerName="ovn-config" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.348626 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-gzbmf" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.352195 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.352366 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.352462 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.356987 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-gzbmf"] Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.430070 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/b3ffcac3-ee64-440c-983d-67404e5f47fd-config-data-merged\") pod \"octavia-rsyslog-gzbmf\" (UID: \"b3ffcac3-ee64-440c-983d-67404e5f47fd\") " pod="openstack/octavia-rsyslog-gzbmf" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.430175 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/b3ffcac3-ee64-440c-983d-67404e5f47fd-hm-ports\") pod \"octavia-rsyslog-gzbmf\" (UID: \"b3ffcac3-ee64-440c-983d-67404e5f47fd\") " pod="openstack/octavia-rsyslog-gzbmf" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.430228 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ffcac3-ee64-440c-983d-67404e5f47fd-config-data\") pod \"octavia-rsyslog-gzbmf\" (UID: \"b3ffcac3-ee64-440c-983d-67404e5f47fd\") " pod="openstack/octavia-rsyslog-gzbmf" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.430251 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3ffcac3-ee64-440c-983d-67404e5f47fd-scripts\") pod \"octavia-rsyslog-gzbmf\" (UID: \"b3ffcac3-ee64-440c-983d-67404e5f47fd\") " pod="openstack/octavia-rsyslog-gzbmf" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.530830 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/b3ffcac3-ee64-440c-983d-67404e5f47fd-hm-ports\") pod \"octavia-rsyslog-gzbmf\" (UID: \"b3ffcac3-ee64-440c-983d-67404e5f47fd\") " pod="openstack/octavia-rsyslog-gzbmf" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.530901 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ffcac3-ee64-440c-983d-67404e5f47fd-config-data\") pod \"octavia-rsyslog-gzbmf\" (UID: \"b3ffcac3-ee64-440c-983d-67404e5f47fd\") " pod="openstack/octavia-rsyslog-gzbmf" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.530930 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3ffcac3-ee64-440c-983d-67404e5f47fd-scripts\") pod \"octavia-rsyslog-gzbmf\" (UID: \"b3ffcac3-ee64-440c-983d-67404e5f47fd\") " pod="openstack/octavia-rsyslog-gzbmf" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.531032 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/b3ffcac3-ee64-440c-983d-67404e5f47fd-config-data-merged\") pod \"octavia-rsyslog-gzbmf\" (UID: \"b3ffcac3-ee64-440c-983d-67404e5f47fd\") " pod="openstack/octavia-rsyslog-gzbmf" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.531494 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/b3ffcac3-ee64-440c-983d-67404e5f47fd-config-data-merged\") pod \"octavia-rsyslog-gzbmf\" (UID: \"b3ffcac3-ee64-440c-983d-67404e5f47fd\") " pod="openstack/octavia-rsyslog-gzbmf" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.532013 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/b3ffcac3-ee64-440c-983d-67404e5f47fd-hm-ports\") pod \"octavia-rsyslog-gzbmf\" (UID: \"b3ffcac3-ee64-440c-983d-67404e5f47fd\") " pod="openstack/octavia-rsyslog-gzbmf" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.536020 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ffcac3-ee64-440c-983d-67404e5f47fd-config-data\") pod \"octavia-rsyslog-gzbmf\" (UID: \"b3ffcac3-ee64-440c-983d-67404e5f47fd\") " pod="openstack/octavia-rsyslog-gzbmf" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.540306 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3ffcac3-ee64-440c-983d-67404e5f47fd-scripts\") pod \"octavia-rsyslog-gzbmf\" (UID: \"b3ffcac3-ee64-440c-983d-67404e5f47fd\") " pod="openstack/octavia-rsyslog-gzbmf" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.665506 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-gzbmf" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.697437 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-knqfl-config-qq79f"] Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.706684 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-knqfl-config-qq79f"] Feb 19 23:03:06 crc kubenswrapper[4795]: I0219 23:03:06.374934 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-8d4564f8f-pj6cx"] Feb 19 23:03:06 crc kubenswrapper[4795]: I0219 23:03:06.380903 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" Feb 19 23:03:06 crc kubenswrapper[4795]: I0219 23:03:06.384324 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Feb 19 23:03:06 crc kubenswrapper[4795]: I0219 23:03:06.386448 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-8d4564f8f-pj6cx"] Feb 19 23:03:06 crc kubenswrapper[4795]: I0219 23:03:06.452267 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/04efa30d-5580-4301-8a36-b452e949fcd3-httpd-config\") pod \"octavia-image-upload-8d4564f8f-pj6cx\" (UID: \"04efa30d-5580-4301-8a36-b452e949fcd3\") " pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" Feb 19 23:03:06 crc kubenswrapper[4795]: I0219 23:03:06.452318 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/04efa30d-5580-4301-8a36-b452e949fcd3-amphora-image\") pod \"octavia-image-upload-8d4564f8f-pj6cx\" (UID: \"04efa30d-5580-4301-8a36-b452e949fcd3\") " pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" Feb 19 23:03:06 crc kubenswrapper[4795]: I0219 23:03:06.483391 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-gzbmf"] Feb 19 23:03:06 crc kubenswrapper[4795]: I0219 23:03:06.512015 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:03:06 crc kubenswrapper[4795]: E0219 23:03:06.512238 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:03:06 crc kubenswrapper[4795]: I0219 23:03:06.553589 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/04efa30d-5580-4301-8a36-b452e949fcd3-httpd-config\") pod \"octavia-image-upload-8d4564f8f-pj6cx\" (UID: \"04efa30d-5580-4301-8a36-b452e949fcd3\") " pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" Feb 19 23:03:06 crc kubenswrapper[4795]: I0219 23:03:06.553649 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/04efa30d-5580-4301-8a36-b452e949fcd3-amphora-image\") pod \"octavia-image-upload-8d4564f8f-pj6cx\" (UID: \"04efa30d-5580-4301-8a36-b452e949fcd3\") " pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" Feb 19 23:03:06 crc kubenswrapper[4795]: I0219 23:03:06.555424 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/04efa30d-5580-4301-8a36-b452e949fcd3-amphora-image\") pod \"octavia-image-upload-8d4564f8f-pj6cx\" (UID: \"04efa30d-5580-4301-8a36-b452e949fcd3\") " pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" Feb 19 23:03:06 crc kubenswrapper[4795]: I0219 23:03:06.567456 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/04efa30d-5580-4301-8a36-b452e949fcd3-httpd-config\") pod \"octavia-image-upload-8d4564f8f-pj6cx\" (UID: \"04efa30d-5580-4301-8a36-b452e949fcd3\") " pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" Feb 19 23:03:06 crc kubenswrapper[4795]: I0219 23:03:06.721015 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.221268 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-8d4564f8f-pj6cx"] Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.262337 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-gzbmf" event={"ID":"b3ffcac3-ee64-440c-983d-67404e5f47fd","Type":"ContainerStarted","Data":"12cac13ec5fb25a6171cc270755d648f8163cc8154f582f842720ae0015fbed7"} Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.274031 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" event={"ID":"04efa30d-5580-4301-8a36-b452e949fcd3","Type":"ContainerStarted","Data":"9d50e6342b96a4c35e1fe26f38816a55271a38a4cb1281be352ab5ae7f179533"} Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.288523 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-z8cdz"] Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.290371 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-z8cdz" Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.301027 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.317780 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-z8cdz"] Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.473624 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-combined-ca-bundle\") pod \"octavia-db-sync-z8cdz\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " pod="openstack/octavia-db-sync-z8cdz" Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.473696 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-scripts\") pod \"octavia-db-sync-z8cdz\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " pod="openstack/octavia-db-sync-z8cdz" Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.473816 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-config-data\") pod \"octavia-db-sync-z8cdz\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " pod="openstack/octavia-db-sync-z8cdz" Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.473849 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c7895d70-3c78-4913-9028-75797e6e1dbd-config-data-merged\") pod \"octavia-db-sync-z8cdz\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " pod="openstack/octavia-db-sync-z8cdz" Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.526324 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d710e8ab-01d7-4137-872b-71a05ac52188" path="/var/lib/kubelet/pods/d710e8ab-01d7-4137-872b-71a05ac52188/volumes" Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.527145 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-gzbmf"] Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.575945 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-config-data\") pod \"octavia-db-sync-z8cdz\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " pod="openstack/octavia-db-sync-z8cdz" Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.576009 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c7895d70-3c78-4913-9028-75797e6e1dbd-config-data-merged\") pod \"octavia-db-sync-z8cdz\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " pod="openstack/octavia-db-sync-z8cdz" Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.576112 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-combined-ca-bundle\") pod \"octavia-db-sync-z8cdz\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " pod="openstack/octavia-db-sync-z8cdz" Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.576156 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-scripts\") pod \"octavia-db-sync-z8cdz\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " pod="openstack/octavia-db-sync-z8cdz" Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.578426 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c7895d70-3c78-4913-9028-75797e6e1dbd-config-data-merged\") pod \"octavia-db-sync-z8cdz\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " pod="openstack/octavia-db-sync-z8cdz" Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.583213 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-scripts\") pod \"octavia-db-sync-z8cdz\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " pod="openstack/octavia-db-sync-z8cdz" Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.584110 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-config-data\") pod \"octavia-db-sync-z8cdz\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " pod="openstack/octavia-db-sync-z8cdz" Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.586289 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-combined-ca-bundle\") pod \"octavia-db-sync-z8cdz\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " pod="openstack/octavia-db-sync-z8cdz" Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.619046 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-z8cdz" Feb 19 23:03:08 crc kubenswrapper[4795]: I0219 23:03:08.115525 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-z8cdz"] Feb 19 23:03:08 crc kubenswrapper[4795]: I0219 23:03:08.282720 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-z8cdz" event={"ID":"c7895d70-3c78-4913-9028-75797e6e1dbd","Type":"ContainerStarted","Data":"14cb79bdd1caf585f83eec9d0981b74eaae9b5dab8922c52ff8baeedb37030f3"} Feb 19 23:03:09 crc kubenswrapper[4795]: I0219 23:03:09.321014 4795 generic.go:334] "Generic (PLEG): container finished" podID="c7895d70-3c78-4913-9028-75797e6e1dbd" containerID="700b20fbdb16cdb721d65035d59d19827c546e867ba32ed9f351235ca4bc0246" exitCode=0 Feb 19 23:03:09 crc kubenswrapper[4795]: I0219 23:03:09.321555 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-z8cdz" event={"ID":"c7895d70-3c78-4913-9028-75797e6e1dbd","Type":"ContainerDied","Data":"700b20fbdb16cdb721d65035d59d19827c546e867ba32ed9f351235ca4bc0246"} Feb 19 23:03:10 crc kubenswrapper[4795]: I0219 23:03:10.332526 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-gzbmf" event={"ID":"b3ffcac3-ee64-440c-983d-67404e5f47fd","Type":"ContainerStarted","Data":"4b70afb267e5c9a99026ae17ebb1f8309c23e1b1e7119d4584335d427e2d81e0"} Feb 19 23:03:10 crc kubenswrapper[4795]: I0219 23:03:10.335494 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-z8cdz" event={"ID":"c7895d70-3c78-4913-9028-75797e6e1dbd","Type":"ContainerStarted","Data":"d4c8e37b4453cd9ee00d52e7178d48381bffc05905e787ed7678728d6f9cc0ef"} Feb 19 23:03:10 crc kubenswrapper[4795]: I0219 23:03:10.371746 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-z8cdz" podStartSLOduration=3.37172876 podStartE2EDuration="3.37172876s" podCreationTimestamp="2026-02-19 23:03:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:03:10.368096687 +0000 UTC m=+5701.560614571" watchObservedRunningTime="2026-02-19 23:03:10.37172876 +0000 UTC m=+5701.564246624" Feb 19 23:03:12 crc kubenswrapper[4795]: I0219 23:03:12.167622 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:03:12 crc kubenswrapper[4795]: I0219 23:03:12.308180 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:03:12 crc kubenswrapper[4795]: I0219 23:03:12.366464 4795 generic.go:334] "Generic (PLEG): container finished" podID="b3ffcac3-ee64-440c-983d-67404e5f47fd" containerID="4b70afb267e5c9a99026ae17ebb1f8309c23e1b1e7119d4584335d427e2d81e0" exitCode=0 Feb 19 23:03:12 crc kubenswrapper[4795]: I0219 23:03:12.368047 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-gzbmf" event={"ID":"b3ffcac3-ee64-440c-983d-67404e5f47fd","Type":"ContainerDied","Data":"4b70afb267e5c9a99026ae17ebb1f8309c23e1b1e7119d4584335d427e2d81e0"} Feb 19 23:03:13 crc kubenswrapper[4795]: I0219 23:03:13.378291 4795 generic.go:334] "Generic (PLEG): container finished" podID="c7895d70-3c78-4913-9028-75797e6e1dbd" containerID="d4c8e37b4453cd9ee00d52e7178d48381bffc05905e787ed7678728d6f9cc0ef" exitCode=0 Feb 19 23:03:13 crc kubenswrapper[4795]: I0219 23:03:13.378360 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-z8cdz" event={"ID":"c7895d70-3c78-4913-9028-75797e6e1dbd","Type":"ContainerDied","Data":"d4c8e37b4453cd9ee00d52e7178d48381bffc05905e787ed7678728d6f9cc0ef"} Feb 19 23:03:15 crc kubenswrapper[4795]: I0219 23:03:15.677484 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-z8cdz" Feb 19 23:03:15 crc kubenswrapper[4795]: I0219 23:03:15.864358 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-config-data\") pod \"c7895d70-3c78-4913-9028-75797e6e1dbd\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " Feb 19 23:03:15 crc kubenswrapper[4795]: I0219 23:03:15.864668 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-combined-ca-bundle\") pod \"c7895d70-3c78-4913-9028-75797e6e1dbd\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " Feb 19 23:03:15 crc kubenswrapper[4795]: I0219 23:03:15.864772 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c7895d70-3c78-4913-9028-75797e6e1dbd-config-data-merged\") pod \"c7895d70-3c78-4913-9028-75797e6e1dbd\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " Feb 19 23:03:15 crc kubenswrapper[4795]: I0219 23:03:15.864843 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-scripts\") pod \"c7895d70-3c78-4913-9028-75797e6e1dbd\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " Feb 19 23:03:15 crc kubenswrapper[4795]: I0219 23:03:15.872931 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-scripts" (OuterVolumeSpecName: "scripts") pod "c7895d70-3c78-4913-9028-75797e6e1dbd" (UID: "c7895d70-3c78-4913-9028-75797e6e1dbd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:03:15 crc kubenswrapper[4795]: I0219 23:03:15.925352 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-config-data" (OuterVolumeSpecName: "config-data") pod "c7895d70-3c78-4913-9028-75797e6e1dbd" (UID: "c7895d70-3c78-4913-9028-75797e6e1dbd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:03:15 crc kubenswrapper[4795]: E0219 23:03:15.926096 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/empty-dir/c7895d70-3c78-4913-9028-75797e6e1dbd-config-data-merged podName:c7895d70-3c78-4913-9028-75797e6e1dbd nodeName:}" failed. No retries permitted until 2026-02-19 23:03:16.426063351 +0000 UTC m=+5707.618581225 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data-merged" (UniqueName: "kubernetes.io/empty-dir/c7895d70-3c78-4913-9028-75797e6e1dbd-config-data-merged") pod "c7895d70-3c78-4913-9028-75797e6e1dbd" (UID: "c7895d70-3c78-4913-9028-75797e6e1dbd") : error deleting /var/lib/kubelet/pods/c7895d70-3c78-4913-9028-75797e6e1dbd/volume-subpaths: remove /var/lib/kubelet/pods/c7895d70-3c78-4913-9028-75797e6e1dbd/volume-subpaths: no such file or directory Feb 19 23:03:15 crc kubenswrapper[4795]: I0219 23:03:15.929974 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7895d70-3c78-4913-9028-75797e6e1dbd" (UID: "c7895d70-3c78-4913-9028-75797e6e1dbd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:03:15 crc kubenswrapper[4795]: I0219 23:03:15.968653 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:15 crc kubenswrapper[4795]: I0219 23:03:15.968696 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:15 crc kubenswrapper[4795]: I0219 23:03:15.968705 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:16 crc kubenswrapper[4795]: I0219 23:03:16.262925 4795 scope.go:117] "RemoveContainer" containerID="2119979e9a0784ca3835a55937d4bf0a118f5016d04e68e3c74e02efc1b7df06" Feb 19 23:03:16 crc kubenswrapper[4795]: I0219 23:03:16.344917 4795 scope.go:117] "RemoveContainer" containerID="000fa8f747858d495d2c6d2c850ff5f2717d6c22150f68c950eeae0bca0bd7f7" Feb 19 23:03:16 crc kubenswrapper[4795]: I0219 23:03:16.388541 4795 scope.go:117] "RemoveContainer" containerID="f711a9de7781799454b8f0272a2636f2533360c8ba6ce3a134936f4cf9908d61" Feb 19 23:03:16 crc kubenswrapper[4795]: I0219 23:03:16.411544 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-z8cdz" event={"ID":"c7895d70-3c78-4913-9028-75797e6e1dbd","Type":"ContainerDied","Data":"14cb79bdd1caf585f83eec9d0981b74eaae9b5dab8922c52ff8baeedb37030f3"} Feb 19 23:03:16 crc kubenswrapper[4795]: I0219 23:03:16.411580 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14cb79bdd1caf585f83eec9d0981b74eaae9b5dab8922c52ff8baeedb37030f3" Feb 19 23:03:16 crc kubenswrapper[4795]: I0219 23:03:16.411592 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-z8cdz" Feb 19 23:03:16 crc kubenswrapper[4795]: I0219 23:03:16.420583 4795 scope.go:117] "RemoveContainer" containerID="e2855b43837d8ca73ce1e28756facf9b648ea7b22ec3a7d6bc132133d332b880" Feb 19 23:03:16 crc kubenswrapper[4795]: I0219 23:03:16.477250 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c7895d70-3c78-4913-9028-75797e6e1dbd-config-data-merged\") pod \"c7895d70-3c78-4913-9028-75797e6e1dbd\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " Feb 19 23:03:16 crc kubenswrapper[4795]: I0219 23:03:16.478201 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7895d70-3c78-4913-9028-75797e6e1dbd-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "c7895d70-3c78-4913-9028-75797e6e1dbd" (UID: "c7895d70-3c78-4913-9028-75797e6e1dbd"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:03:16 crc kubenswrapper[4795]: I0219 23:03:16.582604 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c7895d70-3c78-4913-9028-75797e6e1dbd-config-data-merged\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:17 crc kubenswrapper[4795]: I0219 23:03:17.425034 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-gzbmf" event={"ID":"b3ffcac3-ee64-440c-983d-67404e5f47fd","Type":"ContainerStarted","Data":"1bfff6d4fd95fd8d7fa708823512b19ab2b9d4cb0dd8429c0f1342f6642f76b3"} Feb 19 23:03:17 crc kubenswrapper[4795]: I0219 23:03:17.425525 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-gzbmf" Feb 19 23:03:17 crc kubenswrapper[4795]: I0219 23:03:17.427981 4795 generic.go:334] "Generic (PLEG): container finished" podID="04efa30d-5580-4301-8a36-b452e949fcd3" containerID="17e566856695530e17e7832d511ae36f5a1fc5d25825457c5e350446cc676cef" exitCode=0 Feb 19 23:03:17 crc kubenswrapper[4795]: I0219 23:03:17.428034 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" event={"ID":"04efa30d-5580-4301-8a36-b452e949fcd3","Type":"ContainerDied","Data":"17e566856695530e17e7832d511ae36f5a1fc5d25825457c5e350446cc676cef"} Feb 19 23:03:17 crc kubenswrapper[4795]: I0219 23:03:17.447365 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-gzbmf" podStartSLOduration=2.6410139409999998 podStartE2EDuration="12.447336978s" podCreationTimestamp="2026-02-19 23:03:05 +0000 UTC" firstStartedPulling="2026-02-19 23:03:06.494408442 +0000 UTC m=+5697.686926306" lastFinishedPulling="2026-02-19 23:03:16.300731479 +0000 UTC m=+5707.493249343" observedRunningTime="2026-02-19 23:03:17.441771639 +0000 UTC m=+5708.634289503" watchObservedRunningTime="2026-02-19 23:03:17.447336978 +0000 UTC m=+5708.639854862" Feb 19 23:03:17 crc kubenswrapper[4795]: I0219 23:03:17.512666 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:03:17 crc kubenswrapper[4795]: E0219 23:03:17.512943 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:03:18 crc kubenswrapper[4795]: I0219 23:03:18.440873 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" event={"ID":"04efa30d-5580-4301-8a36-b452e949fcd3","Type":"ContainerStarted","Data":"a3c2b90372f120a7eb061548f48c5800cdadbafec3663c23b6dd97d05212f21d"} Feb 19 23:03:18 crc kubenswrapper[4795]: I0219 23:03:18.458507 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" podStartSLOduration=3.240983285 podStartE2EDuration="12.457989822s" podCreationTimestamp="2026-02-19 23:03:06 +0000 UTC" firstStartedPulling="2026-02-19 23:03:07.244651616 +0000 UTC m=+5698.437169480" lastFinishedPulling="2026-02-19 23:03:16.461658153 +0000 UTC m=+5707.654176017" observedRunningTime="2026-02-19 23:03:18.453906386 +0000 UTC m=+5709.646424250" watchObservedRunningTime="2026-02-19 23:03:18.457989822 +0000 UTC m=+5709.650507706" Feb 19 23:03:31 crc kubenswrapper[4795]: I0219 23:03:31.514669 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:03:31 crc kubenswrapper[4795]: E0219 23:03:31.517581 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:03:35 crc kubenswrapper[4795]: I0219 23:03:35.702400 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-gzbmf" Feb 19 23:03:39 crc kubenswrapper[4795]: I0219 23:03:39.565936 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-8d4564f8f-pj6cx"] Feb 19 23:03:39 crc kubenswrapper[4795]: I0219 23:03:39.566726 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" podUID="04efa30d-5580-4301-8a36-b452e949fcd3" containerName="octavia-amphora-httpd" containerID="cri-o://a3c2b90372f120a7eb061548f48c5800cdadbafec3663c23b6dd97d05212f21d" gracePeriod=30 Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.165790 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.169320 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/04efa30d-5580-4301-8a36-b452e949fcd3-amphora-image\") pod \"04efa30d-5580-4301-8a36-b452e949fcd3\" (UID: \"04efa30d-5580-4301-8a36-b452e949fcd3\") " Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.169477 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/04efa30d-5580-4301-8a36-b452e949fcd3-httpd-config\") pod \"04efa30d-5580-4301-8a36-b452e949fcd3\" (UID: \"04efa30d-5580-4301-8a36-b452e949fcd3\") " Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.218527 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04efa30d-5580-4301-8a36-b452e949fcd3-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "04efa30d-5580-4301-8a36-b452e949fcd3" (UID: "04efa30d-5580-4301-8a36-b452e949fcd3"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.222575 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04efa30d-5580-4301-8a36-b452e949fcd3-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "04efa30d-5580-4301-8a36-b452e949fcd3" (UID: "04efa30d-5580-4301-8a36-b452e949fcd3"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.272095 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/04efa30d-5580-4301-8a36-b452e949fcd3-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.272130 4795 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/04efa30d-5580-4301-8a36-b452e949fcd3-amphora-image\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.643954 4795 generic.go:334] "Generic (PLEG): container finished" podID="04efa30d-5580-4301-8a36-b452e949fcd3" containerID="a3c2b90372f120a7eb061548f48c5800cdadbafec3663c23b6dd97d05212f21d" exitCode=0 Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.644019 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.644039 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" event={"ID":"04efa30d-5580-4301-8a36-b452e949fcd3","Type":"ContainerDied","Data":"a3c2b90372f120a7eb061548f48c5800cdadbafec3663c23b6dd97d05212f21d"} Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.646428 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" event={"ID":"04efa30d-5580-4301-8a36-b452e949fcd3","Type":"ContainerDied","Data":"9d50e6342b96a4c35e1fe26f38816a55271a38a4cb1281be352ab5ae7f179533"} Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.646450 4795 scope.go:117] "RemoveContainer" containerID="a3c2b90372f120a7eb061548f48c5800cdadbafec3663c23b6dd97d05212f21d" Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.672922 4795 scope.go:117] "RemoveContainer" containerID="17e566856695530e17e7832d511ae36f5a1fc5d25825457c5e350446cc676cef" Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.680056 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-8d4564f8f-pj6cx"] Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.689218 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-8d4564f8f-pj6cx"] Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.702489 4795 scope.go:117] "RemoveContainer" containerID="a3c2b90372f120a7eb061548f48c5800cdadbafec3663c23b6dd97d05212f21d" Feb 19 23:03:40 crc kubenswrapper[4795]: E0219 23:03:40.703139 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3c2b90372f120a7eb061548f48c5800cdadbafec3663c23b6dd97d05212f21d\": container with ID starting with a3c2b90372f120a7eb061548f48c5800cdadbafec3663c23b6dd97d05212f21d not found: ID does not exist" containerID="a3c2b90372f120a7eb061548f48c5800cdadbafec3663c23b6dd97d05212f21d" Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.703202 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3c2b90372f120a7eb061548f48c5800cdadbafec3663c23b6dd97d05212f21d"} err="failed to get container status \"a3c2b90372f120a7eb061548f48c5800cdadbafec3663c23b6dd97d05212f21d\": rpc error: code = NotFound desc = could not find container \"a3c2b90372f120a7eb061548f48c5800cdadbafec3663c23b6dd97d05212f21d\": container with ID starting with a3c2b90372f120a7eb061548f48c5800cdadbafec3663c23b6dd97d05212f21d not found: ID does not exist" Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.703230 4795 scope.go:117] "RemoveContainer" containerID="17e566856695530e17e7832d511ae36f5a1fc5d25825457c5e350446cc676cef" Feb 19 23:03:40 crc kubenswrapper[4795]: E0219 23:03:40.703495 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17e566856695530e17e7832d511ae36f5a1fc5d25825457c5e350446cc676cef\": container with ID starting with 17e566856695530e17e7832d511ae36f5a1fc5d25825457c5e350446cc676cef not found: ID does not exist" containerID="17e566856695530e17e7832d511ae36f5a1fc5d25825457c5e350446cc676cef" Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.703531 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17e566856695530e17e7832d511ae36f5a1fc5d25825457c5e350446cc676cef"} err="failed to get container status \"17e566856695530e17e7832d511ae36f5a1fc5d25825457c5e350446cc676cef\": rpc error: code = NotFound desc = could not find container \"17e566856695530e17e7832d511ae36f5a1fc5d25825457c5e350446cc676cef\": container with ID starting with 17e566856695530e17e7832d511ae36f5a1fc5d25825457c5e350446cc676cef not found: ID does not exist" Feb 19 23:03:41 crc kubenswrapper[4795]: I0219 23:03:41.528326 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04efa30d-5580-4301-8a36-b452e949fcd3" path="/var/lib/kubelet/pods/04efa30d-5580-4301-8a36-b452e949fcd3/volumes" Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.511194 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-8d4564f8f-vp2hp"] Feb 19 23:03:42 crc kubenswrapper[4795]: E0219 23:03:42.511734 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04efa30d-5580-4301-8a36-b452e949fcd3" containerName="octavia-amphora-httpd" Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.511750 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="04efa30d-5580-4301-8a36-b452e949fcd3" containerName="octavia-amphora-httpd" Feb 19 23:03:42 crc kubenswrapper[4795]: E0219 23:03:42.511777 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7895d70-3c78-4913-9028-75797e6e1dbd" containerName="init" Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.511787 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7895d70-3c78-4913-9028-75797e6e1dbd" containerName="init" Feb 19 23:03:42 crc kubenswrapper[4795]: E0219 23:03:42.511809 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7895d70-3c78-4913-9028-75797e6e1dbd" containerName="octavia-db-sync" Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.511818 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7895d70-3c78-4913-9028-75797e6e1dbd" containerName="octavia-db-sync" Feb 19 23:03:42 crc kubenswrapper[4795]: E0219 23:03:42.511838 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04efa30d-5580-4301-8a36-b452e949fcd3" containerName="init" Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.511847 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="04efa30d-5580-4301-8a36-b452e949fcd3" containerName="init" Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.512066 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="04efa30d-5580-4301-8a36-b452e949fcd3" containerName="octavia-amphora-httpd" Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.512079 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7895d70-3c78-4913-9028-75797e6e1dbd" containerName="octavia-db-sync" Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.514553 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-8d4564f8f-vp2hp" Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.517522 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.542836 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-8d4564f8f-vp2hp"] Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.617665 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/73c5ad0c-a7f2-414d-a1f8-041a807d82b9-amphora-image\") pod \"octavia-image-upload-8d4564f8f-vp2hp\" (UID: \"73c5ad0c-a7f2-414d-a1f8-041a807d82b9\") " pod="openstack/octavia-image-upload-8d4564f8f-vp2hp" Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.617757 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/73c5ad0c-a7f2-414d-a1f8-041a807d82b9-httpd-config\") pod \"octavia-image-upload-8d4564f8f-vp2hp\" (UID: \"73c5ad0c-a7f2-414d-a1f8-041a807d82b9\") " pod="openstack/octavia-image-upload-8d4564f8f-vp2hp" Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.722114 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/73c5ad0c-a7f2-414d-a1f8-041a807d82b9-amphora-image\") pod \"octavia-image-upload-8d4564f8f-vp2hp\" (UID: \"73c5ad0c-a7f2-414d-a1f8-041a807d82b9\") " pod="openstack/octavia-image-upload-8d4564f8f-vp2hp" Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.722199 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/73c5ad0c-a7f2-414d-a1f8-041a807d82b9-httpd-config\") pod \"octavia-image-upload-8d4564f8f-vp2hp\" (UID: \"73c5ad0c-a7f2-414d-a1f8-041a807d82b9\") " pod="openstack/octavia-image-upload-8d4564f8f-vp2hp" Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.722854 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/73c5ad0c-a7f2-414d-a1f8-041a807d82b9-amphora-image\") pod \"octavia-image-upload-8d4564f8f-vp2hp\" (UID: \"73c5ad0c-a7f2-414d-a1f8-041a807d82b9\") " pod="openstack/octavia-image-upload-8d4564f8f-vp2hp" Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.731536 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/73c5ad0c-a7f2-414d-a1f8-041a807d82b9-httpd-config\") pod \"octavia-image-upload-8d4564f8f-vp2hp\" (UID: \"73c5ad0c-a7f2-414d-a1f8-041a807d82b9\") " pod="openstack/octavia-image-upload-8d4564f8f-vp2hp" Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.854686 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-8d4564f8f-vp2hp" Feb 19 23:03:43 crc kubenswrapper[4795]: I0219 23:03:43.364147 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-8d4564f8f-vp2hp"] Feb 19 23:03:43 crc kubenswrapper[4795]: I0219 23:03:43.672329 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-8d4564f8f-vp2hp" event={"ID":"73c5ad0c-a7f2-414d-a1f8-041a807d82b9","Type":"ContainerStarted","Data":"3ce38c4982eed68631a5ca80791e20aaf553f425b0dd44ffa40018ca0b479c31"} Feb 19 23:03:44 crc kubenswrapper[4795]: I0219 23:03:44.511877 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:03:44 crc kubenswrapper[4795]: E0219 23:03:44.512660 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:03:44 crc kubenswrapper[4795]: I0219 23:03:44.685190 4795 generic.go:334] "Generic (PLEG): container finished" podID="73c5ad0c-a7f2-414d-a1f8-041a807d82b9" containerID="5dc328a328718ac247257d6a6628622c81badfd14393f0cf9cd608946e9727fe" exitCode=0 Feb 19 23:03:44 crc kubenswrapper[4795]: I0219 23:03:44.685256 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-8d4564f8f-vp2hp" event={"ID":"73c5ad0c-a7f2-414d-a1f8-041a807d82b9","Type":"ContainerDied","Data":"5dc328a328718ac247257d6a6628622c81badfd14393f0cf9cd608946e9727fe"} Feb 19 23:03:45 crc kubenswrapper[4795]: I0219 23:03:45.694888 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-8d4564f8f-vp2hp" event={"ID":"73c5ad0c-a7f2-414d-a1f8-041a807d82b9","Type":"ContainerStarted","Data":"a0a495de52f9506a01aa5e9822d5ed0280e10e8b48c5690a8d17ca946352315f"} Feb 19 23:03:50 crc kubenswrapper[4795]: I0219 23:03:50.940598 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-8d4564f8f-vp2hp" podStartSLOduration=8.519854308 podStartE2EDuration="8.940577975s" podCreationTimestamp="2026-02-19 23:03:42 +0000 UTC" firstStartedPulling="2026-02-19 23:03:43.364614037 +0000 UTC m=+5734.557131911" lastFinishedPulling="2026-02-19 23:03:43.785337714 +0000 UTC m=+5734.977855578" observedRunningTime="2026-02-19 23:03:45.720666068 +0000 UTC m=+5736.913183922" watchObservedRunningTime="2026-02-19 23:03:50.940577975 +0000 UTC m=+5742.133095849" Feb 19 23:03:50 crc kubenswrapper[4795]: I0219 23:03:50.944927 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ggcwd"] Feb 19 23:03:50 crc kubenswrapper[4795]: I0219 23:03:50.948651 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:03:50 crc kubenswrapper[4795]: I0219 23:03:50.958388 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ggcwd"] Feb 19 23:03:51 crc kubenswrapper[4795]: I0219 23:03:51.085461 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/604a8d08-540f-450f-ae5a-f627d2023851-utilities\") pod \"redhat-marketplace-ggcwd\" (UID: \"604a8d08-540f-450f-ae5a-f627d2023851\") " pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:03:51 crc kubenswrapper[4795]: I0219 23:03:51.085604 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/604a8d08-540f-450f-ae5a-f627d2023851-catalog-content\") pod \"redhat-marketplace-ggcwd\" (UID: \"604a8d08-540f-450f-ae5a-f627d2023851\") " pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:03:51 crc kubenswrapper[4795]: I0219 23:03:51.085629 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl9s7\" (UniqueName: \"kubernetes.io/projected/604a8d08-540f-450f-ae5a-f627d2023851-kube-api-access-dl9s7\") pod \"redhat-marketplace-ggcwd\" (UID: \"604a8d08-540f-450f-ae5a-f627d2023851\") " pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:03:51 crc kubenswrapper[4795]: I0219 23:03:51.187009 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/604a8d08-540f-450f-ae5a-f627d2023851-catalog-content\") pod \"redhat-marketplace-ggcwd\" (UID: \"604a8d08-540f-450f-ae5a-f627d2023851\") " pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:03:51 crc kubenswrapper[4795]: I0219 23:03:51.187061 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl9s7\" (UniqueName: \"kubernetes.io/projected/604a8d08-540f-450f-ae5a-f627d2023851-kube-api-access-dl9s7\") pod \"redhat-marketplace-ggcwd\" (UID: \"604a8d08-540f-450f-ae5a-f627d2023851\") " pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:03:51 crc kubenswrapper[4795]: I0219 23:03:51.187159 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/604a8d08-540f-450f-ae5a-f627d2023851-utilities\") pod \"redhat-marketplace-ggcwd\" (UID: \"604a8d08-540f-450f-ae5a-f627d2023851\") " pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:03:51 crc kubenswrapper[4795]: I0219 23:03:51.187558 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/604a8d08-540f-450f-ae5a-f627d2023851-catalog-content\") pod \"redhat-marketplace-ggcwd\" (UID: \"604a8d08-540f-450f-ae5a-f627d2023851\") " pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:03:51 crc kubenswrapper[4795]: I0219 23:03:51.187627 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/604a8d08-540f-450f-ae5a-f627d2023851-utilities\") pod \"redhat-marketplace-ggcwd\" (UID: \"604a8d08-540f-450f-ae5a-f627d2023851\") " pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:03:51 crc kubenswrapper[4795]: I0219 23:03:51.212634 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl9s7\" (UniqueName: \"kubernetes.io/projected/604a8d08-540f-450f-ae5a-f627d2023851-kube-api-access-dl9s7\") pod \"redhat-marketplace-ggcwd\" (UID: \"604a8d08-540f-450f-ae5a-f627d2023851\") " pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:03:51 crc kubenswrapper[4795]: I0219 23:03:51.285542 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:03:51 crc kubenswrapper[4795]: I0219 23:03:51.757356 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ggcwd"] Feb 19 23:03:52 crc kubenswrapper[4795]: I0219 23:03:52.774716 4795 generic.go:334] "Generic (PLEG): container finished" podID="604a8d08-540f-450f-ae5a-f627d2023851" containerID="d902470a19ebe7417abce2843271bb85008f8e47eb2f80f53409ec3c0c5d3d14" exitCode=0 Feb 19 23:03:52 crc kubenswrapper[4795]: I0219 23:03:52.774757 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ggcwd" event={"ID":"604a8d08-540f-450f-ae5a-f627d2023851","Type":"ContainerDied","Data":"d902470a19ebe7417abce2843271bb85008f8e47eb2f80f53409ec3c0c5d3d14"} Feb 19 23:03:52 crc kubenswrapper[4795]: I0219 23:03:52.774962 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ggcwd" event={"ID":"604a8d08-540f-450f-ae5a-f627d2023851","Type":"ContainerStarted","Data":"9d38bc5478406d8bb61b38691f880d15c257a58823105f014b66c85569ab6ae9"} Feb 19 23:03:53 crc kubenswrapper[4795]: I0219 23:03:53.784953 4795 generic.go:334] "Generic (PLEG): container finished" podID="604a8d08-540f-450f-ae5a-f627d2023851" containerID="11f02dc00e80c0fcdb142b7d1dea8181c76ed1f4ee204ff1b653f00c55c62c50" exitCode=0 Feb 19 23:03:53 crc kubenswrapper[4795]: I0219 23:03:53.785030 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ggcwd" event={"ID":"604a8d08-540f-450f-ae5a-f627d2023851","Type":"ContainerDied","Data":"11f02dc00e80c0fcdb142b7d1dea8181c76ed1f4ee204ff1b653f00c55c62c50"} Feb 19 23:03:54 crc kubenswrapper[4795]: I0219 23:03:54.800624 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ggcwd" event={"ID":"604a8d08-540f-450f-ae5a-f627d2023851","Type":"ContainerStarted","Data":"eef135ebd487d307fa54d1f6a09d89b69d4371f333a48251e8a62d22b2127115"} Feb 19 23:03:54 crc kubenswrapper[4795]: I0219 23:03:54.839085 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ggcwd" podStartSLOduration=3.400687603 podStartE2EDuration="4.839062664s" podCreationTimestamp="2026-02-19 23:03:50 +0000 UTC" firstStartedPulling="2026-02-19 23:03:52.777508922 +0000 UTC m=+5743.970026786" lastFinishedPulling="2026-02-19 23:03:54.215883993 +0000 UTC m=+5745.408401847" observedRunningTime="2026-02-19 23:03:54.82871624 +0000 UTC m=+5746.021234124" watchObservedRunningTime="2026-02-19 23:03:54.839062664 +0000 UTC m=+5746.031580548" Feb 19 23:03:56 crc kubenswrapper[4795]: I0219 23:03:56.511344 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:03:56 crc kubenswrapper[4795]: E0219 23:03:56.512807 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:04:01 crc kubenswrapper[4795]: I0219 23:04:01.289538 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:04:01 crc kubenswrapper[4795]: I0219 23:04:01.291232 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:04:01 crc kubenswrapper[4795]: I0219 23:04:01.353017 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:04:01 crc kubenswrapper[4795]: I0219 23:04:01.928493 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:04:02 crc kubenswrapper[4795]: I0219 23:04:02.007460 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ggcwd"] Feb 19 23:04:03 crc kubenswrapper[4795]: I0219 23:04:03.880315 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ggcwd" podUID="604a8d08-540f-450f-ae5a-f627d2023851" containerName="registry-server" containerID="cri-o://eef135ebd487d307fa54d1f6a09d89b69d4371f333a48251e8a62d22b2127115" gracePeriod=2 Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.355775 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.460647 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/604a8d08-540f-450f-ae5a-f627d2023851-catalog-content\") pod \"604a8d08-540f-450f-ae5a-f627d2023851\" (UID: \"604a8d08-540f-450f-ae5a-f627d2023851\") " Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.460865 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/604a8d08-540f-450f-ae5a-f627d2023851-utilities\") pod \"604a8d08-540f-450f-ae5a-f627d2023851\" (UID: \"604a8d08-540f-450f-ae5a-f627d2023851\") " Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.461056 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dl9s7\" (UniqueName: \"kubernetes.io/projected/604a8d08-540f-450f-ae5a-f627d2023851-kube-api-access-dl9s7\") pod \"604a8d08-540f-450f-ae5a-f627d2023851\" (UID: \"604a8d08-540f-450f-ae5a-f627d2023851\") " Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.461663 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/604a8d08-540f-450f-ae5a-f627d2023851-utilities" (OuterVolumeSpecName: "utilities") pod "604a8d08-540f-450f-ae5a-f627d2023851" (UID: "604a8d08-540f-450f-ae5a-f627d2023851"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.466772 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/604a8d08-540f-450f-ae5a-f627d2023851-kube-api-access-dl9s7" (OuterVolumeSpecName: "kube-api-access-dl9s7") pod "604a8d08-540f-450f-ae5a-f627d2023851" (UID: "604a8d08-540f-450f-ae5a-f627d2023851"). InnerVolumeSpecName "kube-api-access-dl9s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.482899 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/604a8d08-540f-450f-ae5a-f627d2023851-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "604a8d08-540f-450f-ae5a-f627d2023851" (UID: "604a8d08-540f-450f-ae5a-f627d2023851"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.563330 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/604a8d08-540f-450f-ae5a-f627d2023851-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.563368 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dl9s7\" (UniqueName: \"kubernetes.io/projected/604a8d08-540f-450f-ae5a-f627d2023851-kube-api-access-dl9s7\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.563377 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/604a8d08-540f-450f-ae5a-f627d2023851-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.891565 4795 generic.go:334] "Generic (PLEG): container finished" podID="604a8d08-540f-450f-ae5a-f627d2023851" containerID="eef135ebd487d307fa54d1f6a09d89b69d4371f333a48251e8a62d22b2127115" exitCode=0 Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.891607 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ggcwd" event={"ID":"604a8d08-540f-450f-ae5a-f627d2023851","Type":"ContainerDied","Data":"eef135ebd487d307fa54d1f6a09d89b69d4371f333a48251e8a62d22b2127115"} Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.891660 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ggcwd" event={"ID":"604a8d08-540f-450f-ae5a-f627d2023851","Type":"ContainerDied","Data":"9d38bc5478406d8bb61b38691f880d15c257a58823105f014b66c85569ab6ae9"} Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.891676 4795 scope.go:117] "RemoveContainer" containerID="eef135ebd487d307fa54d1f6a09d89b69d4371f333a48251e8a62d22b2127115" Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.891811 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.918934 4795 scope.go:117] "RemoveContainer" containerID="11f02dc00e80c0fcdb142b7d1dea8181c76ed1f4ee204ff1b653f00c55c62c50" Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.949871 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ggcwd"] Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.966458 4795 scope.go:117] "RemoveContainer" containerID="d902470a19ebe7417abce2843271bb85008f8e47eb2f80f53409ec3c0c5d3d14" Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.972524 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ggcwd"] Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.996243 4795 scope.go:117] "RemoveContainer" containerID="eef135ebd487d307fa54d1f6a09d89b69d4371f333a48251e8a62d22b2127115" Feb 19 23:04:04 crc kubenswrapper[4795]: E0219 23:04:04.996711 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eef135ebd487d307fa54d1f6a09d89b69d4371f333a48251e8a62d22b2127115\": container with ID starting with eef135ebd487d307fa54d1f6a09d89b69d4371f333a48251e8a62d22b2127115 not found: ID does not exist" containerID="eef135ebd487d307fa54d1f6a09d89b69d4371f333a48251e8a62d22b2127115" Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.996803 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eef135ebd487d307fa54d1f6a09d89b69d4371f333a48251e8a62d22b2127115"} err="failed to get container status \"eef135ebd487d307fa54d1f6a09d89b69d4371f333a48251e8a62d22b2127115\": rpc error: code = NotFound desc = could not find container \"eef135ebd487d307fa54d1f6a09d89b69d4371f333a48251e8a62d22b2127115\": container with ID starting with eef135ebd487d307fa54d1f6a09d89b69d4371f333a48251e8a62d22b2127115 not found: ID does not exist" Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.996880 4795 scope.go:117] "RemoveContainer" containerID="11f02dc00e80c0fcdb142b7d1dea8181c76ed1f4ee204ff1b653f00c55c62c50" Feb 19 23:04:04 crc kubenswrapper[4795]: E0219 23:04:04.997273 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11f02dc00e80c0fcdb142b7d1dea8181c76ed1f4ee204ff1b653f00c55c62c50\": container with ID starting with 11f02dc00e80c0fcdb142b7d1dea8181c76ed1f4ee204ff1b653f00c55c62c50 not found: ID does not exist" containerID="11f02dc00e80c0fcdb142b7d1dea8181c76ed1f4ee204ff1b653f00c55c62c50" Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.997335 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11f02dc00e80c0fcdb142b7d1dea8181c76ed1f4ee204ff1b653f00c55c62c50"} err="failed to get container status \"11f02dc00e80c0fcdb142b7d1dea8181c76ed1f4ee204ff1b653f00c55c62c50\": rpc error: code = NotFound desc = could not find container \"11f02dc00e80c0fcdb142b7d1dea8181c76ed1f4ee204ff1b653f00c55c62c50\": container with ID starting with 11f02dc00e80c0fcdb142b7d1dea8181c76ed1f4ee204ff1b653f00c55c62c50 not found: ID does not exist" Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.997372 4795 scope.go:117] "RemoveContainer" containerID="d902470a19ebe7417abce2843271bb85008f8e47eb2f80f53409ec3c0c5d3d14" Feb 19 23:04:04 crc kubenswrapper[4795]: E0219 23:04:04.997831 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d902470a19ebe7417abce2843271bb85008f8e47eb2f80f53409ec3c0c5d3d14\": container with ID starting with d902470a19ebe7417abce2843271bb85008f8e47eb2f80f53409ec3c0c5d3d14 not found: ID does not exist" containerID="d902470a19ebe7417abce2843271bb85008f8e47eb2f80f53409ec3c0c5d3d14" Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.997907 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d902470a19ebe7417abce2843271bb85008f8e47eb2f80f53409ec3c0c5d3d14"} err="failed to get container status \"d902470a19ebe7417abce2843271bb85008f8e47eb2f80f53409ec3c0c5d3d14\": rpc error: code = NotFound desc = could not find container \"d902470a19ebe7417abce2843271bb85008f8e47eb2f80f53409ec3c0c5d3d14\": container with ID starting with d902470a19ebe7417abce2843271bb85008f8e47eb2f80f53409ec3c0c5d3d14 not found: ID does not exist" Feb 19 23:04:05 crc kubenswrapper[4795]: I0219 23:04:05.522484 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="604a8d08-540f-450f-ae5a-f627d2023851" path="/var/lib/kubelet/pods/604a8d08-540f-450f-ae5a-f627d2023851/volumes" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.498583 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-g59hr"] Feb 19 23:04:09 crc kubenswrapper[4795]: E0219 23:04:09.502695 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="604a8d08-540f-450f-ae5a-f627d2023851" containerName="extract-content" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.502741 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="604a8d08-540f-450f-ae5a-f627d2023851" containerName="extract-content" Feb 19 23:04:09 crc kubenswrapper[4795]: E0219 23:04:09.502947 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="604a8d08-540f-450f-ae5a-f627d2023851" containerName="registry-server" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.502963 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="604a8d08-540f-450f-ae5a-f627d2023851" containerName="registry-server" Feb 19 23:04:09 crc kubenswrapper[4795]: E0219 23:04:09.503012 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="604a8d08-540f-450f-ae5a-f627d2023851" containerName="extract-utilities" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.503021 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="604a8d08-540f-450f-ae5a-f627d2023851" containerName="extract-utilities" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.505905 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="604a8d08-540f-450f-ae5a-f627d2023851" containerName="registry-server" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.509754 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.520761 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.522884 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.527495 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.563412 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-g59hr"] Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.657249 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/107db266-c130-4312-be67-ffe75016fd44-scripts\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.657292 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107db266-c130-4312-be67-ffe75016fd44-config-data\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.657344 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107db266-c130-4312-be67-ffe75016fd44-combined-ca-bundle\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.657385 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/107db266-c130-4312-be67-ffe75016fd44-config-data-merged\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.657451 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/107db266-c130-4312-be67-ffe75016fd44-hm-ports\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.657482 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/107db266-c130-4312-be67-ffe75016fd44-amphora-certs\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.758626 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/107db266-c130-4312-be67-ffe75016fd44-scripts\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.758919 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107db266-c130-4312-be67-ffe75016fd44-config-data\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.758961 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107db266-c130-4312-be67-ffe75016fd44-combined-ca-bundle\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.759003 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/107db266-c130-4312-be67-ffe75016fd44-config-data-merged\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.759069 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/107db266-c130-4312-be67-ffe75016fd44-hm-ports\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.759116 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/107db266-c130-4312-be67-ffe75016fd44-amphora-certs\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.759950 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/107db266-c130-4312-be67-ffe75016fd44-config-data-merged\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.760597 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/107db266-c130-4312-be67-ffe75016fd44-hm-ports\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.766134 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107db266-c130-4312-be67-ffe75016fd44-config-data\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.769032 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107db266-c130-4312-be67-ffe75016fd44-combined-ca-bundle\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.769136 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/107db266-c130-4312-be67-ffe75016fd44-amphora-certs\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.779799 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/107db266-c130-4312-be67-ffe75016fd44-scripts\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.856874 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:10 crc kubenswrapper[4795]: I0219 23:04:10.469812 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-g59hr"] Feb 19 23:04:10 crc kubenswrapper[4795]: I0219 23:04:10.945401 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-g59hr" event={"ID":"107db266-c130-4312-be67-ffe75016fd44","Type":"ContainerStarted","Data":"28f69fbcb2e7fa6bbca67e47ba798af54fc3eb6462fa131a9d6bebe6df1bb316"} Feb 19 23:04:11 crc kubenswrapper[4795]: I0219 23:04:11.511719 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:04:11 crc kubenswrapper[4795]: E0219 23:04:11.512397 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:04:11 crc kubenswrapper[4795]: I0219 23:04:11.738287 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-cm4g6"] Feb 19 23:04:11 crc kubenswrapper[4795]: I0219 23:04:11.739978 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:11 crc kubenswrapper[4795]: I0219 23:04:11.743620 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Feb 19 23:04:11 crc kubenswrapper[4795]: I0219 23:04:11.744027 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Feb 19 23:04:11 crc kubenswrapper[4795]: I0219 23:04:11.758561 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-cm4g6"] Feb 19 23:04:11 crc kubenswrapper[4795]: I0219 23:04:11.907011 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7167d9ee-5127-43c9-957a-598d9dcfecb3-config-data-merged\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:11 crc kubenswrapper[4795]: I0219 23:04:11.907046 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/7167d9ee-5127-43c9-957a-598d9dcfecb3-amphora-certs\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:11 crc kubenswrapper[4795]: I0219 23:04:11.907077 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7167d9ee-5127-43c9-957a-598d9dcfecb3-combined-ca-bundle\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:11 crc kubenswrapper[4795]: I0219 23:04:11.907157 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7167d9ee-5127-43c9-957a-598d9dcfecb3-config-data\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:11 crc kubenswrapper[4795]: I0219 23:04:11.907263 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/7167d9ee-5127-43c9-957a-598d9dcfecb3-hm-ports\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:11 crc kubenswrapper[4795]: I0219 23:04:11.907287 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7167d9ee-5127-43c9-957a-598d9dcfecb3-scripts\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:11 crc kubenswrapper[4795]: I0219 23:04:11.957239 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-g59hr" event={"ID":"107db266-c130-4312-be67-ffe75016fd44","Type":"ContainerStarted","Data":"107a2524974dddf0c18c6e83d9afecbe178f926101f7f9b4b0ed708d237cc506"} Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.008720 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/7167d9ee-5127-43c9-957a-598d9dcfecb3-hm-ports\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.008772 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7167d9ee-5127-43c9-957a-598d9dcfecb3-scripts\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.009186 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7167d9ee-5127-43c9-957a-598d9dcfecb3-config-data-merged\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.009217 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/7167d9ee-5127-43c9-957a-598d9dcfecb3-amphora-certs\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.009268 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7167d9ee-5127-43c9-957a-598d9dcfecb3-combined-ca-bundle\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.009301 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7167d9ee-5127-43c9-957a-598d9dcfecb3-config-data\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.009953 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/7167d9ee-5127-43c9-957a-598d9dcfecb3-hm-ports\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.010013 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7167d9ee-5127-43c9-957a-598d9dcfecb3-config-data-merged\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.015521 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7167d9ee-5127-43c9-957a-598d9dcfecb3-config-data\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.015690 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/7167d9ee-5127-43c9-957a-598d9dcfecb3-amphora-certs\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.016296 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7167d9ee-5127-43c9-957a-598d9dcfecb3-scripts\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.022919 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7167d9ee-5127-43c9-957a-598d9dcfecb3-combined-ca-bundle\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.056797 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.709476 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-cm4g6"] Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.723212 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.968070 4795 generic.go:334] "Generic (PLEG): container finished" podID="107db266-c130-4312-be67-ffe75016fd44" containerID="107a2524974dddf0c18c6e83d9afecbe178f926101f7f9b4b0ed708d237cc506" exitCode=0 Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.968134 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-g59hr" event={"ID":"107db266-c130-4312-be67-ffe75016fd44","Type":"ContainerDied","Data":"107a2524974dddf0c18c6e83d9afecbe178f926101f7f9b4b0ed708d237cc506"} Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.972277 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-cm4g6" event={"ID":"7167d9ee-5127-43c9-957a-598d9dcfecb3","Type":"ContainerStarted","Data":"bdf026767620b1d93453c9606f54712db3d1d189eae8b3852b6cd91396f98126"} Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.649084 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-pktjg"] Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.651581 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.656544 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.656566 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.670534 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-pktjg"] Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.754235 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/cfbfd9d0-564e-41d0-8171-5f32f380a3df-amphora-certs\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.760583 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cfbfd9d0-564e-41d0-8171-5f32f380a3df-config-data-merged\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.760736 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfbfd9d0-564e-41d0-8171-5f32f380a3df-scripts\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.761002 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/cfbfd9d0-564e-41d0-8171-5f32f380a3df-hm-ports\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.761129 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfbfd9d0-564e-41d0-8171-5f32f380a3df-combined-ca-bundle\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.761283 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfbfd9d0-564e-41d0-8171-5f32f380a3df-config-data\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.863311 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfbfd9d0-564e-41d0-8171-5f32f380a3df-config-data\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.863388 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/cfbfd9d0-564e-41d0-8171-5f32f380a3df-amphora-certs\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.863413 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cfbfd9d0-564e-41d0-8171-5f32f380a3df-config-data-merged\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.863447 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfbfd9d0-564e-41d0-8171-5f32f380a3df-scripts\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.864122 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cfbfd9d0-564e-41d0-8171-5f32f380a3df-config-data-merged\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.864619 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/cfbfd9d0-564e-41d0-8171-5f32f380a3df-hm-ports\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.865598 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/cfbfd9d0-564e-41d0-8171-5f32f380a3df-hm-ports\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.865737 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfbfd9d0-564e-41d0-8171-5f32f380a3df-combined-ca-bundle\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.868943 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfbfd9d0-564e-41d0-8171-5f32f380a3df-combined-ca-bundle\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.872550 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfbfd9d0-564e-41d0-8171-5f32f380a3df-config-data\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.873156 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfbfd9d0-564e-41d0-8171-5f32f380a3df-scripts\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.874394 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/cfbfd9d0-564e-41d0-8171-5f32f380a3df-amphora-certs\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.992087 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-pktjg" Feb 19 23:04:14 crc kubenswrapper[4795]: I0219 23:04:14.004643 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-g59hr" event={"ID":"107db266-c130-4312-be67-ffe75016fd44","Type":"ContainerStarted","Data":"4ae2b472afcfd0f46d4752b9c1fda3f7a0a7b5b43c040a89febfab65f4737409"} Feb 19 23:04:14 crc kubenswrapper[4795]: I0219 23:04:14.005329 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:14 crc kubenswrapper[4795]: I0219 23:04:14.032776 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-g59hr" podStartSLOduration=5.032758171 podStartE2EDuration="5.032758171s" podCreationTimestamp="2026-02-19 23:04:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:04:14.02112709 +0000 UTC m=+5765.213644954" watchObservedRunningTime="2026-02-19 23:04:14.032758171 +0000 UTC m=+5765.225276035" Feb 19 23:04:14 crc kubenswrapper[4795]: I0219 23:04:14.747400 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-pktjg"] Feb 19 23:04:14 crc kubenswrapper[4795]: W0219 23:04:14.762235 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfbfd9d0_564e_41d0_8171_5f32f380a3df.slice/crio-e179c81ae7b94c11e346ae5700a062a7f4fba241804b5adacc27b71f989fb28e WatchSource:0}: Error finding container e179c81ae7b94c11e346ae5700a062a7f4fba241804b5adacc27b71f989fb28e: Status 404 returned error can't find the container with id e179c81ae7b94c11e346ae5700a062a7f4fba241804b5adacc27b71f989fb28e Feb 19 23:04:14 crc kubenswrapper[4795]: I0219 23:04:14.835146 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-g59hr"] Feb 19 23:04:15 crc kubenswrapper[4795]: I0219 23:04:15.016132 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-cm4g6" event={"ID":"7167d9ee-5127-43c9-957a-598d9dcfecb3","Type":"ContainerStarted","Data":"1accc9d526d6fda6c3e60fa5ba08048673c2630dfb31c567d1cbf3b6e78ba1c9"} Feb 19 23:04:15 crc kubenswrapper[4795]: I0219 23:04:15.018129 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-pktjg" event={"ID":"cfbfd9d0-564e-41d0-8171-5f32f380a3df","Type":"ContainerStarted","Data":"e179c81ae7b94c11e346ae5700a062a7f4fba241804b5adacc27b71f989fb28e"} Feb 19 23:04:16 crc kubenswrapper[4795]: I0219 23:04:16.033139 4795 generic.go:334] "Generic (PLEG): container finished" podID="7167d9ee-5127-43c9-957a-598d9dcfecb3" containerID="1accc9d526d6fda6c3e60fa5ba08048673c2630dfb31c567d1cbf3b6e78ba1c9" exitCode=0 Feb 19 23:04:16 crc kubenswrapper[4795]: I0219 23:04:16.033330 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-cm4g6" event={"ID":"7167d9ee-5127-43c9-957a-598d9dcfecb3","Type":"ContainerDied","Data":"1accc9d526d6fda6c3e60fa5ba08048673c2630dfb31c567d1cbf3b6e78ba1c9"} Feb 19 23:04:17 crc kubenswrapper[4795]: I0219 23:04:17.044774 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:17 crc kubenswrapper[4795]: I0219 23:04:17.071220 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-cm4g6" podStartSLOduration=4.623196604 podStartE2EDuration="6.071203387s" podCreationTimestamp="2026-02-19 23:04:11 +0000 UTC" firstStartedPulling="2026-02-19 23:04:12.72275659 +0000 UTC m=+5763.915274494" lastFinishedPulling="2026-02-19 23:04:14.170763413 +0000 UTC m=+5765.363281277" observedRunningTime="2026-02-19 23:04:17.067520242 +0000 UTC m=+5768.260038126" watchObservedRunningTime="2026-02-19 23:04:17.071203387 +0000 UTC m=+5768.263721251" Feb 19 23:04:18 crc kubenswrapper[4795]: I0219 23:04:18.054297 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-cm4g6" event={"ID":"7167d9ee-5127-43c9-957a-598d9dcfecb3","Type":"ContainerStarted","Data":"ddead98e84d71253645b8249ecc7cd9d813a41ec08a5fb7c30e015c4c879ed6e"} Feb 19 23:04:18 crc kubenswrapper[4795]: I0219 23:04:18.055767 4795 generic.go:334] "Generic (PLEG): container finished" podID="cfbfd9d0-564e-41d0-8171-5f32f380a3df" containerID="7b24faaea42b781c594aaea87b574d69779fd11ee1b15c67f94b3fb5af59435a" exitCode=0 Feb 19 23:04:18 crc kubenswrapper[4795]: I0219 23:04:18.055811 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-pktjg" event={"ID":"cfbfd9d0-564e-41d0-8171-5f32f380a3df","Type":"ContainerDied","Data":"7b24faaea42b781c594aaea87b574d69779fd11ee1b15c67f94b3fb5af59435a"} Feb 19 23:04:20 crc kubenswrapper[4795]: I0219 23:04:20.074944 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-pktjg" event={"ID":"cfbfd9d0-564e-41d0-8171-5f32f380a3df","Type":"ContainerStarted","Data":"b3ab6e660a9f4d36d1c9e3c306b6ac94d1db18c70c8b4cb1f60cbf46841f718f"} Feb 19 23:04:20 crc kubenswrapper[4795]: I0219 23:04:20.075653 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-pktjg" Feb 19 23:04:20 crc kubenswrapper[4795]: I0219 23:04:20.108452 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-pktjg" podStartSLOduration=5.292556709 podStartE2EDuration="7.108434319s" podCreationTimestamp="2026-02-19 23:04:13 +0000 UTC" firstStartedPulling="2026-02-19 23:04:14.765548438 +0000 UTC m=+5765.958066292" lastFinishedPulling="2026-02-19 23:04:16.581426028 +0000 UTC m=+5767.773943902" observedRunningTime="2026-02-19 23:04:20.098971481 +0000 UTC m=+5771.291489375" watchObservedRunningTime="2026-02-19 23:04:20.108434319 +0000 UTC m=+5771.300952183" Feb 19 23:04:22 crc kubenswrapper[4795]: I0219 23:04:22.513498 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:04:22 crc kubenswrapper[4795]: E0219 23:04:22.514354 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:04:24 crc kubenswrapper[4795]: I0219 23:04:24.883189 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:27 crc kubenswrapper[4795]: I0219 23:04:27.093984 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:29 crc kubenswrapper[4795]: I0219 23:04:29.034709 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-pktjg" Feb 19 23:04:37 crc kubenswrapper[4795]: I0219 23:04:37.512270 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:04:37 crc kubenswrapper[4795]: E0219 23:04:37.513130 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.521499 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-79bb8b759c-wxdkj"] Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.523567 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.529255 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-79bb8b759c-wxdkj"] Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.531957 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.531961 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.532424 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-s4gpl" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.532631 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.570946 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.571591 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5ba19509-98fd-4ae4-b9ab-673c27ab8e85" containerName="glance-log" containerID="cri-o://e79326eb305f46105ac73051eebe6a8a28eedc36e23a4992a65d558674bb4f58" gracePeriod=30 Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.573181 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5ba19509-98fd-4ae4-b9ab-673c27ab8e85" containerName="glance-httpd" containerID="cri-o://f4e052417461008a87fe32a59b836852c3e0dfbe3a7cafdcd8e66c0079a03648" gracePeriod=30 Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.638727 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.641662 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" containerName="glance-log" containerID="cri-o://2881a41d4e90cea66bb06cc15746e467948606aaeb4ab6378ba25eec8d520511" gracePeriod=30 Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.641683 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" containerName="glance-httpd" containerID="cri-o://1b16e4f15dda00c47ebd1d3a052f48e0eb759759e8c054d1aebe9ed9f2d0750b" gracePeriod=30 Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.690769 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwmwt\" (UniqueName: \"kubernetes.io/projected/4ab2c6ea-997b-4147-a0de-5e3989980973-kube-api-access-hwmwt\") pod \"horizon-79bb8b759c-wxdkj\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.690858 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ab2c6ea-997b-4147-a0de-5e3989980973-config-data\") pod \"horizon-79bb8b759c-wxdkj\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.690955 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4ab2c6ea-997b-4147-a0de-5e3989980973-horizon-secret-key\") pod \"horizon-79bb8b759c-wxdkj\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.691000 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ab2c6ea-997b-4147-a0de-5e3989980973-scripts\") pod \"horizon-79bb8b759c-wxdkj\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.691084 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ab2c6ea-997b-4147-a0de-5e3989980973-logs\") pod \"horizon-79bb8b759c-wxdkj\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.694159 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-844c94496f-brkd8"] Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.696997 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.724090 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-844c94496f-brkd8"] Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.793325 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6xf7\" (UniqueName: \"kubernetes.io/projected/fb48bdd6-abf1-4115-8357-79c56555d51b-kube-api-access-p6xf7\") pod \"horizon-844c94496f-brkd8\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.793393 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ab2c6ea-997b-4147-a0de-5e3989980973-scripts\") pod \"horizon-79bb8b759c-wxdkj\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.793446 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb48bdd6-abf1-4115-8357-79c56555d51b-scripts\") pod \"horizon-844c94496f-brkd8\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.793504 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb48bdd6-abf1-4115-8357-79c56555d51b-logs\") pod \"horizon-844c94496f-brkd8\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.793533 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fb48bdd6-abf1-4115-8357-79c56555d51b-horizon-secret-key\") pod \"horizon-844c94496f-brkd8\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.793558 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ab2c6ea-997b-4147-a0de-5e3989980973-logs\") pod \"horizon-79bb8b759c-wxdkj\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.793585 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fb48bdd6-abf1-4115-8357-79c56555d51b-config-data\") pod \"horizon-844c94496f-brkd8\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.793626 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwmwt\" (UniqueName: \"kubernetes.io/projected/4ab2c6ea-997b-4147-a0de-5e3989980973-kube-api-access-hwmwt\") pod \"horizon-79bb8b759c-wxdkj\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.793698 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ab2c6ea-997b-4147-a0de-5e3989980973-config-data\") pod \"horizon-79bb8b759c-wxdkj\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.793773 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4ab2c6ea-997b-4147-a0de-5e3989980973-horizon-secret-key\") pod \"horizon-79bb8b759c-wxdkj\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.794503 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ab2c6ea-997b-4147-a0de-5e3989980973-logs\") pod \"horizon-79bb8b759c-wxdkj\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.795721 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ab2c6ea-997b-4147-a0de-5e3989980973-config-data\") pod \"horizon-79bb8b759c-wxdkj\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.796312 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ab2c6ea-997b-4147-a0de-5e3989980973-scripts\") pod \"horizon-79bb8b759c-wxdkj\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.802431 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4ab2c6ea-997b-4147-a0de-5e3989980973-horizon-secret-key\") pod \"horizon-79bb8b759c-wxdkj\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.816592 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwmwt\" (UniqueName: \"kubernetes.io/projected/4ab2c6ea-997b-4147-a0de-5e3989980973-kube-api-access-hwmwt\") pod \"horizon-79bb8b759c-wxdkj\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.851640 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.899196 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb48bdd6-abf1-4115-8357-79c56555d51b-scripts\") pod \"horizon-844c94496f-brkd8\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.899787 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb48bdd6-abf1-4115-8357-79c56555d51b-logs\") pod \"horizon-844c94496f-brkd8\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.899811 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fb48bdd6-abf1-4115-8357-79c56555d51b-horizon-secret-key\") pod \"horizon-844c94496f-brkd8\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.899834 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fb48bdd6-abf1-4115-8357-79c56555d51b-config-data\") pod \"horizon-844c94496f-brkd8\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.899952 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6xf7\" (UniqueName: \"kubernetes.io/projected/fb48bdd6-abf1-4115-8357-79c56555d51b-kube-api-access-p6xf7\") pod \"horizon-844c94496f-brkd8\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.900095 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb48bdd6-abf1-4115-8357-79c56555d51b-scripts\") pod \"horizon-844c94496f-brkd8\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.901439 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb48bdd6-abf1-4115-8357-79c56555d51b-logs\") pod \"horizon-844c94496f-brkd8\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.903022 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fb48bdd6-abf1-4115-8357-79c56555d51b-config-data\") pod \"horizon-844c94496f-brkd8\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.907772 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fb48bdd6-abf1-4115-8357-79c56555d51b-horizon-secret-key\") pod \"horizon-844c94496f-brkd8\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.915643 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6xf7\" (UniqueName: \"kubernetes.io/projected/fb48bdd6-abf1-4115-8357-79c56555d51b-kube-api-access-p6xf7\") pod \"horizon-844c94496f-brkd8\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.108623 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.337281 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-79bb8b759c-wxdkj"] Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.352026 4795 generic.go:334] "Generic (PLEG): container finished" podID="5ba19509-98fd-4ae4-b9ab-673c27ab8e85" containerID="e79326eb305f46105ac73051eebe6a8a28eedc36e23a4992a65d558674bb4f58" exitCode=143 Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.352101 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5ba19509-98fd-4ae4-b9ab-673c27ab8e85","Type":"ContainerDied","Data":"e79326eb305f46105ac73051eebe6a8a28eedc36e23a4992a65d558674bb4f58"} Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.355964 4795 generic.go:334] "Generic (PLEG): container finished" podID="f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" containerID="2881a41d4e90cea66bb06cc15746e467948606aaeb4ab6378ba25eec8d520511" exitCode=143 Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.355999 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc","Type":"ContainerDied","Data":"2881a41d4e90cea66bb06cc15746e467948606aaeb4ab6378ba25eec8d520511"} Feb 19 23:04:46 crc kubenswrapper[4795]: W0219 23:04:46.393558 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ab2c6ea_997b_4147_a0de_5e3989980973.slice/crio-79805600d19571dc97c2bb6f4d14290da224c25127f75720d4481d84a77147c0 WatchSource:0}: Error finding container 79805600d19571dc97c2bb6f4d14290da224c25127f75720d4481d84a77147c0: Status 404 returned error can't find the container with id 79805600d19571dc97c2bb6f4d14290da224c25127f75720d4481d84a77147c0 Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.393752 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-548bf4c685-852ql"] Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.395644 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.423301 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-79bb8b759c-wxdkj"] Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.434143 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-548bf4c685-852ql"] Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.514225 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/353e54a0-06cb-4876-af76-78bcd1bb3a22-scripts\") pod \"horizon-548bf4c685-852ql\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.514295 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/353e54a0-06cb-4876-af76-78bcd1bb3a22-config-data\") pod \"horizon-548bf4c685-852ql\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.514332 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/353e54a0-06cb-4876-af76-78bcd1bb3a22-logs\") pod \"horizon-548bf4c685-852ql\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.514582 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/353e54a0-06cb-4876-af76-78bcd1bb3a22-horizon-secret-key\") pod \"horizon-548bf4c685-852ql\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.514892 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svjbq\" (UniqueName: \"kubernetes.io/projected/353e54a0-06cb-4876-af76-78bcd1bb3a22-kube-api-access-svjbq\") pod \"horizon-548bf4c685-852ql\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.617042 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svjbq\" (UniqueName: \"kubernetes.io/projected/353e54a0-06cb-4876-af76-78bcd1bb3a22-kube-api-access-svjbq\") pod \"horizon-548bf4c685-852ql\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.617137 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/353e54a0-06cb-4876-af76-78bcd1bb3a22-scripts\") pod \"horizon-548bf4c685-852ql\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.617207 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/353e54a0-06cb-4876-af76-78bcd1bb3a22-config-data\") pod \"horizon-548bf4c685-852ql\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.617263 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/353e54a0-06cb-4876-af76-78bcd1bb3a22-logs\") pod \"horizon-548bf4c685-852ql\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.617379 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/353e54a0-06cb-4876-af76-78bcd1bb3a22-horizon-secret-key\") pod \"horizon-548bf4c685-852ql\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.618148 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/353e54a0-06cb-4876-af76-78bcd1bb3a22-scripts\") pod \"horizon-548bf4c685-852ql\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.618393 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/353e54a0-06cb-4876-af76-78bcd1bb3a22-logs\") pod \"horizon-548bf4c685-852ql\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.618785 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/353e54a0-06cb-4876-af76-78bcd1bb3a22-config-data\") pod \"horizon-548bf4c685-852ql\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.624669 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/353e54a0-06cb-4876-af76-78bcd1bb3a22-horizon-secret-key\") pod \"horizon-548bf4c685-852ql\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.637527 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svjbq\" (UniqueName: \"kubernetes.io/projected/353e54a0-06cb-4876-af76-78bcd1bb3a22-kube-api-access-svjbq\") pod \"horizon-548bf4c685-852ql\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.705157 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-844c94496f-brkd8"] Feb 19 23:04:46 crc kubenswrapper[4795]: W0219 23:04:46.713908 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb48bdd6_abf1_4115_8357_79c56555d51b.slice/crio-a28eb35b4d570794611599731f49604cc11a5fca584746030e04c1f904b4cff5 WatchSource:0}: Error finding container a28eb35b4d570794611599731f49604cc11a5fca584746030e04c1f904b4cff5: Status 404 returned error can't find the container with id a28eb35b4d570794611599731f49604cc11a5fca584746030e04c1f904b4cff5 Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.747949 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:47 crc kubenswrapper[4795]: I0219 23:04:47.230914 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-548bf4c685-852ql"] Feb 19 23:04:47 crc kubenswrapper[4795]: W0219 23:04:47.234903 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod353e54a0_06cb_4876_af76_78bcd1bb3a22.slice/crio-582b797ed9c1b4bfe8384f717fb4dee523c0a67a75d13d2ff254fdbeb84f1617 WatchSource:0}: Error finding container 582b797ed9c1b4bfe8384f717fb4dee523c0a67a75d13d2ff254fdbeb84f1617: Status 404 returned error can't find the container with id 582b797ed9c1b4bfe8384f717fb4dee523c0a67a75d13d2ff254fdbeb84f1617 Feb 19 23:04:47 crc kubenswrapper[4795]: I0219 23:04:47.368355 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79bb8b759c-wxdkj" event={"ID":"4ab2c6ea-997b-4147-a0de-5e3989980973","Type":"ContainerStarted","Data":"79805600d19571dc97c2bb6f4d14290da224c25127f75720d4481d84a77147c0"} Feb 19 23:04:47 crc kubenswrapper[4795]: I0219 23:04:47.369892 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-844c94496f-brkd8" event={"ID":"fb48bdd6-abf1-4115-8357-79c56555d51b","Type":"ContainerStarted","Data":"a28eb35b4d570794611599731f49604cc11a5fca584746030e04c1f904b4cff5"} Feb 19 23:04:47 crc kubenswrapper[4795]: I0219 23:04:47.371172 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-548bf4c685-852ql" event={"ID":"353e54a0-06cb-4876-af76-78bcd1bb3a22","Type":"ContainerStarted","Data":"582b797ed9c1b4bfe8384f717fb4dee523c0a67a75d13d2ff254fdbeb84f1617"} Feb 19 23:04:48 crc kubenswrapper[4795]: I0219 23:04:48.847507 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.1.43:9292/healthcheck\": read tcp 10.217.0.2:55430->10.217.1.43:9292: read: connection reset by peer" Feb 19 23:04:48 crc kubenswrapper[4795]: I0219 23:04:48.847542 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.1.43:9292/healthcheck\": read tcp 10.217.0.2:55422->10.217.1.43:9292: read: connection reset by peer" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.350790 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.399019 4795 generic.go:334] "Generic (PLEG): container finished" podID="5ba19509-98fd-4ae4-b9ab-673c27ab8e85" containerID="f4e052417461008a87fe32a59b836852c3e0dfbe3a7cafdcd8e66c0079a03648" exitCode=0 Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.399068 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5ba19509-98fd-4ae4-b9ab-673c27ab8e85","Type":"ContainerDied","Data":"f4e052417461008a87fe32a59b836852c3e0dfbe3a7cafdcd8e66c0079a03648"} Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.399107 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.399131 4795 scope.go:117] "RemoveContainer" containerID="f4e052417461008a87fe32a59b836852c3e0dfbe3a7cafdcd8e66c0079a03648" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.399119 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5ba19509-98fd-4ae4-b9ab-673c27ab8e85","Type":"ContainerDied","Data":"80a194040524fc06f644957b6fb02ec46c953aefd62778b8fadace961f88d1ac"} Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.407708 4795 generic.go:334] "Generic (PLEG): container finished" podID="f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" containerID="1b16e4f15dda00c47ebd1d3a052f48e0eb759759e8c054d1aebe9ed9f2d0750b" exitCode=0 Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.407747 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc","Type":"ContainerDied","Data":"1b16e4f15dda00c47ebd1d3a052f48e0eb759759e8c054d1aebe9ed9f2d0750b"} Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.442865 4795 scope.go:117] "RemoveContainer" containerID="e79326eb305f46105ac73051eebe6a8a28eedc36e23a4992a65d558674bb4f58" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.479057 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srnpd\" (UniqueName: \"kubernetes.io/projected/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-kube-api-access-srnpd\") pod \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.479154 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-httpd-run\") pod \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.479316 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-config-data\") pod \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.479527 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-scripts\") pod \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.479565 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-logs\") pod \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.479618 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-ceph\") pod \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.479712 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-combined-ca-bundle\") pod \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.481599 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5ba19509-98fd-4ae4-b9ab-673c27ab8e85" (UID: "5ba19509-98fd-4ae4-b9ab-673c27ab8e85"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.481893 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-logs" (OuterVolumeSpecName: "logs") pod "5ba19509-98fd-4ae4-b9ab-673c27ab8e85" (UID: "5ba19509-98fd-4ae4-b9ab-673c27ab8e85"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.485610 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-scripts" (OuterVolumeSpecName: "scripts") pod "5ba19509-98fd-4ae4-b9ab-673c27ab8e85" (UID: "5ba19509-98fd-4ae4-b9ab-673c27ab8e85"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.485949 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-ceph" (OuterVolumeSpecName: "ceph") pod "5ba19509-98fd-4ae4-b9ab-673c27ab8e85" (UID: "5ba19509-98fd-4ae4-b9ab-673c27ab8e85"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.495561 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-kube-api-access-srnpd" (OuterVolumeSpecName: "kube-api-access-srnpd") pod "5ba19509-98fd-4ae4-b9ab-673c27ab8e85" (UID: "5ba19509-98fd-4ae4-b9ab-673c27ab8e85"). InnerVolumeSpecName "kube-api-access-srnpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.519197 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ba19509-98fd-4ae4-b9ab-673c27ab8e85" (UID: "5ba19509-98fd-4ae4-b9ab-673c27ab8e85"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.547753 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-config-data" (OuterVolumeSpecName: "config-data") pod "5ba19509-98fd-4ae4-b9ab-673c27ab8e85" (UID: "5ba19509-98fd-4ae4-b9ab-673c27ab8e85"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.568457 4795 scope.go:117] "RemoveContainer" containerID="f4e052417461008a87fe32a59b836852c3e0dfbe3a7cafdcd8e66c0079a03648" Feb 19 23:04:49 crc kubenswrapper[4795]: E0219 23:04:49.569223 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4e052417461008a87fe32a59b836852c3e0dfbe3a7cafdcd8e66c0079a03648\": container with ID starting with f4e052417461008a87fe32a59b836852c3e0dfbe3a7cafdcd8e66c0079a03648 not found: ID does not exist" containerID="f4e052417461008a87fe32a59b836852c3e0dfbe3a7cafdcd8e66c0079a03648" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.569262 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4e052417461008a87fe32a59b836852c3e0dfbe3a7cafdcd8e66c0079a03648"} err="failed to get container status \"f4e052417461008a87fe32a59b836852c3e0dfbe3a7cafdcd8e66c0079a03648\": rpc error: code = NotFound desc = could not find container \"f4e052417461008a87fe32a59b836852c3e0dfbe3a7cafdcd8e66c0079a03648\": container with ID starting with f4e052417461008a87fe32a59b836852c3e0dfbe3a7cafdcd8e66c0079a03648 not found: ID does not exist" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.569291 4795 scope.go:117] "RemoveContainer" containerID="e79326eb305f46105ac73051eebe6a8a28eedc36e23a4992a65d558674bb4f58" Feb 19 23:04:49 crc kubenswrapper[4795]: E0219 23:04:49.569952 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e79326eb305f46105ac73051eebe6a8a28eedc36e23a4992a65d558674bb4f58\": container with ID starting with e79326eb305f46105ac73051eebe6a8a28eedc36e23a4992a65d558674bb4f58 not found: ID does not exist" containerID="e79326eb305f46105ac73051eebe6a8a28eedc36e23a4992a65d558674bb4f58" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.569985 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e79326eb305f46105ac73051eebe6a8a28eedc36e23a4992a65d558674bb4f58"} err="failed to get container status \"e79326eb305f46105ac73051eebe6a8a28eedc36e23a4992a65d558674bb4f58\": rpc error: code = NotFound desc = could not find container \"e79326eb305f46105ac73051eebe6a8a28eedc36e23a4992a65d558674bb4f58\": container with ID starting with e79326eb305f46105ac73051eebe6a8a28eedc36e23a4992a65d558674bb4f58 not found: ID does not exist" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.581852 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.581880 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.581888 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.581899 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.581908 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srnpd\" (UniqueName: \"kubernetes.io/projected/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-kube-api-access-srnpd\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.581916 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.581925 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.737223 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.747870 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.767467 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 23:04:49 crc kubenswrapper[4795]: E0219 23:04:49.768122 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ba19509-98fd-4ae4-b9ab-673c27ab8e85" containerName="glance-log" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.768157 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ba19509-98fd-4ae4-b9ab-673c27ab8e85" containerName="glance-log" Feb 19 23:04:49 crc kubenswrapper[4795]: E0219 23:04:49.768247 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ba19509-98fd-4ae4-b9ab-673c27ab8e85" containerName="glance-httpd" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.768262 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ba19509-98fd-4ae4-b9ab-673c27ab8e85" containerName="glance-httpd" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.768631 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ba19509-98fd-4ae4-b9ab-673c27ab8e85" containerName="glance-log" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.768690 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ba19509-98fd-4ae4-b9ab-673c27ab8e85" containerName="glance-httpd" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.770446 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.774791 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.788031 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.888685 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-config-data\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.889054 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.889233 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-scripts\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.889265 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.889327 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqsqt\" (UniqueName: \"kubernetes.io/projected/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-kube-api-access-bqsqt\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.889366 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-logs\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.889386 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-ceph\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:50 crc kubenswrapper[4795]: I0219 23:04:50.009100 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-scripts\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:50 crc kubenswrapper[4795]: I0219 23:04:50.009150 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:50 crc kubenswrapper[4795]: I0219 23:04:50.009202 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqsqt\" (UniqueName: \"kubernetes.io/projected/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-kube-api-access-bqsqt\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:50 crc kubenswrapper[4795]: I0219 23:04:50.009244 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-logs\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:50 crc kubenswrapper[4795]: I0219 23:04:50.009281 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-ceph\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:50 crc kubenswrapper[4795]: I0219 23:04:50.009326 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-config-data\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:50 crc kubenswrapper[4795]: I0219 23:04:50.009348 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:50 crc kubenswrapper[4795]: I0219 23:04:50.016873 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:50 crc kubenswrapper[4795]: I0219 23:04:50.016911 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-logs\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:50 crc kubenswrapper[4795]: I0219 23:04:50.017571 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-scripts\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:50 crc kubenswrapper[4795]: I0219 23:04:50.023695 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-ceph\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:50 crc kubenswrapper[4795]: I0219 23:04:50.035398 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-config-data\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:50 crc kubenswrapper[4795]: I0219 23:04:50.036068 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqsqt\" (UniqueName: \"kubernetes.io/projected/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-kube-api-access-bqsqt\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:50 crc kubenswrapper[4795]: I0219 23:04:50.036744 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:50 crc kubenswrapper[4795]: I0219 23:04:50.121718 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 23:04:50 crc kubenswrapper[4795]: I0219 23:04:50.512367 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:04:50 crc kubenswrapper[4795]: E0219 23:04:50.512583 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:04:51 crc kubenswrapper[4795]: I0219 23:04:51.526588 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ba19509-98fd-4ae4-b9ab-673c27ab8e85" path="/var/lib/kubelet/pods/5ba19509-98fd-4ae4-b9ab-673c27ab8e85/volumes" Feb 19 23:04:54 crc kubenswrapper[4795]: I0219 23:04:54.795905 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 23:04:54 crc kubenswrapper[4795]: I0219 23:04:54.927798 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-httpd-run\") pod \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " Feb 19 23:04:54 crc kubenswrapper[4795]: I0219 23:04:54.927872 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-combined-ca-bundle\") pod \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " Feb 19 23:04:54 crc kubenswrapper[4795]: I0219 23:04:54.927898 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-logs\") pod \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " Feb 19 23:04:54 crc kubenswrapper[4795]: I0219 23:04:54.927947 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljtbh\" (UniqueName: \"kubernetes.io/projected/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-kube-api-access-ljtbh\") pod \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " Feb 19 23:04:54 crc kubenswrapper[4795]: I0219 23:04:54.928003 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-config-data\") pod \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " Feb 19 23:04:54 crc kubenswrapper[4795]: I0219 23:04:54.928077 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-ceph\") pod \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " Feb 19 23:04:54 crc kubenswrapper[4795]: I0219 23:04:54.928137 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-scripts\") pod \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " Feb 19 23:04:54 crc kubenswrapper[4795]: I0219 23:04:54.939667 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-logs" (OuterVolumeSpecName: "logs") pod "f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" (UID: "f24f09d2-d9ef-4930-b1ce-284e1a0e61cc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:04:54 crc kubenswrapper[4795]: I0219 23:04:54.939906 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" (UID: "f24f09d2-d9ef-4930-b1ce-284e1a0e61cc"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:04:54 crc kubenswrapper[4795]: I0219 23:04:54.988447 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-scripts" (OuterVolumeSpecName: "scripts") pod "f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" (UID: "f24f09d2-d9ef-4930-b1ce-284e1a0e61cc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:04:54 crc kubenswrapper[4795]: I0219 23:04:54.988638 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-ceph" (OuterVolumeSpecName: "ceph") pod "f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" (UID: "f24f09d2-d9ef-4930-b1ce-284e1a0e61cc"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:04:54 crc kubenswrapper[4795]: I0219 23:04:54.994573 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-kube-api-access-ljtbh" (OuterVolumeSpecName: "kube-api-access-ljtbh") pod "f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" (UID: "f24f09d2-d9ef-4930-b1ce-284e1a0e61cc"). InnerVolumeSpecName "kube-api-access-ljtbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.031559 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.031592 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.031603 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljtbh\" (UniqueName: \"kubernetes.io/projected/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-kube-api-access-ljtbh\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.031614 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.031625 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.101340 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" (UID: "f24f09d2-d9ef-4930-b1ce-284e1a0e61cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.132789 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.165492 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-config-data" (OuterVolumeSpecName: "config-data") pod "f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" (UID: "f24f09d2-d9ef-4930-b1ce-284e1a0e61cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.235678 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.331062 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.461029 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-548bf4c685-852ql" event={"ID":"353e54a0-06cb-4876-af76-78bcd1bb3a22","Type":"ContainerStarted","Data":"1bd2543092bc2960614f9063a0b09c79b76c851d29dedd3bce8880af20588398"} Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.461451 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-548bf4c685-852ql" event={"ID":"353e54a0-06cb-4876-af76-78bcd1bb3a22","Type":"ContainerStarted","Data":"b946e5db8d3a52db0f1f9054f0099688c365289ec5bd26ac448c71a734370a40"} Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.466048 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79bb8b759c-wxdkj" event={"ID":"4ab2c6ea-997b-4147-a0de-5e3989980973","Type":"ContainerStarted","Data":"aba2ee7ae1998f0274365ca81f59652ac2f73a1ce0c84373e76ed4aca4b68378"} Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.466115 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79bb8b759c-wxdkj" event={"ID":"4ab2c6ea-997b-4147-a0de-5e3989980973","Type":"ContainerStarted","Data":"b76880969cbea856154bb0e3062a422d765928c77fe52c693f3423f812d36b6f"} Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.466198 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-79bb8b759c-wxdkj" podUID="4ab2c6ea-997b-4147-a0de-5e3989980973" containerName="horizon-log" containerID="cri-o://b76880969cbea856154bb0e3062a422d765928c77fe52c693f3423f812d36b6f" gracePeriod=30 Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.466261 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-79bb8b759c-wxdkj" podUID="4ab2c6ea-997b-4147-a0de-5e3989980973" containerName="horizon" containerID="cri-o://aba2ee7ae1998f0274365ca81f59652ac2f73a1ce0c84373e76ed4aca4b68378" gracePeriod=30 Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.469534 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-844c94496f-brkd8" event={"ID":"fb48bdd6-abf1-4115-8357-79c56555d51b","Type":"ContainerStarted","Data":"098124b166de0fbf41c5cb05a877f55202dbab6d10c56f71e5220e37b6da5149"} Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.469596 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-844c94496f-brkd8" event={"ID":"fb48bdd6-abf1-4115-8357-79c56555d51b","Type":"ContainerStarted","Data":"66e07672d0b0039c92e5a7bf2db03f18fc25ea4f8132e4dd9bc20c658abaf8bd"} Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.472179 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc","Type":"ContainerDied","Data":"5b24832724357e9fe3cd524ab47d5c8ea237d7ac8d26ead3ec336002073cfe70"} Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.472243 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.472260 4795 scope.go:117] "RemoveContainer" containerID="1b16e4f15dda00c47ebd1d3a052f48e0eb759759e8c054d1aebe9ed9f2d0750b" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.474198 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bd4ac280-c0e4-46e3-95c8-5e051c96f32e","Type":"ContainerStarted","Data":"8982d489f3ba63018dcae7db8988f20bf8ac06ffc172b54c1199a4804d4202c6"} Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.491522 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-548bf4c685-852ql" podStartSLOduration=2.005485808 podStartE2EDuration="9.491496029s" podCreationTimestamp="2026-02-19 23:04:46 +0000 UTC" firstStartedPulling="2026-02-19 23:04:47.238309064 +0000 UTC m=+5798.430826928" lastFinishedPulling="2026-02-19 23:04:54.724319245 +0000 UTC m=+5805.916837149" observedRunningTime="2026-02-19 23:04:55.485081337 +0000 UTC m=+5806.677599221" watchObservedRunningTime="2026-02-19 23:04:55.491496029 +0000 UTC m=+5806.684013893" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.516502 4795 scope.go:117] "RemoveContainer" containerID="2881a41d4e90cea66bb06cc15746e467948606aaeb4ab6378ba25eec8d520511" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.522939 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-844c94496f-brkd8" podStartSLOduration=2.519107114 podStartE2EDuration="10.522919492s" podCreationTimestamp="2026-02-19 23:04:45 +0000 UTC" firstStartedPulling="2026-02-19 23:04:46.717462131 +0000 UTC m=+5797.909979995" lastFinishedPulling="2026-02-19 23:04:54.721274499 +0000 UTC m=+5805.913792373" observedRunningTime="2026-02-19 23:04:55.509148791 +0000 UTC m=+5806.701666665" watchObservedRunningTime="2026-02-19 23:04:55.522919492 +0000 UTC m=+5806.715437356" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.536960 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-79bb8b759c-wxdkj" podStartSLOduration=2.239893859 podStartE2EDuration="10.536937351s" podCreationTimestamp="2026-02-19 23:04:45 +0000 UTC" firstStartedPulling="2026-02-19 23:04:46.40183764 +0000 UTC m=+5797.594355504" lastFinishedPulling="2026-02-19 23:04:54.698881112 +0000 UTC m=+5805.891398996" observedRunningTime="2026-02-19 23:04:55.534073199 +0000 UTC m=+5806.726591083" watchObservedRunningTime="2026-02-19 23:04:55.536937351 +0000 UTC m=+5806.729455215" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.583674 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.604460 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.614983 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 23:04:55 crc kubenswrapper[4795]: E0219 23:04:55.618139 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" containerName="glance-httpd" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.618352 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" containerName="glance-httpd" Feb 19 23:04:55 crc kubenswrapper[4795]: E0219 23:04:55.618462 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" containerName="glance-log" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.618550 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" containerName="glance-log" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.620150 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" containerName="glance-httpd" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.620281 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" containerName="glance-log" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.625614 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.641527 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.646895 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.749351 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.749413 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.749441 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.749470 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-logs\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.749536 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrdch\" (UniqueName: \"kubernetes.io/projected/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-kube-api-access-wrdch\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.749557 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.749582 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.851783 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.852967 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrdch\" (UniqueName: \"kubernetes.io/projected/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-kube-api-access-wrdch\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.853114 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.853253 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.853529 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.853705 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.853846 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.853986 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-logs\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.854085 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.854360 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-logs\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.860015 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.862666 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.864350 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.870468 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.878877 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrdch\" (UniqueName: \"kubernetes.io/projected/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-kube-api-access-wrdch\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:56 crc kubenswrapper[4795]: I0219 23:04:56.016283 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 23:04:56 crc kubenswrapper[4795]: I0219 23:04:56.108715 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:56 crc kubenswrapper[4795]: I0219 23:04:56.108765 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:56 crc kubenswrapper[4795]: I0219 23:04:56.491397 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bd4ac280-c0e4-46e3-95c8-5e051c96f32e","Type":"ContainerStarted","Data":"31fabd4d7a61a9a58d9ad9aa5dc3005e8d81ec5a949d4fa86467ff0dd972f904"} Feb 19 23:04:56 crc kubenswrapper[4795]: I0219 23:04:56.628221 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 23:04:56 crc kubenswrapper[4795]: I0219 23:04:56.749956 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:56 crc kubenswrapper[4795]: I0219 23:04:56.751926 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:57 crc kubenswrapper[4795]: I0219 23:04:57.501972 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b7d41b06-abb7-4a30-a29c-3b9d66706d8f","Type":"ContainerStarted","Data":"d7187e932a3c630957fc932dcb0516a262cbd3e096ff9d008d16d55591e9d7a2"} Feb 19 23:04:57 crc kubenswrapper[4795]: I0219 23:04:57.502404 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b7d41b06-abb7-4a30-a29c-3b9d66706d8f","Type":"ContainerStarted","Data":"0830d7fc6b4ddd1fb3f0c43eedd5e0ef6164946ae562a47ceef1d3553930bb86"} Feb 19 23:04:57 crc kubenswrapper[4795]: I0219 23:04:57.505728 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bd4ac280-c0e4-46e3-95c8-5e051c96f32e","Type":"ContainerStarted","Data":"1d9e33caaf030e67fdfb0c6e7e0023ecd6c71ed6e4784f944811f7b33d74d1c4"} Feb 19 23:04:57 crc kubenswrapper[4795]: I0219 23:04:57.528431 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" path="/var/lib/kubelet/pods/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc/volumes" Feb 19 23:04:57 crc kubenswrapper[4795]: I0219 23:04:57.536000 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.535981636 podStartE2EDuration="8.535981636s" podCreationTimestamp="2026-02-19 23:04:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:04:57.531624492 +0000 UTC m=+5808.724142456" watchObservedRunningTime="2026-02-19 23:04:57.535981636 +0000 UTC m=+5808.728499500" Feb 19 23:04:58 crc kubenswrapper[4795]: I0219 23:04:58.529499 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b7d41b06-abb7-4a30-a29c-3b9d66706d8f","Type":"ContainerStarted","Data":"f28f0381bb04b7582c79e7da10be4145ec5a60513881210191f53ef170db2ca4"} Feb 19 23:04:58 crc kubenswrapper[4795]: I0219 23:04:58.572191 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.572145215 podStartE2EDuration="3.572145215s" podCreationTimestamp="2026-02-19 23:04:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:04:58.550414237 +0000 UTC m=+5809.742932111" watchObservedRunningTime="2026-02-19 23:04:58.572145215 +0000 UTC m=+5809.764663099" Feb 19 23:05:00 crc kubenswrapper[4795]: I0219 23:05:00.122991 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 23:05:00 crc kubenswrapper[4795]: I0219 23:05:00.124577 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 23:05:00 crc kubenswrapper[4795]: I0219 23:05:00.178725 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 23:05:00 crc kubenswrapper[4795]: I0219 23:05:00.184086 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 23:05:00 crc kubenswrapper[4795]: I0219 23:05:00.545857 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 23:05:00 crc kubenswrapper[4795]: I0219 23:05:00.545908 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 23:05:01 crc kubenswrapper[4795]: I0219 23:05:01.524415 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:05:01 crc kubenswrapper[4795]: E0219 23:05:01.525244 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:05:03 crc kubenswrapper[4795]: I0219 23:05:03.537758 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 23:05:03 crc kubenswrapper[4795]: I0219 23:05:03.572001 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 23:05:04 crc kubenswrapper[4795]: I0219 23:05:04.047173 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-v9rbk"] Feb 19 23:05:04 crc kubenswrapper[4795]: I0219 23:05:04.062649 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-b1d9-account-create-update-bfmjz"] Feb 19 23:05:04 crc kubenswrapper[4795]: I0219 23:05:04.071371 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-b1d9-account-create-update-bfmjz"] Feb 19 23:05:04 crc kubenswrapper[4795]: I0219 23:05:04.079745 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-v9rbk"] Feb 19 23:05:05 crc kubenswrapper[4795]: I0219 23:05:05.523358 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4ca6125-46fa-4dd9-8d20-3816b6c09066" path="/var/lib/kubelet/pods/a4ca6125-46fa-4dd9-8d20-3816b6c09066/volumes" Feb 19 23:05:05 crc kubenswrapper[4795]: I0219 23:05:05.525412 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c13f05e4-27de-4750-bb9d-008e3a0be0c7" path="/var/lib/kubelet/pods/c13f05e4-27de-4750-bb9d-008e3a0be0c7/volumes" Feb 19 23:05:06 crc kubenswrapper[4795]: I0219 23:05:06.017446 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 23:05:06 crc kubenswrapper[4795]: I0219 23:05:06.018698 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 23:05:06 crc kubenswrapper[4795]: I0219 23:05:06.071748 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 23:05:06 crc kubenswrapper[4795]: I0219 23:05:06.076391 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 23:05:06 crc kubenswrapper[4795]: I0219 23:05:06.112700 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-844c94496f-brkd8" podUID="fb48bdd6-abf1-4115-8357-79c56555d51b" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.112:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.112:8080: connect: connection refused" Feb 19 23:05:06 crc kubenswrapper[4795]: I0219 23:05:06.602828 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 23:05:06 crc kubenswrapper[4795]: I0219 23:05:06.602875 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 23:05:06 crc kubenswrapper[4795]: I0219 23:05:06.750285 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-548bf4c685-852ql" podUID="353e54a0-06cb-4876-af76-78bcd1bb3a22" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.113:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.113:8080: connect: connection refused" Feb 19 23:05:08 crc kubenswrapper[4795]: I0219 23:05:08.640850 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 23:05:08 crc kubenswrapper[4795]: I0219 23:05:08.641316 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 23:05:08 crc kubenswrapper[4795]: I0219 23:05:08.732509 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 23:05:10 crc kubenswrapper[4795]: I0219 23:05:10.033861 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-f5h94"] Feb 19 23:05:10 crc kubenswrapper[4795]: I0219 23:05:10.045142 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-f5h94"] Feb 19 23:05:11 crc kubenswrapper[4795]: I0219 23:05:11.533231 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ace73a97-1b52-4187-a035-df7a08266bab" path="/var/lib/kubelet/pods/ace73a97-1b52-4187-a035-df7a08266bab/volumes" Feb 19 23:05:16 crc kubenswrapper[4795]: I0219 23:05:16.512271 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:05:16 crc kubenswrapper[4795]: E0219 23:05:16.512884 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:05:16 crc kubenswrapper[4795]: I0219 23:05:16.688497 4795 scope.go:117] "RemoveContainer" containerID="b9f475bcb8fe7daa110726dc68979a5f8257c170cc549dfc5c151f5b0ece628f" Feb 19 23:05:16 crc kubenswrapper[4795]: I0219 23:05:16.731456 4795 scope.go:117] "RemoveContainer" containerID="4cb474362c7411d131561c37cda83e0ffd7519207d531dd518914dff39a66f87" Feb 19 23:05:16 crc kubenswrapper[4795]: I0219 23:05:16.760583 4795 scope.go:117] "RemoveContainer" containerID="c70a8662daa881e0007310348920cbd44bed7e794700942de323e6f34a2f57fc" Feb 19 23:05:16 crc kubenswrapper[4795]: I0219 23:05:16.789439 4795 scope.go:117] "RemoveContainer" containerID="1ad516f68056dfaa3dffe45049adde7607d436761c153d38593aeeda7b4036af" Feb 19 23:05:16 crc kubenswrapper[4795]: I0219 23:05:16.871832 4795 scope.go:117] "RemoveContainer" containerID="88b5a89b19e9675d8f8f4f6be28cd30648da8baae5b54e90d50ee586416168ce" Feb 19 23:05:16 crc kubenswrapper[4795]: I0219 23:05:16.911037 4795 scope.go:117] "RemoveContainer" containerID="b1d9f8ec39116ffdbb4a902a6f66f52cc6253d77fc5e2ba0e18cacfaee7d6753" Feb 19 23:05:17 crc kubenswrapper[4795]: I0219 23:05:17.884571 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:05:18 crc kubenswrapper[4795]: I0219 23:05:18.576649 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:05:19 crc kubenswrapper[4795]: I0219 23:05:19.494910 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:05:20 crc kubenswrapper[4795]: I0219 23:05:20.253952 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:05:20 crc kubenswrapper[4795]: I0219 23:05:20.356610 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-844c94496f-brkd8"] Feb 19 23:05:20 crc kubenswrapper[4795]: I0219 23:05:20.356831 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-844c94496f-brkd8" podUID="fb48bdd6-abf1-4115-8357-79c56555d51b" containerName="horizon-log" containerID="cri-o://66e07672d0b0039c92e5a7bf2db03f18fc25ea4f8132e4dd9bc20c658abaf8bd" gracePeriod=30 Feb 19 23:05:20 crc kubenswrapper[4795]: I0219 23:05:20.357350 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-844c94496f-brkd8" podUID="fb48bdd6-abf1-4115-8357-79c56555d51b" containerName="horizon" containerID="cri-o://098124b166de0fbf41c5cb05a877f55202dbab6d10c56f71e5220e37b6da5149" gracePeriod=30 Feb 19 23:05:23 crc kubenswrapper[4795]: I0219 23:05:23.765890 4795 generic.go:334] "Generic (PLEG): container finished" podID="fb48bdd6-abf1-4115-8357-79c56555d51b" containerID="098124b166de0fbf41c5cb05a877f55202dbab6d10c56f71e5220e37b6da5149" exitCode=0 Feb 19 23:05:23 crc kubenswrapper[4795]: I0219 23:05:23.766020 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-844c94496f-brkd8" event={"ID":"fb48bdd6-abf1-4115-8357-79c56555d51b","Type":"ContainerDied","Data":"098124b166de0fbf41c5cb05a877f55202dbab6d10c56f71e5220e37b6da5149"} Feb 19 23:05:25 crc kubenswrapper[4795]: I0219 23:05:25.787650 4795 generic.go:334] "Generic (PLEG): container finished" podID="4ab2c6ea-997b-4147-a0de-5e3989980973" containerID="aba2ee7ae1998f0274365ca81f59652ac2f73a1ce0c84373e76ed4aca4b68378" exitCode=137 Feb 19 23:05:25 crc kubenswrapper[4795]: I0219 23:05:25.788178 4795 generic.go:334] "Generic (PLEG): container finished" podID="4ab2c6ea-997b-4147-a0de-5e3989980973" containerID="b76880969cbea856154bb0e3062a422d765928c77fe52c693f3423f812d36b6f" exitCode=137 Feb 19 23:05:25 crc kubenswrapper[4795]: I0219 23:05:25.788203 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79bb8b759c-wxdkj" event={"ID":"4ab2c6ea-997b-4147-a0de-5e3989980973","Type":"ContainerDied","Data":"aba2ee7ae1998f0274365ca81f59652ac2f73a1ce0c84373e76ed4aca4b68378"} Feb 19 23:05:25 crc kubenswrapper[4795]: I0219 23:05:25.788231 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79bb8b759c-wxdkj" event={"ID":"4ab2c6ea-997b-4147-a0de-5e3989980973","Type":"ContainerDied","Data":"b76880969cbea856154bb0e3062a422d765928c77fe52c693f3423f812d36b6f"} Feb 19 23:05:25 crc kubenswrapper[4795]: I0219 23:05:25.889036 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.065674 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ab2c6ea-997b-4147-a0de-5e3989980973-scripts\") pod \"4ab2c6ea-997b-4147-a0de-5e3989980973\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.065741 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwmwt\" (UniqueName: \"kubernetes.io/projected/4ab2c6ea-997b-4147-a0de-5e3989980973-kube-api-access-hwmwt\") pod \"4ab2c6ea-997b-4147-a0de-5e3989980973\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.065804 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4ab2c6ea-997b-4147-a0de-5e3989980973-horizon-secret-key\") pod \"4ab2c6ea-997b-4147-a0de-5e3989980973\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.065883 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ab2c6ea-997b-4147-a0de-5e3989980973-config-data\") pod \"4ab2c6ea-997b-4147-a0de-5e3989980973\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.065902 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ab2c6ea-997b-4147-a0de-5e3989980973-logs\") pod \"4ab2c6ea-997b-4147-a0de-5e3989980973\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.066749 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ab2c6ea-997b-4147-a0de-5e3989980973-logs" (OuterVolumeSpecName: "logs") pod "4ab2c6ea-997b-4147-a0de-5e3989980973" (UID: "4ab2c6ea-997b-4147-a0de-5e3989980973"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.071240 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ab2c6ea-997b-4147-a0de-5e3989980973-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "4ab2c6ea-997b-4147-a0de-5e3989980973" (UID: "4ab2c6ea-997b-4147-a0de-5e3989980973"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.071259 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ab2c6ea-997b-4147-a0de-5e3989980973-kube-api-access-hwmwt" (OuterVolumeSpecName: "kube-api-access-hwmwt") pod "4ab2c6ea-997b-4147-a0de-5e3989980973" (UID: "4ab2c6ea-997b-4147-a0de-5e3989980973"). InnerVolumeSpecName "kube-api-access-hwmwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.090187 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ab2c6ea-997b-4147-a0de-5e3989980973-scripts" (OuterVolumeSpecName: "scripts") pod "4ab2c6ea-997b-4147-a0de-5e3989980973" (UID: "4ab2c6ea-997b-4147-a0de-5e3989980973"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.098232 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ab2c6ea-997b-4147-a0de-5e3989980973-config-data" (OuterVolumeSpecName: "config-data") pod "4ab2c6ea-997b-4147-a0de-5e3989980973" (UID: "4ab2c6ea-997b-4147-a0de-5e3989980973"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.110115 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-844c94496f-brkd8" podUID="fb48bdd6-abf1-4115-8357-79c56555d51b" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.112:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.112:8080: connect: connection refused" Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.168560 4795 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4ab2c6ea-997b-4147-a0de-5e3989980973-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.168601 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ab2c6ea-997b-4147-a0de-5e3989980973-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.168611 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ab2c6ea-997b-4147-a0de-5e3989980973-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.168672 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ab2c6ea-997b-4147-a0de-5e3989980973-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.168686 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwmwt\" (UniqueName: \"kubernetes.io/projected/4ab2c6ea-997b-4147-a0de-5e3989980973-kube-api-access-hwmwt\") on node \"crc\" DevicePath \"\"" Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.797321 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79bb8b759c-wxdkj" event={"ID":"4ab2c6ea-997b-4147-a0de-5e3989980973","Type":"ContainerDied","Data":"79805600d19571dc97c2bb6f4d14290da224c25127f75720d4481d84a77147c0"} Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.797704 4795 scope.go:117] "RemoveContainer" containerID="aba2ee7ae1998f0274365ca81f59652ac2f73a1ce0c84373e76ed4aca4b68378" Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.797389 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.832707 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-79bb8b759c-wxdkj"] Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.841900 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-79bb8b759c-wxdkj"] Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.968817 4795 scope.go:117] "RemoveContainer" containerID="b76880969cbea856154bb0e3062a422d765928c77fe52c693f3423f812d36b6f" Feb 19 23:05:27 crc kubenswrapper[4795]: I0219 23:05:27.524382 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ab2c6ea-997b-4147-a0de-5e3989980973" path="/var/lib/kubelet/pods/4ab2c6ea-997b-4147-a0de-5e3989980973/volumes" Feb 19 23:05:31 crc kubenswrapper[4795]: I0219 23:05:31.511814 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:05:31 crc kubenswrapper[4795]: E0219 23:05:31.515026 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:05:36 crc kubenswrapper[4795]: I0219 23:05:36.109663 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-844c94496f-brkd8" podUID="fb48bdd6-abf1-4115-8357-79c56555d51b" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.112:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.112:8080: connect: connection refused" Feb 19 23:05:40 crc kubenswrapper[4795]: I0219 23:05:40.048922 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f69d-account-create-update-gbq6r"] Feb 19 23:05:40 crc kubenswrapper[4795]: I0219 23:05:40.061511 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-xbwb6"] Feb 19 23:05:40 crc kubenswrapper[4795]: I0219 23:05:40.070671 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f69d-account-create-update-gbq6r"] Feb 19 23:05:40 crc kubenswrapper[4795]: I0219 23:05:40.080036 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-xbwb6"] Feb 19 23:05:41 crc kubenswrapper[4795]: I0219 23:05:41.523927 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="635044d2-10e8-457c-b03e-9507a500c7fe" path="/var/lib/kubelet/pods/635044d2-10e8-457c-b03e-9507a500c7fe/volumes" Feb 19 23:05:41 crc kubenswrapper[4795]: I0219 23:05:41.525292 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0369c6f-517b-44b8-968a-a3408c6044d6" path="/var/lib/kubelet/pods/c0369c6f-517b-44b8-968a-a3408c6044d6/volumes" Feb 19 23:05:42 crc kubenswrapper[4795]: I0219 23:05:42.511827 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:05:42 crc kubenswrapper[4795]: E0219 23:05:42.512481 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:05:46 crc kubenswrapper[4795]: I0219 23:05:46.109570 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-844c94496f-brkd8" podUID="fb48bdd6-abf1-4115-8357-79c56555d51b" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.112:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.112:8080: connect: connection refused" Feb 19 23:05:46 crc kubenswrapper[4795]: I0219 23:05:46.110016 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:05:48 crc kubenswrapper[4795]: I0219 23:05:48.078004 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-p5rjh"] Feb 19 23:05:48 crc kubenswrapper[4795]: I0219 23:05:48.088643 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-p5rjh"] Feb 19 23:05:49 crc kubenswrapper[4795]: I0219 23:05:49.535320 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ad0e107-d857-4118-9582-5039b45f1ec8" path="/var/lib/kubelet/pods/9ad0e107-d857-4118-9582-5039b45f1ec8/volumes" Feb 19 23:05:50 crc kubenswrapper[4795]: I0219 23:05:50.853756 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:05:50 crc kubenswrapper[4795]: I0219 23:05:50.977394 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fb48bdd6-abf1-4115-8357-79c56555d51b-horizon-secret-key\") pod \"fb48bdd6-abf1-4115-8357-79c56555d51b\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " Feb 19 23:05:50 crc kubenswrapper[4795]: I0219 23:05:50.977454 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb48bdd6-abf1-4115-8357-79c56555d51b-scripts\") pod \"fb48bdd6-abf1-4115-8357-79c56555d51b\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " Feb 19 23:05:50 crc kubenswrapper[4795]: I0219 23:05:50.977546 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6xf7\" (UniqueName: \"kubernetes.io/projected/fb48bdd6-abf1-4115-8357-79c56555d51b-kube-api-access-p6xf7\") pod \"fb48bdd6-abf1-4115-8357-79c56555d51b\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " Feb 19 23:05:50 crc kubenswrapper[4795]: I0219 23:05:50.977611 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fb48bdd6-abf1-4115-8357-79c56555d51b-config-data\") pod \"fb48bdd6-abf1-4115-8357-79c56555d51b\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " Feb 19 23:05:50 crc kubenswrapper[4795]: I0219 23:05:50.977680 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb48bdd6-abf1-4115-8357-79c56555d51b-logs\") pod \"fb48bdd6-abf1-4115-8357-79c56555d51b\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " Feb 19 23:05:50 crc kubenswrapper[4795]: I0219 23:05:50.978134 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb48bdd6-abf1-4115-8357-79c56555d51b-logs" (OuterVolumeSpecName: "logs") pod "fb48bdd6-abf1-4115-8357-79c56555d51b" (UID: "fb48bdd6-abf1-4115-8357-79c56555d51b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:05:50 crc kubenswrapper[4795]: I0219 23:05:50.978718 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb48bdd6-abf1-4115-8357-79c56555d51b-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:05:50 crc kubenswrapper[4795]: I0219 23:05:50.982868 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb48bdd6-abf1-4115-8357-79c56555d51b-kube-api-access-p6xf7" (OuterVolumeSpecName: "kube-api-access-p6xf7") pod "fb48bdd6-abf1-4115-8357-79c56555d51b" (UID: "fb48bdd6-abf1-4115-8357-79c56555d51b"). InnerVolumeSpecName "kube-api-access-p6xf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:05:50 crc kubenswrapper[4795]: I0219 23:05:50.982893 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb48bdd6-abf1-4115-8357-79c56555d51b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "fb48bdd6-abf1-4115-8357-79c56555d51b" (UID: "fb48bdd6-abf1-4115-8357-79c56555d51b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.001749 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb48bdd6-abf1-4115-8357-79c56555d51b-config-data" (OuterVolumeSpecName: "config-data") pod "fb48bdd6-abf1-4115-8357-79c56555d51b" (UID: "fb48bdd6-abf1-4115-8357-79c56555d51b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.003904 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb48bdd6-abf1-4115-8357-79c56555d51b-scripts" (OuterVolumeSpecName: "scripts") pod "fb48bdd6-abf1-4115-8357-79c56555d51b" (UID: "fb48bdd6-abf1-4115-8357-79c56555d51b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.042692 4795 generic.go:334] "Generic (PLEG): container finished" podID="fb48bdd6-abf1-4115-8357-79c56555d51b" containerID="66e07672d0b0039c92e5a7bf2db03f18fc25ea4f8132e4dd9bc20c658abaf8bd" exitCode=137 Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.042785 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-844c94496f-brkd8" event={"ID":"fb48bdd6-abf1-4115-8357-79c56555d51b","Type":"ContainerDied","Data":"66e07672d0b0039c92e5a7bf2db03f18fc25ea4f8132e4dd9bc20c658abaf8bd"} Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.043058 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-844c94496f-brkd8" event={"ID":"fb48bdd6-abf1-4115-8357-79c56555d51b","Type":"ContainerDied","Data":"a28eb35b4d570794611599731f49604cc11a5fca584746030e04c1f904b4cff5"} Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.043127 4795 scope.go:117] "RemoveContainer" containerID="098124b166de0fbf41c5cb05a877f55202dbab6d10c56f71e5220e37b6da5149" Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.042815 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.080682 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fb48bdd6-abf1-4115-8357-79c56555d51b-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.080994 4795 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fb48bdd6-abf1-4115-8357-79c56555d51b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.081009 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb48bdd6-abf1-4115-8357-79c56555d51b-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.081021 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6xf7\" (UniqueName: \"kubernetes.io/projected/fb48bdd6-abf1-4115-8357-79c56555d51b-kube-api-access-p6xf7\") on node \"crc\" DevicePath \"\"" Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.081769 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-844c94496f-brkd8"] Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.091305 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-844c94496f-brkd8"] Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.235059 4795 scope.go:117] "RemoveContainer" containerID="66e07672d0b0039c92e5a7bf2db03f18fc25ea4f8132e4dd9bc20c658abaf8bd" Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.257452 4795 scope.go:117] "RemoveContainer" containerID="098124b166de0fbf41c5cb05a877f55202dbab6d10c56f71e5220e37b6da5149" Feb 19 23:05:51 crc kubenswrapper[4795]: E0219 23:05:51.257936 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"098124b166de0fbf41c5cb05a877f55202dbab6d10c56f71e5220e37b6da5149\": container with ID starting with 098124b166de0fbf41c5cb05a877f55202dbab6d10c56f71e5220e37b6da5149 not found: ID does not exist" containerID="098124b166de0fbf41c5cb05a877f55202dbab6d10c56f71e5220e37b6da5149" Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.257980 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"098124b166de0fbf41c5cb05a877f55202dbab6d10c56f71e5220e37b6da5149"} err="failed to get container status \"098124b166de0fbf41c5cb05a877f55202dbab6d10c56f71e5220e37b6da5149\": rpc error: code = NotFound desc = could not find container \"098124b166de0fbf41c5cb05a877f55202dbab6d10c56f71e5220e37b6da5149\": container with ID starting with 098124b166de0fbf41c5cb05a877f55202dbab6d10c56f71e5220e37b6da5149 not found: ID does not exist" Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.258006 4795 scope.go:117] "RemoveContainer" containerID="66e07672d0b0039c92e5a7bf2db03f18fc25ea4f8132e4dd9bc20c658abaf8bd" Feb 19 23:05:51 crc kubenswrapper[4795]: E0219 23:05:51.258484 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66e07672d0b0039c92e5a7bf2db03f18fc25ea4f8132e4dd9bc20c658abaf8bd\": container with ID starting with 66e07672d0b0039c92e5a7bf2db03f18fc25ea4f8132e4dd9bc20c658abaf8bd not found: ID does not exist" containerID="66e07672d0b0039c92e5a7bf2db03f18fc25ea4f8132e4dd9bc20c658abaf8bd" Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.258516 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66e07672d0b0039c92e5a7bf2db03f18fc25ea4f8132e4dd9bc20c658abaf8bd"} err="failed to get container status \"66e07672d0b0039c92e5a7bf2db03f18fc25ea4f8132e4dd9bc20c658abaf8bd\": rpc error: code = NotFound desc = could not find container \"66e07672d0b0039c92e5a7bf2db03f18fc25ea4f8132e4dd9bc20c658abaf8bd\": container with ID starting with 66e07672d0b0039c92e5a7bf2db03f18fc25ea4f8132e4dd9bc20c658abaf8bd not found: ID does not exist" Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.528940 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb48bdd6-abf1-4115-8357-79c56555d51b" path="/var/lib/kubelet/pods/fb48bdd6-abf1-4115-8357-79c56555d51b/volumes" Feb 19 23:05:55 crc kubenswrapper[4795]: I0219 23:05:55.512219 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:05:55 crc kubenswrapper[4795]: E0219 23:05:55.514062 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.372690 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6f75767dd9-c8js2"] Feb 19 23:06:03 crc kubenswrapper[4795]: E0219 23:06:03.373676 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ab2c6ea-997b-4147-a0de-5e3989980973" containerName="horizon" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.373695 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ab2c6ea-997b-4147-a0de-5e3989980973" containerName="horizon" Feb 19 23:06:03 crc kubenswrapper[4795]: E0219 23:06:03.373724 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb48bdd6-abf1-4115-8357-79c56555d51b" containerName="horizon" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.373732 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb48bdd6-abf1-4115-8357-79c56555d51b" containerName="horizon" Feb 19 23:06:03 crc kubenswrapper[4795]: E0219 23:06:03.373751 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ab2c6ea-997b-4147-a0de-5e3989980973" containerName="horizon-log" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.373759 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ab2c6ea-997b-4147-a0de-5e3989980973" containerName="horizon-log" Feb 19 23:06:03 crc kubenswrapper[4795]: E0219 23:06:03.373792 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb48bdd6-abf1-4115-8357-79c56555d51b" containerName="horizon-log" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.373799 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb48bdd6-abf1-4115-8357-79c56555d51b" containerName="horizon-log" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.374008 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ab2c6ea-997b-4147-a0de-5e3989980973" containerName="horizon-log" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.374035 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb48bdd6-abf1-4115-8357-79c56555d51b" containerName="horizon" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.374049 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ab2c6ea-997b-4147-a0de-5e3989980973" containerName="horizon" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.374069 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb48bdd6-abf1-4115-8357-79c56555d51b" containerName="horizon-log" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.375376 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.391548 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f75767dd9-c8js2"] Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.513981 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvzlz\" (UniqueName: \"kubernetes.io/projected/53ce70ba-9e61-4dbd-b858-7059c82eed67-kube-api-access-hvzlz\") pod \"horizon-6f75767dd9-c8js2\" (UID: \"53ce70ba-9e61-4dbd-b858-7059c82eed67\") " pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.514415 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53ce70ba-9e61-4dbd-b858-7059c82eed67-scripts\") pod \"horizon-6f75767dd9-c8js2\" (UID: \"53ce70ba-9e61-4dbd-b858-7059c82eed67\") " pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.514455 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/53ce70ba-9e61-4dbd-b858-7059c82eed67-horizon-secret-key\") pod \"horizon-6f75767dd9-c8js2\" (UID: \"53ce70ba-9e61-4dbd-b858-7059c82eed67\") " pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.514518 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53ce70ba-9e61-4dbd-b858-7059c82eed67-logs\") pod \"horizon-6f75767dd9-c8js2\" (UID: \"53ce70ba-9e61-4dbd-b858-7059c82eed67\") " pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.514602 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53ce70ba-9e61-4dbd-b858-7059c82eed67-config-data\") pod \"horizon-6f75767dd9-c8js2\" (UID: \"53ce70ba-9e61-4dbd-b858-7059c82eed67\") " pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.616193 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53ce70ba-9e61-4dbd-b858-7059c82eed67-scripts\") pod \"horizon-6f75767dd9-c8js2\" (UID: \"53ce70ba-9e61-4dbd-b858-7059c82eed67\") " pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.616304 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/53ce70ba-9e61-4dbd-b858-7059c82eed67-horizon-secret-key\") pod \"horizon-6f75767dd9-c8js2\" (UID: \"53ce70ba-9e61-4dbd-b858-7059c82eed67\") " pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.616380 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53ce70ba-9e61-4dbd-b858-7059c82eed67-logs\") pod \"horizon-6f75767dd9-c8js2\" (UID: \"53ce70ba-9e61-4dbd-b858-7059c82eed67\") " pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.616444 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53ce70ba-9e61-4dbd-b858-7059c82eed67-config-data\") pod \"horizon-6f75767dd9-c8js2\" (UID: \"53ce70ba-9e61-4dbd-b858-7059c82eed67\") " pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.616545 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvzlz\" (UniqueName: \"kubernetes.io/projected/53ce70ba-9e61-4dbd-b858-7059c82eed67-kube-api-access-hvzlz\") pod \"horizon-6f75767dd9-c8js2\" (UID: \"53ce70ba-9e61-4dbd-b858-7059c82eed67\") " pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.617766 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53ce70ba-9e61-4dbd-b858-7059c82eed67-logs\") pod \"horizon-6f75767dd9-c8js2\" (UID: \"53ce70ba-9e61-4dbd-b858-7059c82eed67\") " pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.617897 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53ce70ba-9e61-4dbd-b858-7059c82eed67-scripts\") pod \"horizon-6f75767dd9-c8js2\" (UID: \"53ce70ba-9e61-4dbd-b858-7059c82eed67\") " pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.619434 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53ce70ba-9e61-4dbd-b858-7059c82eed67-config-data\") pod \"horizon-6f75767dd9-c8js2\" (UID: \"53ce70ba-9e61-4dbd-b858-7059c82eed67\") " pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.633078 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/53ce70ba-9e61-4dbd-b858-7059c82eed67-horizon-secret-key\") pod \"horizon-6f75767dd9-c8js2\" (UID: \"53ce70ba-9e61-4dbd-b858-7059c82eed67\") " pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.633237 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvzlz\" (UniqueName: \"kubernetes.io/projected/53ce70ba-9e61-4dbd-b858-7059c82eed67-kube-api-access-hvzlz\") pod \"horizon-6f75767dd9-c8js2\" (UID: \"53ce70ba-9e61-4dbd-b858-7059c82eed67\") " pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.700702 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:04 crc kubenswrapper[4795]: I0219 23:06:04.162506 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f75767dd9-c8js2"] Feb 19 23:06:04 crc kubenswrapper[4795]: I0219 23:06:04.179885 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f75767dd9-c8js2" event={"ID":"53ce70ba-9e61-4dbd-b858-7059c82eed67","Type":"ContainerStarted","Data":"2ea9022ede0c3e50baa39c2347e5ce3c1030986712d1ff78b282a6219767af88"} Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.005101 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-pvsmv"] Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.007393 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-pvsmv" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.018320 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-pvsmv"] Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.109938 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-4351-account-create-update-7cp54"] Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.111715 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-4351-account-create-update-7cp54" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.114455 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.119584 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-4351-account-create-update-7cp54"] Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.149521 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ce3f6fb-8688-4e53-8d30-e6c7edbf5636-operator-scripts\") pod \"heat-db-create-pvsmv\" (UID: \"3ce3f6fb-8688-4e53-8d30-e6c7edbf5636\") " pod="openstack/heat-db-create-pvsmv" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.149616 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5znf\" (UniqueName: \"kubernetes.io/projected/3ce3f6fb-8688-4e53-8d30-e6c7edbf5636-kube-api-access-w5znf\") pod \"heat-db-create-pvsmv\" (UID: \"3ce3f6fb-8688-4e53-8d30-e6c7edbf5636\") " pod="openstack/heat-db-create-pvsmv" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.196483 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f75767dd9-c8js2" event={"ID":"53ce70ba-9e61-4dbd-b858-7059c82eed67","Type":"ContainerStarted","Data":"1f44a51809ccf2a01357de51840ced27533940ee96c83e26d97d8fba5a6f387b"} Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.196527 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f75767dd9-c8js2" event={"ID":"53ce70ba-9e61-4dbd-b858-7059c82eed67","Type":"ContainerStarted","Data":"dd4dcaf29f9033593907f9d3d6a4d67d87ce9f60ff1b51bf67c29e49681ba566"} Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.218762 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6f75767dd9-c8js2" podStartSLOduration=2.2187418819999998 podStartE2EDuration="2.218741882s" podCreationTimestamp="2026-02-19 23:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:06:05.210888393 +0000 UTC m=+5876.403406277" watchObservedRunningTime="2026-02-19 23:06:05.218741882 +0000 UTC m=+5876.411259746" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.251532 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm24c\" (UniqueName: \"kubernetes.io/projected/c7049350-2c57-49c2-aef7-b9f0bd28abfc-kube-api-access-dm24c\") pod \"heat-4351-account-create-update-7cp54\" (UID: \"c7049350-2c57-49c2-aef7-b9f0bd28abfc\") " pod="openstack/heat-4351-account-create-update-7cp54" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.251592 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5znf\" (UniqueName: \"kubernetes.io/projected/3ce3f6fb-8688-4e53-8d30-e6c7edbf5636-kube-api-access-w5znf\") pod \"heat-db-create-pvsmv\" (UID: \"3ce3f6fb-8688-4e53-8d30-e6c7edbf5636\") " pod="openstack/heat-db-create-pvsmv" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.251677 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7049350-2c57-49c2-aef7-b9f0bd28abfc-operator-scripts\") pod \"heat-4351-account-create-update-7cp54\" (UID: \"c7049350-2c57-49c2-aef7-b9f0bd28abfc\") " pod="openstack/heat-4351-account-create-update-7cp54" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.251937 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ce3f6fb-8688-4e53-8d30-e6c7edbf5636-operator-scripts\") pod \"heat-db-create-pvsmv\" (UID: \"3ce3f6fb-8688-4e53-8d30-e6c7edbf5636\") " pod="openstack/heat-db-create-pvsmv" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.252691 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ce3f6fb-8688-4e53-8d30-e6c7edbf5636-operator-scripts\") pod \"heat-db-create-pvsmv\" (UID: \"3ce3f6fb-8688-4e53-8d30-e6c7edbf5636\") " pod="openstack/heat-db-create-pvsmv" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.275685 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5znf\" (UniqueName: \"kubernetes.io/projected/3ce3f6fb-8688-4e53-8d30-e6c7edbf5636-kube-api-access-w5znf\") pod \"heat-db-create-pvsmv\" (UID: \"3ce3f6fb-8688-4e53-8d30-e6c7edbf5636\") " pod="openstack/heat-db-create-pvsmv" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.328147 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-pvsmv" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.353559 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm24c\" (UniqueName: \"kubernetes.io/projected/c7049350-2c57-49c2-aef7-b9f0bd28abfc-kube-api-access-dm24c\") pod \"heat-4351-account-create-update-7cp54\" (UID: \"c7049350-2c57-49c2-aef7-b9f0bd28abfc\") " pod="openstack/heat-4351-account-create-update-7cp54" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.353631 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7049350-2c57-49c2-aef7-b9f0bd28abfc-operator-scripts\") pod \"heat-4351-account-create-update-7cp54\" (UID: \"c7049350-2c57-49c2-aef7-b9f0bd28abfc\") " pod="openstack/heat-4351-account-create-update-7cp54" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.354621 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7049350-2c57-49c2-aef7-b9f0bd28abfc-operator-scripts\") pod \"heat-4351-account-create-update-7cp54\" (UID: \"c7049350-2c57-49c2-aef7-b9f0bd28abfc\") " pod="openstack/heat-4351-account-create-update-7cp54" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.379721 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm24c\" (UniqueName: \"kubernetes.io/projected/c7049350-2c57-49c2-aef7-b9f0bd28abfc-kube-api-access-dm24c\") pod \"heat-4351-account-create-update-7cp54\" (UID: \"c7049350-2c57-49c2-aef7-b9f0bd28abfc\") " pod="openstack/heat-4351-account-create-update-7cp54" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.428671 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-4351-account-create-update-7cp54" Feb 19 23:06:06 crc kubenswrapper[4795]: I0219 23:06:06.385845 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-pvsmv"] Feb 19 23:06:06 crc kubenswrapper[4795]: W0219 23:06:06.446599 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7049350_2c57_49c2_aef7_b9f0bd28abfc.slice/crio-f7fd4e88e17a64a3d63d66068c160f61502e2f8de22f456fb4f0e646af6f9a6f WatchSource:0}: Error finding container f7fd4e88e17a64a3d63d66068c160f61502e2f8de22f456fb4f0e646af6f9a6f: Status 404 returned error can't find the container with id f7fd4e88e17a64a3d63d66068c160f61502e2f8de22f456fb4f0e646af6f9a6f Feb 19 23:06:06 crc kubenswrapper[4795]: I0219 23:06:06.447665 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-4351-account-create-update-7cp54"] Feb 19 23:06:07 crc kubenswrapper[4795]: I0219 23:06:07.220516 4795 generic.go:334] "Generic (PLEG): container finished" podID="3ce3f6fb-8688-4e53-8d30-e6c7edbf5636" containerID="394778543b7d586f5d57eb0c386c1afcba774c0cbc8858b3c036d9d786189525" exitCode=0 Feb 19 23:06:07 crc kubenswrapper[4795]: I0219 23:06:07.220556 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-pvsmv" event={"ID":"3ce3f6fb-8688-4e53-8d30-e6c7edbf5636","Type":"ContainerDied","Data":"394778543b7d586f5d57eb0c386c1afcba774c0cbc8858b3c036d9d786189525"} Feb 19 23:06:07 crc kubenswrapper[4795]: I0219 23:06:07.220944 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-pvsmv" event={"ID":"3ce3f6fb-8688-4e53-8d30-e6c7edbf5636","Type":"ContainerStarted","Data":"83fe7779207b6996d4e3030b5609b7839791250e025c39eae68cc633d632e6ed"} Feb 19 23:06:07 crc kubenswrapper[4795]: I0219 23:06:07.223348 4795 generic.go:334] "Generic (PLEG): container finished" podID="c7049350-2c57-49c2-aef7-b9f0bd28abfc" containerID="be3b7d10ee8dbba79631201a8d5da4057d99e4383e860152ba77db4992ff52fc" exitCode=0 Feb 19 23:06:07 crc kubenswrapper[4795]: I0219 23:06:07.223417 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-4351-account-create-update-7cp54" event={"ID":"c7049350-2c57-49c2-aef7-b9f0bd28abfc","Type":"ContainerDied","Data":"be3b7d10ee8dbba79631201a8d5da4057d99e4383e860152ba77db4992ff52fc"} Feb 19 23:06:07 crc kubenswrapper[4795]: I0219 23:06:07.223443 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-4351-account-create-update-7cp54" event={"ID":"c7049350-2c57-49c2-aef7-b9f0bd28abfc","Type":"ContainerStarted","Data":"f7fd4e88e17a64a3d63d66068c160f61502e2f8de22f456fb4f0e646af6f9a6f"} Feb 19 23:06:08 crc kubenswrapper[4795]: I0219 23:06:08.512342 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:06:08 crc kubenswrapper[4795]: E0219 23:06:08.513070 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:06:08 crc kubenswrapper[4795]: I0219 23:06:08.671517 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-pvsmv" Feb 19 23:06:08 crc kubenswrapper[4795]: I0219 23:06:08.678060 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-4351-account-create-update-7cp54" Feb 19 23:06:08 crc kubenswrapper[4795]: I0219 23:06:08.718046 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ce3f6fb-8688-4e53-8d30-e6c7edbf5636-operator-scripts\") pod \"3ce3f6fb-8688-4e53-8d30-e6c7edbf5636\" (UID: \"3ce3f6fb-8688-4e53-8d30-e6c7edbf5636\") " Feb 19 23:06:08 crc kubenswrapper[4795]: I0219 23:06:08.718428 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5znf\" (UniqueName: \"kubernetes.io/projected/3ce3f6fb-8688-4e53-8d30-e6c7edbf5636-kube-api-access-w5znf\") pod \"3ce3f6fb-8688-4e53-8d30-e6c7edbf5636\" (UID: \"3ce3f6fb-8688-4e53-8d30-e6c7edbf5636\") " Feb 19 23:06:08 crc kubenswrapper[4795]: I0219 23:06:08.718859 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ce3f6fb-8688-4e53-8d30-e6c7edbf5636-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3ce3f6fb-8688-4e53-8d30-e6c7edbf5636" (UID: "3ce3f6fb-8688-4e53-8d30-e6c7edbf5636"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:06:08 crc kubenswrapper[4795]: I0219 23:06:08.719020 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ce3f6fb-8688-4e53-8d30-e6c7edbf5636-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:08 crc kubenswrapper[4795]: I0219 23:06:08.723500 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ce3f6fb-8688-4e53-8d30-e6c7edbf5636-kube-api-access-w5znf" (OuterVolumeSpecName: "kube-api-access-w5znf") pod "3ce3f6fb-8688-4e53-8d30-e6c7edbf5636" (UID: "3ce3f6fb-8688-4e53-8d30-e6c7edbf5636"). InnerVolumeSpecName "kube-api-access-w5znf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:06:08 crc kubenswrapper[4795]: I0219 23:06:08.820550 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dm24c\" (UniqueName: \"kubernetes.io/projected/c7049350-2c57-49c2-aef7-b9f0bd28abfc-kube-api-access-dm24c\") pod \"c7049350-2c57-49c2-aef7-b9f0bd28abfc\" (UID: \"c7049350-2c57-49c2-aef7-b9f0bd28abfc\") " Feb 19 23:06:08 crc kubenswrapper[4795]: I0219 23:06:08.821388 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7049350-2c57-49c2-aef7-b9f0bd28abfc-operator-scripts\") pod \"c7049350-2c57-49c2-aef7-b9f0bd28abfc\" (UID: \"c7049350-2c57-49c2-aef7-b9f0bd28abfc\") " Feb 19 23:06:08 crc kubenswrapper[4795]: I0219 23:06:08.821857 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7049350-2c57-49c2-aef7-b9f0bd28abfc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c7049350-2c57-49c2-aef7-b9f0bd28abfc" (UID: "c7049350-2c57-49c2-aef7-b9f0bd28abfc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:06:08 crc kubenswrapper[4795]: I0219 23:06:08.822772 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5znf\" (UniqueName: \"kubernetes.io/projected/3ce3f6fb-8688-4e53-8d30-e6c7edbf5636-kube-api-access-w5znf\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:08 crc kubenswrapper[4795]: I0219 23:06:08.822962 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7049350-2c57-49c2-aef7-b9f0bd28abfc-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:08 crc kubenswrapper[4795]: I0219 23:06:08.823846 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7049350-2c57-49c2-aef7-b9f0bd28abfc-kube-api-access-dm24c" (OuterVolumeSpecName: "kube-api-access-dm24c") pod "c7049350-2c57-49c2-aef7-b9f0bd28abfc" (UID: "c7049350-2c57-49c2-aef7-b9f0bd28abfc"). InnerVolumeSpecName "kube-api-access-dm24c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:06:08 crc kubenswrapper[4795]: I0219 23:06:08.925944 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dm24c\" (UniqueName: \"kubernetes.io/projected/c7049350-2c57-49c2-aef7-b9f0bd28abfc-kube-api-access-dm24c\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:09 crc kubenswrapper[4795]: I0219 23:06:09.247430 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-4351-account-create-update-7cp54" event={"ID":"c7049350-2c57-49c2-aef7-b9f0bd28abfc","Type":"ContainerDied","Data":"f7fd4e88e17a64a3d63d66068c160f61502e2f8de22f456fb4f0e646af6f9a6f"} Feb 19 23:06:09 crc kubenswrapper[4795]: I0219 23:06:09.247483 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7fd4e88e17a64a3d63d66068c160f61502e2f8de22f456fb4f0e646af6f9a6f" Feb 19 23:06:09 crc kubenswrapper[4795]: I0219 23:06:09.247559 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-4351-account-create-update-7cp54" Feb 19 23:06:09 crc kubenswrapper[4795]: I0219 23:06:09.252957 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-pvsmv" event={"ID":"3ce3f6fb-8688-4e53-8d30-e6c7edbf5636","Type":"ContainerDied","Data":"83fe7779207b6996d4e3030b5609b7839791250e025c39eae68cc633d632e6ed"} Feb 19 23:06:09 crc kubenswrapper[4795]: I0219 23:06:09.252998 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83fe7779207b6996d4e3030b5609b7839791250e025c39eae68cc633d632e6ed" Feb 19 23:06:09 crc kubenswrapper[4795]: I0219 23:06:09.253048 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-pvsmv" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.326743 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-wlhqm"] Feb 19 23:06:10 crc kubenswrapper[4795]: E0219 23:06:10.327438 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ce3f6fb-8688-4e53-8d30-e6c7edbf5636" containerName="mariadb-database-create" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.327449 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ce3f6fb-8688-4e53-8d30-e6c7edbf5636" containerName="mariadb-database-create" Feb 19 23:06:10 crc kubenswrapper[4795]: E0219 23:06:10.327463 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7049350-2c57-49c2-aef7-b9f0bd28abfc" containerName="mariadb-account-create-update" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.327468 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7049350-2c57-49c2-aef7-b9f0bd28abfc" containerName="mariadb-account-create-update" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.327643 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7049350-2c57-49c2-aef7-b9f0bd28abfc" containerName="mariadb-account-create-update" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.327657 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ce3f6fb-8688-4e53-8d30-e6c7edbf5636" containerName="mariadb-database-create" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.329630 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-wlhqm" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.335705 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-67jp9" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.335731 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.347884 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-wlhqm"] Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.461532 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf4f6\" (UniqueName: \"kubernetes.io/projected/497c4c82-13ae-430c-83bd-1f1c4d4683e4-kube-api-access-nf4f6\") pod \"heat-db-sync-wlhqm\" (UID: \"497c4c82-13ae-430c-83bd-1f1c4d4683e4\") " pod="openstack/heat-db-sync-wlhqm" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.461788 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497c4c82-13ae-430c-83bd-1f1c4d4683e4-combined-ca-bundle\") pod \"heat-db-sync-wlhqm\" (UID: \"497c4c82-13ae-430c-83bd-1f1c4d4683e4\") " pod="openstack/heat-db-sync-wlhqm" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.461833 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/497c4c82-13ae-430c-83bd-1f1c4d4683e4-config-data\") pod \"heat-db-sync-wlhqm\" (UID: \"497c4c82-13ae-430c-83bd-1f1c4d4683e4\") " pod="openstack/heat-db-sync-wlhqm" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.563923 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf4f6\" (UniqueName: \"kubernetes.io/projected/497c4c82-13ae-430c-83bd-1f1c4d4683e4-kube-api-access-nf4f6\") pod \"heat-db-sync-wlhqm\" (UID: \"497c4c82-13ae-430c-83bd-1f1c4d4683e4\") " pod="openstack/heat-db-sync-wlhqm" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.564053 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497c4c82-13ae-430c-83bd-1f1c4d4683e4-combined-ca-bundle\") pod \"heat-db-sync-wlhqm\" (UID: \"497c4c82-13ae-430c-83bd-1f1c4d4683e4\") " pod="openstack/heat-db-sync-wlhqm" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.564076 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/497c4c82-13ae-430c-83bd-1f1c4d4683e4-config-data\") pod \"heat-db-sync-wlhqm\" (UID: \"497c4c82-13ae-430c-83bd-1f1c4d4683e4\") " pod="openstack/heat-db-sync-wlhqm" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.568994 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497c4c82-13ae-430c-83bd-1f1c4d4683e4-combined-ca-bundle\") pod \"heat-db-sync-wlhqm\" (UID: \"497c4c82-13ae-430c-83bd-1f1c4d4683e4\") " pod="openstack/heat-db-sync-wlhqm" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.581239 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/497c4c82-13ae-430c-83bd-1f1c4d4683e4-config-data\") pod \"heat-db-sync-wlhqm\" (UID: \"497c4c82-13ae-430c-83bd-1f1c4d4683e4\") " pod="openstack/heat-db-sync-wlhqm" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.581360 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf4f6\" (UniqueName: \"kubernetes.io/projected/497c4c82-13ae-430c-83bd-1f1c4d4683e4-kube-api-access-nf4f6\") pod \"heat-db-sync-wlhqm\" (UID: \"497c4c82-13ae-430c-83bd-1f1c4d4683e4\") " pod="openstack/heat-db-sync-wlhqm" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.664791 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-wlhqm" Feb 19 23:06:11 crc kubenswrapper[4795]: I0219 23:06:11.124614 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-wlhqm"] Feb 19 23:06:11 crc kubenswrapper[4795]: W0219 23:06:11.125123 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod497c4c82_13ae_430c_83bd_1f1c4d4683e4.slice/crio-1f14b1d9f8ec956e2f15e91e3acd1feb1bf1a9e896176fbddc75b273110a404a WatchSource:0}: Error finding container 1f14b1d9f8ec956e2f15e91e3acd1feb1bf1a9e896176fbddc75b273110a404a: Status 404 returned error can't find the container with id 1f14b1d9f8ec956e2f15e91e3acd1feb1bf1a9e896176fbddc75b273110a404a Feb 19 23:06:11 crc kubenswrapper[4795]: I0219 23:06:11.285853 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-wlhqm" event={"ID":"497c4c82-13ae-430c-83bd-1f1c4d4683e4","Type":"ContainerStarted","Data":"1f14b1d9f8ec956e2f15e91e3acd1feb1bf1a9e896176fbddc75b273110a404a"} Feb 19 23:06:13 crc kubenswrapper[4795]: I0219 23:06:13.701661 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:13 crc kubenswrapper[4795]: I0219 23:06:13.702002 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:17 crc kubenswrapper[4795]: I0219 23:06:17.100277 4795 scope.go:117] "RemoveContainer" containerID="247acf34bea1d6df95accdf51f9a624b45c32153a0b9fa97bc69d2358a9601e8" Feb 19 23:06:17 crc kubenswrapper[4795]: I0219 23:06:17.120736 4795 scope.go:117] "RemoveContainer" containerID="82b2542be6e058170f19505c42c4114714e88b52257f1c0e99664dda5eb05f78" Feb 19 23:06:17 crc kubenswrapper[4795]: I0219 23:06:17.169786 4795 scope.go:117] "RemoveContainer" containerID="a01a032519bdc879e7f051388c3f7e0f8289504340bef813d7d1864b844d8771" Feb 19 23:06:17 crc kubenswrapper[4795]: I0219 23:06:17.357568 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-wlhqm" event={"ID":"497c4c82-13ae-430c-83bd-1f1c4d4683e4","Type":"ContainerStarted","Data":"26cce21cbd7189a101a6a725aa7d2a769b17bb5f8957270015deb5068ba381a3"} Feb 19 23:06:17 crc kubenswrapper[4795]: I0219 23:06:17.386031 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-wlhqm" podStartSLOduration=1.585032536 podStartE2EDuration="7.386006896s" podCreationTimestamp="2026-02-19 23:06:10 +0000 UTC" firstStartedPulling="2026-02-19 23:06:11.128320828 +0000 UTC m=+5882.320838692" lastFinishedPulling="2026-02-19 23:06:16.929295178 +0000 UTC m=+5888.121813052" observedRunningTime="2026-02-19 23:06:17.373366509 +0000 UTC m=+5888.565884384" watchObservedRunningTime="2026-02-19 23:06:17.386006896 +0000 UTC m=+5888.578524780" Feb 19 23:06:20 crc kubenswrapper[4795]: I0219 23:06:20.387454 4795 generic.go:334] "Generic (PLEG): container finished" podID="497c4c82-13ae-430c-83bd-1f1c4d4683e4" containerID="26cce21cbd7189a101a6a725aa7d2a769b17bb5f8957270015deb5068ba381a3" exitCode=0 Feb 19 23:06:20 crc kubenswrapper[4795]: I0219 23:06:20.387588 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-wlhqm" event={"ID":"497c4c82-13ae-430c-83bd-1f1c4d4683e4","Type":"ContainerDied","Data":"26cce21cbd7189a101a6a725aa7d2a769b17bb5f8957270015deb5068ba381a3"} Feb 19 23:06:20 crc kubenswrapper[4795]: I0219 23:06:20.511962 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:06:20 crc kubenswrapper[4795]: E0219 23:06:20.512400 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:06:21 crc kubenswrapper[4795]: I0219 23:06:21.828857 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-wlhqm" Feb 19 23:06:21 crc kubenswrapper[4795]: I0219 23:06:21.987358 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nf4f6\" (UniqueName: \"kubernetes.io/projected/497c4c82-13ae-430c-83bd-1f1c4d4683e4-kube-api-access-nf4f6\") pod \"497c4c82-13ae-430c-83bd-1f1c4d4683e4\" (UID: \"497c4c82-13ae-430c-83bd-1f1c4d4683e4\") " Feb 19 23:06:21 crc kubenswrapper[4795]: I0219 23:06:21.987548 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/497c4c82-13ae-430c-83bd-1f1c4d4683e4-config-data\") pod \"497c4c82-13ae-430c-83bd-1f1c4d4683e4\" (UID: \"497c4c82-13ae-430c-83bd-1f1c4d4683e4\") " Feb 19 23:06:21 crc kubenswrapper[4795]: I0219 23:06:21.987768 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497c4c82-13ae-430c-83bd-1f1c4d4683e4-combined-ca-bundle\") pod \"497c4c82-13ae-430c-83bd-1f1c4d4683e4\" (UID: \"497c4c82-13ae-430c-83bd-1f1c4d4683e4\") " Feb 19 23:06:21 crc kubenswrapper[4795]: I0219 23:06:21.997140 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/497c4c82-13ae-430c-83bd-1f1c4d4683e4-kube-api-access-nf4f6" (OuterVolumeSpecName: "kube-api-access-nf4f6") pod "497c4c82-13ae-430c-83bd-1f1c4d4683e4" (UID: "497c4c82-13ae-430c-83bd-1f1c4d4683e4"). InnerVolumeSpecName "kube-api-access-nf4f6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:06:22 crc kubenswrapper[4795]: I0219 23:06:22.020699 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497c4c82-13ae-430c-83bd-1f1c4d4683e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "497c4c82-13ae-430c-83bd-1f1c4d4683e4" (UID: "497c4c82-13ae-430c-83bd-1f1c4d4683e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:06:22 crc kubenswrapper[4795]: I0219 23:06:22.065405 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497c4c82-13ae-430c-83bd-1f1c4d4683e4-config-data" (OuterVolumeSpecName: "config-data") pod "497c4c82-13ae-430c-83bd-1f1c4d4683e4" (UID: "497c4c82-13ae-430c-83bd-1f1c4d4683e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:06:22 crc kubenswrapper[4795]: I0219 23:06:22.089917 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/497c4c82-13ae-430c-83bd-1f1c4d4683e4-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:22 crc kubenswrapper[4795]: I0219 23:06:22.090413 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497c4c82-13ae-430c-83bd-1f1c4d4683e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:22 crc kubenswrapper[4795]: I0219 23:06:22.090606 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nf4f6\" (UniqueName: \"kubernetes.io/projected/497c4c82-13ae-430c-83bd-1f1c4d4683e4-kube-api-access-nf4f6\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:22 crc kubenswrapper[4795]: I0219 23:06:22.406973 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-wlhqm" event={"ID":"497c4c82-13ae-430c-83bd-1f1c4d4683e4","Type":"ContainerDied","Data":"1f14b1d9f8ec956e2f15e91e3acd1feb1bf1a9e896176fbddc75b273110a404a"} Feb 19 23:06:22 crc kubenswrapper[4795]: I0219 23:06:22.407032 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f14b1d9f8ec956e2f15e91e3acd1feb1bf1a9e896176fbddc75b273110a404a" Feb 19 23:06:22 crc kubenswrapper[4795]: I0219 23:06:22.407071 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-wlhqm" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.505960 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-54b48c7f4c-97pnj"] Feb 19 23:06:23 crc kubenswrapper[4795]: E0219 23:06:23.508902 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="497c4c82-13ae-430c-83bd-1f1c4d4683e4" containerName="heat-db-sync" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.508931 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="497c4c82-13ae-430c-83bd-1f1c4d4683e4" containerName="heat-db-sync" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.509214 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="497c4c82-13ae-430c-83bd-1f1c4d4683e4" containerName="heat-db-sync" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.510183 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-54b48c7f4c-97pnj" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.517858 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.517891 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-67jp9" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.518148 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.529112 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-54b48c7f4c-97pnj"] Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.621748 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a380f130-e904-41e8-90e2-93bdeb0615d6-config-data-custom\") pod \"heat-engine-54b48c7f4c-97pnj\" (UID: \"a380f130-e904-41e8-90e2-93bdeb0615d6\") " pod="openstack/heat-engine-54b48c7f4c-97pnj" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.622142 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a380f130-e904-41e8-90e2-93bdeb0615d6-combined-ca-bundle\") pod \"heat-engine-54b48c7f4c-97pnj\" (UID: \"a380f130-e904-41e8-90e2-93bdeb0615d6\") " pod="openstack/heat-engine-54b48c7f4c-97pnj" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.622219 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a380f130-e904-41e8-90e2-93bdeb0615d6-config-data\") pod \"heat-engine-54b48c7f4c-97pnj\" (UID: \"a380f130-e904-41e8-90e2-93bdeb0615d6\") " pod="openstack/heat-engine-54b48c7f4c-97pnj" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.622260 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xx2s\" (UniqueName: \"kubernetes.io/projected/a380f130-e904-41e8-90e2-93bdeb0615d6-kube-api-access-5xx2s\") pod \"heat-engine-54b48c7f4c-97pnj\" (UID: \"a380f130-e904-41e8-90e2-93bdeb0615d6\") " pod="openstack/heat-engine-54b48c7f4c-97pnj" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.691488 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6565dd9f4d-w85dm"] Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.692957 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.696199 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.723983 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6565dd9f4d-w85dm"] Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.725176 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a380f130-e904-41e8-90e2-93bdeb0615d6-config-data-custom\") pod \"heat-engine-54b48c7f4c-97pnj\" (UID: \"a380f130-e904-41e8-90e2-93bdeb0615d6\") " pod="openstack/heat-engine-54b48c7f4c-97pnj" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.725313 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a380f130-e904-41e8-90e2-93bdeb0615d6-combined-ca-bundle\") pod \"heat-engine-54b48c7f4c-97pnj\" (UID: \"a380f130-e904-41e8-90e2-93bdeb0615d6\") " pod="openstack/heat-engine-54b48c7f4c-97pnj" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.725344 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a380f130-e904-41e8-90e2-93bdeb0615d6-config-data\") pod \"heat-engine-54b48c7f4c-97pnj\" (UID: \"a380f130-e904-41e8-90e2-93bdeb0615d6\") " pod="openstack/heat-engine-54b48c7f4c-97pnj" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.725367 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xx2s\" (UniqueName: \"kubernetes.io/projected/a380f130-e904-41e8-90e2-93bdeb0615d6-kube-api-access-5xx2s\") pod \"heat-engine-54b48c7f4c-97pnj\" (UID: \"a380f130-e904-41e8-90e2-93bdeb0615d6\") " pod="openstack/heat-engine-54b48c7f4c-97pnj" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.763296 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a380f130-e904-41e8-90e2-93bdeb0615d6-combined-ca-bundle\") pod \"heat-engine-54b48c7f4c-97pnj\" (UID: \"a380f130-e904-41e8-90e2-93bdeb0615d6\") " pod="openstack/heat-engine-54b48c7f4c-97pnj" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.766613 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xx2s\" (UniqueName: \"kubernetes.io/projected/a380f130-e904-41e8-90e2-93bdeb0615d6-kube-api-access-5xx2s\") pod \"heat-engine-54b48c7f4c-97pnj\" (UID: \"a380f130-e904-41e8-90e2-93bdeb0615d6\") " pod="openstack/heat-engine-54b48c7f4c-97pnj" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.777916 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a380f130-e904-41e8-90e2-93bdeb0615d6-config-data-custom\") pod \"heat-engine-54b48c7f4c-97pnj\" (UID: \"a380f130-e904-41e8-90e2-93bdeb0615d6\") " pod="openstack/heat-engine-54b48c7f4c-97pnj" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.822885 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-57679899bc-rj6x7"] Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.826150 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a380f130-e904-41e8-90e2-93bdeb0615d6-config-data\") pod \"heat-engine-54b48c7f4c-97pnj\" (UID: \"a380f130-e904-41e8-90e2-93bdeb0615d6\") " pod="openstack/heat-engine-54b48c7f4c-97pnj" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.839333 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94ea6e46-bacd-40ca-bce9-0f28656581af-config-data\") pod \"heat-cfnapi-6565dd9f4d-w85dm\" (UID: \"94ea6e46-bacd-40ca-bce9-0f28656581af\") " pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.839394 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94ea6e46-bacd-40ca-bce9-0f28656581af-combined-ca-bundle\") pod \"heat-cfnapi-6565dd9f4d-w85dm\" (UID: \"94ea6e46-bacd-40ca-bce9-0f28656581af\") " pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.839416 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdcgf\" (UniqueName: \"kubernetes.io/projected/94ea6e46-bacd-40ca-bce9-0f28656581af-kube-api-access-fdcgf\") pod \"heat-cfnapi-6565dd9f4d-w85dm\" (UID: \"94ea6e46-bacd-40ca-bce9-0f28656581af\") " pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.839499 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94ea6e46-bacd-40ca-bce9-0f28656581af-config-data-custom\") pod \"heat-cfnapi-6565dd9f4d-w85dm\" (UID: \"94ea6e46-bacd-40ca-bce9-0f28656581af\") " pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.840411 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-57679899bc-rj6x7" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.842371 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-54b48c7f4c-97pnj" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.843069 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.873978 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-57679899bc-rj6x7"] Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.942063 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94ea6e46-bacd-40ca-bce9-0f28656581af-combined-ca-bundle\") pod \"heat-cfnapi-6565dd9f4d-w85dm\" (UID: \"94ea6e46-bacd-40ca-bce9-0f28656581af\") " pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.942128 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdcgf\" (UniqueName: \"kubernetes.io/projected/94ea6e46-bacd-40ca-bce9-0f28656581af-kube-api-access-fdcgf\") pod \"heat-cfnapi-6565dd9f4d-w85dm\" (UID: \"94ea6e46-bacd-40ca-bce9-0f28656581af\") " pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.942218 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-248z5\" (UniqueName: \"kubernetes.io/projected/058c5b61-3ec2-4a88-bea8-59843d00750c-kube-api-access-248z5\") pod \"heat-api-57679899bc-rj6x7\" (UID: \"058c5b61-3ec2-4a88-bea8-59843d00750c\") " pod="openstack/heat-api-57679899bc-rj6x7" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.942299 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/058c5b61-3ec2-4a88-bea8-59843d00750c-config-data\") pod \"heat-api-57679899bc-rj6x7\" (UID: \"058c5b61-3ec2-4a88-bea8-59843d00750c\") " pod="openstack/heat-api-57679899bc-rj6x7" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.942397 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/058c5b61-3ec2-4a88-bea8-59843d00750c-config-data-custom\") pod \"heat-api-57679899bc-rj6x7\" (UID: \"058c5b61-3ec2-4a88-bea8-59843d00750c\") " pod="openstack/heat-api-57679899bc-rj6x7" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.942554 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94ea6e46-bacd-40ca-bce9-0f28656581af-config-data-custom\") pod \"heat-cfnapi-6565dd9f4d-w85dm\" (UID: \"94ea6e46-bacd-40ca-bce9-0f28656581af\") " pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.942623 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058c5b61-3ec2-4a88-bea8-59843d00750c-combined-ca-bundle\") pod \"heat-api-57679899bc-rj6x7\" (UID: \"058c5b61-3ec2-4a88-bea8-59843d00750c\") " pod="openstack/heat-api-57679899bc-rj6x7" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.942919 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94ea6e46-bacd-40ca-bce9-0f28656581af-config-data\") pod \"heat-cfnapi-6565dd9f4d-w85dm\" (UID: \"94ea6e46-bacd-40ca-bce9-0f28656581af\") " pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.948534 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94ea6e46-bacd-40ca-bce9-0f28656581af-config-data\") pod \"heat-cfnapi-6565dd9f4d-w85dm\" (UID: \"94ea6e46-bacd-40ca-bce9-0f28656581af\") " pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.960184 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94ea6e46-bacd-40ca-bce9-0f28656581af-config-data-custom\") pod \"heat-cfnapi-6565dd9f4d-w85dm\" (UID: \"94ea6e46-bacd-40ca-bce9-0f28656581af\") " pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.964787 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94ea6e46-bacd-40ca-bce9-0f28656581af-combined-ca-bundle\") pod \"heat-cfnapi-6565dd9f4d-w85dm\" (UID: \"94ea6e46-bacd-40ca-bce9-0f28656581af\") " pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.965343 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdcgf\" (UniqueName: \"kubernetes.io/projected/94ea6e46-bacd-40ca-bce9-0f28656581af-kube-api-access-fdcgf\") pod \"heat-cfnapi-6565dd9f4d-w85dm\" (UID: \"94ea6e46-bacd-40ca-bce9-0f28656581af\") " pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" Feb 19 23:06:24 crc kubenswrapper[4795]: I0219 23:06:24.016115 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" Feb 19 23:06:24 crc kubenswrapper[4795]: I0219 23:06:24.044767 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058c5b61-3ec2-4a88-bea8-59843d00750c-combined-ca-bundle\") pod \"heat-api-57679899bc-rj6x7\" (UID: \"058c5b61-3ec2-4a88-bea8-59843d00750c\") " pod="openstack/heat-api-57679899bc-rj6x7" Feb 19 23:06:24 crc kubenswrapper[4795]: I0219 23:06:24.045289 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-248z5\" (UniqueName: \"kubernetes.io/projected/058c5b61-3ec2-4a88-bea8-59843d00750c-kube-api-access-248z5\") pod \"heat-api-57679899bc-rj6x7\" (UID: \"058c5b61-3ec2-4a88-bea8-59843d00750c\") " pod="openstack/heat-api-57679899bc-rj6x7" Feb 19 23:06:24 crc kubenswrapper[4795]: I0219 23:06:24.045324 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/058c5b61-3ec2-4a88-bea8-59843d00750c-config-data\") pod \"heat-api-57679899bc-rj6x7\" (UID: \"058c5b61-3ec2-4a88-bea8-59843d00750c\") " pod="openstack/heat-api-57679899bc-rj6x7" Feb 19 23:06:24 crc kubenswrapper[4795]: I0219 23:06:24.045347 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/058c5b61-3ec2-4a88-bea8-59843d00750c-config-data-custom\") pod \"heat-api-57679899bc-rj6x7\" (UID: \"058c5b61-3ec2-4a88-bea8-59843d00750c\") " pod="openstack/heat-api-57679899bc-rj6x7" Feb 19 23:06:24 crc kubenswrapper[4795]: I0219 23:06:24.052044 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058c5b61-3ec2-4a88-bea8-59843d00750c-combined-ca-bundle\") pod \"heat-api-57679899bc-rj6x7\" (UID: \"058c5b61-3ec2-4a88-bea8-59843d00750c\") " pod="openstack/heat-api-57679899bc-rj6x7" Feb 19 23:06:24 crc kubenswrapper[4795]: I0219 23:06:24.055344 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/058c5b61-3ec2-4a88-bea8-59843d00750c-config-data\") pod \"heat-api-57679899bc-rj6x7\" (UID: \"058c5b61-3ec2-4a88-bea8-59843d00750c\") " pod="openstack/heat-api-57679899bc-rj6x7" Feb 19 23:06:24 crc kubenswrapper[4795]: I0219 23:06:24.056536 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/058c5b61-3ec2-4a88-bea8-59843d00750c-config-data-custom\") pod \"heat-api-57679899bc-rj6x7\" (UID: \"058c5b61-3ec2-4a88-bea8-59843d00750c\") " pod="openstack/heat-api-57679899bc-rj6x7" Feb 19 23:06:24 crc kubenswrapper[4795]: I0219 23:06:24.061222 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-248z5\" (UniqueName: \"kubernetes.io/projected/058c5b61-3ec2-4a88-bea8-59843d00750c-kube-api-access-248z5\") pod \"heat-api-57679899bc-rj6x7\" (UID: \"058c5b61-3ec2-4a88-bea8-59843d00750c\") " pod="openstack/heat-api-57679899bc-rj6x7" Feb 19 23:06:24 crc kubenswrapper[4795]: I0219 23:06:24.342345 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-57679899bc-rj6x7" Feb 19 23:06:24 crc kubenswrapper[4795]: I0219 23:06:24.361674 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-54b48c7f4c-97pnj"] Feb 19 23:06:24 crc kubenswrapper[4795]: W0219 23:06:24.374099 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda380f130_e904_41e8_90e2_93bdeb0615d6.slice/crio-42e93ff6f262ded38d30a4699b66ef3748d0559062d10f8e7d7630ce3a9f97a2 WatchSource:0}: Error finding container 42e93ff6f262ded38d30a4699b66ef3748d0559062d10f8e7d7630ce3a9f97a2: Status 404 returned error can't find the container with id 42e93ff6f262ded38d30a4699b66ef3748d0559062d10f8e7d7630ce3a9f97a2 Feb 19 23:06:24 crc kubenswrapper[4795]: I0219 23:06:24.435498 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-54b48c7f4c-97pnj" event={"ID":"a380f130-e904-41e8-90e2-93bdeb0615d6","Type":"ContainerStarted","Data":"42e93ff6f262ded38d30a4699b66ef3748d0559062d10f8e7d7630ce3a9f97a2"} Feb 19 23:06:24 crc kubenswrapper[4795]: I0219 23:06:24.502100 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6565dd9f4d-w85dm"] Feb 19 23:06:24 crc kubenswrapper[4795]: I0219 23:06:24.824179 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-57679899bc-rj6x7"] Feb 19 23:06:25 crc kubenswrapper[4795]: I0219 23:06:25.450739 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-57679899bc-rj6x7" event={"ID":"058c5b61-3ec2-4a88-bea8-59843d00750c","Type":"ContainerStarted","Data":"7c2569b462a1849b91272409407428ae61f0f85c98af819c9c77c8800b7411e2"} Feb 19 23:06:25 crc kubenswrapper[4795]: I0219 23:06:25.453026 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" event={"ID":"94ea6e46-bacd-40ca-bce9-0f28656581af","Type":"ContainerStarted","Data":"cf7360edbf9d31e08708caf5995d7041976a8fb9fdd9b6931540bad10df2f87c"} Feb 19 23:06:25 crc kubenswrapper[4795]: I0219 23:06:25.455763 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-54b48c7f4c-97pnj" event={"ID":"a380f130-e904-41e8-90e2-93bdeb0615d6","Type":"ContainerStarted","Data":"28e9017fdd45a316b9c170a0903ec6ce2597a64e8457cb18d9b0c1a4e16d2538"} Feb 19 23:06:25 crc kubenswrapper[4795]: I0219 23:06:25.455893 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-54b48c7f4c-97pnj" Feb 19 23:06:25 crc kubenswrapper[4795]: I0219 23:06:25.474516 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-54b48c7f4c-97pnj" podStartSLOduration=2.474494172 podStartE2EDuration="2.474494172s" podCreationTimestamp="2026-02-19 23:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:06:25.469265923 +0000 UTC m=+5896.661783817" watchObservedRunningTime="2026-02-19 23:06:25.474494172 +0000 UTC m=+5896.667012046" Feb 19 23:06:25 crc kubenswrapper[4795]: I0219 23:06:25.653199 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:27 crc kubenswrapper[4795]: I0219 23:06:27.475606 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-57679899bc-rj6x7" event={"ID":"058c5b61-3ec2-4a88-bea8-59843d00750c","Type":"ContainerStarted","Data":"7b7d0aeb0dbc4b713268e62f34080da6273676dd4d7d2d299501d03356596277"} Feb 19 23:06:27 crc kubenswrapper[4795]: I0219 23:06:27.476211 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-57679899bc-rj6x7" Feb 19 23:06:27 crc kubenswrapper[4795]: I0219 23:06:27.477792 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" event={"ID":"94ea6e46-bacd-40ca-bce9-0f28656581af","Type":"ContainerStarted","Data":"9783318731f4eddc5ba92ad6da9e864b2d90e3fc020d553614c899f2e97786e4"} Feb 19 23:06:27 crc kubenswrapper[4795]: I0219 23:06:27.477918 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" Feb 19 23:06:27 crc kubenswrapper[4795]: I0219 23:06:27.492556 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-57679899bc-rj6x7" podStartSLOduration=2.662771808 podStartE2EDuration="4.492541657s" podCreationTimestamp="2026-02-19 23:06:23 +0000 UTC" firstStartedPulling="2026-02-19 23:06:24.81680057 +0000 UTC m=+5896.009318434" lastFinishedPulling="2026-02-19 23:06:26.646570419 +0000 UTC m=+5897.839088283" observedRunningTime="2026-02-19 23:06:27.488206212 +0000 UTC m=+5898.680724076" watchObservedRunningTime="2026-02-19 23:06:27.492541657 +0000 UTC m=+5898.685059521" Feb 19 23:06:27 crc kubenswrapper[4795]: I0219 23:06:27.513051 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" podStartSLOduration=2.386660232 podStartE2EDuration="4.513034573s" podCreationTimestamp="2026-02-19 23:06:23 +0000 UTC" firstStartedPulling="2026-02-19 23:06:24.517043594 +0000 UTC m=+5895.709561458" lastFinishedPulling="2026-02-19 23:06:26.643417935 +0000 UTC m=+5897.835935799" observedRunningTime="2026-02-19 23:06:27.508894263 +0000 UTC m=+5898.701412127" watchObservedRunningTime="2026-02-19 23:06:27.513034573 +0000 UTC m=+5898.705552437" Feb 19 23:06:27 crc kubenswrapper[4795]: I0219 23:06:27.535089 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:27 crc kubenswrapper[4795]: I0219 23:06:27.606059 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-548bf4c685-852ql"] Feb 19 23:06:27 crc kubenswrapper[4795]: I0219 23:06:27.606352 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-548bf4c685-852ql" podUID="353e54a0-06cb-4876-af76-78bcd1bb3a22" containerName="horizon-log" containerID="cri-o://b946e5db8d3a52db0f1f9054f0099688c365289ec5bd26ac448c71a734370a40" gracePeriod=30 Feb 19 23:06:27 crc kubenswrapper[4795]: I0219 23:06:27.606956 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-548bf4c685-852ql" podUID="353e54a0-06cb-4876-af76-78bcd1bb3a22" containerName="horizon" containerID="cri-o://1bd2543092bc2960614f9063a0b09c79b76c851d29dedd3bce8880af20588398" gracePeriod=30 Feb 19 23:06:30 crc kubenswrapper[4795]: I0219 23:06:30.056256 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-57skz"] Feb 19 23:06:30 crc kubenswrapper[4795]: I0219 23:06:30.067309 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-c270-account-create-update-m9p4w"] Feb 19 23:06:30 crc kubenswrapper[4795]: I0219 23:06:30.075587 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-c270-account-create-update-m9p4w"] Feb 19 23:06:30 crc kubenswrapper[4795]: I0219 23:06:30.084801 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-57skz"] Feb 19 23:06:31 crc kubenswrapper[4795]: I0219 23:06:31.510282 4795 generic.go:334] "Generic (PLEG): container finished" podID="353e54a0-06cb-4876-af76-78bcd1bb3a22" containerID="1bd2543092bc2960614f9063a0b09c79b76c851d29dedd3bce8880af20588398" exitCode=0 Feb 19 23:06:31 crc kubenswrapper[4795]: I0219 23:06:31.510341 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-548bf4c685-852ql" event={"ID":"353e54a0-06cb-4876-af76-78bcd1bb3a22","Type":"ContainerDied","Data":"1bd2543092bc2960614f9063a0b09c79b76c851d29dedd3bce8880af20588398"} Feb 19 23:06:31 crc kubenswrapper[4795]: I0219 23:06:31.528707 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5efc0b6-7441-4f4b-827e-d920c711d076" path="/var/lib/kubelet/pods/b5efc0b6-7441-4f4b-827e-d920c711d076/volumes" Feb 19 23:06:31 crc kubenswrapper[4795]: I0219 23:06:31.529321 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6d0c29a-694d-4afc-ba36-c66fa8fd0328" path="/var/lib/kubelet/pods/e6d0c29a-694d-4afc-ba36-c66fa8fd0328/volumes" Feb 19 23:06:35 crc kubenswrapper[4795]: I0219 23:06:35.456635 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" Feb 19 23:06:35 crc kubenswrapper[4795]: I0219 23:06:35.512009 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:06:35 crc kubenswrapper[4795]: E0219 23:06:35.512288 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:06:35 crc kubenswrapper[4795]: I0219 23:06:35.922377 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-57679899bc-rj6x7" Feb 19 23:06:36 crc kubenswrapper[4795]: I0219 23:06:36.748742 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-548bf4c685-852ql" podUID="353e54a0-06cb-4876-af76-78bcd1bb3a22" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.113:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.113:8080: connect: connection refused" Feb 19 23:06:39 crc kubenswrapper[4795]: I0219 23:06:39.032830 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-wrz6p"] Feb 19 23:06:39 crc kubenswrapper[4795]: I0219 23:06:39.042665 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-wrz6p"] Feb 19 23:06:39 crc kubenswrapper[4795]: I0219 23:06:39.522430 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e95033f-725f-4784-995c-ec7a3b9c24c4" path="/var/lib/kubelet/pods/3e95033f-725f-4784-995c-ec7a3b9c24c4/volumes" Feb 19 23:06:43 crc kubenswrapper[4795]: I0219 23:06:43.888228 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-54b48c7f4c-97pnj" Feb 19 23:06:46 crc kubenswrapper[4795]: I0219 23:06:46.748778 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-548bf4c685-852ql" podUID="353e54a0-06cb-4876-af76-78bcd1bb3a22" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.113:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.113:8080: connect: connection refused" Feb 19 23:06:48 crc kubenswrapper[4795]: I0219 23:06:48.512324 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:06:48 crc kubenswrapper[4795]: E0219 23:06:48.513128 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:06:53 crc kubenswrapper[4795]: I0219 23:06:53.866624 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk"] Feb 19 23:06:53 crc kubenswrapper[4795]: I0219 23:06:53.869690 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" Feb 19 23:06:53 crc kubenswrapper[4795]: I0219 23:06:53.871662 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 19 23:06:53 crc kubenswrapper[4795]: I0219 23:06:53.894896 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk"] Feb 19 23:06:53 crc kubenswrapper[4795]: I0219 23:06:53.997732 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg59p\" (UniqueName: \"kubernetes.io/projected/42ff3cee-7522-42a5-8cc9-b52b30d45220-kube-api-access-dg59p\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk\" (UID: \"42ff3cee-7522-42a5-8cc9-b52b30d45220\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" Feb 19 23:06:53 crc kubenswrapper[4795]: I0219 23:06:53.998133 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42ff3cee-7522-42a5-8cc9-b52b30d45220-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk\" (UID: \"42ff3cee-7522-42a5-8cc9-b52b30d45220\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" Feb 19 23:06:53 crc kubenswrapper[4795]: I0219 23:06:53.998348 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42ff3cee-7522-42a5-8cc9-b52b30d45220-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk\" (UID: \"42ff3cee-7522-42a5-8cc9-b52b30d45220\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" Feb 19 23:06:54 crc kubenswrapper[4795]: I0219 23:06:54.100354 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg59p\" (UniqueName: \"kubernetes.io/projected/42ff3cee-7522-42a5-8cc9-b52b30d45220-kube-api-access-dg59p\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk\" (UID: \"42ff3cee-7522-42a5-8cc9-b52b30d45220\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" Feb 19 23:06:54 crc kubenswrapper[4795]: I0219 23:06:54.100395 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42ff3cee-7522-42a5-8cc9-b52b30d45220-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk\" (UID: \"42ff3cee-7522-42a5-8cc9-b52b30d45220\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" Feb 19 23:06:54 crc kubenswrapper[4795]: I0219 23:06:54.100550 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42ff3cee-7522-42a5-8cc9-b52b30d45220-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk\" (UID: \"42ff3cee-7522-42a5-8cc9-b52b30d45220\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" Feb 19 23:06:54 crc kubenswrapper[4795]: I0219 23:06:54.100993 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42ff3cee-7522-42a5-8cc9-b52b30d45220-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk\" (UID: \"42ff3cee-7522-42a5-8cc9-b52b30d45220\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" Feb 19 23:06:54 crc kubenswrapper[4795]: I0219 23:06:54.101189 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42ff3cee-7522-42a5-8cc9-b52b30d45220-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk\" (UID: \"42ff3cee-7522-42a5-8cc9-b52b30d45220\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" Feb 19 23:06:54 crc kubenswrapper[4795]: I0219 23:06:54.122286 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg59p\" (UniqueName: \"kubernetes.io/projected/42ff3cee-7522-42a5-8cc9-b52b30d45220-kube-api-access-dg59p\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk\" (UID: \"42ff3cee-7522-42a5-8cc9-b52b30d45220\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" Feb 19 23:06:54 crc kubenswrapper[4795]: I0219 23:06:54.198068 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" Feb 19 23:06:54 crc kubenswrapper[4795]: I0219 23:06:54.630058 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk"] Feb 19 23:06:54 crc kubenswrapper[4795]: I0219 23:06:54.747486 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" event={"ID":"42ff3cee-7522-42a5-8cc9-b52b30d45220","Type":"ContainerStarted","Data":"d11bd090c6a8a61659e1b16a0e54fc14ff621007a0c7499ea67644b6ce764c7f"} Feb 19 23:06:55 crc kubenswrapper[4795]: I0219 23:06:55.765390 4795 generic.go:334] "Generic (PLEG): container finished" podID="42ff3cee-7522-42a5-8cc9-b52b30d45220" containerID="1b77aac585ca3cfe29b317bf1fd045a5d63798719b946034f72426232115ecf7" exitCode=0 Feb 19 23:06:55 crc kubenswrapper[4795]: I0219 23:06:55.765431 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" event={"ID":"42ff3cee-7522-42a5-8cc9-b52b30d45220","Type":"ContainerDied","Data":"1b77aac585ca3cfe29b317bf1fd045a5d63798719b946034f72426232115ecf7"} Feb 19 23:06:56 crc kubenswrapper[4795]: I0219 23:06:56.749000 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-548bf4c685-852ql" podUID="353e54a0-06cb-4876-af76-78bcd1bb3a22" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.113:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.113:8080: connect: connection refused" Feb 19 23:06:56 crc kubenswrapper[4795]: I0219 23:06:56.749439 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:06:57 crc kubenswrapper[4795]: I0219 23:06:57.795647 4795 generic.go:334] "Generic (PLEG): container finished" podID="353e54a0-06cb-4876-af76-78bcd1bb3a22" containerID="b946e5db8d3a52db0f1f9054f0099688c365289ec5bd26ac448c71a734370a40" exitCode=137 Feb 19 23:06:57 crc kubenswrapper[4795]: I0219 23:06:57.795958 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-548bf4c685-852ql" event={"ID":"353e54a0-06cb-4876-af76-78bcd1bb3a22","Type":"ContainerDied","Data":"b946e5db8d3a52db0f1f9054f0099688c365289ec5bd26ac448c71a734370a40"} Feb 19 23:06:57 crc kubenswrapper[4795]: I0219 23:06:57.799033 4795 generic.go:334] "Generic (PLEG): container finished" podID="42ff3cee-7522-42a5-8cc9-b52b30d45220" containerID="7325a5f3066d7e846cc85a6bea1dd01396196730c91dbd4e526befb9417a41b7" exitCode=0 Feb 19 23:06:57 crc kubenswrapper[4795]: I0219 23:06:57.799087 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" event={"ID":"42ff3cee-7522-42a5-8cc9-b52b30d45220","Type":"ContainerDied","Data":"7325a5f3066d7e846cc85a6bea1dd01396196730c91dbd4e526befb9417a41b7"} Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.063196 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.201056 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/353e54a0-06cb-4876-af76-78bcd1bb3a22-horizon-secret-key\") pod \"353e54a0-06cb-4876-af76-78bcd1bb3a22\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.201178 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/353e54a0-06cb-4876-af76-78bcd1bb3a22-scripts\") pod \"353e54a0-06cb-4876-af76-78bcd1bb3a22\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.201255 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/353e54a0-06cb-4876-af76-78bcd1bb3a22-logs\") pod \"353e54a0-06cb-4876-af76-78bcd1bb3a22\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.201278 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/353e54a0-06cb-4876-af76-78bcd1bb3a22-config-data\") pod \"353e54a0-06cb-4876-af76-78bcd1bb3a22\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.201393 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svjbq\" (UniqueName: \"kubernetes.io/projected/353e54a0-06cb-4876-af76-78bcd1bb3a22-kube-api-access-svjbq\") pod \"353e54a0-06cb-4876-af76-78bcd1bb3a22\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.201923 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/353e54a0-06cb-4876-af76-78bcd1bb3a22-logs" (OuterVolumeSpecName: "logs") pod "353e54a0-06cb-4876-af76-78bcd1bb3a22" (UID: "353e54a0-06cb-4876-af76-78bcd1bb3a22"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.215399 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/353e54a0-06cb-4876-af76-78bcd1bb3a22-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "353e54a0-06cb-4876-af76-78bcd1bb3a22" (UID: "353e54a0-06cb-4876-af76-78bcd1bb3a22"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.221932 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/353e54a0-06cb-4876-af76-78bcd1bb3a22-kube-api-access-svjbq" (OuterVolumeSpecName: "kube-api-access-svjbq") pod "353e54a0-06cb-4876-af76-78bcd1bb3a22" (UID: "353e54a0-06cb-4876-af76-78bcd1bb3a22"). InnerVolumeSpecName "kube-api-access-svjbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.227380 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/353e54a0-06cb-4876-af76-78bcd1bb3a22-scripts" (OuterVolumeSpecName: "scripts") pod "353e54a0-06cb-4876-af76-78bcd1bb3a22" (UID: "353e54a0-06cb-4876-af76-78bcd1bb3a22"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.266269 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/353e54a0-06cb-4876-af76-78bcd1bb3a22-config-data" (OuterVolumeSpecName: "config-data") pod "353e54a0-06cb-4876-af76-78bcd1bb3a22" (UID: "353e54a0-06cb-4876-af76-78bcd1bb3a22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.303960 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svjbq\" (UniqueName: \"kubernetes.io/projected/353e54a0-06cb-4876-af76-78bcd1bb3a22-kube-api-access-svjbq\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.303998 4795 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/353e54a0-06cb-4876-af76-78bcd1bb3a22-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.304008 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/353e54a0-06cb-4876-af76-78bcd1bb3a22-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.304018 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/353e54a0-06cb-4876-af76-78bcd1bb3a22-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.304026 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/353e54a0-06cb-4876-af76-78bcd1bb3a22-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.810832 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-548bf4c685-852ql" event={"ID":"353e54a0-06cb-4876-af76-78bcd1bb3a22","Type":"ContainerDied","Data":"582b797ed9c1b4bfe8384f717fb4dee523c0a67a75d13d2ff254fdbeb84f1617"} Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.810865 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.810897 4795 scope.go:117] "RemoveContainer" containerID="1bd2543092bc2960614f9063a0b09c79b76c851d29dedd3bce8880af20588398" Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.814832 4795 generic.go:334] "Generic (PLEG): container finished" podID="42ff3cee-7522-42a5-8cc9-b52b30d45220" containerID="89a80ef894aa5f55ca88f1134be53fdc59b583e2230ceb9fdaf35bfbb0fe8774" exitCode=0 Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.814884 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" event={"ID":"42ff3cee-7522-42a5-8cc9-b52b30d45220","Type":"ContainerDied","Data":"89a80ef894aa5f55ca88f1134be53fdc59b583e2230ceb9fdaf35bfbb0fe8774"} Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.870317 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-548bf4c685-852ql"] Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.880668 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-548bf4c685-852ql"] Feb 19 23:06:59 crc kubenswrapper[4795]: I0219 23:06:59.027138 4795 scope.go:117] "RemoveContainer" containerID="b946e5db8d3a52db0f1f9054f0099688c365289ec5bd26ac448c71a734370a40" Feb 19 23:06:59 crc kubenswrapper[4795]: I0219 23:06:59.525714 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="353e54a0-06cb-4876-af76-78bcd1bb3a22" path="/var/lib/kubelet/pods/353e54a0-06cb-4876-af76-78bcd1bb3a22/volumes" Feb 19 23:07:00 crc kubenswrapper[4795]: I0219 23:07:00.183419 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" Feb 19 23:07:00 crc kubenswrapper[4795]: I0219 23:07:00.345553 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42ff3cee-7522-42a5-8cc9-b52b30d45220-util\") pod \"42ff3cee-7522-42a5-8cc9-b52b30d45220\" (UID: \"42ff3cee-7522-42a5-8cc9-b52b30d45220\") " Feb 19 23:07:00 crc kubenswrapper[4795]: I0219 23:07:00.345693 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42ff3cee-7522-42a5-8cc9-b52b30d45220-bundle\") pod \"42ff3cee-7522-42a5-8cc9-b52b30d45220\" (UID: \"42ff3cee-7522-42a5-8cc9-b52b30d45220\") " Feb 19 23:07:00 crc kubenswrapper[4795]: I0219 23:07:00.345732 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dg59p\" (UniqueName: \"kubernetes.io/projected/42ff3cee-7522-42a5-8cc9-b52b30d45220-kube-api-access-dg59p\") pod \"42ff3cee-7522-42a5-8cc9-b52b30d45220\" (UID: \"42ff3cee-7522-42a5-8cc9-b52b30d45220\") " Feb 19 23:07:00 crc kubenswrapper[4795]: I0219 23:07:00.350601 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42ff3cee-7522-42a5-8cc9-b52b30d45220-bundle" (OuterVolumeSpecName: "bundle") pod "42ff3cee-7522-42a5-8cc9-b52b30d45220" (UID: "42ff3cee-7522-42a5-8cc9-b52b30d45220"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:07:00 crc kubenswrapper[4795]: I0219 23:07:00.353312 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42ff3cee-7522-42a5-8cc9-b52b30d45220-kube-api-access-dg59p" (OuterVolumeSpecName: "kube-api-access-dg59p") pod "42ff3cee-7522-42a5-8cc9-b52b30d45220" (UID: "42ff3cee-7522-42a5-8cc9-b52b30d45220"). InnerVolumeSpecName "kube-api-access-dg59p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:07:00 crc kubenswrapper[4795]: I0219 23:07:00.357868 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42ff3cee-7522-42a5-8cc9-b52b30d45220-util" (OuterVolumeSpecName: "util") pod "42ff3cee-7522-42a5-8cc9-b52b30d45220" (UID: "42ff3cee-7522-42a5-8cc9-b52b30d45220"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:07:00 crc kubenswrapper[4795]: I0219 23:07:00.447934 4795 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42ff3cee-7522-42a5-8cc9-b52b30d45220-util\") on node \"crc\" DevicePath \"\"" Feb 19 23:07:00 crc kubenswrapper[4795]: I0219 23:07:00.447973 4795 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42ff3cee-7522-42a5-8cc9-b52b30d45220-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:07:00 crc kubenswrapper[4795]: I0219 23:07:00.447987 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dg59p\" (UniqueName: \"kubernetes.io/projected/42ff3cee-7522-42a5-8cc9-b52b30d45220-kube-api-access-dg59p\") on node \"crc\" DevicePath \"\"" Feb 19 23:07:00 crc kubenswrapper[4795]: I0219 23:07:00.839113 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" event={"ID":"42ff3cee-7522-42a5-8cc9-b52b30d45220","Type":"ContainerDied","Data":"d11bd090c6a8a61659e1b16a0e54fc14ff621007a0c7499ea67644b6ce764c7f"} Feb 19 23:07:00 crc kubenswrapper[4795]: I0219 23:07:00.839160 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d11bd090c6a8a61659e1b16a0e54fc14ff621007a0c7499ea67644b6ce764c7f" Feb 19 23:07:00 crc kubenswrapper[4795]: I0219 23:07:00.839230 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" Feb 19 23:07:03 crc kubenswrapper[4795]: I0219 23:07:03.511606 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:07:03 crc kubenswrapper[4795]: E0219 23:07:03.512184 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.056512 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-w9m97"] Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.065472 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-4561-account-create-update-zf5q8"] Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.073945 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-w9m97"] Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.083000 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-4561-account-create-update-zf5q8"] Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.886052 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-s52kw"] Feb 19 23:07:10 crc kubenswrapper[4795]: E0219 23:07:10.887953 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="353e54a0-06cb-4876-af76-78bcd1bb3a22" containerName="horizon" Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.888074 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="353e54a0-06cb-4876-af76-78bcd1bb3a22" containerName="horizon" Feb 19 23:07:10 crc kubenswrapper[4795]: E0219 23:07:10.888205 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42ff3cee-7522-42a5-8cc9-b52b30d45220" containerName="pull" Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.888291 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="42ff3cee-7522-42a5-8cc9-b52b30d45220" containerName="pull" Feb 19 23:07:10 crc kubenswrapper[4795]: E0219 23:07:10.888604 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42ff3cee-7522-42a5-8cc9-b52b30d45220" containerName="util" Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.888703 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="42ff3cee-7522-42a5-8cc9-b52b30d45220" containerName="util" Feb 19 23:07:10 crc kubenswrapper[4795]: E0219 23:07:10.888789 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42ff3cee-7522-42a5-8cc9-b52b30d45220" containerName="extract" Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.888867 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="42ff3cee-7522-42a5-8cc9-b52b30d45220" containerName="extract" Feb 19 23:07:10 crc kubenswrapper[4795]: E0219 23:07:10.888964 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="353e54a0-06cb-4876-af76-78bcd1bb3a22" containerName="horizon-log" Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.889038 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="353e54a0-06cb-4876-af76-78bcd1bb3a22" containerName="horizon-log" Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.889802 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="42ff3cee-7522-42a5-8cc9-b52b30d45220" containerName="extract" Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.889912 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="353e54a0-06cb-4876-af76-78bcd1bb3a22" containerName="horizon-log" Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.890006 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="353e54a0-06cb-4876-af76-78bcd1bb3a22" containerName="horizon" Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.912017 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-s52kw" Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.916301 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-4787d" Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.918609 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.918986 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.920904 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-s52kw"] Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.030477 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb"] Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.031906 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.037191 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.037650 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-tfsnj" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.048736 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd"] Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.050183 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.081299 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb"] Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.099324 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd"] Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.104955 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87vck\" (UniqueName: \"kubernetes.io/projected/0a29e309-2974-42a7-afd9-c77d17f414d0-kube-api-access-87vck\") pod \"obo-prometheus-operator-68bc856cb9-s52kw\" (UID: \"0a29e309-2974-42a7-afd9-c77d17f414d0\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-s52kw" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.186424 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-ls2vk"] Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.188668 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-ls2vk" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.192361 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.192645 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-48rz4" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.205821 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-ls2vk"] Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.206104 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fb0e3807-a209-43ca-a245-64283a1d021f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb\" (UID: \"fb0e3807-a209-43ca-a245-64283a1d021f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.206371 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/658abf91-1e8b-4182-998f-76d3ed17b836-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd\" (UID: \"658abf91-1e8b-4182-998f-76d3ed17b836\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.206433 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fb0e3807-a209-43ca-a245-64283a1d021f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb\" (UID: \"fb0e3807-a209-43ca-a245-64283a1d021f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.206719 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/658abf91-1e8b-4182-998f-76d3ed17b836-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd\" (UID: \"658abf91-1e8b-4182-998f-76d3ed17b836\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.206792 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87vck\" (UniqueName: \"kubernetes.io/projected/0a29e309-2974-42a7-afd9-c77d17f414d0-kube-api-access-87vck\") pod \"obo-prometheus-operator-68bc856cb9-s52kw\" (UID: \"0a29e309-2974-42a7-afd9-c77d17f414d0\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-s52kw" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.232069 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87vck\" (UniqueName: \"kubernetes.io/projected/0a29e309-2974-42a7-afd9-c77d17f414d0-kube-api-access-87vck\") pod \"obo-prometheus-operator-68bc856cb9-s52kw\" (UID: \"0a29e309-2974-42a7-afd9-c77d17f414d0\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-s52kw" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.245564 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-s52kw" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.308346 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fb0e3807-a209-43ca-a245-64283a1d021f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb\" (UID: \"fb0e3807-a209-43ca-a245-64283a1d021f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.308453 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d03bcc6-aa94-401a-9a3b-4970f64537cd-observability-operator-tls\") pod \"observability-operator-59bdc8b94-ls2vk\" (UID: \"5d03bcc6-aa94-401a-9a3b-4970f64537cd\") " pod="openshift-operators/observability-operator-59bdc8b94-ls2vk" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.308496 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/658abf91-1e8b-4182-998f-76d3ed17b836-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd\" (UID: \"658abf91-1e8b-4182-998f-76d3ed17b836\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.308562 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fb0e3807-a209-43ca-a245-64283a1d021f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb\" (UID: \"fb0e3807-a209-43ca-a245-64283a1d021f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.308605 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6vnc\" (UniqueName: \"kubernetes.io/projected/5d03bcc6-aa94-401a-9a3b-4970f64537cd-kube-api-access-w6vnc\") pod \"observability-operator-59bdc8b94-ls2vk\" (UID: \"5d03bcc6-aa94-401a-9a3b-4970f64537cd\") " pod="openshift-operators/observability-operator-59bdc8b94-ls2vk" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.308669 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/658abf91-1e8b-4182-998f-76d3ed17b836-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd\" (UID: \"658abf91-1e8b-4182-998f-76d3ed17b836\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.311804 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-w6sln"] Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.315312 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fb0e3807-a209-43ca-a245-64283a1d021f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb\" (UID: \"fb0e3807-a209-43ca-a245-64283a1d021f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.316643 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/658abf91-1e8b-4182-998f-76d3ed17b836-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd\" (UID: \"658abf91-1e8b-4182-998f-76d3ed17b836\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.317638 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fb0e3807-a209-43ca-a245-64283a1d021f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb\" (UID: \"fb0e3807-a209-43ca-a245-64283a1d021f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.320320 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-w6sln" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.329152 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-w6sln"] Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.331795 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/658abf91-1e8b-4182-998f-76d3ed17b836-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd\" (UID: \"658abf91-1e8b-4182-998f-76d3ed17b836\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.336968 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-qztjr" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.349943 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.375306 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.413981 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6vnc\" (UniqueName: \"kubernetes.io/projected/5d03bcc6-aa94-401a-9a3b-4970f64537cd-kube-api-access-w6vnc\") pod \"observability-operator-59bdc8b94-ls2vk\" (UID: \"5d03bcc6-aa94-401a-9a3b-4970f64537cd\") " pod="openshift-operators/observability-operator-59bdc8b94-ls2vk" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.414388 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d03bcc6-aa94-401a-9a3b-4970f64537cd-observability-operator-tls\") pod \"observability-operator-59bdc8b94-ls2vk\" (UID: \"5d03bcc6-aa94-401a-9a3b-4970f64537cd\") " pod="openshift-operators/observability-operator-59bdc8b94-ls2vk" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.423374 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d03bcc6-aa94-401a-9a3b-4970f64537cd-observability-operator-tls\") pod \"observability-operator-59bdc8b94-ls2vk\" (UID: \"5d03bcc6-aa94-401a-9a3b-4970f64537cd\") " pod="openshift-operators/observability-operator-59bdc8b94-ls2vk" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.456962 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6vnc\" (UniqueName: \"kubernetes.io/projected/5d03bcc6-aa94-401a-9a3b-4970f64537cd-kube-api-access-w6vnc\") pod \"observability-operator-59bdc8b94-ls2vk\" (UID: \"5d03bcc6-aa94-401a-9a3b-4970f64537cd\") " pod="openshift-operators/observability-operator-59bdc8b94-ls2vk" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.505691 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-ls2vk" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.518784 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnv7x\" (UniqueName: \"kubernetes.io/projected/e8a44bc2-38b3-4e8d-ab95-e5e6f861b13b-kube-api-access-nnv7x\") pod \"perses-operator-5bf474d74f-w6sln\" (UID: \"e8a44bc2-38b3-4e8d-ab95-e5e6f861b13b\") " pod="openshift-operators/perses-operator-5bf474d74f-w6sln" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.518968 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/e8a44bc2-38b3-4e8d-ab95-e5e6f861b13b-openshift-service-ca\") pod \"perses-operator-5bf474d74f-w6sln\" (UID: \"e8a44bc2-38b3-4e8d-ab95-e5e6f861b13b\") " pod="openshift-operators/perses-operator-5bf474d74f-w6sln" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.573472 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eda8a248-0107-4d34-a02b-6dbf30972c64" path="/var/lib/kubelet/pods/eda8a248-0107-4d34-a02b-6dbf30972c64/volumes" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.574946 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcc96fc8-80e4-4dda-af2e-91390b6af829" path="/var/lib/kubelet/pods/fcc96fc8-80e4-4dda-af2e-91390b6af829/volumes" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.642821 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnv7x\" (UniqueName: \"kubernetes.io/projected/e8a44bc2-38b3-4e8d-ab95-e5e6f861b13b-kube-api-access-nnv7x\") pod \"perses-operator-5bf474d74f-w6sln\" (UID: \"e8a44bc2-38b3-4e8d-ab95-e5e6f861b13b\") " pod="openshift-operators/perses-operator-5bf474d74f-w6sln" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.643256 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/e8a44bc2-38b3-4e8d-ab95-e5e6f861b13b-openshift-service-ca\") pod \"perses-operator-5bf474d74f-w6sln\" (UID: \"e8a44bc2-38b3-4e8d-ab95-e5e6f861b13b\") " pod="openshift-operators/perses-operator-5bf474d74f-w6sln" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.644093 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/e8a44bc2-38b3-4e8d-ab95-e5e6f861b13b-openshift-service-ca\") pod \"perses-operator-5bf474d74f-w6sln\" (UID: \"e8a44bc2-38b3-4e8d-ab95-e5e6f861b13b\") " pod="openshift-operators/perses-operator-5bf474d74f-w6sln" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.673977 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnv7x\" (UniqueName: \"kubernetes.io/projected/e8a44bc2-38b3-4e8d-ab95-e5e6f861b13b-kube-api-access-nnv7x\") pod \"perses-operator-5bf474d74f-w6sln\" (UID: \"e8a44bc2-38b3-4e8d-ab95-e5e6f861b13b\") " pod="openshift-operators/perses-operator-5bf474d74f-w6sln" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.801600 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-w6sln" Feb 19 23:07:12 crc kubenswrapper[4795]: I0219 23:07:12.042506 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-s52kw"] Feb 19 23:07:12 crc kubenswrapper[4795]: W0219 23:07:12.058885 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a29e309_2974_42a7_afd9_c77d17f414d0.slice/crio-8a5309c92897f41bee68a6bab50234526b7efd5a8bc86ab390db5fb079f34e38 WatchSource:0}: Error finding container 8a5309c92897f41bee68a6bab50234526b7efd5a8bc86ab390db5fb079f34e38: Status 404 returned error can't find the container with id 8a5309c92897f41bee68a6bab50234526b7efd5a8bc86ab390db5fb079f34e38 Feb 19 23:07:12 crc kubenswrapper[4795]: W0219 23:07:12.180391 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod658abf91_1e8b_4182_998f_76d3ed17b836.slice/crio-13b9004e1c9a968a77ed9603a1c5f1f794c4e77bb4ee543a655ae6d8441de5cf WatchSource:0}: Error finding container 13b9004e1c9a968a77ed9603a1c5f1f794c4e77bb4ee543a655ae6d8441de5cf: Status 404 returned error can't find the container with id 13b9004e1c9a968a77ed9603a1c5f1f794c4e77bb4ee543a655ae6d8441de5cf Feb 19 23:07:12 crc kubenswrapper[4795]: I0219 23:07:12.189106 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb"] Feb 19 23:07:12 crc kubenswrapper[4795]: I0219 23:07:12.202565 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd"] Feb 19 23:07:12 crc kubenswrapper[4795]: I0219 23:07:12.347703 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-ls2vk"] Feb 19 23:07:12 crc kubenswrapper[4795]: W0219 23:07:12.353738 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d03bcc6_aa94_401a_9a3b_4970f64537cd.slice/crio-61354bba44c68b07c36a9ad14176942036125815a1d4e68bdcc378739045a92c WatchSource:0}: Error finding container 61354bba44c68b07c36a9ad14176942036125815a1d4e68bdcc378739045a92c: Status 404 returned error can't find the container with id 61354bba44c68b07c36a9ad14176942036125815a1d4e68bdcc378739045a92c Feb 19 23:07:12 crc kubenswrapper[4795]: I0219 23:07:12.460323 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-w6sln"] Feb 19 23:07:12 crc kubenswrapper[4795]: I0219 23:07:12.995007 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-s52kw" event={"ID":"0a29e309-2974-42a7-afd9-c77d17f414d0","Type":"ContainerStarted","Data":"8a5309c92897f41bee68a6bab50234526b7efd5a8bc86ab390db5fb079f34e38"} Feb 19 23:07:13 crc kubenswrapper[4795]: I0219 23:07:13.012040 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd" event={"ID":"658abf91-1e8b-4182-998f-76d3ed17b836","Type":"ContainerStarted","Data":"13b9004e1c9a968a77ed9603a1c5f1f794c4e77bb4ee543a655ae6d8441de5cf"} Feb 19 23:07:13 crc kubenswrapper[4795]: I0219 23:07:13.013298 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb" event={"ID":"fb0e3807-a209-43ca-a245-64283a1d021f","Type":"ContainerStarted","Data":"defa1ed0cd56e005c852167667d4d7f1341390b5bd35c25ff6a70d1ae1f69160"} Feb 19 23:07:13 crc kubenswrapper[4795]: I0219 23:07:13.015486 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-w6sln" event={"ID":"e8a44bc2-38b3-4e8d-ab95-e5e6f861b13b","Type":"ContainerStarted","Data":"a4a35f9a7530a173b815a5a8fbaf856d5a542972a107e0bf7eb3da7d799df193"} Feb 19 23:07:13 crc kubenswrapper[4795]: I0219 23:07:13.016554 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-ls2vk" event={"ID":"5d03bcc6-aa94-401a-9a3b-4970f64537cd","Type":"ContainerStarted","Data":"61354bba44c68b07c36a9ad14176942036125815a1d4e68bdcc378739045a92c"} Feb 19 23:07:14 crc kubenswrapper[4795]: I0219 23:07:14.512586 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:07:14 crc kubenswrapper[4795]: E0219 23:07:14.513152 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:07:17 crc kubenswrapper[4795]: I0219 23:07:17.042482 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-srnhx"] Feb 19 23:07:17 crc kubenswrapper[4795]: I0219 23:07:17.050739 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-srnhx"] Feb 19 23:07:17 crc kubenswrapper[4795]: I0219 23:07:17.355387 4795 scope.go:117] "RemoveContainer" containerID="a92e1a832062a3ff2c20dc8cd15ae2f139e651b81902a3e2ee439ec6e20b2123" Feb 19 23:07:17 crc kubenswrapper[4795]: I0219 23:07:17.718701 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7286d7ba-7f8c-4f40-a18a-d29af788c344" path="/var/lib/kubelet/pods/7286d7ba-7f8c-4f40-a18a-d29af788c344/volumes" Feb 19 23:07:24 crc kubenswrapper[4795]: I0219 23:07:24.784658 4795 scope.go:117] "RemoveContainer" containerID="b7a9bfd4cc9f1de56acd0e84791ea820651eb87fe9629808f92a9d7ea31fba08" Feb 19 23:07:24 crc kubenswrapper[4795]: I0219 23:07:24.850400 4795 scope.go:117] "RemoveContainer" containerID="ce12984ea586896da4a3a2ca9a12c46a23d3b89ee0886c7e9ee2b6ceb73add38" Feb 19 23:07:24 crc kubenswrapper[4795]: I0219 23:07:24.965942 4795 scope.go:117] "RemoveContainer" containerID="2fce75ec54d1aa6599ee05902a3278bc712bcedfa4ab8adb5ae0a48b2003dc71" Feb 19 23:07:25 crc kubenswrapper[4795]: I0219 23:07:25.073800 4795 scope.go:117] "RemoveContainer" containerID="406b01e5656ffc8e932bd5ff0af64ce799789e55eeb816cb84ffd89f44da61ef" Feb 19 23:07:25 crc kubenswrapper[4795]: I0219 23:07:25.173581 4795 scope.go:117] "RemoveContainer" containerID="dbe34d14d5acf3b90d2e3b7465f76c19cc181a6a6717aade5162b1682beb16f4" Feb 19 23:07:26 crc kubenswrapper[4795]: I0219 23:07:26.207575 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb" event={"ID":"fb0e3807-a209-43ca-a245-64283a1d021f","Type":"ContainerStarted","Data":"682b8f555a3cbdac5c847614e8e8f922fae5c657c652b1d0d67c7b3900ca001c"} Feb 19 23:07:26 crc kubenswrapper[4795]: I0219 23:07:26.208996 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-w6sln" event={"ID":"e8a44bc2-38b3-4e8d-ab95-e5e6f861b13b","Type":"ContainerStarted","Data":"22e157950a34b88b6beab18352898d9a491eafe4a19446a758769a2c448fa1a3"} Feb 19 23:07:26 crc kubenswrapper[4795]: I0219 23:07:26.209064 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-w6sln" Feb 19 23:07:26 crc kubenswrapper[4795]: I0219 23:07:26.210074 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-s52kw" event={"ID":"0a29e309-2974-42a7-afd9-c77d17f414d0","Type":"ContainerStarted","Data":"cc7d6b7a70dbc038ef0a36ea561b3ce2b71efb27a24befaf68ed450c51a65c2f"} Feb 19 23:07:26 crc kubenswrapper[4795]: I0219 23:07:26.211426 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd" event={"ID":"658abf91-1e8b-4182-998f-76d3ed17b836","Type":"ContainerStarted","Data":"54ad2b3df4352c8c6d60ccf5835f2c0cb6d61d704d51733ce8f5dbbeca69fa26"} Feb 19 23:07:26 crc kubenswrapper[4795]: I0219 23:07:26.212546 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-ls2vk" event={"ID":"5d03bcc6-aa94-401a-9a3b-4970f64537cd","Type":"ContainerStarted","Data":"5161f4538ca8f54a75b3d76362bad3b3b48c240663435771edeff5a8bdf806ad"} Feb 19 23:07:26 crc kubenswrapper[4795]: I0219 23:07:26.212895 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-ls2vk" Feb 19 23:07:26 crc kubenswrapper[4795]: I0219 23:07:26.220764 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-ls2vk" Feb 19 23:07:26 crc kubenswrapper[4795]: I0219 23:07:26.236761 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb" podStartSLOduration=2.57395792 podStartE2EDuration="15.236740175s" podCreationTimestamp="2026-02-19 23:07:11 +0000 UTC" firstStartedPulling="2026-02-19 23:07:12.188810146 +0000 UTC m=+5943.381328010" lastFinishedPulling="2026-02-19 23:07:24.851592401 +0000 UTC m=+5956.044110265" observedRunningTime="2026-02-19 23:07:26.222009483 +0000 UTC m=+5957.414527347" watchObservedRunningTime="2026-02-19 23:07:26.236740175 +0000 UTC m=+5957.429258039" Feb 19 23:07:26 crc kubenswrapper[4795]: I0219 23:07:26.250252 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd" podStartSLOduration=2.512897112 podStartE2EDuration="15.250235565s" podCreationTimestamp="2026-02-19 23:07:11 +0000 UTC" firstStartedPulling="2026-02-19 23:07:12.186927535 +0000 UTC m=+5943.379445399" lastFinishedPulling="2026-02-19 23:07:24.924265998 +0000 UTC m=+5956.116783852" observedRunningTime="2026-02-19 23:07:26.244208044 +0000 UTC m=+5957.436725908" watchObservedRunningTime="2026-02-19 23:07:26.250235565 +0000 UTC m=+5957.442753429" Feb 19 23:07:26 crc kubenswrapper[4795]: I0219 23:07:26.314939 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-w6sln" podStartSLOduration=2.92790728 podStartE2EDuration="15.314919778s" podCreationTimestamp="2026-02-19 23:07:11 +0000 UTC" firstStartedPulling="2026-02-19 23:07:12.463560896 +0000 UTC m=+5943.656078760" lastFinishedPulling="2026-02-19 23:07:24.850573394 +0000 UTC m=+5956.043091258" observedRunningTime="2026-02-19 23:07:26.301686566 +0000 UTC m=+5957.494204430" watchObservedRunningTime="2026-02-19 23:07:26.314919778 +0000 UTC m=+5957.507437642" Feb 19 23:07:26 crc kubenswrapper[4795]: I0219 23:07:26.374183 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-s52kw" podStartSLOduration=3.561444336 podStartE2EDuration="16.374153676s" podCreationTimestamp="2026-02-19 23:07:10 +0000 UTC" firstStartedPulling="2026-02-19 23:07:12.061670859 +0000 UTC m=+5943.254188723" lastFinishedPulling="2026-02-19 23:07:24.874380209 +0000 UTC m=+5956.066898063" observedRunningTime="2026-02-19 23:07:26.342479893 +0000 UTC m=+5957.534997757" watchObservedRunningTime="2026-02-19 23:07:26.374153676 +0000 UTC m=+5957.566671540" Feb 19 23:07:26 crc kubenswrapper[4795]: I0219 23:07:26.380627 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-ls2vk" podStartSLOduration=2.770883836 podStartE2EDuration="15.380614999s" podCreationTimestamp="2026-02-19 23:07:11 +0000 UTC" firstStartedPulling="2026-02-19 23:07:12.355187098 +0000 UTC m=+5943.547704962" lastFinishedPulling="2026-02-19 23:07:24.964918261 +0000 UTC m=+5956.157436125" observedRunningTime="2026-02-19 23:07:26.372715748 +0000 UTC m=+5957.565233612" watchObservedRunningTime="2026-02-19 23:07:26.380614999 +0000 UTC m=+5957.573132853" Feb 19 23:07:28 crc kubenswrapper[4795]: I0219 23:07:28.511536 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:07:29 crc kubenswrapper[4795]: I0219 23:07:29.239842 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"b0fe96e51c4e702a3b8fdcd9d997ef35d626772b563d6c998f84fa6863685a9d"} Feb 19 23:07:31 crc kubenswrapper[4795]: I0219 23:07:31.805005 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-w6sln" Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.624265 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.625903 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="54e90f84-703c-41b3-85c2-dd4ce9e3a968" containerName="openstackclient" containerID="cri-o://3e0c8bd8a5e5ff3ecf9344765c7baeaf282ae456818a7075cdf1b64ce1d085bd" gracePeriod=2 Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.636985 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.703624 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 19 23:07:34 crc kubenswrapper[4795]: E0219 23:07:34.704346 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54e90f84-703c-41b3-85c2-dd4ce9e3a968" containerName="openstackclient" Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.704363 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="54e90f84-703c-41b3-85c2-dd4ce9e3a968" containerName="openstackclient" Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.704567 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="54e90f84-703c-41b3-85c2-dd4ce9e3a968" containerName="openstackclient" Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.705226 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.710380 4795 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="54e90f84-703c-41b3-85c2-dd4ce9e3a968" podUID="f1d06b1e-9114-47b8-913d-86144f6314c3" Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.737262 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.854204 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.856319 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.865598 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-l7j8w" Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.866353 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f1d06b1e-9114-47b8-913d-86144f6314c3-openstack-config-secret\") pod \"openstackclient\" (UID: \"f1d06b1e-9114-47b8-913d-86144f6314c3\") " pod="openstack/openstackclient" Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.866431 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b94s\" (UniqueName: \"kubernetes.io/projected/f1d06b1e-9114-47b8-913d-86144f6314c3-kube-api-access-4b94s\") pod \"openstackclient\" (UID: \"f1d06b1e-9114-47b8-913d-86144f6314c3\") " pod="openstack/openstackclient" Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.866469 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f1d06b1e-9114-47b8-913d-86144f6314c3-openstack-config\") pod \"openstackclient\" (UID: \"f1d06b1e-9114-47b8-913d-86144f6314c3\") " pod="openstack/openstackclient" Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.875713 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.969419 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f1d06b1e-9114-47b8-913d-86144f6314c3-openstack-config-secret\") pod \"openstackclient\" (UID: \"f1d06b1e-9114-47b8-913d-86144f6314c3\") " pod="openstack/openstackclient" Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.969473 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b94s\" (UniqueName: \"kubernetes.io/projected/f1d06b1e-9114-47b8-913d-86144f6314c3-kube-api-access-4b94s\") pod \"openstackclient\" (UID: \"f1d06b1e-9114-47b8-913d-86144f6314c3\") " pod="openstack/openstackclient" Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.969500 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f1d06b1e-9114-47b8-913d-86144f6314c3-openstack-config\") pod \"openstackclient\" (UID: \"f1d06b1e-9114-47b8-913d-86144f6314c3\") " pod="openstack/openstackclient" Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.969541 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqtks\" (UniqueName: \"kubernetes.io/projected/5a2a47de-c40d-40c9-8556-ea7033a4033b-kube-api-access-zqtks\") pod \"kube-state-metrics-0\" (UID: \"5a2a47de-c40d-40c9-8556-ea7033a4033b\") " pod="openstack/kube-state-metrics-0" Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.974111 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f1d06b1e-9114-47b8-913d-86144f6314c3-openstack-config\") pod \"openstackclient\" (UID: \"f1d06b1e-9114-47b8-913d-86144f6314c3\") " pod="openstack/openstackclient" Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.978372 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f1d06b1e-9114-47b8-913d-86144f6314c3-openstack-config-secret\") pod \"openstackclient\" (UID: \"f1d06b1e-9114-47b8-913d-86144f6314c3\") " pod="openstack/openstackclient" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.035472 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b94s\" (UniqueName: \"kubernetes.io/projected/f1d06b1e-9114-47b8-913d-86144f6314c3-kube-api-access-4b94s\") pod \"openstackclient\" (UID: \"f1d06b1e-9114-47b8-913d-86144f6314c3\") " pod="openstack/openstackclient" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.071733 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqtks\" (UniqueName: \"kubernetes.io/projected/5a2a47de-c40d-40c9-8556-ea7033a4033b-kube-api-access-zqtks\") pod \"kube-state-metrics-0\" (UID: \"5a2a47de-c40d-40c9-8556-ea7033a4033b\") " pod="openstack/kube-state-metrics-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.115961 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqtks\" (UniqueName: \"kubernetes.io/projected/5a2a47de-c40d-40c9-8556-ea7033a4033b-kube-api-access-zqtks\") pod \"kube-state-metrics-0\" (UID: \"5a2a47de-c40d-40c9-8556-ea7033a4033b\") " pod="openstack/kube-state-metrics-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.199726 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.329488 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.641640 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.667621 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.667713 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.688688 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.688774 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.688937 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.688686 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-dlgzw" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.688695 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.808396 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.808500 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.808599 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.808634 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.808656 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.808700 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.808746 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mznjs\" (UniqueName: \"kubernetes.io/projected/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-kube-api-access-mznjs\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.910672 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.910742 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mznjs\" (UniqueName: \"kubernetes.io/projected/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-kube-api-access-mznjs\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.910785 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.910844 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.910890 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.910918 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.910934 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.912254 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.917259 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.920544 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.930820 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.933049 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.935507 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mznjs\" (UniqueName: \"kubernetes.io/projected/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-kube-api-access-mznjs\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.949071 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.021664 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.075600 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.277395 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.279601 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.303352 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.303884 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.303484 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.303553 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.310075 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.310190 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.315943 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.338687 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-n4gjc" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.387738 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.387961 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5a2a47de-c40d-40c9-8556-ea7033a4033b","Type":"ContainerStarted","Data":"28e7859be678b09a2f3d8d670d0b4e0ab24a6551e502e177072a9b339370025d"} Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.408990 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.489430 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/281b5fc0-7da4-4d5a-89d4-b073b1500865-config\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.489554 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/281b5fc0-7da4-4d5a-89d4-b073b1500865-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.489604 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/281b5fc0-7da4-4d5a-89d4-b073b1500865-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.489638 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/281b5fc0-7da4-4d5a-89d4-b073b1500865-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.489706 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/281b5fc0-7da4-4d5a-89d4-b073b1500865-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.489753 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-50ab6ab7-e7d1-4664-a405-e36dbc6d1dc7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-50ab6ab7-e7d1-4664-a405-e36dbc6d1dc7\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.489834 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/281b5fc0-7da4-4d5a-89d4-b073b1500865-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.489881 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/281b5fc0-7da4-4d5a-89d4-b073b1500865-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.489949 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/281b5fc0-7da4-4d5a-89d4-b073b1500865-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.489995 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zsf5\" (UniqueName: \"kubernetes.io/projected/281b5fc0-7da4-4d5a-89d4-b073b1500865-kube-api-access-4zsf5\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.605829 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/281b5fc0-7da4-4d5a-89d4-b073b1500865-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.606297 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/281b5fc0-7da4-4d5a-89d4-b073b1500865-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.606430 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/281b5fc0-7da4-4d5a-89d4-b073b1500865-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.606523 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zsf5\" (UniqueName: \"kubernetes.io/projected/281b5fc0-7da4-4d5a-89d4-b073b1500865-kube-api-access-4zsf5\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.606636 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/281b5fc0-7da4-4d5a-89d4-b073b1500865-config\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.606748 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/281b5fc0-7da4-4d5a-89d4-b073b1500865-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.606802 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/281b5fc0-7da4-4d5a-89d4-b073b1500865-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.606839 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/281b5fc0-7da4-4d5a-89d4-b073b1500865-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.606919 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/281b5fc0-7da4-4d5a-89d4-b073b1500865-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.606980 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-50ab6ab7-e7d1-4664-a405-e36dbc6d1dc7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-50ab6ab7-e7d1-4664-a405-e36dbc6d1dc7\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.610025 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/281b5fc0-7da4-4d5a-89d4-b073b1500865-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.614459 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/281b5fc0-7da4-4d5a-89d4-b073b1500865-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.617854 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/281b5fc0-7da4-4d5a-89d4-b073b1500865-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.622431 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/281b5fc0-7da4-4d5a-89d4-b073b1500865-config\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.622741 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/281b5fc0-7da4-4d5a-89d4-b073b1500865-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.632882 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/281b5fc0-7da4-4d5a-89d4-b073b1500865-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.634213 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/281b5fc0-7da4-4d5a-89d4-b073b1500865-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.635782 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.635833 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-50ab6ab7-e7d1-4664-a405-e36dbc6d1dc7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-50ab6ab7-e7d1-4664-a405-e36dbc6d1dc7\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0e98ca8a4306f45b0c207e0b22ce20c78efdfec2f28e0669d30e68d17890be1f/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.650334 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/281b5fc0-7da4-4d5a-89d4-b073b1500865-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.651977 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zsf5\" (UniqueName: \"kubernetes.io/projected/281b5fc0-7da4-4d5a-89d4-b073b1500865-kube-api-access-4zsf5\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.686304 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-50ab6ab7-e7d1-4664-a405-e36dbc6d1dc7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-50ab6ab7-e7d1-4664-a405-e36dbc6d1dc7\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.961973 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.991371 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.334593 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.400705 4795 generic.go:334] "Generic (PLEG): container finished" podID="54e90f84-703c-41b3-85c2-dd4ce9e3a968" containerID="3e0c8bd8a5e5ff3ecf9344765c7baeaf282ae456818a7075cdf1b64ce1d085bd" exitCode=137 Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.400783 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.400816 4795 scope.go:117] "RemoveContainer" containerID="3e0c8bd8a5e5ff3ecf9344765c7baeaf282ae456818a7075cdf1b64ce1d085bd" Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.402940 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3","Type":"ContainerStarted","Data":"91039ea6f3bcf8ab62a299a40c821405c3ef84455f4d7cafb422b0ca09dbe4d0"} Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.407961 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f1d06b1e-9114-47b8-913d-86144f6314c3","Type":"ContainerStarted","Data":"5f9b3bfde2c60041dc2613027af3cb7a93b83c6359ae775ac8a5b59a4fbe841d"} Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.424254 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/54e90f84-703c-41b3-85c2-dd4ce9e3a968-openstack-config\") pod \"54e90f84-703c-41b3-85c2-dd4ce9e3a968\" (UID: \"54e90f84-703c-41b3-85c2-dd4ce9e3a968\") " Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.424619 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/54e90f84-703c-41b3-85c2-dd4ce9e3a968-openstack-config-secret\") pod \"54e90f84-703c-41b3-85c2-dd4ce9e3a968\" (UID: \"54e90f84-703c-41b3-85c2-dd4ce9e3a968\") " Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.424847 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sfjd\" (UniqueName: \"kubernetes.io/projected/54e90f84-703c-41b3-85c2-dd4ce9e3a968-kube-api-access-8sfjd\") pod \"54e90f84-703c-41b3-85c2-dd4ce9e3a968\" (UID: \"54e90f84-703c-41b3-85c2-dd4ce9e3a968\") " Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.436338 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54e90f84-703c-41b3-85c2-dd4ce9e3a968-kube-api-access-8sfjd" (OuterVolumeSpecName: "kube-api-access-8sfjd") pod "54e90f84-703c-41b3-85c2-dd4ce9e3a968" (UID: "54e90f84-703c-41b3-85c2-dd4ce9e3a968"). InnerVolumeSpecName "kube-api-access-8sfjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.477641 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54e90f84-703c-41b3-85c2-dd4ce9e3a968-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "54e90f84-703c-41b3-85c2-dd4ce9e3a968" (UID: "54e90f84-703c-41b3-85c2-dd4ce9e3a968"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.494715 4795 scope.go:117] "RemoveContainer" containerID="3e0c8bd8a5e5ff3ecf9344765c7baeaf282ae456818a7075cdf1b64ce1d085bd" Feb 19 23:07:37 crc kubenswrapper[4795]: E0219 23:07:37.495699 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e0c8bd8a5e5ff3ecf9344765c7baeaf282ae456818a7075cdf1b64ce1d085bd\": container with ID starting with 3e0c8bd8a5e5ff3ecf9344765c7baeaf282ae456818a7075cdf1b64ce1d085bd not found: ID does not exist" containerID="3e0c8bd8a5e5ff3ecf9344765c7baeaf282ae456818a7075cdf1b64ce1d085bd" Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.495735 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e0c8bd8a5e5ff3ecf9344765c7baeaf282ae456818a7075cdf1b64ce1d085bd"} err="failed to get container status \"3e0c8bd8a5e5ff3ecf9344765c7baeaf282ae456818a7075cdf1b64ce1d085bd\": rpc error: code = NotFound desc = could not find container \"3e0c8bd8a5e5ff3ecf9344765c7baeaf282ae456818a7075cdf1b64ce1d085bd\": container with ID starting with 3e0c8bd8a5e5ff3ecf9344765c7baeaf282ae456818a7075cdf1b64ce1d085bd not found: ID does not exist" Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.518817 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54e90f84-703c-41b3-85c2-dd4ce9e3a968-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "54e90f84-703c-41b3-85c2-dd4ce9e3a968" (UID: "54e90f84-703c-41b3-85c2-dd4ce9e3a968"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.524780 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54e90f84-703c-41b3-85c2-dd4ce9e3a968" path="/var/lib/kubelet/pods/54e90f84-703c-41b3-85c2-dd4ce9e3a968/volumes" Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.530568 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/54e90f84-703c-41b3-85c2-dd4ce9e3a968-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.530602 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/54e90f84-703c-41b3-85c2-dd4ce9e3a968-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.530613 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sfjd\" (UniqueName: \"kubernetes.io/projected/54e90f84-703c-41b3-85c2-dd4ce9e3a968-kube-api-access-8sfjd\") on node \"crc\" DevicePath \"\"" Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.617883 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 23:07:38 crc kubenswrapper[4795]: I0219 23:07:38.421262 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5a2a47de-c40d-40c9-8556-ea7033a4033b","Type":"ContainerStarted","Data":"1d898d6c38d43a61747bc511598e0c38b71f457124d422748f7fc0a5fd54852b"} Feb 19 23:07:38 crc kubenswrapper[4795]: I0219 23:07:38.421448 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 19 23:07:38 crc kubenswrapper[4795]: I0219 23:07:38.423122 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"281b5fc0-7da4-4d5a-89d4-b073b1500865","Type":"ContainerStarted","Data":"04c67ed8f0cf7f1b0e556d1df003bd0033d604fd36cf7640910b6dc261b03a92"} Feb 19 23:07:38 crc kubenswrapper[4795]: I0219 23:07:38.425335 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f1d06b1e-9114-47b8-913d-86144f6314c3","Type":"ContainerStarted","Data":"54c3c15f47eb5e449e2d5319739718bd7136c1784750f7f835b55b200ef136a6"} Feb 19 23:07:38 crc kubenswrapper[4795]: I0219 23:07:38.443661 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.80503997 podStartE2EDuration="4.443642644s" podCreationTimestamp="2026-02-19 23:07:34 +0000 UTC" firstStartedPulling="2026-02-19 23:07:36.130818326 +0000 UTC m=+5967.323336190" lastFinishedPulling="2026-02-19 23:07:36.769421 +0000 UTC m=+5967.961938864" observedRunningTime="2026-02-19 23:07:38.43897002 +0000 UTC m=+5969.631487884" watchObservedRunningTime="2026-02-19 23:07:38.443642644 +0000 UTC m=+5969.636160508" Feb 19 23:07:38 crc kubenswrapper[4795]: I0219 23:07:38.461425 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=4.4614032869999996 podStartE2EDuration="4.461403287s" podCreationTimestamp="2026-02-19 23:07:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:07:38.455082989 +0000 UTC m=+5969.647600853" watchObservedRunningTime="2026-02-19 23:07:38.461403287 +0000 UTC m=+5969.653921141" Feb 19 23:07:42 crc kubenswrapper[4795]: I0219 23:07:42.468774 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"281b5fc0-7da4-4d5a-89d4-b073b1500865","Type":"ContainerStarted","Data":"95b763a5281f060aa043098421142c64a5a0e1f503f589832c897f8922788c08"} Feb 19 23:07:44 crc kubenswrapper[4795]: I0219 23:07:44.489989 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3","Type":"ContainerStarted","Data":"33132c914d6072c08b89eaca99297423ffe42a3393bab4b9400028bc91539b90"} Feb 19 23:07:45 crc kubenswrapper[4795]: I0219 23:07:45.208304 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 19 23:07:50 crc kubenswrapper[4795]: I0219 23:07:50.560105 4795 generic.go:334] "Generic (PLEG): container finished" podID="b5b60b6d-7ecf-424d-a297-f98fae5ef0a3" containerID="33132c914d6072c08b89eaca99297423ffe42a3393bab4b9400028bc91539b90" exitCode=0 Feb 19 23:07:50 crc kubenswrapper[4795]: I0219 23:07:50.560222 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3","Type":"ContainerDied","Data":"33132c914d6072c08b89eaca99297423ffe42a3393bab4b9400028bc91539b90"} Feb 19 23:07:50 crc kubenswrapper[4795]: I0219 23:07:50.564383 4795 generic.go:334] "Generic (PLEG): container finished" podID="281b5fc0-7da4-4d5a-89d4-b073b1500865" containerID="95b763a5281f060aa043098421142c64a5a0e1f503f589832c897f8922788c08" exitCode=0 Feb 19 23:07:50 crc kubenswrapper[4795]: I0219 23:07:50.564475 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"281b5fc0-7da4-4d5a-89d4-b073b1500865","Type":"ContainerDied","Data":"95b763a5281f060aa043098421142c64a5a0e1f503f589832c897f8922788c08"} Feb 19 23:07:53 crc kubenswrapper[4795]: I0219 23:07:53.601053 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3","Type":"ContainerStarted","Data":"035dced6dc1d8c609f3f4f83f0e988bba429ccd5ba721ef5396e1f2e114bdb62"} Feb 19 23:07:56 crc kubenswrapper[4795]: I0219 23:07:56.627079 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"281b5fc0-7da4-4d5a-89d4-b073b1500865","Type":"ContainerStarted","Data":"460ba294f22d0be78bd36903159370630e46c54d8b61d72126070926df32cbbf"} Feb 19 23:07:56 crc kubenswrapper[4795]: I0219 23:07:56.630750 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3","Type":"ContainerStarted","Data":"64ab6545c5dbcb6ef2aa8c68dc95e4d32067b0113778a132e51b79ca3dde9d3e"} Feb 19 23:07:56 crc kubenswrapper[4795]: I0219 23:07:56.630983 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:56 crc kubenswrapper[4795]: I0219 23:07:56.634696 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:56 crc kubenswrapper[4795]: I0219 23:07:56.654463 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=5.550604469 podStartE2EDuration="21.654444062s" podCreationTimestamp="2026-02-19 23:07:35 +0000 UTC" firstStartedPulling="2026-02-19 23:07:36.994818085 +0000 UTC m=+5968.187335949" lastFinishedPulling="2026-02-19 23:07:53.098657678 +0000 UTC m=+5984.291175542" observedRunningTime="2026-02-19 23:07:56.64984637 +0000 UTC m=+5987.842364234" watchObservedRunningTime="2026-02-19 23:07:56.654444062 +0000 UTC m=+5987.846961926" Feb 19 23:07:59 crc kubenswrapper[4795]: I0219 23:07:59.662401 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"281b5fc0-7da4-4d5a-89d4-b073b1500865","Type":"ContainerStarted","Data":"c11350d4307defa8241541c8c4e814d6c6de396113561f904b3198d0aad466ed"} Feb 19 23:08:04 crc kubenswrapper[4795]: I0219 23:08:04.721122 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"281b5fc0-7da4-4d5a-89d4-b073b1500865","Type":"ContainerStarted","Data":"a464a72fbdb723c6703282b6c1410035b218194056585f1670956cbf17f81b82"} Feb 19 23:08:04 crc kubenswrapper[4795]: I0219 23:08:04.756486 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=3.517215225 podStartE2EDuration="29.756461229s" podCreationTimestamp="2026-02-19 23:07:35 +0000 UTC" firstStartedPulling="2026-02-19 23:07:37.626192085 +0000 UTC m=+5968.818709949" lastFinishedPulling="2026-02-19 23:08:03.865438089 +0000 UTC m=+5995.057955953" observedRunningTime="2026-02-19 23:08:04.746239696 +0000 UTC m=+5995.938757550" watchObservedRunningTime="2026-02-19 23:08:04.756461229 +0000 UTC m=+5995.948979113" Feb 19 23:08:06 crc kubenswrapper[4795]: I0219 23:08:06.991808 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 19 23:08:06 crc kubenswrapper[4795]: I0219 23:08:06.992393 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 19 23:08:06 crc kubenswrapper[4795]: I0219 23:08:06.996285 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 19 23:08:07 crc kubenswrapper[4795]: I0219 23:08:07.750436 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.743910 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.748346 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.751172 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.751180 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.767134 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.785263 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-config-data\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.785397 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbt7n\" (UniqueName: \"kubernetes.io/projected/ed07c74b-0313-43e7-a031-024966ef2734-kube-api-access-hbt7n\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.785481 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.785585 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-scripts\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.785611 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed07c74b-0313-43e7-a031-024966ef2734-run-httpd\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.785707 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.785744 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed07c74b-0313-43e7-a031-024966ef2734-log-httpd\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.887036 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-config-data\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.887108 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbt7n\" (UniqueName: \"kubernetes.io/projected/ed07c74b-0313-43e7-a031-024966ef2734-kube-api-access-hbt7n\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.887152 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.887214 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-scripts\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.887230 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed07c74b-0313-43e7-a031-024966ef2734-run-httpd\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.887266 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.887283 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed07c74b-0313-43e7-a031-024966ef2734-log-httpd\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.887698 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed07c74b-0313-43e7-a031-024966ef2734-log-httpd\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.889025 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed07c74b-0313-43e7-a031-024966ef2734-run-httpd\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.892920 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-scripts\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.893020 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.894397 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-config-data\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.909822 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.917838 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbt7n\" (UniqueName: \"kubernetes.io/projected/ed07c74b-0313-43e7-a031-024966ef2734-kube-api-access-hbt7n\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:09 crc kubenswrapper[4795]: I0219 23:08:09.068178 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:08:09 crc kubenswrapper[4795]: I0219 23:08:09.567337 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:08:09 crc kubenswrapper[4795]: I0219 23:08:09.774375 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed07c74b-0313-43e7-a031-024966ef2734","Type":"ContainerStarted","Data":"92493a20710869063a50df759376e4f92368ef4022d75d392f1e23b18e170ed8"} Feb 19 23:08:10 crc kubenswrapper[4795]: I0219 23:08:10.785915 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed07c74b-0313-43e7-a031-024966ef2734","Type":"ContainerStarted","Data":"34b6d357921348772dce07cbb59f4cb709626c264269c9f5fbc76d546a352213"} Feb 19 23:08:11 crc kubenswrapper[4795]: I0219 23:08:11.799830 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed07c74b-0313-43e7-a031-024966ef2734","Type":"ContainerStarted","Data":"793df9462bcfd2f1844bb37dfd196c29bbd8324d8e35261f3d76e625871728fa"} Feb 19 23:08:12 crc kubenswrapper[4795]: I0219 23:08:12.811510 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed07c74b-0313-43e7-a031-024966ef2734","Type":"ContainerStarted","Data":"cb6a9b96dedb3f21e3e969a05bf643085cb53bcccf6ec35d67ad3e864971ea31"} Feb 19 23:08:14 crc kubenswrapper[4795]: I0219 23:08:14.837451 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed07c74b-0313-43e7-a031-024966ef2734","Type":"ContainerStarted","Data":"887744f5aee90def91c51a51f9bae27976967caa200c8c26baf9917953c0a305"} Feb 19 23:08:14 crc kubenswrapper[4795]: I0219 23:08:14.838390 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 23:08:14 crc kubenswrapper[4795]: I0219 23:08:14.868663 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.262240424 podStartE2EDuration="6.86863768s" podCreationTimestamp="2026-02-19 23:08:08 +0000 UTC" firstStartedPulling="2026-02-19 23:08:09.578188649 +0000 UTC m=+6000.770706513" lastFinishedPulling="2026-02-19 23:08:14.184585905 +0000 UTC m=+6005.377103769" observedRunningTime="2026-02-19 23:08:14.858357826 +0000 UTC m=+6006.050875690" watchObservedRunningTime="2026-02-19 23:08:14.86863768 +0000 UTC m=+6006.061155554" Feb 19 23:08:17 crc kubenswrapper[4795]: I0219 23:08:17.047681 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-qp45n"] Feb 19 23:08:17 crc kubenswrapper[4795]: I0219 23:08:17.056670 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-c94f-account-create-update-rqkwd"] Feb 19 23:08:17 crc kubenswrapper[4795]: I0219 23:08:17.065779 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-x75sd"] Feb 19 23:08:17 crc kubenswrapper[4795]: I0219 23:08:17.074982 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-c94f-account-create-update-rqkwd"] Feb 19 23:08:17 crc kubenswrapper[4795]: I0219 23:08:17.084236 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-wj996"] Feb 19 23:08:17 crc kubenswrapper[4795]: I0219 23:08:17.093268 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-qp45n"] Feb 19 23:08:17 crc kubenswrapper[4795]: I0219 23:08:17.102208 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-x75sd"] Feb 19 23:08:17 crc kubenswrapper[4795]: I0219 23:08:17.110869 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-wj996"] Feb 19 23:08:17 crc kubenswrapper[4795]: I0219 23:08:17.538510 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41276d39-878a-4ed2-879b-2a053340874e" path="/var/lib/kubelet/pods/41276d39-878a-4ed2-879b-2a053340874e/volumes" Feb 19 23:08:17 crc kubenswrapper[4795]: I0219 23:08:17.539926 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f811fa4-8fb3-4adc-a9a8-6539dc03494c" path="/var/lib/kubelet/pods/6f811fa4-8fb3-4adc-a9a8-6539dc03494c/volumes" Feb 19 23:08:17 crc kubenswrapper[4795]: I0219 23:08:17.541581 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd4e5010-15f4-499e-8279-9a1b814b5490" path="/var/lib/kubelet/pods/bd4e5010-15f4-499e-8279-9a1b814b5490/volumes" Feb 19 23:08:17 crc kubenswrapper[4795]: I0219 23:08:17.542885 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c14f4993-80e4-4fbf-a719-22f17750811b" path="/var/lib/kubelet/pods/c14f4993-80e4-4fbf-a719-22f17750811b/volumes" Feb 19 23:08:18 crc kubenswrapper[4795]: I0219 23:08:18.035071 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-72a3-account-create-update-bfhbs"] Feb 19 23:08:18 crc kubenswrapper[4795]: I0219 23:08:18.048131 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-93a1-account-create-update-fsvms"] Feb 19 23:08:18 crc kubenswrapper[4795]: I0219 23:08:18.062338 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-93a1-account-create-update-fsvms"] Feb 19 23:08:18 crc kubenswrapper[4795]: I0219 23:08:18.077201 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-72a3-account-create-update-bfhbs"] Feb 19 23:08:19 crc kubenswrapper[4795]: I0219 23:08:19.528082 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32478c4a-a97f-4fd3-84f0-a3c221beefe9" path="/var/lib/kubelet/pods/32478c4a-a97f-4fd3-84f0-a3c221beefe9/volumes" Feb 19 23:08:19 crc kubenswrapper[4795]: I0219 23:08:19.528959 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6f38e11-ea05-447d-8564-117c0f589d88" path="/var/lib/kubelet/pods/b6f38e11-ea05-447d-8564-117c0f589d88/volumes" Feb 19 23:08:20 crc kubenswrapper[4795]: I0219 23:08:20.985958 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-jll2l"] Feb 19 23:08:20 crc kubenswrapper[4795]: I0219 23:08:20.989318 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-jll2l" Feb 19 23:08:20 crc kubenswrapper[4795]: I0219 23:08:20.994338 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-jll2l"] Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.101972 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-aa86-account-create-update-n8xdq"] Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.103682 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-aa86-account-create-update-n8xdq" Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.106667 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.119069 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-aa86-account-create-update-n8xdq"] Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.133947 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8p4w\" (UniqueName: \"kubernetes.io/projected/d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8-kube-api-access-z8p4w\") pod \"aodh-db-create-jll2l\" (UID: \"d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8\") " pod="openstack/aodh-db-create-jll2l" Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.134128 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8-operator-scripts\") pod \"aodh-db-create-jll2l\" (UID: \"d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8\") " pod="openstack/aodh-db-create-jll2l" Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.239374 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8-operator-scripts\") pod \"aodh-db-create-jll2l\" (UID: \"d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8\") " pod="openstack/aodh-db-create-jll2l" Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.239548 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8p4w\" (UniqueName: \"kubernetes.io/projected/d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8-kube-api-access-z8p4w\") pod \"aodh-db-create-jll2l\" (UID: \"d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8\") " pod="openstack/aodh-db-create-jll2l" Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.239594 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3086733-54e4-4041-9896-88f6df519492-operator-scripts\") pod \"aodh-aa86-account-create-update-n8xdq\" (UID: \"d3086733-54e4-4041-9896-88f6df519492\") " pod="openstack/aodh-aa86-account-create-update-n8xdq" Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.239656 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mpb5\" (UniqueName: \"kubernetes.io/projected/d3086733-54e4-4041-9896-88f6df519492-kube-api-access-6mpb5\") pod \"aodh-aa86-account-create-update-n8xdq\" (UID: \"d3086733-54e4-4041-9896-88f6df519492\") " pod="openstack/aodh-aa86-account-create-update-n8xdq" Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.240137 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8-operator-scripts\") pod \"aodh-db-create-jll2l\" (UID: \"d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8\") " pod="openstack/aodh-db-create-jll2l" Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.265440 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8p4w\" (UniqueName: \"kubernetes.io/projected/d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8-kube-api-access-z8p4w\") pod \"aodh-db-create-jll2l\" (UID: \"d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8\") " pod="openstack/aodh-db-create-jll2l" Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.341347 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3086733-54e4-4041-9896-88f6df519492-operator-scripts\") pod \"aodh-aa86-account-create-update-n8xdq\" (UID: \"d3086733-54e4-4041-9896-88f6df519492\") " pod="openstack/aodh-aa86-account-create-update-n8xdq" Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.341428 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mpb5\" (UniqueName: \"kubernetes.io/projected/d3086733-54e4-4041-9896-88f6df519492-kube-api-access-6mpb5\") pod \"aodh-aa86-account-create-update-n8xdq\" (UID: \"d3086733-54e4-4041-9896-88f6df519492\") " pod="openstack/aodh-aa86-account-create-update-n8xdq" Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.342451 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3086733-54e4-4041-9896-88f6df519492-operator-scripts\") pod \"aodh-aa86-account-create-update-n8xdq\" (UID: \"d3086733-54e4-4041-9896-88f6df519492\") " pod="openstack/aodh-aa86-account-create-update-n8xdq" Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.356891 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-jll2l" Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.364210 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mpb5\" (UniqueName: \"kubernetes.io/projected/d3086733-54e4-4041-9896-88f6df519492-kube-api-access-6mpb5\") pod \"aodh-aa86-account-create-update-n8xdq\" (UID: \"d3086733-54e4-4041-9896-88f6df519492\") " pod="openstack/aodh-aa86-account-create-update-n8xdq" Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.422122 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-aa86-account-create-update-n8xdq" Feb 19 23:08:21 crc kubenswrapper[4795]: W0219 23:08:21.904845 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3ad8082_2f7c_4c51_ac6d_f6121f30d0c8.slice/crio-ea3c9853658bd62f59c739bcb14432fcbfeb30913ffcdb5f1d8e595caecd4713 WatchSource:0}: Error finding container ea3c9853658bd62f59c739bcb14432fcbfeb30913ffcdb5f1d8e595caecd4713: Status 404 returned error can't find the container with id ea3c9853658bd62f59c739bcb14432fcbfeb30913ffcdb5f1d8e595caecd4713 Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.906786 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-jll2l"] Feb 19 23:08:22 crc kubenswrapper[4795]: I0219 23:08:22.017298 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-aa86-account-create-update-n8xdq"] Feb 19 23:08:22 crc kubenswrapper[4795]: W0219 23:08:22.018027 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3086733_54e4_4041_9896_88f6df519492.slice/crio-1092d0b418e78e9ed7e6f5759bd83db0728ea07945e94a26a13e157313e6ba59 WatchSource:0}: Error finding container 1092d0b418e78e9ed7e6f5759bd83db0728ea07945e94a26a13e157313e6ba59: Status 404 returned error can't find the container with id 1092d0b418e78e9ed7e6f5759bd83db0728ea07945e94a26a13e157313e6ba59 Feb 19 23:08:22 crc kubenswrapper[4795]: I0219 23:08:22.929839 4795 generic.go:334] "Generic (PLEG): container finished" podID="d3086733-54e4-4041-9896-88f6df519492" containerID="49e34c186890a85782e2cf1f05ffa268eb8918522484c1ad1e090793c43863fa" exitCode=0 Feb 19 23:08:22 crc kubenswrapper[4795]: I0219 23:08:22.929949 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-aa86-account-create-update-n8xdq" event={"ID":"d3086733-54e4-4041-9896-88f6df519492","Type":"ContainerDied","Data":"49e34c186890a85782e2cf1f05ffa268eb8918522484c1ad1e090793c43863fa"} Feb 19 23:08:22 crc kubenswrapper[4795]: I0219 23:08:22.930211 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-aa86-account-create-update-n8xdq" event={"ID":"d3086733-54e4-4041-9896-88f6df519492","Type":"ContainerStarted","Data":"1092d0b418e78e9ed7e6f5759bd83db0728ea07945e94a26a13e157313e6ba59"} Feb 19 23:08:22 crc kubenswrapper[4795]: I0219 23:08:22.931777 4795 generic.go:334] "Generic (PLEG): container finished" podID="d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8" containerID="6653ee424e57053f6b0308afd986889a06d2949741f5136682c4b14068b5b724" exitCode=0 Feb 19 23:08:22 crc kubenswrapper[4795]: I0219 23:08:22.931837 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-jll2l" event={"ID":"d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8","Type":"ContainerDied","Data":"6653ee424e57053f6b0308afd986889a06d2949741f5136682c4b14068b5b724"} Feb 19 23:08:22 crc kubenswrapper[4795]: I0219 23:08:22.931864 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-jll2l" event={"ID":"d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8","Type":"ContainerStarted","Data":"ea3c9853658bd62f59c739bcb14432fcbfeb30913ffcdb5f1d8e595caecd4713"} Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.353513 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-aa86-account-create-update-n8xdq" Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.360039 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-jll2l" Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.408857 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3086733-54e4-4041-9896-88f6df519492-operator-scripts\") pod \"d3086733-54e4-4041-9896-88f6df519492\" (UID: \"d3086733-54e4-4041-9896-88f6df519492\") " Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.409018 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mpb5\" (UniqueName: \"kubernetes.io/projected/d3086733-54e4-4041-9896-88f6df519492-kube-api-access-6mpb5\") pod \"d3086733-54e4-4041-9896-88f6df519492\" (UID: \"d3086733-54e4-4041-9896-88f6df519492\") " Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.409435 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3086733-54e4-4041-9896-88f6df519492-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d3086733-54e4-4041-9896-88f6df519492" (UID: "d3086733-54e4-4041-9896-88f6df519492"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.409908 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3086733-54e4-4041-9896-88f6df519492-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.417395 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3086733-54e4-4041-9896-88f6df519492-kube-api-access-6mpb5" (OuterVolumeSpecName: "kube-api-access-6mpb5") pod "d3086733-54e4-4041-9896-88f6df519492" (UID: "d3086733-54e4-4041-9896-88f6df519492"). InnerVolumeSpecName "kube-api-access-6mpb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.511750 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8-operator-scripts\") pod \"d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8\" (UID: \"d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8\") " Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.511893 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8p4w\" (UniqueName: \"kubernetes.io/projected/d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8-kube-api-access-z8p4w\") pod \"d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8\" (UID: \"d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8\") " Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.512549 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8" (UID: "d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.512816 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mpb5\" (UniqueName: \"kubernetes.io/projected/d3086733-54e4-4041-9896-88f6df519492-kube-api-access-6mpb5\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.512835 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.515051 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8-kube-api-access-z8p4w" (OuterVolumeSpecName: "kube-api-access-z8p4w") pod "d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8" (UID: "d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8"). InnerVolumeSpecName "kube-api-access-z8p4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.614501 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8p4w\" (UniqueName: \"kubernetes.io/projected/d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8-kube-api-access-z8p4w\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.958787 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-aa86-account-create-update-n8xdq" event={"ID":"d3086733-54e4-4041-9896-88f6df519492","Type":"ContainerDied","Data":"1092d0b418e78e9ed7e6f5759bd83db0728ea07945e94a26a13e157313e6ba59"} Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.958826 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1092d0b418e78e9ed7e6f5759bd83db0728ea07945e94a26a13e157313e6ba59" Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.959122 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-aa86-account-create-update-n8xdq" Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.960248 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-jll2l" event={"ID":"d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8","Type":"ContainerDied","Data":"ea3c9853658bd62f59c739bcb14432fcbfeb30913ffcdb5f1d8e595caecd4713"} Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.960268 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea3c9853658bd62f59c739bcb14432fcbfeb30913ffcdb5f1d8e595caecd4713" Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.960395 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-jll2l" Feb 19 23:08:25 crc kubenswrapper[4795]: I0219 23:08:25.440728 4795 scope.go:117] "RemoveContainer" containerID="2ef66b4d7c2d7c4390b233ae4586dc431786dee8c34b27c0308b9e8e392e9643" Feb 19 23:08:25 crc kubenswrapper[4795]: I0219 23:08:25.466836 4795 scope.go:117] "RemoveContainer" containerID="4ee5bc336806db9668bb12f08ece22f6cb10a6fe341f3f2abd0e9309ef59d764" Feb 19 23:08:25 crc kubenswrapper[4795]: I0219 23:08:25.530099 4795 scope.go:117] "RemoveContainer" containerID="886bef23bb00ea44b033b4812e88d91f6efe4ec5933a69343ea4b9bc4fed7503" Feb 19 23:08:25 crc kubenswrapper[4795]: I0219 23:08:25.579263 4795 scope.go:117] "RemoveContainer" containerID="00e59b7398cdff96aaf8a625c69d45b06909964fc853d0d0c0e9c6a12321c27b" Feb 19 23:08:25 crc kubenswrapper[4795]: I0219 23:08:25.618825 4795 scope.go:117] "RemoveContainer" containerID="aa245a9fc5b7b00ccfbfeb6940d75331800d0a653cedd61f24d6b1162fb6f41d" Feb 19 23:08:25 crc kubenswrapper[4795]: I0219 23:08:25.666045 4795 scope.go:117] "RemoveContainer" containerID="0a09a6dbfb16ba30386d5f0a4f52128dc79107a79561ffd233f9f60e4e73100d" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.590738 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-q2lkk"] Feb 19 23:08:26 crc kubenswrapper[4795]: E0219 23:08:26.591664 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8" containerName="mariadb-database-create" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.591680 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8" containerName="mariadb-database-create" Feb 19 23:08:26 crc kubenswrapper[4795]: E0219 23:08:26.591701 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3086733-54e4-4041-9896-88f6df519492" containerName="mariadb-account-create-update" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.591709 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3086733-54e4-4041-9896-88f6df519492" containerName="mariadb-account-create-update" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.591946 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3086733-54e4-4041-9896-88f6df519492" containerName="mariadb-account-create-update" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.591978 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8" containerName="mariadb-database-create" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.593071 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-q2lkk" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.595091 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.595348 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.595547 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.595863 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-7c4hw" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.603501 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-q2lkk"] Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.683718 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4rc2\" (UniqueName: \"kubernetes.io/projected/e1953fbb-b558-497f-b889-62b41f35e4b4-kube-api-access-t4rc2\") pod \"aodh-db-sync-q2lkk\" (UID: \"e1953fbb-b558-497f-b889-62b41f35e4b4\") " pod="openstack/aodh-db-sync-q2lkk" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.683802 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-scripts\") pod \"aodh-db-sync-q2lkk\" (UID: \"e1953fbb-b558-497f-b889-62b41f35e4b4\") " pod="openstack/aodh-db-sync-q2lkk" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.683835 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-config-data\") pod \"aodh-db-sync-q2lkk\" (UID: \"e1953fbb-b558-497f-b889-62b41f35e4b4\") " pod="openstack/aodh-db-sync-q2lkk" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.683971 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-combined-ca-bundle\") pod \"aodh-db-sync-q2lkk\" (UID: \"e1953fbb-b558-497f-b889-62b41f35e4b4\") " pod="openstack/aodh-db-sync-q2lkk" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.786056 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-combined-ca-bundle\") pod \"aodh-db-sync-q2lkk\" (UID: \"e1953fbb-b558-497f-b889-62b41f35e4b4\") " pod="openstack/aodh-db-sync-q2lkk" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.786193 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4rc2\" (UniqueName: \"kubernetes.io/projected/e1953fbb-b558-497f-b889-62b41f35e4b4-kube-api-access-t4rc2\") pod \"aodh-db-sync-q2lkk\" (UID: \"e1953fbb-b558-497f-b889-62b41f35e4b4\") " pod="openstack/aodh-db-sync-q2lkk" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.786277 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-scripts\") pod \"aodh-db-sync-q2lkk\" (UID: \"e1953fbb-b558-497f-b889-62b41f35e4b4\") " pod="openstack/aodh-db-sync-q2lkk" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.786317 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-config-data\") pod \"aodh-db-sync-q2lkk\" (UID: \"e1953fbb-b558-497f-b889-62b41f35e4b4\") " pod="openstack/aodh-db-sync-q2lkk" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.792658 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-combined-ca-bundle\") pod \"aodh-db-sync-q2lkk\" (UID: \"e1953fbb-b558-497f-b889-62b41f35e4b4\") " pod="openstack/aodh-db-sync-q2lkk" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.792985 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-config-data\") pod \"aodh-db-sync-q2lkk\" (UID: \"e1953fbb-b558-497f-b889-62b41f35e4b4\") " pod="openstack/aodh-db-sync-q2lkk" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.793282 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-scripts\") pod \"aodh-db-sync-q2lkk\" (UID: \"e1953fbb-b558-497f-b889-62b41f35e4b4\") " pod="openstack/aodh-db-sync-q2lkk" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.803111 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4rc2\" (UniqueName: \"kubernetes.io/projected/e1953fbb-b558-497f-b889-62b41f35e4b4-kube-api-access-t4rc2\") pod \"aodh-db-sync-q2lkk\" (UID: \"e1953fbb-b558-497f-b889-62b41f35e4b4\") " pod="openstack/aodh-db-sync-q2lkk" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.926810 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-q2lkk" Feb 19 23:08:27 crc kubenswrapper[4795]: I0219 23:08:27.047949 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9nmbd"] Feb 19 23:08:27 crc kubenswrapper[4795]: I0219 23:08:27.059948 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9nmbd"] Feb 19 23:08:27 crc kubenswrapper[4795]: I0219 23:08:27.400881 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-q2lkk"] Feb 19 23:08:27 crc kubenswrapper[4795]: W0219 23:08:27.403205 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1953fbb_b558_497f_b889_62b41f35e4b4.slice/crio-dfc73f4cc52325e3cfd8fb3d09922cc6d90931e71284877d05ba85bec5c67e8b WatchSource:0}: Error finding container dfc73f4cc52325e3cfd8fb3d09922cc6d90931e71284877d05ba85bec5c67e8b: Status 404 returned error can't find the container with id dfc73f4cc52325e3cfd8fb3d09922cc6d90931e71284877d05ba85bec5c67e8b Feb 19 23:08:27 crc kubenswrapper[4795]: I0219 23:08:27.523452 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="303b3e4f-4b2b-4071-b54d-fe4aec3f18f5" path="/var/lib/kubelet/pods/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5/volumes" Feb 19 23:08:27 crc kubenswrapper[4795]: I0219 23:08:27.997850 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-q2lkk" event={"ID":"e1953fbb-b558-497f-b889-62b41f35e4b4","Type":"ContainerStarted","Data":"dfc73f4cc52325e3cfd8fb3d09922cc6d90931e71284877d05ba85bec5c67e8b"} Feb 19 23:08:31 crc kubenswrapper[4795]: I0219 23:08:31.039235 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-q2lkk" event={"ID":"e1953fbb-b558-497f-b889-62b41f35e4b4","Type":"ContainerStarted","Data":"47bda409c5d13cad14cf54811039c8544c7b3358bb1de45385c9a60e61a75704"} Feb 19 23:08:31 crc kubenswrapper[4795]: I0219 23:08:31.055731 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-q2lkk" podStartSLOduration=1.680619766 podStartE2EDuration="5.055715838s" podCreationTimestamp="2026-02-19 23:08:26 +0000 UTC" firstStartedPulling="2026-02-19 23:08:27.405362644 +0000 UTC m=+6018.597880508" lastFinishedPulling="2026-02-19 23:08:30.780458706 +0000 UTC m=+6021.972976580" observedRunningTime="2026-02-19 23:08:31.051622609 +0000 UTC m=+6022.244140463" watchObservedRunningTime="2026-02-19 23:08:31.055715838 +0000 UTC m=+6022.248233702" Feb 19 23:08:33 crc kubenswrapper[4795]: I0219 23:08:33.061216 4795 generic.go:334] "Generic (PLEG): container finished" podID="e1953fbb-b558-497f-b889-62b41f35e4b4" containerID="47bda409c5d13cad14cf54811039c8544c7b3358bb1de45385c9a60e61a75704" exitCode=0 Feb 19 23:08:33 crc kubenswrapper[4795]: I0219 23:08:33.061326 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-q2lkk" event={"ID":"e1953fbb-b558-497f-b889-62b41f35e4b4","Type":"ContainerDied","Data":"47bda409c5d13cad14cf54811039c8544c7b3358bb1de45385c9a60e61a75704"} Feb 19 23:08:34 crc kubenswrapper[4795]: I0219 23:08:34.485151 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-q2lkk" Feb 19 23:08:34 crc kubenswrapper[4795]: I0219 23:08:34.554974 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-config-data\") pod \"e1953fbb-b558-497f-b889-62b41f35e4b4\" (UID: \"e1953fbb-b558-497f-b889-62b41f35e4b4\") " Feb 19 23:08:34 crc kubenswrapper[4795]: I0219 23:08:34.555754 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-combined-ca-bundle\") pod \"e1953fbb-b558-497f-b889-62b41f35e4b4\" (UID: \"e1953fbb-b558-497f-b889-62b41f35e4b4\") " Feb 19 23:08:34 crc kubenswrapper[4795]: I0219 23:08:34.560258 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4rc2\" (UniqueName: \"kubernetes.io/projected/e1953fbb-b558-497f-b889-62b41f35e4b4-kube-api-access-t4rc2\") pod \"e1953fbb-b558-497f-b889-62b41f35e4b4\" (UID: \"e1953fbb-b558-497f-b889-62b41f35e4b4\") " Feb 19 23:08:34 crc kubenswrapper[4795]: I0219 23:08:34.560366 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-scripts\") pod \"e1953fbb-b558-497f-b889-62b41f35e4b4\" (UID: \"e1953fbb-b558-497f-b889-62b41f35e4b4\") " Feb 19 23:08:34 crc kubenswrapper[4795]: I0219 23:08:34.605372 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1953fbb-b558-497f-b889-62b41f35e4b4-kube-api-access-t4rc2" (OuterVolumeSpecName: "kube-api-access-t4rc2") pod "e1953fbb-b558-497f-b889-62b41f35e4b4" (UID: "e1953fbb-b558-497f-b889-62b41f35e4b4"). InnerVolumeSpecName "kube-api-access-t4rc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:08:34 crc kubenswrapper[4795]: I0219 23:08:34.629372 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-scripts" (OuterVolumeSpecName: "scripts") pod "e1953fbb-b558-497f-b889-62b41f35e4b4" (UID: "e1953fbb-b558-497f-b889-62b41f35e4b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:08:34 crc kubenswrapper[4795]: I0219 23:08:34.669495 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4rc2\" (UniqueName: \"kubernetes.io/projected/e1953fbb-b558-497f-b889-62b41f35e4b4-kube-api-access-t4rc2\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:34 crc kubenswrapper[4795]: I0219 23:08:34.669522 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:34 crc kubenswrapper[4795]: I0219 23:08:34.682316 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-config-data" (OuterVolumeSpecName: "config-data") pod "e1953fbb-b558-497f-b889-62b41f35e4b4" (UID: "e1953fbb-b558-497f-b889-62b41f35e4b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:08:34 crc kubenswrapper[4795]: I0219 23:08:34.716341 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1953fbb-b558-497f-b889-62b41f35e4b4" (UID: "e1953fbb-b558-497f-b889-62b41f35e4b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:08:34 crc kubenswrapper[4795]: I0219 23:08:34.770654 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:34 crc kubenswrapper[4795]: I0219 23:08:34.770876 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:35 crc kubenswrapper[4795]: I0219 23:08:35.079523 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-q2lkk" event={"ID":"e1953fbb-b558-497f-b889-62b41f35e4b4","Type":"ContainerDied","Data":"dfc73f4cc52325e3cfd8fb3d09922cc6d90931e71284877d05ba85bec5c67e8b"} Feb 19 23:08:35 crc kubenswrapper[4795]: I0219 23:08:35.079563 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfc73f4cc52325e3cfd8fb3d09922cc6d90931e71284877d05ba85bec5c67e8b" Feb 19 23:08:35 crc kubenswrapper[4795]: I0219 23:08:35.079618 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-q2lkk" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.040072 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Feb 19 23:08:36 crc kubenswrapper[4795]: E0219 23:08:36.041188 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1953fbb-b558-497f-b889-62b41f35e4b4" containerName="aodh-db-sync" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.041211 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1953fbb-b558-497f-b889-62b41f35e4b4" containerName="aodh-db-sync" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.041628 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1953fbb-b558-497f-b889-62b41f35e4b4" containerName="aodh-db-sync" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.045204 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.047333 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.047906 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.052615 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.055717 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-7c4hw" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.096492 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb98719-7401-4241-8361-070eb67879c7-combined-ca-bundle\") pod \"aodh-0\" (UID: \"acb98719-7401-4241-8361-070eb67879c7\") " pod="openstack/aodh-0" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.096695 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acb98719-7401-4241-8361-070eb67879c7-config-data\") pod \"aodh-0\" (UID: \"acb98719-7401-4241-8361-070eb67879c7\") " pod="openstack/aodh-0" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.096726 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4mnq\" (UniqueName: \"kubernetes.io/projected/acb98719-7401-4241-8361-070eb67879c7-kube-api-access-p4mnq\") pod \"aodh-0\" (UID: \"acb98719-7401-4241-8361-070eb67879c7\") " pod="openstack/aodh-0" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.096845 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acb98719-7401-4241-8361-070eb67879c7-scripts\") pod \"aodh-0\" (UID: \"acb98719-7401-4241-8361-070eb67879c7\") " pod="openstack/aodh-0" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.198572 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acb98719-7401-4241-8361-070eb67879c7-scripts\") pod \"aodh-0\" (UID: \"acb98719-7401-4241-8361-070eb67879c7\") " pod="openstack/aodh-0" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.198670 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb98719-7401-4241-8361-070eb67879c7-combined-ca-bundle\") pod \"aodh-0\" (UID: \"acb98719-7401-4241-8361-070eb67879c7\") " pod="openstack/aodh-0" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.198758 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acb98719-7401-4241-8361-070eb67879c7-config-data\") pod \"aodh-0\" (UID: \"acb98719-7401-4241-8361-070eb67879c7\") " pod="openstack/aodh-0" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.198775 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4mnq\" (UniqueName: \"kubernetes.io/projected/acb98719-7401-4241-8361-070eb67879c7-kube-api-access-p4mnq\") pod \"aodh-0\" (UID: \"acb98719-7401-4241-8361-070eb67879c7\") " pod="openstack/aodh-0" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.213220 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acb98719-7401-4241-8361-070eb67879c7-config-data\") pod \"aodh-0\" (UID: \"acb98719-7401-4241-8361-070eb67879c7\") " pod="openstack/aodh-0" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.213257 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb98719-7401-4241-8361-070eb67879c7-combined-ca-bundle\") pod \"aodh-0\" (UID: \"acb98719-7401-4241-8361-070eb67879c7\") " pod="openstack/aodh-0" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.214344 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acb98719-7401-4241-8361-070eb67879c7-scripts\") pod \"aodh-0\" (UID: \"acb98719-7401-4241-8361-070eb67879c7\") " pod="openstack/aodh-0" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.228010 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4mnq\" (UniqueName: \"kubernetes.io/projected/acb98719-7401-4241-8361-070eb67879c7-kube-api-access-p4mnq\") pod \"aodh-0\" (UID: \"acb98719-7401-4241-8361-070eb67879c7\") " pod="openstack/aodh-0" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.364944 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.882062 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Feb 19 23:08:37 crc kubenswrapper[4795]: I0219 23:08:37.100621 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"acb98719-7401-4241-8361-070eb67879c7","Type":"ContainerStarted","Data":"b252e6ed8510a7cc2263a5cc42510d0a516ac8a803923d4988abacc34acf10da"} Feb 19 23:08:38 crc kubenswrapper[4795]: I0219 23:08:38.119887 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"acb98719-7401-4241-8361-070eb67879c7","Type":"ContainerStarted","Data":"7e8468251057892f31f06e7a69a5f13721746fb21c7868cf95003c9fe84705bf"} Feb 19 23:08:38 crc kubenswrapper[4795]: I0219 23:08:38.171996 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:08:38 crc kubenswrapper[4795]: I0219 23:08:38.172371 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="ceilometer-central-agent" containerID="cri-o://34b6d357921348772dce07cbb59f4cb709626c264269c9f5fbc76d546a352213" gracePeriod=30 Feb 19 23:08:38 crc kubenswrapper[4795]: I0219 23:08:38.172569 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="sg-core" containerID="cri-o://cb6a9b96dedb3f21e3e969a05bf643085cb53bcccf6ec35d67ad3e864971ea31" gracePeriod=30 Feb 19 23:08:38 crc kubenswrapper[4795]: I0219 23:08:38.172743 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="proxy-httpd" containerID="cri-o://887744f5aee90def91c51a51f9bae27976967caa200c8c26baf9917953c0a305" gracePeriod=30 Feb 19 23:08:38 crc kubenswrapper[4795]: I0219 23:08:38.172808 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="ceilometer-notification-agent" containerID="cri-o://793df9462bcfd2f1844bb37dfd196c29bbd8324d8e35261f3d76e625871728fa" gracePeriod=30 Feb 19 23:08:38 crc kubenswrapper[4795]: I0219 23:08:38.183968 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.1.133:3000/\": EOF" Feb 19 23:08:39 crc kubenswrapper[4795]: I0219 23:08:39.069081 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.1.133:3000/\": dial tcp 10.217.1.133:3000: connect: connection refused" Feb 19 23:08:39 crc kubenswrapper[4795]: I0219 23:08:39.146587 4795 generic.go:334] "Generic (PLEG): container finished" podID="ed07c74b-0313-43e7-a031-024966ef2734" containerID="887744f5aee90def91c51a51f9bae27976967caa200c8c26baf9917953c0a305" exitCode=0 Feb 19 23:08:39 crc kubenswrapper[4795]: I0219 23:08:39.146927 4795 generic.go:334] "Generic (PLEG): container finished" podID="ed07c74b-0313-43e7-a031-024966ef2734" containerID="cb6a9b96dedb3f21e3e969a05bf643085cb53bcccf6ec35d67ad3e864971ea31" exitCode=2 Feb 19 23:08:39 crc kubenswrapper[4795]: I0219 23:08:39.146935 4795 generic.go:334] "Generic (PLEG): container finished" podID="ed07c74b-0313-43e7-a031-024966ef2734" containerID="34b6d357921348772dce07cbb59f4cb709626c264269c9f5fbc76d546a352213" exitCode=0 Feb 19 23:08:39 crc kubenswrapper[4795]: I0219 23:08:39.146728 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed07c74b-0313-43e7-a031-024966ef2734","Type":"ContainerDied","Data":"887744f5aee90def91c51a51f9bae27976967caa200c8c26baf9917953c0a305"} Feb 19 23:08:39 crc kubenswrapper[4795]: I0219 23:08:39.146972 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed07c74b-0313-43e7-a031-024966ef2734","Type":"ContainerDied","Data":"cb6a9b96dedb3f21e3e969a05bf643085cb53bcccf6ec35d67ad3e864971ea31"} Feb 19 23:08:39 crc kubenswrapper[4795]: I0219 23:08:39.146986 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed07c74b-0313-43e7-a031-024966ef2734","Type":"ContainerDied","Data":"34b6d357921348772dce07cbb59f4cb709626c264269c9f5fbc76d546a352213"} Feb 19 23:08:40 crc kubenswrapper[4795]: I0219 23:08:40.168193 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"acb98719-7401-4241-8361-070eb67879c7","Type":"ContainerStarted","Data":"170a213f21aa36634496b5342c45671959ff7f7b76474d1dc214499f846811df"} Feb 19 23:08:41 crc kubenswrapper[4795]: I0219 23:08:41.177750 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"acb98719-7401-4241-8361-070eb67879c7","Type":"ContainerStarted","Data":"6788bc52af3b2c8a2628b250d81f0e5ca90417eb4d2797555b72ee77ae358ffb"} Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.190388 4795 generic.go:334] "Generic (PLEG): container finished" podID="ed07c74b-0313-43e7-a031-024966ef2734" containerID="793df9462bcfd2f1844bb37dfd196c29bbd8324d8e35261f3d76e625871728fa" exitCode=0 Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.190479 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed07c74b-0313-43e7-a031-024966ef2734","Type":"ContainerDied","Data":"793df9462bcfd2f1844bb37dfd196c29bbd8324d8e35261f3d76e625871728fa"} Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.673330 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.756776 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed07c74b-0313-43e7-a031-024966ef2734-run-httpd\") pod \"ed07c74b-0313-43e7-a031-024966ef2734\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.757066 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-config-data\") pod \"ed07c74b-0313-43e7-a031-024966ef2734\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.757100 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed07c74b-0313-43e7-a031-024966ef2734-log-httpd\") pod \"ed07c74b-0313-43e7-a031-024966ef2734\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.757135 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-sg-core-conf-yaml\") pod \"ed07c74b-0313-43e7-a031-024966ef2734\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.757176 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbt7n\" (UniqueName: \"kubernetes.io/projected/ed07c74b-0313-43e7-a031-024966ef2734-kube-api-access-hbt7n\") pod \"ed07c74b-0313-43e7-a031-024966ef2734\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.757202 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-scripts\") pod \"ed07c74b-0313-43e7-a031-024966ef2734\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.757258 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-combined-ca-bundle\") pod \"ed07c74b-0313-43e7-a031-024966ef2734\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.757672 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed07c74b-0313-43e7-a031-024966ef2734-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ed07c74b-0313-43e7-a031-024966ef2734" (UID: "ed07c74b-0313-43e7-a031-024966ef2734"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.757897 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed07c74b-0313-43e7-a031-024966ef2734-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ed07c74b-0313-43e7-a031-024966ef2734" (UID: "ed07c74b-0313-43e7-a031-024966ef2734"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.763082 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-scripts" (OuterVolumeSpecName: "scripts") pod "ed07c74b-0313-43e7-a031-024966ef2734" (UID: "ed07c74b-0313-43e7-a031-024966ef2734"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.765106 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed07c74b-0313-43e7-a031-024966ef2734-kube-api-access-hbt7n" (OuterVolumeSpecName: "kube-api-access-hbt7n") pod "ed07c74b-0313-43e7-a031-024966ef2734" (UID: "ed07c74b-0313-43e7-a031-024966ef2734"). InnerVolumeSpecName "kube-api-access-hbt7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.794208 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ed07c74b-0313-43e7-a031-024966ef2734" (UID: "ed07c74b-0313-43e7-a031-024966ef2734"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.834598 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed07c74b-0313-43e7-a031-024966ef2734" (UID: "ed07c74b-0313-43e7-a031-024966ef2734"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.859415 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed07c74b-0313-43e7-a031-024966ef2734-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.859456 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.859471 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbt7n\" (UniqueName: \"kubernetes.io/projected/ed07c74b-0313-43e7-a031-024966ef2734-kube-api-access-hbt7n\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.859486 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.859499 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.859510 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed07c74b-0313-43e7-a031-024966ef2734-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.861127 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-config-data" (OuterVolumeSpecName: "config-data") pod "ed07c74b-0313-43e7-a031-024966ef2734" (UID: "ed07c74b-0313-43e7-a031-024966ef2734"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.962149 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.201438 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed07c74b-0313-43e7-a031-024966ef2734","Type":"ContainerDied","Data":"92493a20710869063a50df759376e4f92368ef4022d75d392f1e23b18e170ed8"} Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.201479 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.201537 4795 scope.go:117] "RemoveContainer" containerID="887744f5aee90def91c51a51f9bae27976967caa200c8c26baf9917953c0a305" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.207190 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"acb98719-7401-4241-8361-070eb67879c7","Type":"ContainerStarted","Data":"64a671ad808d96934ccf0bf7b3291555f64cee0d7425b8a291a73bf4f07607d7"} Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.242016 4795 scope.go:117] "RemoveContainer" containerID="cb6a9b96dedb3f21e3e969a05bf643085cb53bcccf6ec35d67ad3e864971ea31" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.262842 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.83291366 podStartE2EDuration="7.262815244s" podCreationTimestamp="2026-02-19 23:08:36 +0000 UTC" firstStartedPulling="2026-02-19 23:08:36.885680243 +0000 UTC m=+6028.078198117" lastFinishedPulling="2026-02-19 23:08:42.315581837 +0000 UTC m=+6033.508099701" observedRunningTime="2026-02-19 23:08:43.23300613 +0000 UTC m=+6034.425523994" watchObservedRunningTime="2026-02-19 23:08:43.262815244 +0000 UTC m=+6034.455333108" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.287922 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.318489 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.337031 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:08:43 crc kubenswrapper[4795]: E0219 23:08:43.337678 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="proxy-httpd" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.337704 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="proxy-httpd" Feb 19 23:08:43 crc kubenswrapper[4795]: E0219 23:08:43.337733 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="ceilometer-notification-agent" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.337742 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="ceilometer-notification-agent" Feb 19 23:08:43 crc kubenswrapper[4795]: E0219 23:08:43.337756 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="ceilometer-central-agent" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.337767 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="ceilometer-central-agent" Feb 19 23:08:43 crc kubenswrapper[4795]: E0219 23:08:43.337783 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="sg-core" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.337791 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="sg-core" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.338078 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="ceilometer-central-agent" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.338107 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="ceilometer-notification-agent" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.338132 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="sg-core" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.338145 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="proxy-httpd" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.340713 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.343595 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.344021 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.361549 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.376306 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-config-data\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.376429 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.376456 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-scripts\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.376506 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.376569 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7d7882a-86ad-4b48-8280-af2ced7c6807-run-httpd\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.376767 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7d7882a-86ad-4b48-8280-af2ced7c6807-log-httpd\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.376849 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stfh6\" (UniqueName: \"kubernetes.io/projected/d7d7882a-86ad-4b48-8280-af2ced7c6807-kube-api-access-stfh6\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.377456 4795 scope.go:117] "RemoveContainer" containerID="793df9462bcfd2f1844bb37dfd196c29bbd8324d8e35261f3d76e625871728fa" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.424487 4795 scope.go:117] "RemoveContainer" containerID="34b6d357921348772dce07cbb59f4cb709626c264269c9f5fbc76d546a352213" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.478948 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.479010 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-scripts\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.479051 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.479095 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7d7882a-86ad-4b48-8280-af2ced7c6807-run-httpd\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.480437 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7d7882a-86ad-4b48-8280-af2ced7c6807-log-httpd\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.480478 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stfh6\" (UniqueName: \"kubernetes.io/projected/d7d7882a-86ad-4b48-8280-af2ced7c6807-kube-api-access-stfh6\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.480515 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-config-data\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.481286 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7d7882a-86ad-4b48-8280-af2ced7c6807-run-httpd\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.482379 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7d7882a-86ad-4b48-8280-af2ced7c6807-log-httpd\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.484249 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.484310 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-scripts\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.486319 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.487984 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-config-data\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.500052 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stfh6\" (UniqueName: \"kubernetes.io/projected/d7d7882a-86ad-4b48-8280-af2ced7c6807-kube-api-access-stfh6\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.524668 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed07c74b-0313-43e7-a031-024966ef2734" path="/var/lib/kubelet/pods/ed07c74b-0313-43e7-a031-024966ef2734/volumes" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.670974 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:08:44 crc kubenswrapper[4795]: I0219 23:08:44.314315 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:08:45 crc kubenswrapper[4795]: I0219 23:08:45.048440 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7w2k2"] Feb 19 23:08:45 crc kubenswrapper[4795]: I0219 23:08:45.061485 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7w2k2"] Feb 19 23:08:45 crc kubenswrapper[4795]: I0219 23:08:45.244876 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7d7882a-86ad-4b48-8280-af2ced7c6807","Type":"ContainerStarted","Data":"44a3b39a44dc6cf8dbe9434bdd68e38ca2c4f2db3b91cc4b6a6862525ce949fc"} Feb 19 23:08:45 crc kubenswrapper[4795]: I0219 23:08:45.244912 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7d7882a-86ad-4b48-8280-af2ced7c6807","Type":"ContainerStarted","Data":"2b454f16774527928b9df78521f93b16c0055d0b08e28e664f0730245cc4e62b"} Feb 19 23:08:45 crc kubenswrapper[4795]: I0219 23:08:45.551469 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8065cb60-3c91-4fbc-89f1-7d73d11a85e5" path="/var/lib/kubelet/pods/8065cb60-3c91-4fbc-89f1-7d73d11a85e5/volumes" Feb 19 23:08:46 crc kubenswrapper[4795]: I0219 23:08:46.050089 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-cdffm"] Feb 19 23:08:46 crc kubenswrapper[4795]: I0219 23:08:46.067607 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-cdffm"] Feb 19 23:08:46 crc kubenswrapper[4795]: I0219 23:08:46.256524 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7d7882a-86ad-4b48-8280-af2ced7c6807","Type":"ContainerStarted","Data":"807c413b53a5b1af9be38437a438f28044e30b65d6e5738bee5968445dc73a94"} Feb 19 23:08:46 crc kubenswrapper[4795]: I0219 23:08:46.256878 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7d7882a-86ad-4b48-8280-af2ced7c6807","Type":"ContainerStarted","Data":"ca8c993b6a51e98c0db74360e21684626b8c5fc48a0421fcdb15ff26685a9b39"} Feb 19 23:08:47 crc kubenswrapper[4795]: I0219 23:08:47.524642 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7314a002-868e-4028-b341-b719a609e21c" path="/var/lib/kubelet/pods/7314a002-868e-4028-b341-b719a609e21c/volumes" Feb 19 23:08:48 crc kubenswrapper[4795]: I0219 23:08:48.296702 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7d7882a-86ad-4b48-8280-af2ced7c6807","Type":"ContainerStarted","Data":"32823af9fc2a8e52e61f0605bdd2b17739e02e0d4106a20b34de910c13396fbb"} Feb 19 23:08:48 crc kubenswrapper[4795]: I0219 23:08:48.296975 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 23:08:48 crc kubenswrapper[4795]: I0219 23:08:48.334452 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.104368068 podStartE2EDuration="5.334424654s" podCreationTimestamp="2026-02-19 23:08:43 +0000 UTC" firstStartedPulling="2026-02-19 23:08:44.326134323 +0000 UTC m=+6035.518652187" lastFinishedPulling="2026-02-19 23:08:47.556190909 +0000 UTC m=+6038.748708773" observedRunningTime="2026-02-19 23:08:48.322920937 +0000 UTC m=+6039.515438811" watchObservedRunningTime="2026-02-19 23:08:48.334424654 +0000 UTC m=+6039.526942518" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.154570 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-4647-account-create-update-f7k78"] Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.156228 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-4647-account-create-update-f7k78" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.160381 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.168598 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-4647-account-create-update-f7k78"] Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.180151 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-657cv"] Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.185619 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-657cv" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.230674 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce-operator-scripts\") pod \"manila-4647-account-create-update-f7k78\" (UID: \"4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce\") " pod="openstack/manila-4647-account-create-update-f7k78" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.230781 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr7xp\" (UniqueName: \"kubernetes.io/projected/4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce-kube-api-access-cr7xp\") pod \"manila-4647-account-create-update-f7k78\" (UID: \"4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce\") " pod="openstack/manila-4647-account-create-update-f7k78" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.236705 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-657cv"] Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.333466 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s52pg\" (UniqueName: \"kubernetes.io/projected/eadcaa3c-623a-409a-b735-2a38854c8036-kube-api-access-s52pg\") pod \"manila-db-create-657cv\" (UID: \"eadcaa3c-623a-409a-b735-2a38854c8036\") " pod="openstack/manila-db-create-657cv" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.333529 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce-operator-scripts\") pod \"manila-4647-account-create-update-f7k78\" (UID: \"4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce\") " pod="openstack/manila-4647-account-create-update-f7k78" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.333586 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eadcaa3c-623a-409a-b735-2a38854c8036-operator-scripts\") pod \"manila-db-create-657cv\" (UID: \"eadcaa3c-623a-409a-b735-2a38854c8036\") " pod="openstack/manila-db-create-657cv" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.333614 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr7xp\" (UniqueName: \"kubernetes.io/projected/4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce-kube-api-access-cr7xp\") pod \"manila-4647-account-create-update-f7k78\" (UID: \"4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce\") " pod="openstack/manila-4647-account-create-update-f7k78" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.334736 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce-operator-scripts\") pod \"manila-4647-account-create-update-f7k78\" (UID: \"4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce\") " pod="openstack/manila-4647-account-create-update-f7k78" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.375804 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr7xp\" (UniqueName: \"kubernetes.io/projected/4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce-kube-api-access-cr7xp\") pod \"manila-4647-account-create-update-f7k78\" (UID: \"4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce\") " pod="openstack/manila-4647-account-create-update-f7k78" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.435544 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eadcaa3c-623a-409a-b735-2a38854c8036-operator-scripts\") pod \"manila-db-create-657cv\" (UID: \"eadcaa3c-623a-409a-b735-2a38854c8036\") " pod="openstack/manila-db-create-657cv" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.435715 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s52pg\" (UniqueName: \"kubernetes.io/projected/eadcaa3c-623a-409a-b735-2a38854c8036-kube-api-access-s52pg\") pod \"manila-db-create-657cv\" (UID: \"eadcaa3c-623a-409a-b735-2a38854c8036\") " pod="openstack/manila-db-create-657cv" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.437327 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eadcaa3c-623a-409a-b735-2a38854c8036-operator-scripts\") pod \"manila-db-create-657cv\" (UID: \"eadcaa3c-623a-409a-b735-2a38854c8036\") " pod="openstack/manila-db-create-657cv" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.453722 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s52pg\" (UniqueName: \"kubernetes.io/projected/eadcaa3c-623a-409a-b735-2a38854c8036-kube-api-access-s52pg\") pod \"manila-db-create-657cv\" (UID: \"eadcaa3c-623a-409a-b735-2a38854c8036\") " pod="openstack/manila-db-create-657cv" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.473141 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-4647-account-create-update-f7k78" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.516048 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-657cv" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.996639 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-4647-account-create-update-f7k78"] Feb 19 23:08:50 crc kubenswrapper[4795]: W0219 23:08:50.001571 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a6dde49_b59e_4a4a_ad84_6386aa1dc6ce.slice/crio-0786c807ea4bdfb3b6185a4c22709a17c9d628ec80bed4391f9f91b63673b9b7 WatchSource:0}: Error finding container 0786c807ea4bdfb3b6185a4c22709a17c9d628ec80bed4391f9f91b63673b9b7: Status 404 returned error can't find the container with id 0786c807ea4bdfb3b6185a4c22709a17c9d628ec80bed4391f9f91b63673b9b7 Feb 19 23:08:50 crc kubenswrapper[4795]: I0219 23:08:50.148742 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-657cv"] Feb 19 23:08:50 crc kubenswrapper[4795]: W0219 23:08:50.148859 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeadcaa3c_623a_409a_b735_2a38854c8036.slice/crio-602d5deb01290c1783ead482081b843b246c4fe53c5dc8800e837fd62c4812ff WatchSource:0}: Error finding container 602d5deb01290c1783ead482081b843b246c4fe53c5dc8800e837fd62c4812ff: Status 404 returned error can't find the container with id 602d5deb01290c1783ead482081b843b246c4fe53c5dc8800e837fd62c4812ff Feb 19 23:08:50 crc kubenswrapper[4795]: I0219 23:08:50.335568 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-4647-account-create-update-f7k78" event={"ID":"4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce","Type":"ContainerStarted","Data":"74c8a86b466b15b56eddcfa8140418aa598bd95ce7f01643ab2976a1bd25cfe5"} Feb 19 23:08:50 crc kubenswrapper[4795]: I0219 23:08:50.335612 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-4647-account-create-update-f7k78" event={"ID":"4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce","Type":"ContainerStarted","Data":"0786c807ea4bdfb3b6185a4c22709a17c9d628ec80bed4391f9f91b63673b9b7"} Feb 19 23:08:50 crc kubenswrapper[4795]: I0219 23:08:50.337271 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-657cv" event={"ID":"eadcaa3c-623a-409a-b735-2a38854c8036","Type":"ContainerStarted","Data":"2537d15feb1a2c6f922f8d39bb6ed8ef442cb362fc714c8399a87d6d69b4784d"} Feb 19 23:08:50 crc kubenswrapper[4795]: I0219 23:08:50.337314 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-657cv" event={"ID":"eadcaa3c-623a-409a-b735-2a38854c8036","Type":"ContainerStarted","Data":"602d5deb01290c1783ead482081b843b246c4fe53c5dc8800e837fd62c4812ff"} Feb 19 23:08:50 crc kubenswrapper[4795]: I0219 23:08:50.353026 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-4647-account-create-update-f7k78" podStartSLOduration=1.353008083 podStartE2EDuration="1.353008083s" podCreationTimestamp="2026-02-19 23:08:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:08:50.350572838 +0000 UTC m=+6041.543090692" watchObservedRunningTime="2026-02-19 23:08:50.353008083 +0000 UTC m=+6041.545525947" Feb 19 23:08:50 crc kubenswrapper[4795]: I0219 23:08:50.390532 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-create-657cv" podStartSLOduration=1.390513532 podStartE2EDuration="1.390513532s" podCreationTimestamp="2026-02-19 23:08:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:08:50.367749716 +0000 UTC m=+6041.560267590" watchObservedRunningTime="2026-02-19 23:08:50.390513532 +0000 UTC m=+6041.583031396" Feb 19 23:08:51 crc kubenswrapper[4795]: I0219 23:08:51.351518 4795 generic.go:334] "Generic (PLEG): container finished" podID="4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce" containerID="74c8a86b466b15b56eddcfa8140418aa598bd95ce7f01643ab2976a1bd25cfe5" exitCode=0 Feb 19 23:08:51 crc kubenswrapper[4795]: I0219 23:08:51.352395 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-4647-account-create-update-f7k78" event={"ID":"4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce","Type":"ContainerDied","Data":"74c8a86b466b15b56eddcfa8140418aa598bd95ce7f01643ab2976a1bd25cfe5"} Feb 19 23:08:51 crc kubenswrapper[4795]: I0219 23:08:51.354245 4795 generic.go:334] "Generic (PLEG): container finished" podID="eadcaa3c-623a-409a-b735-2a38854c8036" containerID="2537d15feb1a2c6f922f8d39bb6ed8ef442cb362fc714c8399a87d6d69b4784d" exitCode=0 Feb 19 23:08:51 crc kubenswrapper[4795]: I0219 23:08:51.354293 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-657cv" event={"ID":"eadcaa3c-623a-409a-b735-2a38854c8036","Type":"ContainerDied","Data":"2537d15feb1a2c6f922f8d39bb6ed8ef442cb362fc714c8399a87d6d69b4784d"} Feb 19 23:08:52 crc kubenswrapper[4795]: I0219 23:08:52.957478 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-657cv" Feb 19 23:08:52 crc kubenswrapper[4795]: I0219 23:08:52.961844 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-4647-account-create-update-f7k78" Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.126629 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s52pg\" (UniqueName: \"kubernetes.io/projected/eadcaa3c-623a-409a-b735-2a38854c8036-kube-api-access-s52pg\") pod \"eadcaa3c-623a-409a-b735-2a38854c8036\" (UID: \"eadcaa3c-623a-409a-b735-2a38854c8036\") " Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.127037 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce-operator-scripts\") pod \"4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce\" (UID: \"4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce\") " Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.127078 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cr7xp\" (UniqueName: \"kubernetes.io/projected/4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce-kube-api-access-cr7xp\") pod \"4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce\" (UID: \"4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce\") " Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.127104 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eadcaa3c-623a-409a-b735-2a38854c8036-operator-scripts\") pod \"eadcaa3c-623a-409a-b735-2a38854c8036\" (UID: \"eadcaa3c-623a-409a-b735-2a38854c8036\") " Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.127561 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce" (UID: "4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.127630 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eadcaa3c-623a-409a-b735-2a38854c8036-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eadcaa3c-623a-409a-b735-2a38854c8036" (UID: "eadcaa3c-623a-409a-b735-2a38854c8036"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.132559 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce-kube-api-access-cr7xp" (OuterVolumeSpecName: "kube-api-access-cr7xp") pod "4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce" (UID: "4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce"). InnerVolumeSpecName "kube-api-access-cr7xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.132980 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eadcaa3c-623a-409a-b735-2a38854c8036-kube-api-access-s52pg" (OuterVolumeSpecName: "kube-api-access-s52pg") pod "eadcaa3c-623a-409a-b735-2a38854c8036" (UID: "eadcaa3c-623a-409a-b735-2a38854c8036"). InnerVolumeSpecName "kube-api-access-s52pg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.228950 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.228987 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cr7xp\" (UniqueName: \"kubernetes.io/projected/4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce-kube-api-access-cr7xp\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.228997 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eadcaa3c-623a-409a-b735-2a38854c8036-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.229007 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s52pg\" (UniqueName: \"kubernetes.io/projected/eadcaa3c-623a-409a-b735-2a38854c8036-kube-api-access-s52pg\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.376471 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-4647-account-create-update-f7k78" Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.376286 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-4647-account-create-update-f7k78" event={"ID":"4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce","Type":"ContainerDied","Data":"0786c807ea4bdfb3b6185a4c22709a17c9d628ec80bed4391f9f91b63673b9b7"} Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.378297 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0786c807ea4bdfb3b6185a4c22709a17c9d628ec80bed4391f9f91b63673b9b7" Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.386584 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-657cv" event={"ID":"eadcaa3c-623a-409a-b735-2a38854c8036","Type":"ContainerDied","Data":"602d5deb01290c1783ead482081b843b246c4fe53c5dc8800e837fd62c4812ff"} Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.386623 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="602d5deb01290c1783ead482081b843b246c4fe53c5dc8800e837fd62c4812ff" Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.386686 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-657cv" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.630065 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-8fzxs"] Feb 19 23:08:54 crc kubenswrapper[4795]: E0219 23:08:54.632271 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce" containerName="mariadb-account-create-update" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.632294 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce" containerName="mariadb-account-create-update" Feb 19 23:08:54 crc kubenswrapper[4795]: E0219 23:08:54.632337 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eadcaa3c-623a-409a-b735-2a38854c8036" containerName="mariadb-database-create" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.632348 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="eadcaa3c-623a-409a-b735-2a38854c8036" containerName="mariadb-database-create" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.632615 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce" containerName="mariadb-account-create-update" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.632660 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="eadcaa3c-623a-409a-b735-2a38854c8036" containerName="mariadb-database-create" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.633606 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-8fzxs" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.636342 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-g5f8g" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.642258 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-8fzxs"] Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.642349 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.771924 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-job-config-data\") pod \"manila-db-sync-8fzxs\" (UID: \"bd19113b-623e-4f3e-8392-09968a5d71f9\") " pod="openstack/manila-db-sync-8fzxs" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.772034 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-674wp\" (UniqueName: \"kubernetes.io/projected/bd19113b-623e-4f3e-8392-09968a5d71f9-kube-api-access-674wp\") pod \"manila-db-sync-8fzxs\" (UID: \"bd19113b-623e-4f3e-8392-09968a5d71f9\") " pod="openstack/manila-db-sync-8fzxs" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.772136 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-combined-ca-bundle\") pod \"manila-db-sync-8fzxs\" (UID: \"bd19113b-623e-4f3e-8392-09968a5d71f9\") " pod="openstack/manila-db-sync-8fzxs" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.772152 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-config-data\") pod \"manila-db-sync-8fzxs\" (UID: \"bd19113b-623e-4f3e-8392-09968a5d71f9\") " pod="openstack/manila-db-sync-8fzxs" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.873655 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-combined-ca-bundle\") pod \"manila-db-sync-8fzxs\" (UID: \"bd19113b-623e-4f3e-8392-09968a5d71f9\") " pod="openstack/manila-db-sync-8fzxs" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.873976 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-config-data\") pod \"manila-db-sync-8fzxs\" (UID: \"bd19113b-623e-4f3e-8392-09968a5d71f9\") " pod="openstack/manila-db-sync-8fzxs" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.874059 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-job-config-data\") pod \"manila-db-sync-8fzxs\" (UID: \"bd19113b-623e-4f3e-8392-09968a5d71f9\") " pod="openstack/manila-db-sync-8fzxs" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.874158 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-674wp\" (UniqueName: \"kubernetes.io/projected/bd19113b-623e-4f3e-8392-09968a5d71f9-kube-api-access-674wp\") pod \"manila-db-sync-8fzxs\" (UID: \"bd19113b-623e-4f3e-8392-09968a5d71f9\") " pod="openstack/manila-db-sync-8fzxs" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.887049 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-combined-ca-bundle\") pod \"manila-db-sync-8fzxs\" (UID: \"bd19113b-623e-4f3e-8392-09968a5d71f9\") " pod="openstack/manila-db-sync-8fzxs" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.887053 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-job-config-data\") pod \"manila-db-sync-8fzxs\" (UID: \"bd19113b-623e-4f3e-8392-09968a5d71f9\") " pod="openstack/manila-db-sync-8fzxs" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.887278 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-config-data\") pod \"manila-db-sync-8fzxs\" (UID: \"bd19113b-623e-4f3e-8392-09968a5d71f9\") " pod="openstack/manila-db-sync-8fzxs" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.889515 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-674wp\" (UniqueName: \"kubernetes.io/projected/bd19113b-623e-4f3e-8392-09968a5d71f9-kube-api-access-674wp\") pod \"manila-db-sync-8fzxs\" (UID: \"bd19113b-623e-4f3e-8392-09968a5d71f9\") " pod="openstack/manila-db-sync-8fzxs" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.962628 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-8fzxs" Feb 19 23:08:55 crc kubenswrapper[4795]: I0219 23:08:55.760241 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-8fzxs"] Feb 19 23:08:56 crc kubenswrapper[4795]: I0219 23:08:56.412596 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-8fzxs" event={"ID":"bd19113b-623e-4f3e-8392-09968a5d71f9","Type":"ContainerStarted","Data":"2de2f6bee0fa48983ed9cf80a52a1fcd93154a98d917c415b7634301a3151ebf"} Feb 19 23:08:59 crc kubenswrapper[4795]: I0219 23:08:59.033149 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-ltwc9"] Feb 19 23:08:59 crc kubenswrapper[4795]: I0219 23:08:59.047084 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-ltwc9"] Feb 19 23:08:59 crc kubenswrapper[4795]: I0219 23:08:59.526047 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0532ff51-023e-4663-9c95-6545236a8fb3" path="/var/lib/kubelet/pods/0532ff51-023e-4663-9c95-6545236a8fb3/volumes" Feb 19 23:09:01 crc kubenswrapper[4795]: I0219 23:09:01.459911 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-8fzxs" event={"ID":"bd19113b-623e-4f3e-8392-09968a5d71f9","Type":"ContainerStarted","Data":"a15bc43318a5f56769ebc9469003d0ddb425e341f35a3faca3d202f2707e3b2d"} Feb 19 23:09:01 crc kubenswrapper[4795]: I0219 23:09:01.483976 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-8fzxs" podStartSLOduration=2.917938107 podStartE2EDuration="7.483959205s" podCreationTimestamp="2026-02-19 23:08:54 +0000 UTC" firstStartedPulling="2026-02-19 23:08:55.775250543 +0000 UTC m=+6046.967768407" lastFinishedPulling="2026-02-19 23:09:00.341271641 +0000 UTC m=+6051.533789505" observedRunningTime="2026-02-19 23:09:01.476538398 +0000 UTC m=+6052.669056272" watchObservedRunningTime="2026-02-19 23:09:01.483959205 +0000 UTC m=+6052.676477069" Feb 19 23:09:03 crc kubenswrapper[4795]: I0219 23:09:03.478431 4795 generic.go:334] "Generic (PLEG): container finished" podID="bd19113b-623e-4f3e-8392-09968a5d71f9" containerID="a15bc43318a5f56769ebc9469003d0ddb425e341f35a3faca3d202f2707e3b2d" exitCode=0 Feb 19 23:09:03 crc kubenswrapper[4795]: I0219 23:09:03.478591 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-8fzxs" event={"ID":"bd19113b-623e-4f3e-8392-09968a5d71f9","Type":"ContainerDied","Data":"a15bc43318a5f56769ebc9469003d0ddb425e341f35a3faca3d202f2707e3b2d"} Feb 19 23:09:04 crc kubenswrapper[4795]: I0219 23:09:04.960635 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-8fzxs" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.109967 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-job-config-data\") pod \"bd19113b-623e-4f3e-8392-09968a5d71f9\" (UID: \"bd19113b-623e-4f3e-8392-09968a5d71f9\") " Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.110141 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-674wp\" (UniqueName: \"kubernetes.io/projected/bd19113b-623e-4f3e-8392-09968a5d71f9-kube-api-access-674wp\") pod \"bd19113b-623e-4f3e-8392-09968a5d71f9\" (UID: \"bd19113b-623e-4f3e-8392-09968a5d71f9\") " Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.110186 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-combined-ca-bundle\") pod \"bd19113b-623e-4f3e-8392-09968a5d71f9\" (UID: \"bd19113b-623e-4f3e-8392-09968a5d71f9\") " Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.110229 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-config-data\") pod \"bd19113b-623e-4f3e-8392-09968a5d71f9\" (UID: \"bd19113b-623e-4f3e-8392-09968a5d71f9\") " Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.115956 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd19113b-623e-4f3e-8392-09968a5d71f9-kube-api-access-674wp" (OuterVolumeSpecName: "kube-api-access-674wp") pod "bd19113b-623e-4f3e-8392-09968a5d71f9" (UID: "bd19113b-623e-4f3e-8392-09968a5d71f9"). InnerVolumeSpecName "kube-api-access-674wp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.116265 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "bd19113b-623e-4f3e-8392-09968a5d71f9" (UID: "bd19113b-623e-4f3e-8392-09968a5d71f9"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.117756 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-config-data" (OuterVolumeSpecName: "config-data") pod "bd19113b-623e-4f3e-8392-09968a5d71f9" (UID: "bd19113b-623e-4f3e-8392-09968a5d71f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.139841 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd19113b-623e-4f3e-8392-09968a5d71f9" (UID: "bd19113b-623e-4f3e-8392-09968a5d71f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.212981 4795 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-job-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.213193 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-674wp\" (UniqueName: \"kubernetes.io/projected/bd19113b-623e-4f3e-8392-09968a5d71f9-kube-api-access-674wp\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.213263 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.213347 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.497924 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-8fzxs" event={"ID":"bd19113b-623e-4f3e-8392-09968a5d71f9","Type":"ContainerDied","Data":"2de2f6bee0fa48983ed9cf80a52a1fcd93154a98d917c415b7634301a3151ebf"} Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.497961 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2de2f6bee0fa48983ed9cf80a52a1fcd93154a98d917c415b7634301a3151ebf" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.497967 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-8fzxs" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.813782 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Feb 19 23:09:05 crc kubenswrapper[4795]: E0219 23:09:05.814788 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd19113b-623e-4f3e-8392-09968a5d71f9" containerName="manila-db-sync" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.814809 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd19113b-623e-4f3e-8392-09968a5d71f9" containerName="manila-db-sync" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.825607 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd19113b-623e-4f3e-8392-09968a5d71f9" containerName="manila-db-sync" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.827956 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.837342 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.837933 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.838399 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.844316 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-g5f8g" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.873912 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.916129 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.920579 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.923828 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.930046 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.933567 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25554074-26bb-4b62-a1f9-dac4cd6308b4-scripts\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.933787 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25554074-26bb-4b62-a1f9-dac4cd6308b4-config-data\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.933996 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25554074-26bb-4b62-a1f9-dac4cd6308b4-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.936305 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25554074-26bb-4b62-a1f9-dac4cd6308b4-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.936465 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g28d5\" (UniqueName: \"kubernetes.io/projected/25554074-26bb-4b62-a1f9-dac4cd6308b4-kube-api-access-g28d5\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.936555 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25554074-26bb-4b62-a1f9-dac4cd6308b4-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.953797 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c4f55b9f-ltwn7"] Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.956340 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.963293 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c4f55b9f-ltwn7"] Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039183 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039229 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wdb8\" (UniqueName: \"kubernetes.io/projected/30effec6-7cdf-4ef1-b828-ff6327bb6bce-kube-api-access-6wdb8\") pod \"dnsmasq-dns-6c4f55b9f-ltwn7\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039256 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25554074-26bb-4b62-a1f9-dac4cd6308b4-config-data\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039274 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039289 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-ovsdbserver-nb\") pod \"dnsmasq-dns-6c4f55b9f-ltwn7\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039331 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-config-data\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039359 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-dns-svc\") pod \"dnsmasq-dns-6c4f55b9f-ltwn7\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039374 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-config\") pod \"dnsmasq-dns-6c4f55b9f-ltwn7\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039440 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25554074-26bb-4b62-a1f9-dac4cd6308b4-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039461 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-ovsdbserver-sb\") pod \"dnsmasq-dns-6c4f55b9f-ltwn7\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039484 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25554074-26bb-4b62-a1f9-dac4cd6308b4-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039519 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039539 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g28d5\" (UniqueName: \"kubernetes.io/projected/25554074-26bb-4b62-a1f9-dac4cd6308b4-kube-api-access-g28d5\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039554 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25554074-26bb-4b62-a1f9-dac4cd6308b4-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039581 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4nzg\" (UniqueName: \"kubernetes.io/projected/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-kube-api-access-c4nzg\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039622 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-ceph\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039654 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-scripts\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039669 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25554074-26bb-4b62-a1f9-dac4cd6308b4-scripts\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039687 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.040102 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25554074-26bb-4b62-a1f9-dac4cd6308b4-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.046571 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25554074-26bb-4b62-a1f9-dac4cd6308b4-scripts\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.047071 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25554074-26bb-4b62-a1f9-dac4cd6308b4-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.047344 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25554074-26bb-4b62-a1f9-dac4cd6308b4-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.047789 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25554074-26bb-4b62-a1f9-dac4cd6308b4-config-data\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.058684 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g28d5\" (UniqueName: \"kubernetes.io/projected/25554074-26bb-4b62-a1f9-dac4cd6308b4-kube-api-access-g28d5\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.095249 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.097253 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.100839 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.119185 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.141942 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-ceph\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.142272 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-scripts\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.142387 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.142502 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.142698 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wdb8\" (UniqueName: \"kubernetes.io/projected/30effec6-7cdf-4ef1-b828-ff6327bb6bce-kube-api-access-6wdb8\") pod \"dnsmasq-dns-6c4f55b9f-ltwn7\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.142793 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.142897 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-ovsdbserver-nb\") pod \"dnsmasq-dns-6c4f55b9f-ltwn7\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.143009 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-config-data\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.143112 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-dns-svc\") pod \"dnsmasq-dns-6c4f55b9f-ltwn7\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.143233 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-config\") pod \"dnsmasq-dns-6c4f55b9f-ltwn7\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.143389 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-ovsdbserver-sb\") pod \"dnsmasq-dns-6c4f55b9f-ltwn7\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.143512 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.143622 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4nzg\" (UniqueName: \"kubernetes.io/projected/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-kube-api-access-c4nzg\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.145275 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-ovsdbserver-nb\") pod \"dnsmasq-dns-6c4f55b9f-ltwn7\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.146392 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.150014 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-config-data\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.150788 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-dns-svc\") pod \"dnsmasq-dns-6c4f55b9f-ltwn7\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.151394 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-config\") pod \"dnsmasq-dns-6c4f55b9f-ltwn7\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.152354 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.153270 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.153443 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-ovsdbserver-sb\") pod \"dnsmasq-dns-6c4f55b9f-ltwn7\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.162703 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.164897 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-ceph\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.170435 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-scripts\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.170961 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.178955 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4nzg\" (UniqueName: \"kubernetes.io/projected/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-kube-api-access-c4nzg\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.185071 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wdb8\" (UniqueName: \"kubernetes.io/projected/30effec6-7cdf-4ef1-b828-ff6327bb6bce-kube-api-access-6wdb8\") pod \"dnsmasq-dns-6c4f55b9f-ltwn7\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.246631 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-etc-machine-id\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.246686 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-config-data-custom\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.246916 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-logs\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.247051 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-scripts\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.247282 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.247497 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-config-data\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.247608 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77q29\" (UniqueName: \"kubernetes.io/projected/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-kube-api-access-77q29\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.262193 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.279236 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.349554 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.349668 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-config-data\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.349711 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77q29\" (UniqueName: \"kubernetes.io/projected/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-kube-api-access-77q29\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.349787 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-etc-machine-id\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.349810 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-config-data-custom\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.349886 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-logs\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.349917 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-scripts\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.352058 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-logs\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.352131 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-etc-machine-id\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.356943 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-scripts\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.359243 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.372978 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-config-data-custom\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.373620 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-config-data\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.392865 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77q29\" (UniqueName: \"kubernetes.io/projected/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-kube-api-access-77q29\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.593522 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.731939 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Feb 19 23:09:06 crc kubenswrapper[4795]: W0219 23:09:06.760309 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25554074_26bb_4b62_a1f9_dac4cd6308b4.slice/crio-3340f030743b87c44179e06785a9f331f79071662606164b653198513ed17653 WatchSource:0}: Error finding container 3340f030743b87c44179e06785a9f331f79071662606164b653198513ed17653: Status 404 returned error can't find the container with id 3340f030743b87c44179e06785a9f331f79071662606164b653198513ed17653 Feb 19 23:09:07 crc kubenswrapper[4795]: I0219 23:09:07.103218 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Feb 19 23:09:07 crc kubenswrapper[4795]: I0219 23:09:07.114914 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c4f55b9f-ltwn7"] Feb 19 23:09:07 crc kubenswrapper[4795]: I0219 23:09:07.187995 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Feb 19 23:09:07 crc kubenswrapper[4795]: I0219 23:09:07.566304 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" event={"ID":"30effec6-7cdf-4ef1-b828-ff6327bb6bce","Type":"ContainerStarted","Data":"78d47d6b1b605f27ba7148a48207891b6a8dfb70fcf1cbffd1a75f4272077818"} Feb 19 23:09:07 crc kubenswrapper[4795]: I0219 23:09:07.572981 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"25554074-26bb-4b62-a1f9-dac4cd6308b4","Type":"ContainerStarted","Data":"3340f030743b87c44179e06785a9f331f79071662606164b653198513ed17653"} Feb 19 23:09:07 crc kubenswrapper[4795]: I0219 23:09:07.575193 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864","Type":"ContainerStarted","Data":"12303e2812dd7f41d5986b133f2354eb3e2eb37ec80e42c2761c4a2c65a83170"} Feb 19 23:09:07 crc kubenswrapper[4795]: I0219 23:09:07.577232 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d","Type":"ContainerStarted","Data":"16c2bad66bbd5e23781823b78ccb5fbbafc69f6c84985671756edddcf7e88b1a"} Feb 19 23:09:08 crc kubenswrapper[4795]: I0219 23:09:08.599807 4795 generic.go:334] "Generic (PLEG): container finished" podID="30effec6-7cdf-4ef1-b828-ff6327bb6bce" containerID="1b8149ccac8d7737b0dd0a0f63b35ed22eb45afb237991b5cd443f37a9b00a52" exitCode=0 Feb 19 23:09:08 crc kubenswrapper[4795]: I0219 23:09:08.599888 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" event={"ID":"30effec6-7cdf-4ef1-b828-ff6327bb6bce","Type":"ContainerDied","Data":"1b8149ccac8d7737b0dd0a0f63b35ed22eb45afb237991b5cd443f37a9b00a52"} Feb 19 23:09:08 crc kubenswrapper[4795]: I0219 23:09:08.625003 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"25554074-26bb-4b62-a1f9-dac4cd6308b4","Type":"ContainerStarted","Data":"80b774f5e004853fd442b3883c009e31328c0024b5ff1f0bf808913283df8cec"} Feb 19 23:09:08 crc kubenswrapper[4795]: I0219 23:09:08.625065 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"25554074-26bb-4b62-a1f9-dac4cd6308b4","Type":"ContainerStarted","Data":"197912b640f4bcb72c1bc7f3064ff9382adb0f54d892c2e831f7536ed5c2c540"} Feb 19 23:09:08 crc kubenswrapper[4795]: I0219 23:09:08.640571 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d","Type":"ContainerStarted","Data":"619ee3a23c0cee7450cc7808138eb296572fce554dc64ef68c9f4a217ae21bb1"} Feb 19 23:09:08 crc kubenswrapper[4795]: I0219 23:09:08.643420 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d","Type":"ContainerStarted","Data":"9a37176c13c4409f76767df52c8daaeb1ed37ac49259ce43926381ed6dda07df"} Feb 19 23:09:08 crc kubenswrapper[4795]: I0219 23:09:08.643467 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Feb 19 23:09:08 crc kubenswrapper[4795]: I0219 23:09:08.676325 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.051637834 podStartE2EDuration="3.676304216s" podCreationTimestamp="2026-02-19 23:09:05 +0000 UTC" firstStartedPulling="2026-02-19 23:09:06.767394058 +0000 UTC m=+6057.959911922" lastFinishedPulling="2026-02-19 23:09:07.39206044 +0000 UTC m=+6058.584578304" observedRunningTime="2026-02-19 23:09:08.662070157 +0000 UTC m=+6059.854588021" watchObservedRunningTime="2026-02-19 23:09:08.676304216 +0000 UTC m=+6059.868822080" Feb 19 23:09:08 crc kubenswrapper[4795]: I0219 23:09:08.741999 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=2.7419829460000003 podStartE2EDuration="2.741982946s" podCreationTimestamp="2026-02-19 23:09:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:09:08.741459582 +0000 UTC m=+6059.933977446" watchObservedRunningTime="2026-02-19 23:09:08.741982946 +0000 UTC m=+6059.934500800" Feb 19 23:09:09 crc kubenswrapper[4795]: I0219 23:09:09.656718 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" event={"ID":"30effec6-7cdf-4ef1-b828-ff6327bb6bce","Type":"ContainerStarted","Data":"971e5c0737888f0489bc10f6f20325ece6b9f13c0a3485472032845f358aba5b"} Feb 19 23:09:09 crc kubenswrapper[4795]: I0219 23:09:09.683069 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" podStartSLOduration=4.683050758 podStartE2EDuration="4.683050758s" podCreationTimestamp="2026-02-19 23:09:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:09:09.673677188 +0000 UTC m=+6060.866195052" watchObservedRunningTime="2026-02-19 23:09:09.683050758 +0000 UTC m=+6060.875568622" Feb 19 23:09:10 crc kubenswrapper[4795]: I0219 23:09:10.672018 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:13 crc kubenswrapper[4795]: I0219 23:09:13.683756 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 19 23:09:15 crc kubenswrapper[4795]: I0219 23:09:15.766682 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864","Type":"ContainerStarted","Data":"9258f71dca60325ce87ec0600932b5afe531b726018c02e457dd257e1fd79f11"} Feb 19 23:09:15 crc kubenswrapper[4795]: I0219 23:09:15.767238 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864","Type":"ContainerStarted","Data":"8b3ee7d3c7c3b6df6ad6d84f9683cbddbc63cf20f7beb14ba96413da1c664f62"} Feb 19 23:09:15 crc kubenswrapper[4795]: I0219 23:09:15.791341 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.580446182 podStartE2EDuration="10.791325256s" podCreationTimestamp="2026-02-19 23:09:05 +0000 UTC" firstStartedPulling="2026-02-19 23:09:07.098491999 +0000 UTC m=+6058.291009863" lastFinishedPulling="2026-02-19 23:09:14.309371073 +0000 UTC m=+6065.501888937" observedRunningTime="2026-02-19 23:09:15.790102883 +0000 UTC m=+6066.982620767" watchObservedRunningTime="2026-02-19 23:09:15.791325256 +0000 UTC m=+6066.983843120" Feb 19 23:09:16 crc kubenswrapper[4795]: I0219 23:09:16.162938 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Feb 19 23:09:16 crc kubenswrapper[4795]: I0219 23:09:16.272389 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Feb 19 23:09:16 crc kubenswrapper[4795]: I0219 23:09:16.280412 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:16 crc kubenswrapper[4795]: I0219 23:09:16.357926 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9677b4c57-7nn9w"] Feb 19 23:09:16 crc kubenswrapper[4795]: I0219 23:09:16.358216 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" podUID="b20710ae-8abe-4d80-8cdf-582fe785e2cc" containerName="dnsmasq-dns" containerID="cri-o://af38521e1f3eb2665f893506c1a46deb190e71a060f854d967c38b1e5c2dbb46" gracePeriod=10 Feb 19 23:09:16 crc kubenswrapper[4795]: I0219 23:09:16.782397 4795 generic.go:334] "Generic (PLEG): container finished" podID="b20710ae-8abe-4d80-8cdf-582fe785e2cc" containerID="af38521e1f3eb2665f893506c1a46deb190e71a060f854d967c38b1e5c2dbb46" exitCode=0 Feb 19 23:09:16 crc kubenswrapper[4795]: I0219 23:09:16.784226 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" event={"ID":"b20710ae-8abe-4d80-8cdf-582fe785e2cc","Type":"ContainerDied","Data":"af38521e1f3eb2665f893506c1a46deb190e71a060f854d967c38b1e5c2dbb46"} Feb 19 23:09:16 crc kubenswrapper[4795]: I0219 23:09:16.945598 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 23:09:16 crc kubenswrapper[4795]: I0219 23:09:16.996198 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-ovsdbserver-sb\") pod \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " Feb 19 23:09:16 crc kubenswrapper[4795]: I0219 23:09:16.996265 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vc6jd\" (UniqueName: \"kubernetes.io/projected/b20710ae-8abe-4d80-8cdf-582fe785e2cc-kube-api-access-vc6jd\") pod \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " Feb 19 23:09:16 crc kubenswrapper[4795]: I0219 23:09:16.996471 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-ovsdbserver-nb\") pod \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " Feb 19 23:09:16 crc kubenswrapper[4795]: I0219 23:09:16.996541 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-config\") pod \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " Feb 19 23:09:16 crc kubenswrapper[4795]: I0219 23:09:16.996720 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-dns-svc\") pod \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " Feb 19 23:09:17 crc kubenswrapper[4795]: I0219 23:09:17.003000 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b20710ae-8abe-4d80-8cdf-582fe785e2cc-kube-api-access-vc6jd" (OuterVolumeSpecName: "kube-api-access-vc6jd") pod "b20710ae-8abe-4d80-8cdf-582fe785e2cc" (UID: "b20710ae-8abe-4d80-8cdf-582fe785e2cc"). InnerVolumeSpecName "kube-api-access-vc6jd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:09:17 crc kubenswrapper[4795]: I0219 23:09:17.068087 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b20710ae-8abe-4d80-8cdf-582fe785e2cc" (UID: "b20710ae-8abe-4d80-8cdf-582fe785e2cc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:09:17 crc kubenswrapper[4795]: I0219 23:09:17.080146 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-config" (OuterVolumeSpecName: "config") pod "b20710ae-8abe-4d80-8cdf-582fe785e2cc" (UID: "b20710ae-8abe-4d80-8cdf-582fe785e2cc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:09:17 crc kubenswrapper[4795]: I0219 23:09:17.097391 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b20710ae-8abe-4d80-8cdf-582fe785e2cc" (UID: "b20710ae-8abe-4d80-8cdf-582fe785e2cc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:09:17 crc kubenswrapper[4795]: I0219 23:09:17.099183 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-config\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:17 crc kubenswrapper[4795]: I0219 23:09:17.099201 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:17 crc kubenswrapper[4795]: I0219 23:09:17.099213 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vc6jd\" (UniqueName: \"kubernetes.io/projected/b20710ae-8abe-4d80-8cdf-582fe785e2cc-kube-api-access-vc6jd\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:17 crc kubenswrapper[4795]: I0219 23:09:17.099223 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:17 crc kubenswrapper[4795]: I0219 23:09:17.103088 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b20710ae-8abe-4d80-8cdf-582fe785e2cc" (UID: "b20710ae-8abe-4d80-8cdf-582fe785e2cc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:09:17 crc kubenswrapper[4795]: I0219 23:09:17.201333 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:17 crc kubenswrapper[4795]: I0219 23:09:17.793671 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 23:09:17 crc kubenswrapper[4795]: I0219 23:09:17.794283 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" event={"ID":"b20710ae-8abe-4d80-8cdf-582fe785e2cc","Type":"ContainerDied","Data":"676d66baf2862b20915b7f3f4da6df9613656d85d389c8b7a38b315539d7e5e8"} Feb 19 23:09:17 crc kubenswrapper[4795]: I0219 23:09:17.794323 4795 scope.go:117] "RemoveContainer" containerID="af38521e1f3eb2665f893506c1a46deb190e71a060f854d967c38b1e5c2dbb46" Feb 19 23:09:17 crc kubenswrapper[4795]: I0219 23:09:17.816694 4795 scope.go:117] "RemoveContainer" containerID="b799c6817a13193660264c0e1a6b6bbda173865957b6b9f053b5f17c7df00f19" Feb 19 23:09:17 crc kubenswrapper[4795]: I0219 23:09:17.823132 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9677b4c57-7nn9w"] Feb 19 23:09:17 crc kubenswrapper[4795]: I0219 23:09:17.864349 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9677b4c57-7nn9w"] Feb 19 23:09:19 crc kubenswrapper[4795]: I0219 23:09:19.409452 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:09:19 crc kubenswrapper[4795]: I0219 23:09:19.410075 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerName="ceilometer-central-agent" containerID="cri-o://44a3b39a44dc6cf8dbe9434bdd68e38ca2c4f2db3b91cc4b6a6862525ce949fc" gracePeriod=30 Feb 19 23:09:19 crc kubenswrapper[4795]: I0219 23:09:19.410236 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerName="ceilometer-notification-agent" containerID="cri-o://ca8c993b6a51e98c0db74360e21684626b8c5fc48a0421fcdb15ff26685a9b39" gracePeriod=30 Feb 19 23:09:19 crc kubenswrapper[4795]: I0219 23:09:19.410222 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerName="sg-core" containerID="cri-o://807c413b53a5b1af9be38437a438f28044e30b65d6e5738bee5968445dc73a94" gracePeriod=30 Feb 19 23:09:19 crc kubenswrapper[4795]: I0219 23:09:19.410419 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerName="proxy-httpd" containerID="cri-o://32823af9fc2a8e52e61f0605bdd2b17739e02e0d4106a20b34de910c13396fbb" gracePeriod=30 Feb 19 23:09:19 crc kubenswrapper[4795]: I0219 23:09:19.523132 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b20710ae-8abe-4d80-8cdf-582fe785e2cc" path="/var/lib/kubelet/pods/b20710ae-8abe-4d80-8cdf-582fe785e2cc/volumes" Feb 19 23:09:19 crc kubenswrapper[4795]: I0219 23:09:19.831681 4795 generic.go:334] "Generic (PLEG): container finished" podID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerID="32823af9fc2a8e52e61f0605bdd2b17739e02e0d4106a20b34de910c13396fbb" exitCode=0 Feb 19 23:09:19 crc kubenswrapper[4795]: I0219 23:09:19.832176 4795 generic.go:334] "Generic (PLEG): container finished" podID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerID="807c413b53a5b1af9be38437a438f28044e30b65d6e5738bee5968445dc73a94" exitCode=2 Feb 19 23:09:19 crc kubenswrapper[4795]: I0219 23:09:19.831795 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7d7882a-86ad-4b48-8280-af2ced7c6807","Type":"ContainerDied","Data":"32823af9fc2a8e52e61f0605bdd2b17739e02e0d4106a20b34de910c13396fbb"} Feb 19 23:09:19 crc kubenswrapper[4795]: I0219 23:09:19.832223 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7d7882a-86ad-4b48-8280-af2ced7c6807","Type":"ContainerDied","Data":"807c413b53a5b1af9be38437a438f28044e30b65d6e5738bee5968445dc73a94"} Feb 19 23:09:20 crc kubenswrapper[4795]: I0219 23:09:20.845020 4795 generic.go:334] "Generic (PLEG): container finished" podID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerID="44a3b39a44dc6cf8dbe9434bdd68e38ca2c4f2db3b91cc4b6a6862525ce949fc" exitCode=0 Feb 19 23:09:20 crc kubenswrapper[4795]: I0219 23:09:20.845080 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7d7882a-86ad-4b48-8280-af2ced7c6807","Type":"ContainerDied","Data":"44a3b39a44dc6cf8dbe9434bdd68e38ca2c4f2db3b91cc4b6a6862525ce949fc"} Feb 19 23:09:21 crc kubenswrapper[4795]: I0219 23:09:21.862375 4795 generic.go:334] "Generic (PLEG): container finished" podID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerID="ca8c993b6a51e98c0db74360e21684626b8c5fc48a0421fcdb15ff26685a9b39" exitCode=0 Feb 19 23:09:21 crc kubenswrapper[4795]: I0219 23:09:21.863061 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7d7882a-86ad-4b48-8280-af2ced7c6807","Type":"ContainerDied","Data":"ca8c993b6a51e98c0db74360e21684626b8c5fc48a0421fcdb15ff26685a9b39"} Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.093766 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.152081 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-config-data\") pod \"d7d7882a-86ad-4b48-8280-af2ced7c6807\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.152119 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-combined-ca-bundle\") pod \"d7d7882a-86ad-4b48-8280-af2ced7c6807\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.152200 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stfh6\" (UniqueName: \"kubernetes.io/projected/d7d7882a-86ad-4b48-8280-af2ced7c6807-kube-api-access-stfh6\") pod \"d7d7882a-86ad-4b48-8280-af2ced7c6807\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.152268 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7d7882a-86ad-4b48-8280-af2ced7c6807-run-httpd\") pod \"d7d7882a-86ad-4b48-8280-af2ced7c6807\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.152294 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7d7882a-86ad-4b48-8280-af2ced7c6807-log-httpd\") pod \"d7d7882a-86ad-4b48-8280-af2ced7c6807\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.152500 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-scripts\") pod \"d7d7882a-86ad-4b48-8280-af2ced7c6807\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.152545 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-sg-core-conf-yaml\") pod \"d7d7882a-86ad-4b48-8280-af2ced7c6807\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.153765 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7d7882a-86ad-4b48-8280-af2ced7c6807-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d7d7882a-86ad-4b48-8280-af2ced7c6807" (UID: "d7d7882a-86ad-4b48-8280-af2ced7c6807"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.153832 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7d7882a-86ad-4b48-8280-af2ced7c6807-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d7d7882a-86ad-4b48-8280-af2ced7c6807" (UID: "d7d7882a-86ad-4b48-8280-af2ced7c6807"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.161430 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-scripts" (OuterVolumeSpecName: "scripts") pod "d7d7882a-86ad-4b48-8280-af2ced7c6807" (UID: "d7d7882a-86ad-4b48-8280-af2ced7c6807"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.174434 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7d7882a-86ad-4b48-8280-af2ced7c6807-kube-api-access-stfh6" (OuterVolumeSpecName: "kube-api-access-stfh6") pod "d7d7882a-86ad-4b48-8280-af2ced7c6807" (UID: "d7d7882a-86ad-4b48-8280-af2ced7c6807"). InnerVolumeSpecName "kube-api-access-stfh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.215463 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d7d7882a-86ad-4b48-8280-af2ced7c6807" (UID: "d7d7882a-86ad-4b48-8280-af2ced7c6807"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.254615 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.254651 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.254662 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stfh6\" (UniqueName: \"kubernetes.io/projected/d7d7882a-86ad-4b48-8280-af2ced7c6807-kube-api-access-stfh6\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.254671 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7d7882a-86ad-4b48-8280-af2ced7c6807-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.254679 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7d7882a-86ad-4b48-8280-af2ced7c6807-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.264297 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-config-data" (OuterVolumeSpecName: "config-data") pod "d7d7882a-86ad-4b48-8280-af2ced7c6807" (UID: "d7d7882a-86ad-4b48-8280-af2ced7c6807"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.282320 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7d7882a-86ad-4b48-8280-af2ced7c6807" (UID: "d7d7882a-86ad-4b48-8280-af2ced7c6807"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.356329 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.356378 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.873552 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7d7882a-86ad-4b48-8280-af2ced7c6807","Type":"ContainerDied","Data":"2b454f16774527928b9df78521f93b16c0055d0b08e28e664f0730245cc4e62b"} Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.873633 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.873923 4795 scope.go:117] "RemoveContainer" containerID="32823af9fc2a8e52e61f0605bdd2b17739e02e0d4106a20b34de910c13396fbb" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.893073 4795 scope.go:117] "RemoveContainer" containerID="807c413b53a5b1af9be38437a438f28044e30b65d6e5738bee5968445dc73a94" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.919087 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.922470 4795 scope.go:117] "RemoveContainer" containerID="ca8c993b6a51e98c0db74360e21684626b8c5fc48a0421fcdb15ff26685a9b39" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.932517 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.943874 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:09:22 crc kubenswrapper[4795]: E0219 23:09:22.944353 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b20710ae-8abe-4d80-8cdf-582fe785e2cc" containerName="dnsmasq-dns" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.944371 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b20710ae-8abe-4d80-8cdf-582fe785e2cc" containerName="dnsmasq-dns" Feb 19 23:09:22 crc kubenswrapper[4795]: E0219 23:09:22.944395 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerName="ceilometer-central-agent" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.944403 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerName="ceilometer-central-agent" Feb 19 23:09:22 crc kubenswrapper[4795]: E0219 23:09:22.944418 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerName="sg-core" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.944424 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerName="sg-core" Feb 19 23:09:22 crc kubenswrapper[4795]: E0219 23:09:22.944437 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b20710ae-8abe-4d80-8cdf-582fe785e2cc" containerName="init" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.944442 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b20710ae-8abe-4d80-8cdf-582fe785e2cc" containerName="init" Feb 19 23:09:22 crc kubenswrapper[4795]: E0219 23:09:22.944452 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerName="proxy-httpd" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.944460 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerName="proxy-httpd" Feb 19 23:09:22 crc kubenswrapper[4795]: E0219 23:09:22.944472 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerName="ceilometer-notification-agent" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.944478 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerName="ceilometer-notification-agent" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.944661 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerName="proxy-httpd" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.944675 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b20710ae-8abe-4d80-8cdf-582fe785e2cc" containerName="dnsmasq-dns" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.944688 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerName="ceilometer-central-agent" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.944700 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerName="sg-core" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.944717 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerName="ceilometer-notification-agent" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.946665 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.948877 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.950269 4795 scope.go:117] "RemoveContainer" containerID="44a3b39a44dc6cf8dbe9434bdd68e38ca2c4f2db3b91cc4b6a6862525ce949fc" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.950403 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.957932 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.974691 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-config-data\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.974785 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-run-httpd\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.974835 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.974917 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9sx2\" (UniqueName: \"kubernetes.io/projected/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-kube-api-access-k9sx2\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.974942 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-scripts\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.974989 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.975038 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-log-httpd\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.078422 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9sx2\" (UniqueName: \"kubernetes.io/projected/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-kube-api-access-k9sx2\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.078495 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-scripts\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.078569 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.078645 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-log-httpd\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.078826 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-config-data\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.078878 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-run-httpd\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.078936 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.085332 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.089542 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-scripts\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.093059 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.093511 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-log-httpd\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.098322 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-config-data\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.098727 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-run-httpd\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.107137 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9sx2\" (UniqueName: \"kubernetes.io/projected/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-kube-api-access-k9sx2\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.263840 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.527445 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" path="/var/lib/kubelet/pods/d7d7882a-86ad-4b48-8280-af2ced7c6807/volumes" Feb 19 23:09:23 crc kubenswrapper[4795]: W0219 23:09:23.834568 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87a2d0b8_5866_4d88_ab91_cd94c2136c6c.slice/crio-20a146dd837978cead831ac646b17a84a953d23ffc337a7ad35c8db3d22c778e WatchSource:0}: Error finding container 20a146dd837978cead831ac646b17a84a953d23ffc337a7ad35c8db3d22c778e: Status 404 returned error can't find the container with id 20a146dd837978cead831ac646b17a84a953d23ffc337a7ad35c8db3d22c778e Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.837749 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.843391 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.884209 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87a2d0b8-5866-4d88-ab91-cd94c2136c6c","Type":"ContainerStarted","Data":"20a146dd837978cead831ac646b17a84a953d23ffc337a7ad35c8db3d22c778e"} Feb 19 23:09:24 crc kubenswrapper[4795]: I0219 23:09:24.894038 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87a2d0b8-5866-4d88-ab91-cd94c2136c6c","Type":"ContainerStarted","Data":"d3b098895da80405758df0193d773c7f712d32d244e01634a1b1139445cb875e"} Feb 19 23:09:25 crc kubenswrapper[4795]: I0219 23:09:25.844509 4795 scope.go:117] "RemoveContainer" containerID="2096e22df9aade6dc70570abe24d3dbe208045474ec5189f56a5d1beac2fa739" Feb 19 23:09:25 crc kubenswrapper[4795]: I0219 23:09:25.871861 4795 scope.go:117] "RemoveContainer" containerID="eb9dfa4e109128ae704882fd6740207250d7781752ee79e05d86190dadc20b22" Feb 19 23:09:25 crc kubenswrapper[4795]: I0219 23:09:25.905023 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87a2d0b8-5866-4d88-ab91-cd94c2136c6c","Type":"ContainerStarted","Data":"a0e565fdba0c7abd230f8945d0858103ea10b5326777256cbafabf6bace4a840"} Feb 19 23:09:25 crc kubenswrapper[4795]: I0219 23:09:25.921420 4795 scope.go:117] "RemoveContainer" containerID="eae4ff82798e1d79080d2f3fe94f82eb86492da01dc7f84fe2ae4aff95927ca0" Feb 19 23:09:25 crc kubenswrapper[4795]: I0219 23:09:25.953152 4795 scope.go:117] "RemoveContainer" containerID="fd9abe25e7c7b1351c86593c296366b3dd62af78525a82ce72038120bd5feb1c" Feb 19 23:09:25 crc kubenswrapper[4795]: I0219 23:09:25.989839 4795 scope.go:117] "RemoveContainer" containerID="3e6c87188fd15e9809b882f54d39f3a4062c4abbd3e889c36c86f66a2c644f00" Feb 19 23:09:26 crc kubenswrapper[4795]: I0219 23:09:26.917419 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87a2d0b8-5866-4d88-ab91-cd94c2136c6c","Type":"ContainerStarted","Data":"92523c6404f017ec2be90d32dc4cf26336fb7adcf1830fd82adee31e7f856fa0"} Feb 19 23:09:27 crc kubenswrapper[4795]: I0219 23:09:27.793146 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Feb 19 23:09:27 crc kubenswrapper[4795]: I0219 23:09:27.799489 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Feb 19 23:09:27 crc kubenswrapper[4795]: I0219 23:09:27.926214 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87a2d0b8-5866-4d88-ab91-cd94c2136c6c","Type":"ContainerStarted","Data":"a30654e19b3666fea6ec0cf0d65c9b3a5c7aaa72197820ff0b31b291411781d3"} Feb 19 23:09:27 crc kubenswrapper[4795]: I0219 23:09:27.927417 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 23:09:27 crc kubenswrapper[4795]: I0219 23:09:27.955820 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.742023213 podStartE2EDuration="5.955800256s" podCreationTimestamp="2026-02-19 23:09:22 +0000 UTC" firstStartedPulling="2026-02-19 23:09:23.837489144 +0000 UTC m=+6075.030007008" lastFinishedPulling="2026-02-19 23:09:27.051266187 +0000 UTC m=+6078.243784051" observedRunningTime="2026-02-19 23:09:27.942140972 +0000 UTC m=+6079.134658836" watchObservedRunningTime="2026-02-19 23:09:27.955800256 +0000 UTC m=+6079.148318130" Feb 19 23:09:28 crc kubenswrapper[4795]: I0219 23:09:28.207788 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Feb 19 23:09:28 crc kubenswrapper[4795]: I0219 23:09:28.427049 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:09:28 crc kubenswrapper[4795]: I0219 23:09:28.427493 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:09:42 crc kubenswrapper[4795]: I0219 23:09:42.053942 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-0a84-account-create-update-pl5gt"] Feb 19 23:09:42 crc kubenswrapper[4795]: I0219 23:09:42.068900 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-z6zgv"] Feb 19 23:09:42 crc kubenswrapper[4795]: I0219 23:09:42.084155 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-0a84-account-create-update-pl5gt"] Feb 19 23:09:42 crc kubenswrapper[4795]: I0219 23:09:42.094896 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-z6zgv"] Feb 19 23:09:43 crc kubenswrapper[4795]: I0219 23:09:43.540427 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eae15d3-0be7-4510-9803-a7ad3f947148" path="/var/lib/kubelet/pods/6eae15d3-0be7-4510-9803-a7ad3f947148/volumes" Feb 19 23:09:43 crc kubenswrapper[4795]: I0219 23:09:43.541905 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de06c33e-b82b-46eb-964b-4bdd02c94166" path="/var/lib/kubelet/pods/de06c33e-b82b-46eb-964b-4bdd02c94166/volumes" Feb 19 23:09:50 crc kubenswrapper[4795]: I0219 23:09:50.037853 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-mn2kc"] Feb 19 23:09:50 crc kubenswrapper[4795]: I0219 23:09:50.055193 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-mn2kc"] Feb 19 23:09:51 crc kubenswrapper[4795]: I0219 23:09:51.527083 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51673183-2fe8-4a11-98f0-dec10081e7fc" path="/var/lib/kubelet/pods/51673183-2fe8-4a11-98f0-dec10081e7fc/volumes" Feb 19 23:09:53 crc kubenswrapper[4795]: I0219 23:09:53.268231 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 19 23:09:58 crc kubenswrapper[4795]: I0219 23:09:58.427349 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:09:58 crc kubenswrapper[4795]: I0219 23:09:58.428093 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.673193 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-857b684d55-kkmvk"] Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.675383 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.677191 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.686135 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-857b684d55-kkmvk"] Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.769784 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-ovsdbserver-nb\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.769957 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-dns-svc\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.770045 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-openstack-cell1\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.770070 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t5jz\" (UniqueName: \"kubernetes.io/projected/b3d4a3d4-7002-422c-af86-2500e4c15e0b-kube-api-access-9t5jz\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.770103 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-config\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.770272 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-ovsdbserver-sb\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.872228 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-openstack-cell1\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.872268 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t5jz\" (UniqueName: \"kubernetes.io/projected/b3d4a3d4-7002-422c-af86-2500e4c15e0b-kube-api-access-9t5jz\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.872295 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-config\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.872358 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-ovsdbserver-sb\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.872393 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-ovsdbserver-nb\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.872470 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-dns-svc\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.873320 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-dns-svc\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.873530 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-ovsdbserver-sb\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.873627 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-openstack-cell1\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.873803 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-config\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.874079 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-ovsdbserver-nb\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.891979 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t5jz\" (UniqueName: \"kubernetes.io/projected/b3d4a3d4-7002-422c-af86-2500e4c15e0b-kube-api-access-9t5jz\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.992375 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:13 crc kubenswrapper[4795]: I0219 23:10:13.479957 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-857b684d55-kkmvk"] Feb 19 23:10:14 crc kubenswrapper[4795]: I0219 23:10:14.378197 4795 generic.go:334] "Generic (PLEG): container finished" podID="b3d4a3d4-7002-422c-af86-2500e4c15e0b" containerID="2e01aa79b584e456d7bb177df7e3c0e3d240599ed160c0a0253a370986182831" exitCode=0 Feb 19 23:10:14 crc kubenswrapper[4795]: I0219 23:10:14.378261 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-857b684d55-kkmvk" event={"ID":"b3d4a3d4-7002-422c-af86-2500e4c15e0b","Type":"ContainerDied","Data":"2e01aa79b584e456d7bb177df7e3c0e3d240599ed160c0a0253a370986182831"} Feb 19 23:10:14 crc kubenswrapper[4795]: I0219 23:10:14.378761 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-857b684d55-kkmvk" event={"ID":"b3d4a3d4-7002-422c-af86-2500e4c15e0b","Type":"ContainerStarted","Data":"330f81a91d8abda42411ed5c57208925e27d58bc7869c4940f8759238ea316af"} Feb 19 23:10:15 crc kubenswrapper[4795]: I0219 23:10:15.388115 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-857b684d55-kkmvk" event={"ID":"b3d4a3d4-7002-422c-af86-2500e4c15e0b","Type":"ContainerStarted","Data":"6a99b05d5f7a67497a684c2cb94a28dfa50917779aaccd71fbb28d10ab5f2cd7"} Feb 19 23:10:15 crc kubenswrapper[4795]: I0219 23:10:15.388572 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:15 crc kubenswrapper[4795]: I0219 23:10:15.411313 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-857b684d55-kkmvk" podStartSLOduration=3.411288436 podStartE2EDuration="3.411288436s" podCreationTimestamp="2026-02-19 23:10:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:10:15.406141999 +0000 UTC m=+6126.598659863" watchObservedRunningTime="2026-02-19 23:10:15.411288436 +0000 UTC m=+6126.603806300" Feb 19 23:10:22 crc kubenswrapper[4795]: I0219 23:10:22.994451 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.057522 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c4f55b9f-ltwn7"] Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.057743 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" podUID="30effec6-7cdf-4ef1-b828-ff6327bb6bce" containerName="dnsmasq-dns" containerID="cri-o://971e5c0737888f0489bc10f6f20325ece6b9f13c0a3485472032845f358aba5b" gracePeriod=10 Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.209470 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67d97dc55-pvbjd"] Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.212720 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.239518 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67d97dc55-pvbjd"] Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.309055 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-ovsdbserver-nb\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.309105 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-ovsdbserver-sb\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.309148 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-config\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.309314 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glsg7\" (UniqueName: \"kubernetes.io/projected/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-kube-api-access-glsg7\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.309405 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-openstack-cell1\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.309493 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-dns-svc\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.411247 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-ovsdbserver-nb\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.411659 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-ovsdbserver-sb\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.411696 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-config\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.411746 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glsg7\" (UniqueName: \"kubernetes.io/projected/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-kube-api-access-glsg7\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.411777 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-openstack-cell1\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.411801 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-dns-svc\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.412513 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-ovsdbserver-nb\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.412738 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-config\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.412874 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-dns-svc\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.412878 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-openstack-cell1\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.413220 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-ovsdbserver-sb\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.442372 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glsg7\" (UniqueName: \"kubernetes.io/projected/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-kube-api-access-glsg7\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.541519 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.642980 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.716549 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-dns-svc\") pod \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.716654 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-config\") pod \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.716703 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-ovsdbserver-nb\") pod \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.716725 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-ovsdbserver-sb\") pod \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.716775 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wdb8\" (UniqueName: \"kubernetes.io/projected/30effec6-7cdf-4ef1-b828-ff6327bb6bce-kube-api-access-6wdb8\") pod \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.734453 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30effec6-7cdf-4ef1-b828-ff6327bb6bce-kube-api-access-6wdb8" (OuterVolumeSpecName: "kube-api-access-6wdb8") pod "30effec6-7cdf-4ef1-b828-ff6327bb6bce" (UID: "30effec6-7cdf-4ef1-b828-ff6327bb6bce"). InnerVolumeSpecName "kube-api-access-6wdb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.806927 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-config" (OuterVolumeSpecName: "config") pod "30effec6-7cdf-4ef1-b828-ff6327bb6bce" (UID: "30effec6-7cdf-4ef1-b828-ff6327bb6bce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.819106 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-config\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.819140 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wdb8\" (UniqueName: \"kubernetes.io/projected/30effec6-7cdf-4ef1-b828-ff6327bb6bce-kube-api-access-6wdb8\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.819642 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "30effec6-7cdf-4ef1-b828-ff6327bb6bce" (UID: "30effec6-7cdf-4ef1-b828-ff6327bb6bce"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.825897 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "30effec6-7cdf-4ef1-b828-ff6327bb6bce" (UID: "30effec6-7cdf-4ef1-b828-ff6327bb6bce"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.828853 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "30effec6-7cdf-4ef1-b828-ff6327bb6bce" (UID: "30effec6-7cdf-4ef1-b828-ff6327bb6bce"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.893464 4795 generic.go:334] "Generic (PLEG): container finished" podID="30effec6-7cdf-4ef1-b828-ff6327bb6bce" containerID="971e5c0737888f0489bc10f6f20325ece6b9f13c0a3485472032845f358aba5b" exitCode=0 Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.893501 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" event={"ID":"30effec6-7cdf-4ef1-b828-ff6327bb6bce","Type":"ContainerDied","Data":"971e5c0737888f0489bc10f6f20325ece6b9f13c0a3485472032845f358aba5b"} Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.893525 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" event={"ID":"30effec6-7cdf-4ef1-b828-ff6327bb6bce","Type":"ContainerDied","Data":"78d47d6b1b605f27ba7148a48207891b6a8dfb70fcf1cbffd1a75f4272077818"} Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.893543 4795 scope.go:117] "RemoveContainer" containerID="971e5c0737888f0489bc10f6f20325ece6b9f13c0a3485472032845f358aba5b" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.893654 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.917312 4795 scope.go:117] "RemoveContainer" containerID="1b8149ccac8d7737b0dd0a0f63b35ed22eb45afb237991b5cd443f37a9b00a52" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.920457 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.920482 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.920492 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.935335 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c4f55b9f-ltwn7"] Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.946958 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c4f55b9f-ltwn7"] Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.962062 4795 scope.go:117] "RemoveContainer" containerID="971e5c0737888f0489bc10f6f20325ece6b9f13c0a3485472032845f358aba5b" Feb 19 23:10:23 crc kubenswrapper[4795]: E0219 23:10:23.962676 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"971e5c0737888f0489bc10f6f20325ece6b9f13c0a3485472032845f358aba5b\": container with ID starting with 971e5c0737888f0489bc10f6f20325ece6b9f13c0a3485472032845f358aba5b not found: ID does not exist" containerID="971e5c0737888f0489bc10f6f20325ece6b9f13c0a3485472032845f358aba5b" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.962803 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"971e5c0737888f0489bc10f6f20325ece6b9f13c0a3485472032845f358aba5b"} err="failed to get container status \"971e5c0737888f0489bc10f6f20325ece6b9f13c0a3485472032845f358aba5b\": rpc error: code = NotFound desc = could not find container \"971e5c0737888f0489bc10f6f20325ece6b9f13c0a3485472032845f358aba5b\": container with ID starting with 971e5c0737888f0489bc10f6f20325ece6b9f13c0a3485472032845f358aba5b not found: ID does not exist" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.962901 4795 scope.go:117] "RemoveContainer" containerID="1b8149ccac8d7737b0dd0a0f63b35ed22eb45afb237991b5cd443f37a9b00a52" Feb 19 23:10:23 crc kubenswrapper[4795]: E0219 23:10:23.963484 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b8149ccac8d7737b0dd0a0f63b35ed22eb45afb237991b5cd443f37a9b00a52\": container with ID starting with 1b8149ccac8d7737b0dd0a0f63b35ed22eb45afb237991b5cd443f37a9b00a52 not found: ID does not exist" containerID="1b8149ccac8d7737b0dd0a0f63b35ed22eb45afb237991b5cd443f37a9b00a52" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.963589 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b8149ccac8d7737b0dd0a0f63b35ed22eb45afb237991b5cd443f37a9b00a52"} err="failed to get container status \"1b8149ccac8d7737b0dd0a0f63b35ed22eb45afb237991b5cd443f37a9b00a52\": rpc error: code = NotFound desc = could not find container \"1b8149ccac8d7737b0dd0a0f63b35ed22eb45afb237991b5cd443f37a9b00a52\": container with ID starting with 1b8149ccac8d7737b0dd0a0f63b35ed22eb45afb237991b5cd443f37a9b00a52 not found: ID does not exist" Feb 19 23:10:24 crc kubenswrapper[4795]: I0219 23:10:24.003848 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67d97dc55-pvbjd"] Feb 19 23:10:24 crc kubenswrapper[4795]: I0219 23:10:24.907772 4795 generic.go:334] "Generic (PLEG): container finished" podID="f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0" containerID="257ca0891e832a13103416a629b50498355f02d46fc45d6dd4a1d4893c3d0e68" exitCode=0 Feb 19 23:10:24 crc kubenswrapper[4795]: I0219 23:10:24.907889 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" event={"ID":"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0","Type":"ContainerDied","Data":"257ca0891e832a13103416a629b50498355f02d46fc45d6dd4a1d4893c3d0e68"} Feb 19 23:10:24 crc kubenswrapper[4795]: I0219 23:10:24.908141 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" event={"ID":"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0","Type":"ContainerStarted","Data":"26c39c8d6d5c335161f6b2b6d3ebdc63222bb1f8be5c31d6aa4639cd5621bb72"} Feb 19 23:10:25 crc kubenswrapper[4795]: I0219 23:10:25.524819 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30effec6-7cdf-4ef1-b828-ff6327bb6bce" path="/var/lib/kubelet/pods/30effec6-7cdf-4ef1-b828-ff6327bb6bce/volumes" Feb 19 23:10:25 crc kubenswrapper[4795]: I0219 23:10:25.920610 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" event={"ID":"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0","Type":"ContainerStarted","Data":"867b817d4ada73b53e336ceb69eb248dff3e7fbc04feedd0aa95fb5b38aa93ae"} Feb 19 23:10:25 crc kubenswrapper[4795]: I0219 23:10:25.920743 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:25 crc kubenswrapper[4795]: I0219 23:10:25.947651 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" podStartSLOduration=2.947618408 podStartE2EDuration="2.947618408s" podCreationTimestamp="2026-02-19 23:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:10:25.941224238 +0000 UTC m=+6137.133742122" watchObservedRunningTime="2026-02-19 23:10:25.947618408 +0000 UTC m=+6137.140136292" Feb 19 23:10:26 crc kubenswrapper[4795]: I0219 23:10:26.234344 4795 scope.go:117] "RemoveContainer" containerID="dc19c0a9eee505e65f65e0357256fb1d1ef9373c082944f9697154c21d026163" Feb 19 23:10:26 crc kubenswrapper[4795]: I0219 23:10:26.261334 4795 scope.go:117] "RemoveContainer" containerID="87dbef2ca275f0dcf2d6fcb445371c808cbb03bd9ce8927982987b0e668dfab9" Feb 19 23:10:26 crc kubenswrapper[4795]: I0219 23:10:26.354421 4795 scope.go:117] "RemoveContainer" containerID="ad3831db9a9f6e12a5ac1eb158cce87c770b2e137cadf64a7ed412c1aedd54c2" Feb 19 23:10:28 crc kubenswrapper[4795]: I0219 23:10:28.427667 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:10:28 crc kubenswrapper[4795]: I0219 23:10:28.429819 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:10:28 crc kubenswrapper[4795]: I0219 23:10:28.429972 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 23:10:28 crc kubenswrapper[4795]: I0219 23:10:28.431077 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b0fe96e51c4e702a3b8fdcd9d997ef35d626772b563d6c998f84fa6863685a9d"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 23:10:28 crc kubenswrapper[4795]: I0219 23:10:28.431262 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://b0fe96e51c4e702a3b8fdcd9d997ef35d626772b563d6c998f84fa6863685a9d" gracePeriod=600 Feb 19 23:10:28 crc kubenswrapper[4795]: I0219 23:10:28.988021 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="b0fe96e51c4e702a3b8fdcd9d997ef35d626772b563d6c998f84fa6863685a9d" exitCode=0 Feb 19 23:10:28 crc kubenswrapper[4795]: I0219 23:10:28.988126 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"b0fe96e51c4e702a3b8fdcd9d997ef35d626772b563d6c998f84fa6863685a9d"} Feb 19 23:10:28 crc kubenswrapper[4795]: I0219 23:10:28.988444 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4"} Feb 19 23:10:28 crc kubenswrapper[4795]: I0219 23:10:28.988477 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:10:33 crc kubenswrapper[4795]: I0219 23:10:33.567006 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:33 crc kubenswrapper[4795]: I0219 23:10:33.648410 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-857b684d55-kkmvk"] Feb 19 23:10:33 crc kubenswrapper[4795]: I0219 23:10:33.649391 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-857b684d55-kkmvk" podUID="b3d4a3d4-7002-422c-af86-2500e4c15e0b" containerName="dnsmasq-dns" containerID="cri-o://6a99b05d5f7a67497a684c2cb94a28dfa50917779aaccd71fbb28d10ab5f2cd7" gracePeriod=10 Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.050301 4795 generic.go:334] "Generic (PLEG): container finished" podID="b3d4a3d4-7002-422c-af86-2500e4c15e0b" containerID="6a99b05d5f7a67497a684c2cb94a28dfa50917779aaccd71fbb28d10ab5f2cd7" exitCode=0 Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.050647 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-857b684d55-kkmvk" event={"ID":"b3d4a3d4-7002-422c-af86-2500e4c15e0b","Type":"ContainerDied","Data":"6a99b05d5f7a67497a684c2cb94a28dfa50917779aaccd71fbb28d10ab5f2cd7"} Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.160829 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.301011 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-ovsdbserver-sb\") pod \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.301137 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-openstack-cell1\") pod \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.301304 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-dns-svc\") pod \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.301378 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-ovsdbserver-nb\") pod \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.301463 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-config\") pod \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.301513 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t5jz\" (UniqueName: \"kubernetes.io/projected/b3d4a3d4-7002-422c-af86-2500e4c15e0b-kube-api-access-9t5jz\") pod \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.311476 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3d4a3d4-7002-422c-af86-2500e4c15e0b-kube-api-access-9t5jz" (OuterVolumeSpecName: "kube-api-access-9t5jz") pod "b3d4a3d4-7002-422c-af86-2500e4c15e0b" (UID: "b3d4a3d4-7002-422c-af86-2500e4c15e0b"). InnerVolumeSpecName "kube-api-access-9t5jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.368691 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b3d4a3d4-7002-422c-af86-2500e4c15e0b" (UID: "b3d4a3d4-7002-422c-af86-2500e4c15e0b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.373411 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b3d4a3d4-7002-422c-af86-2500e4c15e0b" (UID: "b3d4a3d4-7002-422c-af86-2500e4c15e0b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.383566 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b3d4a3d4-7002-422c-af86-2500e4c15e0b" (UID: "b3d4a3d4-7002-422c-af86-2500e4c15e0b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.389712 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "b3d4a3d4-7002-422c-af86-2500e4c15e0b" (UID: "b3d4a3d4-7002-422c-af86-2500e4c15e0b"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.398357 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-config" (OuterVolumeSpecName: "config") pod "b3d4a3d4-7002-422c-af86-2500e4c15e0b" (UID: "b3d4a3d4-7002-422c-af86-2500e4c15e0b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.404529 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.404561 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.404572 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.404585 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-config\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.404597 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t5jz\" (UniqueName: \"kubernetes.io/projected/b3d4a3d4-7002-422c-af86-2500e4c15e0b-kube-api-access-9t5jz\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.404610 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:35 crc kubenswrapper[4795]: I0219 23:10:35.063313 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-857b684d55-kkmvk" event={"ID":"b3d4a3d4-7002-422c-af86-2500e4c15e0b","Type":"ContainerDied","Data":"330f81a91d8abda42411ed5c57208925e27d58bc7869c4940f8759238ea316af"} Feb 19 23:10:35 crc kubenswrapper[4795]: I0219 23:10:35.064050 4795 scope.go:117] "RemoveContainer" containerID="6a99b05d5f7a67497a684c2cb94a28dfa50917779aaccd71fbb28d10ab5f2cd7" Feb 19 23:10:35 crc kubenswrapper[4795]: I0219 23:10:35.063683 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:35 crc kubenswrapper[4795]: I0219 23:10:35.095300 4795 scope.go:117] "RemoveContainer" containerID="2e01aa79b584e456d7bb177df7e3c0e3d240599ed160c0a0253a370986182831" Feb 19 23:10:35 crc kubenswrapper[4795]: I0219 23:10:35.122190 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-857b684d55-kkmvk"] Feb 19 23:10:35 crc kubenswrapper[4795]: I0219 23:10:35.138698 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-857b684d55-kkmvk"] Feb 19 23:10:35 crc kubenswrapper[4795]: I0219 23:10:35.522594 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3d4a3d4-7002-422c-af86-2500e4c15e0b" path="/var/lib/kubelet/pods/b3d4a3d4-7002-422c-af86-2500e4c15e0b/volumes" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.708472 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8dlb8"] Feb 19 23:10:36 crc kubenswrapper[4795]: E0219 23:10:36.709320 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d4a3d4-7002-422c-af86-2500e4c15e0b" containerName="init" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.709335 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d4a3d4-7002-422c-af86-2500e4c15e0b" containerName="init" Feb 19 23:10:36 crc kubenswrapper[4795]: E0219 23:10:36.709369 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30effec6-7cdf-4ef1-b828-ff6327bb6bce" containerName="init" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.709375 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="30effec6-7cdf-4ef1-b828-ff6327bb6bce" containerName="init" Feb 19 23:10:36 crc kubenswrapper[4795]: E0219 23:10:36.709387 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30effec6-7cdf-4ef1-b828-ff6327bb6bce" containerName="dnsmasq-dns" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.709392 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="30effec6-7cdf-4ef1-b828-ff6327bb6bce" containerName="dnsmasq-dns" Feb 19 23:10:36 crc kubenswrapper[4795]: E0219 23:10:36.709411 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d4a3d4-7002-422c-af86-2500e4c15e0b" containerName="dnsmasq-dns" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.709417 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d4a3d4-7002-422c-af86-2500e4c15e0b" containerName="dnsmasq-dns" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.709602 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3d4a3d4-7002-422c-af86-2500e4c15e0b" containerName="dnsmasq-dns" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.709623 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="30effec6-7cdf-4ef1-b828-ff6327bb6bce" containerName="dnsmasq-dns" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.711102 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.723341 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8dlb8"] Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.762947 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bc48201-bd6d-4727-90c2-562889c16c68-catalog-content\") pod \"community-operators-8dlb8\" (UID: \"2bc48201-bd6d-4727-90c2-562889c16c68\") " pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.763261 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bc48201-bd6d-4727-90c2-562889c16c68-utilities\") pod \"community-operators-8dlb8\" (UID: \"2bc48201-bd6d-4727-90c2-562889c16c68\") " pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.763472 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvpl5\" (UniqueName: \"kubernetes.io/projected/2bc48201-bd6d-4727-90c2-562889c16c68-kube-api-access-qvpl5\") pod \"community-operators-8dlb8\" (UID: \"2bc48201-bd6d-4727-90c2-562889c16c68\") " pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.864851 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bc48201-bd6d-4727-90c2-562889c16c68-catalog-content\") pod \"community-operators-8dlb8\" (UID: \"2bc48201-bd6d-4727-90c2-562889c16c68\") " pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.865612 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bc48201-bd6d-4727-90c2-562889c16c68-utilities\") pod \"community-operators-8dlb8\" (UID: \"2bc48201-bd6d-4727-90c2-562889c16c68\") " pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.865317 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bc48201-bd6d-4727-90c2-562889c16c68-catalog-content\") pod \"community-operators-8dlb8\" (UID: \"2bc48201-bd6d-4727-90c2-562889c16c68\") " pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.865885 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bc48201-bd6d-4727-90c2-562889c16c68-utilities\") pod \"community-operators-8dlb8\" (UID: \"2bc48201-bd6d-4727-90c2-562889c16c68\") " pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.866262 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvpl5\" (UniqueName: \"kubernetes.io/projected/2bc48201-bd6d-4727-90c2-562889c16c68-kube-api-access-qvpl5\") pod \"community-operators-8dlb8\" (UID: \"2bc48201-bd6d-4727-90c2-562889c16c68\") " pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.887773 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvpl5\" (UniqueName: \"kubernetes.io/projected/2bc48201-bd6d-4727-90c2-562889c16c68-kube-api-access-qvpl5\") pod \"community-operators-8dlb8\" (UID: \"2bc48201-bd6d-4727-90c2-562889c16c68\") " pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:37 crc kubenswrapper[4795]: I0219 23:10:37.029866 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:37 crc kubenswrapper[4795]: I0219 23:10:37.509861 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8dlb8"] Feb 19 23:10:37 crc kubenswrapper[4795]: W0219 23:10:37.517883 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bc48201_bd6d_4727_90c2_562889c16c68.slice/crio-06a96660172a86537e02ded953308945bb0365e63bf489599714d5e3a40fa70e WatchSource:0}: Error finding container 06a96660172a86537e02ded953308945bb0365e63bf489599714d5e3a40fa70e: Status 404 returned error can't find the container with id 06a96660172a86537e02ded953308945bb0365e63bf489599714d5e3a40fa70e Feb 19 23:10:38 crc kubenswrapper[4795]: I0219 23:10:38.090958 4795 generic.go:334] "Generic (PLEG): container finished" podID="2bc48201-bd6d-4727-90c2-562889c16c68" containerID="6a05abe6f4069b29be6b5f5fcbf8ded158fa515f69f5981aa82331e623d393e5" exitCode=0 Feb 19 23:10:38 crc kubenswrapper[4795]: I0219 23:10:38.091014 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8dlb8" event={"ID":"2bc48201-bd6d-4727-90c2-562889c16c68","Type":"ContainerDied","Data":"6a05abe6f4069b29be6b5f5fcbf8ded158fa515f69f5981aa82331e623d393e5"} Feb 19 23:10:38 crc kubenswrapper[4795]: I0219 23:10:38.091374 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8dlb8" event={"ID":"2bc48201-bd6d-4727-90c2-562889c16c68","Type":"ContainerStarted","Data":"06a96660172a86537e02ded953308945bb0365e63bf489599714d5e3a40fa70e"} Feb 19 23:10:39 crc kubenswrapper[4795]: I0219 23:10:39.102123 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8dlb8" event={"ID":"2bc48201-bd6d-4727-90c2-562889c16c68","Type":"ContainerStarted","Data":"eb261c992c6e6e012c7b1df86458acbc0994a9998ef4e3b76a32dcc8a4e0e786"} Feb 19 23:10:41 crc kubenswrapper[4795]: I0219 23:10:41.123758 4795 generic.go:334] "Generic (PLEG): container finished" podID="2bc48201-bd6d-4727-90c2-562889c16c68" containerID="eb261c992c6e6e012c7b1df86458acbc0994a9998ef4e3b76a32dcc8a4e0e786" exitCode=0 Feb 19 23:10:41 crc kubenswrapper[4795]: I0219 23:10:41.123831 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8dlb8" event={"ID":"2bc48201-bd6d-4727-90c2-562889c16c68","Type":"ContainerDied","Data":"eb261c992c6e6e012c7b1df86458acbc0994a9998ef4e3b76a32dcc8a4e0e786"} Feb 19 23:10:42 crc kubenswrapper[4795]: I0219 23:10:42.136905 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8dlb8" event={"ID":"2bc48201-bd6d-4727-90c2-562889c16c68","Type":"ContainerStarted","Data":"3128b7f06c3d841b4f1d4173d5bffe26bb16b404b8ef664155c2167e6d730fa0"} Feb 19 23:10:42 crc kubenswrapper[4795]: I0219 23:10:42.167279 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8dlb8" podStartSLOduration=2.758856389 podStartE2EDuration="6.167261517s" podCreationTimestamp="2026-02-19 23:10:36 +0000 UTC" firstStartedPulling="2026-02-19 23:10:38.093256316 +0000 UTC m=+6149.285774180" lastFinishedPulling="2026-02-19 23:10:41.501661444 +0000 UTC m=+6152.694179308" observedRunningTime="2026-02-19 23:10:42.157887877 +0000 UTC m=+6153.350405741" watchObservedRunningTime="2026-02-19 23:10:42.167261517 +0000 UTC m=+6153.359779381" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.715082 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n"] Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.717016 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.719408 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.719531 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.719644 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.719693 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.733395 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n"] Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.823497 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.823538 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s97d\" (UniqueName: \"kubernetes.io/projected/7ba2e854-6881-4f7f-8068-7abf4df26229-kube-api-access-2s97d\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.823607 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.823709 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.823736 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.925673 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.926008 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.926637 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.926664 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s97d\" (UniqueName: \"kubernetes.io/projected/7ba2e854-6881-4f7f-8068-7abf4df26229-kube-api-access-2s97d\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.926719 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.931935 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.933979 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.934464 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.946446 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s97d\" (UniqueName: \"kubernetes.io/projected/7ba2e854-6881-4f7f-8068-7abf4df26229-kube-api-access-2s97d\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.949024 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:45 crc kubenswrapper[4795]: I0219 23:10:45.035530 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:45 crc kubenswrapper[4795]: W0219 23:10:45.616688 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ba2e854_6881_4f7f_8068_7abf4df26229.slice/crio-c4bb08ee4225aef2472cbe0144f71bf546c4fee5554133f74513ed9dda24f9cf WatchSource:0}: Error finding container c4bb08ee4225aef2472cbe0144f71bf546c4fee5554133f74513ed9dda24f9cf: Status 404 returned error can't find the container with id c4bb08ee4225aef2472cbe0144f71bf546c4fee5554133f74513ed9dda24f9cf Feb 19 23:10:45 crc kubenswrapper[4795]: I0219 23:10:45.620887 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n"] Feb 19 23:10:46 crc kubenswrapper[4795]: I0219 23:10:46.176345 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" event={"ID":"7ba2e854-6881-4f7f-8068-7abf4df26229","Type":"ContainerStarted","Data":"c4bb08ee4225aef2472cbe0144f71bf546c4fee5554133f74513ed9dda24f9cf"} Feb 19 23:10:47 crc kubenswrapper[4795]: I0219 23:10:47.030737 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:47 crc kubenswrapper[4795]: I0219 23:10:47.031081 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:47 crc kubenswrapper[4795]: I0219 23:10:47.100877 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:47 crc kubenswrapper[4795]: I0219 23:10:47.236464 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:47 crc kubenswrapper[4795]: I0219 23:10:47.342066 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8dlb8"] Feb 19 23:10:49 crc kubenswrapper[4795]: I0219 23:10:49.203759 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8dlb8" podUID="2bc48201-bd6d-4727-90c2-562889c16c68" containerName="registry-server" containerID="cri-o://3128b7f06c3d841b4f1d4173d5bffe26bb16b404b8ef664155c2167e6d730fa0" gracePeriod=2 Feb 19 23:10:50 crc kubenswrapper[4795]: I0219 23:10:50.215846 4795 generic.go:334] "Generic (PLEG): container finished" podID="2bc48201-bd6d-4727-90c2-562889c16c68" containerID="3128b7f06c3d841b4f1d4173d5bffe26bb16b404b8ef664155c2167e6d730fa0" exitCode=0 Feb 19 23:10:50 crc kubenswrapper[4795]: I0219 23:10:50.215898 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8dlb8" event={"ID":"2bc48201-bd6d-4727-90c2-562889c16c68","Type":"ContainerDied","Data":"3128b7f06c3d841b4f1d4173d5bffe26bb16b404b8ef664155c2167e6d730fa0"} Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.648697 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.776722 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bc48201-bd6d-4727-90c2-562889c16c68-catalog-content\") pod \"2bc48201-bd6d-4727-90c2-562889c16c68\" (UID: \"2bc48201-bd6d-4727-90c2-562889c16c68\") " Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.776916 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bc48201-bd6d-4727-90c2-562889c16c68-utilities\") pod \"2bc48201-bd6d-4727-90c2-562889c16c68\" (UID: \"2bc48201-bd6d-4727-90c2-562889c16c68\") " Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.777058 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvpl5\" (UniqueName: \"kubernetes.io/projected/2bc48201-bd6d-4727-90c2-562889c16c68-kube-api-access-qvpl5\") pod \"2bc48201-bd6d-4727-90c2-562889c16c68\" (UID: \"2bc48201-bd6d-4727-90c2-562889c16c68\") " Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.779287 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bc48201-bd6d-4727-90c2-562889c16c68-utilities" (OuterVolumeSpecName: "utilities") pod "2bc48201-bd6d-4727-90c2-562889c16c68" (UID: "2bc48201-bd6d-4727-90c2-562889c16c68"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.781260 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bc48201-bd6d-4727-90c2-562889c16c68-kube-api-access-qvpl5" (OuterVolumeSpecName: "kube-api-access-qvpl5") pod "2bc48201-bd6d-4727-90c2-562889c16c68" (UID: "2bc48201-bd6d-4727-90c2-562889c16c68"). InnerVolumeSpecName "kube-api-access-qvpl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.823299 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bc48201-bd6d-4727-90c2-562889c16c68-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2bc48201-bd6d-4727-90c2-562889c16c68" (UID: "2bc48201-bd6d-4727-90c2-562889c16c68"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.880962 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bc48201-bd6d-4727-90c2-562889c16c68-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.881238 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvpl5\" (UniqueName: \"kubernetes.io/projected/2bc48201-bd6d-4727-90c2-562889c16c68-kube-api-access-qvpl5\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.881322 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bc48201-bd6d-4727-90c2-562889c16c68-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.916474 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m4x8w"] Feb 19 23:10:55 crc kubenswrapper[4795]: E0219 23:10:55.917463 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bc48201-bd6d-4727-90c2-562889c16c68" containerName="registry-server" Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.917504 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bc48201-bd6d-4727-90c2-562889c16c68" containerName="registry-server" Feb 19 23:10:55 crc kubenswrapper[4795]: E0219 23:10:55.917522 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bc48201-bd6d-4727-90c2-562889c16c68" containerName="extract-content" Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.917529 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bc48201-bd6d-4727-90c2-562889c16c68" containerName="extract-content" Feb 19 23:10:55 crc kubenswrapper[4795]: E0219 23:10:55.917574 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bc48201-bd6d-4727-90c2-562889c16c68" containerName="extract-utilities" Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.917582 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bc48201-bd6d-4727-90c2-562889c16c68" containerName="extract-utilities" Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.918059 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bc48201-bd6d-4727-90c2-562889c16c68" containerName="registry-server" Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.920727 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.937497 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m4x8w"] Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.084584 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frs6x\" (UniqueName: \"kubernetes.io/projected/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-kube-api-access-frs6x\") pod \"redhat-operators-m4x8w\" (UID: \"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15\") " pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.084626 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-utilities\") pod \"redhat-operators-m4x8w\" (UID: \"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15\") " pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.084650 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-catalog-content\") pod \"redhat-operators-m4x8w\" (UID: \"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15\") " pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.187034 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frs6x\" (UniqueName: \"kubernetes.io/projected/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-kube-api-access-frs6x\") pod \"redhat-operators-m4x8w\" (UID: \"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15\") " pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.187093 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-utilities\") pod \"redhat-operators-m4x8w\" (UID: \"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15\") " pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.187113 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-catalog-content\") pod \"redhat-operators-m4x8w\" (UID: \"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15\") " pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.187931 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-catalog-content\") pod \"redhat-operators-m4x8w\" (UID: \"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15\") " pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.188134 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-utilities\") pod \"redhat-operators-m4x8w\" (UID: \"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15\") " pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.202304 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frs6x\" (UniqueName: \"kubernetes.io/projected/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-kube-api-access-frs6x\") pod \"redhat-operators-m4x8w\" (UID: \"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15\") " pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.244796 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.284485 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" event={"ID":"7ba2e854-6881-4f7f-8068-7abf4df26229","Type":"ContainerStarted","Data":"bcded33b25e3ae27e84f77a3f23c65d96a1486e81879f51358b255339adf4627"} Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.290661 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8dlb8" event={"ID":"2bc48201-bd6d-4727-90c2-562889c16c68","Type":"ContainerDied","Data":"06a96660172a86537e02ded953308945bb0365e63bf489599714d5e3a40fa70e"} Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.290705 4795 scope.go:117] "RemoveContainer" containerID="3128b7f06c3d841b4f1d4173d5bffe26bb16b404b8ef664155c2167e6d730fa0" Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.290777 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.304796 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" podStartSLOduration=2.5750680299999997 podStartE2EDuration="12.304776082s" podCreationTimestamp="2026-02-19 23:10:44 +0000 UTC" firstStartedPulling="2026-02-19 23:10:45.619545333 +0000 UTC m=+6156.812063207" lastFinishedPulling="2026-02-19 23:10:55.349253405 +0000 UTC m=+6166.541771259" observedRunningTime="2026-02-19 23:10:56.299372578 +0000 UTC m=+6167.491890442" watchObservedRunningTime="2026-02-19 23:10:56.304776082 +0000 UTC m=+6167.497293946" Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.406942 4795 scope.go:117] "RemoveContainer" containerID="eb261c992c6e6e012c7b1df86458acbc0994a9998ef4e3b76a32dcc8a4e0e786" Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.427490 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8dlb8"] Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.441323 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8dlb8"] Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.464844 4795 scope.go:117] "RemoveContainer" containerID="6a05abe6f4069b29be6b5f5fcbf8ded158fa515f69f5981aa82331e623d393e5" Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.807021 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m4x8w"] Feb 19 23:10:57 crc kubenswrapper[4795]: I0219 23:10:57.301799 4795 generic.go:334] "Generic (PLEG): container finished" podID="f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" containerID="2559741c12229f925dd1d5e142ff010ee82861e5a3b7c05df0b350fec30d058b" exitCode=0 Feb 19 23:10:57 crc kubenswrapper[4795]: I0219 23:10:57.302000 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m4x8w" event={"ID":"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15","Type":"ContainerDied","Data":"2559741c12229f925dd1d5e142ff010ee82861e5a3b7c05df0b350fec30d058b"} Feb 19 23:10:57 crc kubenswrapper[4795]: I0219 23:10:57.302145 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m4x8w" event={"ID":"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15","Type":"ContainerStarted","Data":"69a009f19892e689d5d7710989b3ec9c3b063552425a36b7ffeb2388d0a9aa9a"} Feb 19 23:10:57 crc kubenswrapper[4795]: I0219 23:10:57.526953 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bc48201-bd6d-4727-90c2-562889c16c68" path="/var/lib/kubelet/pods/2bc48201-bd6d-4727-90c2-562889c16c68/volumes" Feb 19 23:10:58 crc kubenswrapper[4795]: I0219 23:10:58.313339 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m4x8w" event={"ID":"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15","Type":"ContainerStarted","Data":"cb888e3e269a72b86567197db47985a0e1a268957c3e6bff15a8167cb9638671"} Feb 19 23:11:02 crc kubenswrapper[4795]: I0219 23:11:02.355466 4795 generic.go:334] "Generic (PLEG): container finished" podID="f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" containerID="cb888e3e269a72b86567197db47985a0e1a268957c3e6bff15a8167cb9638671" exitCode=0 Feb 19 23:11:02 crc kubenswrapper[4795]: I0219 23:11:02.356133 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m4x8w" event={"ID":"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15","Type":"ContainerDied","Data":"cb888e3e269a72b86567197db47985a0e1a268957c3e6bff15a8167cb9638671"} Feb 19 23:11:03 crc kubenswrapper[4795]: I0219 23:11:03.367718 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m4x8w" event={"ID":"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15","Type":"ContainerStarted","Data":"f4746fc282e0aeb2af21a6ee7597a1bbd1b787b9a31e5c2b02895def99620fe4"} Feb 19 23:11:03 crc kubenswrapper[4795]: I0219 23:11:03.387204 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m4x8w" podStartSLOduration=2.908654414 podStartE2EDuration="8.387182364s" podCreationTimestamp="2026-02-19 23:10:55 +0000 UTC" firstStartedPulling="2026-02-19 23:10:57.304428956 +0000 UTC m=+6168.496946820" lastFinishedPulling="2026-02-19 23:11:02.782956906 +0000 UTC m=+6173.975474770" observedRunningTime="2026-02-19 23:11:03.382569231 +0000 UTC m=+6174.575087095" watchObservedRunningTime="2026-02-19 23:11:03.387182364 +0000 UTC m=+6174.579700228" Feb 19 23:11:06 crc kubenswrapper[4795]: I0219 23:11:06.245412 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:11:06 crc kubenswrapper[4795]: I0219 23:11:06.247033 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:11:07 crc kubenswrapper[4795]: I0219 23:11:07.300997 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m4x8w" podUID="f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" containerName="registry-server" probeResult="failure" output=< Feb 19 23:11:07 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Feb 19 23:11:07 crc kubenswrapper[4795]: > Feb 19 23:11:08 crc kubenswrapper[4795]: I0219 23:11:08.420778 4795 generic.go:334] "Generic (PLEG): container finished" podID="7ba2e854-6881-4f7f-8068-7abf4df26229" containerID="bcded33b25e3ae27e84f77a3f23c65d96a1486e81879f51358b255339adf4627" exitCode=0 Feb 19 23:11:08 crc kubenswrapper[4795]: I0219 23:11:08.420861 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" event={"ID":"7ba2e854-6881-4f7f-8068-7abf4df26229","Type":"ContainerDied","Data":"bcded33b25e3ae27e84f77a3f23c65d96a1486e81879f51358b255339adf4627"} Feb 19 23:11:09 crc kubenswrapper[4795]: I0219 23:11:09.875617 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:11:09 crc kubenswrapper[4795]: I0219 23:11:09.983008 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-ceph\") pod \"7ba2e854-6881-4f7f-8068-7abf4df26229\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " Feb 19 23:11:09 crc kubenswrapper[4795]: I0219 23:11:09.983408 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-ssh-key-openstack-cell1\") pod \"7ba2e854-6881-4f7f-8068-7abf4df26229\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " Feb 19 23:11:09 crc kubenswrapper[4795]: I0219 23:11:09.983467 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-pre-adoption-validation-combined-ca-bundle\") pod \"7ba2e854-6881-4f7f-8068-7abf4df26229\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " Feb 19 23:11:09 crc kubenswrapper[4795]: I0219 23:11:09.983559 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2s97d\" (UniqueName: \"kubernetes.io/projected/7ba2e854-6881-4f7f-8068-7abf4df26229-kube-api-access-2s97d\") pod \"7ba2e854-6881-4f7f-8068-7abf4df26229\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " Feb 19 23:11:09 crc kubenswrapper[4795]: I0219 23:11:09.983673 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-inventory\") pod \"7ba2e854-6881-4f7f-8068-7abf4df26229\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " Feb 19 23:11:09 crc kubenswrapper[4795]: I0219 23:11:09.991415 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "7ba2e854-6881-4f7f-8068-7abf4df26229" (UID: "7ba2e854-6881-4f7f-8068-7abf4df26229"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:11:10 crc kubenswrapper[4795]: I0219 23:11:10.003869 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ba2e854-6881-4f7f-8068-7abf4df26229-kube-api-access-2s97d" (OuterVolumeSpecName: "kube-api-access-2s97d") pod "7ba2e854-6881-4f7f-8068-7abf4df26229" (UID: "7ba2e854-6881-4f7f-8068-7abf4df26229"). InnerVolumeSpecName "kube-api-access-2s97d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:11:10 crc kubenswrapper[4795]: I0219 23:11:10.004185 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-ceph" (OuterVolumeSpecName: "ceph") pod "7ba2e854-6881-4f7f-8068-7abf4df26229" (UID: "7ba2e854-6881-4f7f-8068-7abf4df26229"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:11:10 crc kubenswrapper[4795]: I0219 23:11:10.014349 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "7ba2e854-6881-4f7f-8068-7abf4df26229" (UID: "7ba2e854-6881-4f7f-8068-7abf4df26229"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:11:10 crc kubenswrapper[4795]: I0219 23:11:10.019326 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-inventory" (OuterVolumeSpecName: "inventory") pod "7ba2e854-6881-4f7f-8068-7abf4df26229" (UID: "7ba2e854-6881-4f7f-8068-7abf4df26229"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:11:10 crc kubenswrapper[4795]: I0219 23:11:10.087044 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:10 crc kubenswrapper[4795]: I0219 23:11:10.087085 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:10 crc kubenswrapper[4795]: I0219 23:11:10.087120 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:10 crc kubenswrapper[4795]: I0219 23:11:10.087137 4795 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:10 crc kubenswrapper[4795]: I0219 23:11:10.087150 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2s97d\" (UniqueName: \"kubernetes.io/projected/7ba2e854-6881-4f7f-8068-7abf4df26229-kube-api-access-2s97d\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:10 crc kubenswrapper[4795]: I0219 23:11:10.440354 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" event={"ID":"7ba2e854-6881-4f7f-8068-7abf4df26229","Type":"ContainerDied","Data":"c4bb08ee4225aef2472cbe0144f71bf546c4fee5554133f74513ed9dda24f9cf"} Feb 19 23:11:10 crc kubenswrapper[4795]: I0219 23:11:10.440404 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4bb08ee4225aef2472cbe0144f71bf546c4fee5554133f74513ed9dda24f9cf" Feb 19 23:11:10 crc kubenswrapper[4795]: I0219 23:11:10.440436 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:11:17 crc kubenswrapper[4795]: I0219 23:11:17.305862 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m4x8w" podUID="f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" containerName="registry-server" probeResult="failure" output=< Feb 19 23:11:17 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Feb 19 23:11:17 crc kubenswrapper[4795]: > Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.269732 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg"] Feb 19 23:11:18 crc kubenswrapper[4795]: E0219 23:11:18.270272 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ba2e854-6881-4f7f-8068-7abf4df26229" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.270297 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ba2e854-6881-4f7f-8068-7abf4df26229" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.270618 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ba2e854-6881-4f7f-8068-7abf4df26229" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.271562 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.272918 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh9wf\" (UniqueName: \"kubernetes.io/projected/c3cbdd11-d93f-4025-9c08-7530a68f6113-kube-api-access-hh9wf\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.273284 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.273394 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.273450 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.273475 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.274056 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.274501 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.276587 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.277974 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.279852 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg"] Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.375795 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh9wf\" (UniqueName: \"kubernetes.io/projected/c3cbdd11-d93f-4025-9c08-7530a68f6113-kube-api-access-hh9wf\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.375848 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.375897 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.375923 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.375941 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.381968 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.382980 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.382997 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.385985 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.392347 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh9wf\" (UniqueName: \"kubernetes.io/projected/c3cbdd11-d93f-4025-9c08-7530a68f6113-kube-api-access-hh9wf\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.600814 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:19 crc kubenswrapper[4795]: I0219 23:11:19.147811 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg"] Feb 19 23:11:19 crc kubenswrapper[4795]: I0219 23:11:19.524983 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" event={"ID":"c3cbdd11-d93f-4025-9c08-7530a68f6113","Type":"ContainerStarted","Data":"0c55dc0c067482da01178354ee4aefa7baf05a359c6d442c03755a2406d2ea95"} Feb 19 23:11:20 crc kubenswrapper[4795]: I0219 23:11:20.543296 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" event={"ID":"c3cbdd11-d93f-4025-9c08-7530a68f6113","Type":"ContainerStarted","Data":"587260c6871d5ac2af0ecde5796363d9253fc80e5d5b060bd7a191b9752a30d4"} Feb 19 23:11:27 crc kubenswrapper[4795]: I0219 23:11:27.293315 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m4x8w" podUID="f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" containerName="registry-server" probeResult="failure" output=< Feb 19 23:11:27 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Feb 19 23:11:27 crc kubenswrapper[4795]: > Feb 19 23:11:36 crc kubenswrapper[4795]: I0219 23:11:36.293090 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:11:36 crc kubenswrapper[4795]: I0219 23:11:36.317605 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" podStartSLOduration=17.850097687999998 podStartE2EDuration="18.317574423s" podCreationTimestamp="2026-02-19 23:11:18 +0000 UTC" firstStartedPulling="2026-02-19 23:11:19.159180975 +0000 UTC m=+6190.351698839" lastFinishedPulling="2026-02-19 23:11:19.62665771 +0000 UTC m=+6190.819175574" observedRunningTime="2026-02-19 23:11:20.569286294 +0000 UTC m=+6191.761804158" watchObservedRunningTime="2026-02-19 23:11:36.317574423 +0000 UTC m=+6207.510092297" Feb 19 23:11:36 crc kubenswrapper[4795]: I0219 23:11:36.356357 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:11:36 crc kubenswrapper[4795]: I0219 23:11:36.545827 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m4x8w"] Feb 19 23:11:37 crc kubenswrapper[4795]: I0219 23:11:37.703365 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m4x8w" podUID="f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" containerName="registry-server" containerID="cri-o://f4746fc282e0aeb2af21a6ee7597a1bbd1b787b9a31e5c2b02895def99620fe4" gracePeriod=2 Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.189085 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.233179 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-catalog-content\") pod \"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15\" (UID: \"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15\") " Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.233301 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-utilities\") pod \"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15\" (UID: \"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15\") " Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.233364 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frs6x\" (UniqueName: \"kubernetes.io/projected/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-kube-api-access-frs6x\") pod \"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15\" (UID: \"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15\") " Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.234672 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-utilities" (OuterVolumeSpecName: "utilities") pod "f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" (UID: "f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.238893 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-kube-api-access-frs6x" (OuterVolumeSpecName: "kube-api-access-frs6x") pod "f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" (UID: "f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15"). InnerVolumeSpecName "kube-api-access-frs6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.335811 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.335852 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frs6x\" (UniqueName: \"kubernetes.io/projected/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-kube-api-access-frs6x\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.350374 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" (UID: "f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.437493 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.712445 4795 generic.go:334] "Generic (PLEG): container finished" podID="f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" containerID="f4746fc282e0aeb2af21a6ee7597a1bbd1b787b9a31e5c2b02895def99620fe4" exitCode=0 Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.712486 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m4x8w" event={"ID":"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15","Type":"ContainerDied","Data":"f4746fc282e0aeb2af21a6ee7597a1bbd1b787b9a31e5c2b02895def99620fe4"} Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.712523 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.712538 4795 scope.go:117] "RemoveContainer" containerID="f4746fc282e0aeb2af21a6ee7597a1bbd1b787b9a31e5c2b02895def99620fe4" Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.712527 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m4x8w" event={"ID":"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15","Type":"ContainerDied","Data":"69a009f19892e689d5d7710989b3ec9c3b063552425a36b7ffeb2388d0a9aa9a"} Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.737991 4795 scope.go:117] "RemoveContainer" containerID="cb888e3e269a72b86567197db47985a0e1a268957c3e6bff15a8167cb9638671" Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.752594 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m4x8w"] Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.761455 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m4x8w"] Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.774056 4795 scope.go:117] "RemoveContainer" containerID="2559741c12229f925dd1d5e142ff010ee82861e5a3b7c05df0b350fec30d058b" Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.812442 4795 scope.go:117] "RemoveContainer" containerID="f4746fc282e0aeb2af21a6ee7597a1bbd1b787b9a31e5c2b02895def99620fe4" Feb 19 23:11:38 crc kubenswrapper[4795]: E0219 23:11:38.812829 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4746fc282e0aeb2af21a6ee7597a1bbd1b787b9a31e5c2b02895def99620fe4\": container with ID starting with f4746fc282e0aeb2af21a6ee7597a1bbd1b787b9a31e5c2b02895def99620fe4 not found: ID does not exist" containerID="f4746fc282e0aeb2af21a6ee7597a1bbd1b787b9a31e5c2b02895def99620fe4" Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.812872 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4746fc282e0aeb2af21a6ee7597a1bbd1b787b9a31e5c2b02895def99620fe4"} err="failed to get container status \"f4746fc282e0aeb2af21a6ee7597a1bbd1b787b9a31e5c2b02895def99620fe4\": rpc error: code = NotFound desc = could not find container \"f4746fc282e0aeb2af21a6ee7597a1bbd1b787b9a31e5c2b02895def99620fe4\": container with ID starting with f4746fc282e0aeb2af21a6ee7597a1bbd1b787b9a31e5c2b02895def99620fe4 not found: ID does not exist" Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.812899 4795 scope.go:117] "RemoveContainer" containerID="cb888e3e269a72b86567197db47985a0e1a268957c3e6bff15a8167cb9638671" Feb 19 23:11:38 crc kubenswrapper[4795]: E0219 23:11:38.813150 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb888e3e269a72b86567197db47985a0e1a268957c3e6bff15a8167cb9638671\": container with ID starting with cb888e3e269a72b86567197db47985a0e1a268957c3e6bff15a8167cb9638671 not found: ID does not exist" containerID="cb888e3e269a72b86567197db47985a0e1a268957c3e6bff15a8167cb9638671" Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.813187 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb888e3e269a72b86567197db47985a0e1a268957c3e6bff15a8167cb9638671"} err="failed to get container status \"cb888e3e269a72b86567197db47985a0e1a268957c3e6bff15a8167cb9638671\": rpc error: code = NotFound desc = could not find container \"cb888e3e269a72b86567197db47985a0e1a268957c3e6bff15a8167cb9638671\": container with ID starting with cb888e3e269a72b86567197db47985a0e1a268957c3e6bff15a8167cb9638671 not found: ID does not exist" Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.813202 4795 scope.go:117] "RemoveContainer" containerID="2559741c12229f925dd1d5e142ff010ee82861e5a3b7c05df0b350fec30d058b" Feb 19 23:11:38 crc kubenswrapper[4795]: E0219 23:11:38.813592 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2559741c12229f925dd1d5e142ff010ee82861e5a3b7c05df0b350fec30d058b\": container with ID starting with 2559741c12229f925dd1d5e142ff010ee82861e5a3b7c05df0b350fec30d058b not found: ID does not exist" containerID="2559741c12229f925dd1d5e142ff010ee82861e5a3b7c05df0b350fec30d058b" Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.813627 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2559741c12229f925dd1d5e142ff010ee82861e5a3b7c05df0b350fec30d058b"} err="failed to get container status \"2559741c12229f925dd1d5e142ff010ee82861e5a3b7c05df0b350fec30d058b\": rpc error: code = NotFound desc = could not find container \"2559741c12229f925dd1d5e142ff010ee82861e5a3b7c05df0b350fec30d058b\": container with ID starting with 2559741c12229f925dd1d5e142ff010ee82861e5a3b7c05df0b350fec30d058b not found: ID does not exist" Feb 19 23:11:39 crc kubenswrapper[4795]: I0219 23:11:39.523488 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" path="/var/lib/kubelet/pods/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15/volumes" Feb 19 23:11:54 crc kubenswrapper[4795]: I0219 23:11:54.936223 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mnftg"] Feb 19 23:11:54 crc kubenswrapper[4795]: E0219 23:11:54.937429 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" containerName="extract-utilities" Feb 19 23:11:54 crc kubenswrapper[4795]: I0219 23:11:54.937457 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" containerName="extract-utilities" Feb 19 23:11:54 crc kubenswrapper[4795]: E0219 23:11:54.937485 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" containerName="registry-server" Feb 19 23:11:54 crc kubenswrapper[4795]: I0219 23:11:54.937491 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" containerName="registry-server" Feb 19 23:11:54 crc kubenswrapper[4795]: E0219 23:11:54.937505 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" containerName="extract-content" Feb 19 23:11:54 crc kubenswrapper[4795]: I0219 23:11:54.937510 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" containerName="extract-content" Feb 19 23:11:54 crc kubenswrapper[4795]: I0219 23:11:54.937716 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" containerName="registry-server" Feb 19 23:11:54 crc kubenswrapper[4795]: I0219 23:11:54.939313 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:11:54 crc kubenswrapper[4795]: I0219 23:11:54.951642 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mnftg"] Feb 19 23:11:54 crc kubenswrapper[4795]: I0219 23:11:54.985377 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10309826-f6d9-49b0-a98c-1c31aab8ca7b-catalog-content\") pod \"certified-operators-mnftg\" (UID: \"10309826-f6d9-49b0-a98c-1c31aab8ca7b\") " pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:11:54 crc kubenswrapper[4795]: I0219 23:11:54.985509 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chpnw\" (UniqueName: \"kubernetes.io/projected/10309826-f6d9-49b0-a98c-1c31aab8ca7b-kube-api-access-chpnw\") pod \"certified-operators-mnftg\" (UID: \"10309826-f6d9-49b0-a98c-1c31aab8ca7b\") " pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:11:54 crc kubenswrapper[4795]: I0219 23:11:54.985575 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10309826-f6d9-49b0-a98c-1c31aab8ca7b-utilities\") pod \"certified-operators-mnftg\" (UID: \"10309826-f6d9-49b0-a98c-1c31aab8ca7b\") " pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:11:55 crc kubenswrapper[4795]: I0219 23:11:55.086778 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chpnw\" (UniqueName: \"kubernetes.io/projected/10309826-f6d9-49b0-a98c-1c31aab8ca7b-kube-api-access-chpnw\") pod \"certified-operators-mnftg\" (UID: \"10309826-f6d9-49b0-a98c-1c31aab8ca7b\") " pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:11:55 crc kubenswrapper[4795]: I0219 23:11:55.086859 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10309826-f6d9-49b0-a98c-1c31aab8ca7b-utilities\") pod \"certified-operators-mnftg\" (UID: \"10309826-f6d9-49b0-a98c-1c31aab8ca7b\") " pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:11:55 crc kubenswrapper[4795]: I0219 23:11:55.086949 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10309826-f6d9-49b0-a98c-1c31aab8ca7b-catalog-content\") pod \"certified-operators-mnftg\" (UID: \"10309826-f6d9-49b0-a98c-1c31aab8ca7b\") " pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:11:55 crc kubenswrapper[4795]: I0219 23:11:55.087506 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10309826-f6d9-49b0-a98c-1c31aab8ca7b-catalog-content\") pod \"certified-operators-mnftg\" (UID: \"10309826-f6d9-49b0-a98c-1c31aab8ca7b\") " pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:11:55 crc kubenswrapper[4795]: I0219 23:11:55.087724 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10309826-f6d9-49b0-a98c-1c31aab8ca7b-utilities\") pod \"certified-operators-mnftg\" (UID: \"10309826-f6d9-49b0-a98c-1c31aab8ca7b\") " pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:11:55 crc kubenswrapper[4795]: I0219 23:11:55.105496 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chpnw\" (UniqueName: \"kubernetes.io/projected/10309826-f6d9-49b0-a98c-1c31aab8ca7b-kube-api-access-chpnw\") pod \"certified-operators-mnftg\" (UID: \"10309826-f6d9-49b0-a98c-1c31aab8ca7b\") " pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:11:55 crc kubenswrapper[4795]: I0219 23:11:55.261229 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:11:55 crc kubenswrapper[4795]: I0219 23:11:55.760470 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mnftg"] Feb 19 23:11:55 crc kubenswrapper[4795]: W0219 23:11:55.762839 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10309826_f6d9_49b0_a98c_1c31aab8ca7b.slice/crio-8c9631e7617f04acaf017c583dcbddd280734220035ef66a47670736c26e2125 WatchSource:0}: Error finding container 8c9631e7617f04acaf017c583dcbddd280734220035ef66a47670736c26e2125: Status 404 returned error can't find the container with id 8c9631e7617f04acaf017c583dcbddd280734220035ef66a47670736c26e2125 Feb 19 23:11:55 crc kubenswrapper[4795]: I0219 23:11:55.882626 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mnftg" event={"ID":"10309826-f6d9-49b0-a98c-1c31aab8ca7b","Type":"ContainerStarted","Data":"8c9631e7617f04acaf017c583dcbddd280734220035ef66a47670736c26e2125"} Feb 19 23:11:56 crc kubenswrapper[4795]: I0219 23:11:56.892699 4795 generic.go:334] "Generic (PLEG): container finished" podID="10309826-f6d9-49b0-a98c-1c31aab8ca7b" containerID="f39335613c4dde281877dd969604dd436c0e834307aa5a9f28116c9e90a7fe20" exitCode=0 Feb 19 23:11:56 crc kubenswrapper[4795]: I0219 23:11:56.892757 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mnftg" event={"ID":"10309826-f6d9-49b0-a98c-1c31aab8ca7b","Type":"ContainerDied","Data":"f39335613c4dde281877dd969604dd436c0e834307aa5a9f28116c9e90a7fe20"} Feb 19 23:11:57 crc kubenswrapper[4795]: I0219 23:11:57.902305 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mnftg" event={"ID":"10309826-f6d9-49b0-a98c-1c31aab8ca7b","Type":"ContainerStarted","Data":"d4d65f6e8327aa14c977b5af3451b60be1db9d89c91b612be202232a40b62f1b"} Feb 19 23:11:58 crc kubenswrapper[4795]: I0219 23:11:58.914866 4795 generic.go:334] "Generic (PLEG): container finished" podID="10309826-f6d9-49b0-a98c-1c31aab8ca7b" containerID="d4d65f6e8327aa14c977b5af3451b60be1db9d89c91b612be202232a40b62f1b" exitCode=0 Feb 19 23:11:58 crc kubenswrapper[4795]: I0219 23:11:58.915059 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mnftg" event={"ID":"10309826-f6d9-49b0-a98c-1c31aab8ca7b","Type":"ContainerDied","Data":"d4d65f6e8327aa14c977b5af3451b60be1db9d89c91b612be202232a40b62f1b"} Feb 19 23:11:59 crc kubenswrapper[4795]: I0219 23:11:59.935482 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mnftg" event={"ID":"10309826-f6d9-49b0-a98c-1c31aab8ca7b","Type":"ContainerStarted","Data":"56affc0ddd3dce1a56ac8545963c2e65c8ca749618a90d58aa77768c3fed3459"} Feb 19 23:12:05 crc kubenswrapper[4795]: I0219 23:12:05.262485 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:12:05 crc kubenswrapper[4795]: I0219 23:12:05.263154 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:12:05 crc kubenswrapper[4795]: I0219 23:12:05.341491 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:12:05 crc kubenswrapper[4795]: I0219 23:12:05.362923 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mnftg" podStartSLOduration=8.974244416 podStartE2EDuration="11.362906535s" podCreationTimestamp="2026-02-19 23:11:54 +0000 UTC" firstStartedPulling="2026-02-19 23:11:56.894576021 +0000 UTC m=+6228.087093885" lastFinishedPulling="2026-02-19 23:11:59.28323814 +0000 UTC m=+6230.475756004" observedRunningTime="2026-02-19 23:11:59.979647904 +0000 UTC m=+6231.172165788" watchObservedRunningTime="2026-02-19 23:12:05.362906535 +0000 UTC m=+6236.555424399" Feb 19 23:12:06 crc kubenswrapper[4795]: I0219 23:12:06.044783 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:12:06 crc kubenswrapper[4795]: I0219 23:12:06.096710 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mnftg"] Feb 19 23:12:08 crc kubenswrapper[4795]: I0219 23:12:08.011723 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mnftg" podUID="10309826-f6d9-49b0-a98c-1c31aab8ca7b" containerName="registry-server" containerID="cri-o://56affc0ddd3dce1a56ac8545963c2e65c8ca749618a90d58aa77768c3fed3459" gracePeriod=2 Feb 19 23:12:08 crc kubenswrapper[4795]: I0219 23:12:08.511273 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:12:08 crc kubenswrapper[4795]: I0219 23:12:08.687894 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chpnw\" (UniqueName: \"kubernetes.io/projected/10309826-f6d9-49b0-a98c-1c31aab8ca7b-kube-api-access-chpnw\") pod \"10309826-f6d9-49b0-a98c-1c31aab8ca7b\" (UID: \"10309826-f6d9-49b0-a98c-1c31aab8ca7b\") " Feb 19 23:12:08 crc kubenswrapper[4795]: I0219 23:12:08.688232 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10309826-f6d9-49b0-a98c-1c31aab8ca7b-catalog-content\") pod \"10309826-f6d9-49b0-a98c-1c31aab8ca7b\" (UID: \"10309826-f6d9-49b0-a98c-1c31aab8ca7b\") " Feb 19 23:12:08 crc kubenswrapper[4795]: I0219 23:12:08.688386 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10309826-f6d9-49b0-a98c-1c31aab8ca7b-utilities\") pod \"10309826-f6d9-49b0-a98c-1c31aab8ca7b\" (UID: \"10309826-f6d9-49b0-a98c-1c31aab8ca7b\") " Feb 19 23:12:08 crc kubenswrapper[4795]: I0219 23:12:08.689210 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10309826-f6d9-49b0-a98c-1c31aab8ca7b-utilities" (OuterVolumeSpecName: "utilities") pod "10309826-f6d9-49b0-a98c-1c31aab8ca7b" (UID: "10309826-f6d9-49b0-a98c-1c31aab8ca7b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:12:08 crc kubenswrapper[4795]: I0219 23:12:08.693648 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10309826-f6d9-49b0-a98c-1c31aab8ca7b-kube-api-access-chpnw" (OuterVolumeSpecName: "kube-api-access-chpnw") pod "10309826-f6d9-49b0-a98c-1c31aab8ca7b" (UID: "10309826-f6d9-49b0-a98c-1c31aab8ca7b"). InnerVolumeSpecName "kube-api-access-chpnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:12:08 crc kubenswrapper[4795]: I0219 23:12:08.739281 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10309826-f6d9-49b0-a98c-1c31aab8ca7b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "10309826-f6d9-49b0-a98c-1c31aab8ca7b" (UID: "10309826-f6d9-49b0-a98c-1c31aab8ca7b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:12:08 crc kubenswrapper[4795]: I0219 23:12:08.789745 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10309826-f6d9-49b0-a98c-1c31aab8ca7b-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:08 crc kubenswrapper[4795]: I0219 23:12:08.789785 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chpnw\" (UniqueName: \"kubernetes.io/projected/10309826-f6d9-49b0-a98c-1c31aab8ca7b-kube-api-access-chpnw\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:08 crc kubenswrapper[4795]: I0219 23:12:08.789798 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10309826-f6d9-49b0-a98c-1c31aab8ca7b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:09 crc kubenswrapper[4795]: I0219 23:12:09.021921 4795 generic.go:334] "Generic (PLEG): container finished" podID="10309826-f6d9-49b0-a98c-1c31aab8ca7b" containerID="56affc0ddd3dce1a56ac8545963c2e65c8ca749618a90d58aa77768c3fed3459" exitCode=0 Feb 19 23:12:09 crc kubenswrapper[4795]: I0219 23:12:09.021968 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mnftg" event={"ID":"10309826-f6d9-49b0-a98c-1c31aab8ca7b","Type":"ContainerDied","Data":"56affc0ddd3dce1a56ac8545963c2e65c8ca749618a90d58aa77768c3fed3459"} Feb 19 23:12:09 crc kubenswrapper[4795]: I0219 23:12:09.022015 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mnftg" event={"ID":"10309826-f6d9-49b0-a98c-1c31aab8ca7b","Type":"ContainerDied","Data":"8c9631e7617f04acaf017c583dcbddd280734220035ef66a47670736c26e2125"} Feb 19 23:12:09 crc kubenswrapper[4795]: I0219 23:12:09.022042 4795 scope.go:117] "RemoveContainer" containerID="56affc0ddd3dce1a56ac8545963c2e65c8ca749618a90d58aa77768c3fed3459" Feb 19 23:12:09 crc kubenswrapper[4795]: I0219 23:12:09.021972 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:12:09 crc kubenswrapper[4795]: I0219 23:12:09.045209 4795 scope.go:117] "RemoveContainer" containerID="d4d65f6e8327aa14c977b5af3451b60be1db9d89c91b612be202232a40b62f1b" Feb 19 23:12:09 crc kubenswrapper[4795]: I0219 23:12:09.069224 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mnftg"] Feb 19 23:12:09 crc kubenswrapper[4795]: I0219 23:12:09.078237 4795 scope.go:117] "RemoveContainer" containerID="f39335613c4dde281877dd969604dd436c0e834307aa5a9f28116c9e90a7fe20" Feb 19 23:12:09 crc kubenswrapper[4795]: I0219 23:12:09.079635 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mnftg"] Feb 19 23:12:09 crc kubenswrapper[4795]: I0219 23:12:09.136139 4795 scope.go:117] "RemoveContainer" containerID="56affc0ddd3dce1a56ac8545963c2e65c8ca749618a90d58aa77768c3fed3459" Feb 19 23:12:09 crc kubenswrapper[4795]: E0219 23:12:09.136623 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56affc0ddd3dce1a56ac8545963c2e65c8ca749618a90d58aa77768c3fed3459\": container with ID starting with 56affc0ddd3dce1a56ac8545963c2e65c8ca749618a90d58aa77768c3fed3459 not found: ID does not exist" containerID="56affc0ddd3dce1a56ac8545963c2e65c8ca749618a90d58aa77768c3fed3459" Feb 19 23:12:09 crc kubenswrapper[4795]: I0219 23:12:09.136777 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56affc0ddd3dce1a56ac8545963c2e65c8ca749618a90d58aa77768c3fed3459"} err="failed to get container status \"56affc0ddd3dce1a56ac8545963c2e65c8ca749618a90d58aa77768c3fed3459\": rpc error: code = NotFound desc = could not find container \"56affc0ddd3dce1a56ac8545963c2e65c8ca749618a90d58aa77768c3fed3459\": container with ID starting with 56affc0ddd3dce1a56ac8545963c2e65c8ca749618a90d58aa77768c3fed3459 not found: ID does not exist" Feb 19 23:12:09 crc kubenswrapper[4795]: I0219 23:12:09.136889 4795 scope.go:117] "RemoveContainer" containerID="d4d65f6e8327aa14c977b5af3451b60be1db9d89c91b612be202232a40b62f1b" Feb 19 23:12:09 crc kubenswrapper[4795]: E0219 23:12:09.137407 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4d65f6e8327aa14c977b5af3451b60be1db9d89c91b612be202232a40b62f1b\": container with ID starting with d4d65f6e8327aa14c977b5af3451b60be1db9d89c91b612be202232a40b62f1b not found: ID does not exist" containerID="d4d65f6e8327aa14c977b5af3451b60be1db9d89c91b612be202232a40b62f1b" Feb 19 23:12:09 crc kubenswrapper[4795]: I0219 23:12:09.137428 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4d65f6e8327aa14c977b5af3451b60be1db9d89c91b612be202232a40b62f1b"} err="failed to get container status \"d4d65f6e8327aa14c977b5af3451b60be1db9d89c91b612be202232a40b62f1b\": rpc error: code = NotFound desc = could not find container \"d4d65f6e8327aa14c977b5af3451b60be1db9d89c91b612be202232a40b62f1b\": container with ID starting with d4d65f6e8327aa14c977b5af3451b60be1db9d89c91b612be202232a40b62f1b not found: ID does not exist" Feb 19 23:12:09 crc kubenswrapper[4795]: I0219 23:12:09.137440 4795 scope.go:117] "RemoveContainer" containerID="f39335613c4dde281877dd969604dd436c0e834307aa5a9f28116c9e90a7fe20" Feb 19 23:12:09 crc kubenswrapper[4795]: E0219 23:12:09.137818 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f39335613c4dde281877dd969604dd436c0e834307aa5a9f28116c9e90a7fe20\": container with ID starting with f39335613c4dde281877dd969604dd436c0e834307aa5a9f28116c9e90a7fe20 not found: ID does not exist" containerID="f39335613c4dde281877dd969604dd436c0e834307aa5a9f28116c9e90a7fe20" Feb 19 23:12:09 crc kubenswrapper[4795]: I0219 23:12:09.138055 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f39335613c4dde281877dd969604dd436c0e834307aa5a9f28116c9e90a7fe20"} err="failed to get container status \"f39335613c4dde281877dd969604dd436c0e834307aa5a9f28116c9e90a7fe20\": rpc error: code = NotFound desc = could not find container \"f39335613c4dde281877dd969604dd436c0e834307aa5a9f28116c9e90a7fe20\": container with ID starting with f39335613c4dde281877dd969604dd436c0e834307aa5a9f28116c9e90a7fe20 not found: ID does not exist" Feb 19 23:12:09 crc kubenswrapper[4795]: I0219 23:12:09.526326 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10309826-f6d9-49b0-a98c-1c31aab8ca7b" path="/var/lib/kubelet/pods/10309826-f6d9-49b0-a98c-1c31aab8ca7b/volumes" Feb 19 23:12:28 crc kubenswrapper[4795]: I0219 23:12:28.427336 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:12:28 crc kubenswrapper[4795]: I0219 23:12:28.427955 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:12:43 crc kubenswrapper[4795]: I0219 23:12:43.041713 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-rfntz"] Feb 19 23:12:43 crc kubenswrapper[4795]: I0219 23:12:43.056134 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-rfntz"] Feb 19 23:12:43 crc kubenswrapper[4795]: I0219 23:12:43.526432 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7389820e-b641-4068-b624-af539a234699" path="/var/lib/kubelet/pods/7389820e-b641-4068-b624-af539a234699/volumes" Feb 19 23:12:44 crc kubenswrapper[4795]: I0219 23:12:44.027720 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-c873-account-create-update-fggql"] Feb 19 23:12:44 crc kubenswrapper[4795]: I0219 23:12:44.037773 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-c873-account-create-update-fggql"] Feb 19 23:12:45 crc kubenswrapper[4795]: I0219 23:12:45.521837 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ca0a783-4d18-4d0a-81d8-7cc1970379a9" path="/var/lib/kubelet/pods/9ca0a783-4d18-4d0a-81d8-7cc1970379a9/volumes" Feb 19 23:12:49 crc kubenswrapper[4795]: I0219 23:12:49.029800 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-2vlg7"] Feb 19 23:12:49 crc kubenswrapper[4795]: I0219 23:12:49.038689 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-2vlg7"] Feb 19 23:12:49 crc kubenswrapper[4795]: I0219 23:12:49.524344 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2af5a019-2aa4-449d-a1a5-148cbf8a1ffa" path="/var/lib/kubelet/pods/2af5a019-2aa4-449d-a1a5-148cbf8a1ffa/volumes" Feb 19 23:12:50 crc kubenswrapper[4795]: I0219 23:12:50.029397 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-75e1-account-create-update-mq672"] Feb 19 23:12:50 crc kubenswrapper[4795]: I0219 23:12:50.039992 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-75e1-account-create-update-mq672"] Feb 19 23:12:51 crc kubenswrapper[4795]: I0219 23:12:51.527560 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2" path="/var/lib/kubelet/pods/0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2/volumes" Feb 19 23:12:58 crc kubenswrapper[4795]: I0219 23:12:58.427636 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:12:58 crc kubenswrapper[4795]: I0219 23:12:58.429305 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:13:16 crc kubenswrapper[4795]: I0219 23:13:16.037296 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-z8cdz"] Feb 19 23:13:16 crc kubenswrapper[4795]: I0219 23:13:16.046388 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-z8cdz"] Feb 19 23:13:17 crc kubenswrapper[4795]: I0219 23:13:17.529296 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7895d70-3c78-4913-9028-75797e6e1dbd" path="/var/lib/kubelet/pods/c7895d70-3c78-4913-9028-75797e6e1dbd/volumes" Feb 19 23:13:26 crc kubenswrapper[4795]: I0219 23:13:26.633521 4795 scope.go:117] "RemoveContainer" containerID="700b20fbdb16cdb721d65035d59d19827c546e867ba32ed9f351235ca4bc0246" Feb 19 23:13:26 crc kubenswrapper[4795]: I0219 23:13:26.670663 4795 scope.go:117] "RemoveContainer" containerID="922d8fbba822aa06530b36452ecdfe6cce93b9221523be2934bb7556f81619e3" Feb 19 23:13:26 crc kubenswrapper[4795]: I0219 23:13:26.716987 4795 scope.go:117] "RemoveContainer" containerID="8783e1276005ef506acf0881371a237a591f18c217cbbd901f218637b5c95d2c" Feb 19 23:13:26 crc kubenswrapper[4795]: I0219 23:13:26.766859 4795 scope.go:117] "RemoveContainer" containerID="d4c8e37b4453cd9ee00d52e7178d48381bffc05905e787ed7678728d6f9cc0ef" Feb 19 23:13:26 crc kubenswrapper[4795]: I0219 23:13:26.844532 4795 scope.go:117] "RemoveContainer" containerID="461ac9425c34a3821048eb55409a0a70365c0acacbfa307ea4409068b90afe68" Feb 19 23:13:26 crc kubenswrapper[4795]: I0219 23:13:26.875469 4795 scope.go:117] "RemoveContainer" containerID="fe6c050cae9125e8669040d524bc8951c45b9effbc5921d0ee07f69c88d514a2" Feb 19 23:13:28 crc kubenswrapper[4795]: I0219 23:13:28.427432 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:13:28 crc kubenswrapper[4795]: I0219 23:13:28.427797 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:13:28 crc kubenswrapper[4795]: I0219 23:13:28.427839 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 23:13:28 crc kubenswrapper[4795]: I0219 23:13:28.428640 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 23:13:28 crc kubenswrapper[4795]: I0219 23:13:28.428688 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" gracePeriod=600 Feb 19 23:13:28 crc kubenswrapper[4795]: E0219 23:13:28.551297 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:13:28 crc kubenswrapper[4795]: I0219 23:13:28.785700 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" exitCode=0 Feb 19 23:13:28 crc kubenswrapper[4795]: I0219 23:13:28.785745 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4"} Feb 19 23:13:28 crc kubenswrapper[4795]: I0219 23:13:28.785781 4795 scope.go:117] "RemoveContainer" containerID="b0fe96e51c4e702a3b8fdcd9d997ef35d626772b563d6c998f84fa6863685a9d" Feb 19 23:13:28 crc kubenswrapper[4795]: I0219 23:13:28.786570 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:13:28 crc kubenswrapper[4795]: E0219 23:13:28.787059 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:13:44 crc kubenswrapper[4795]: I0219 23:13:44.511317 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:13:44 crc kubenswrapper[4795]: E0219 23:13:44.512101 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:13:58 crc kubenswrapper[4795]: I0219 23:13:58.511966 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:13:58 crc kubenswrapper[4795]: E0219 23:13:58.514109 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:14:10 crc kubenswrapper[4795]: I0219 23:14:10.511721 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:14:10 crc kubenswrapper[4795]: E0219 23:14:10.512506 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:14:22 crc kubenswrapper[4795]: I0219 23:14:22.513017 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:14:22 crc kubenswrapper[4795]: E0219 23:14:22.513807 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.047303 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jvh4m"] Feb 19 23:14:29 crc kubenswrapper[4795]: E0219 23:14:29.049323 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10309826-f6d9-49b0-a98c-1c31aab8ca7b" containerName="extract-content" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.049456 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="10309826-f6d9-49b0-a98c-1c31aab8ca7b" containerName="extract-content" Feb 19 23:14:29 crc kubenswrapper[4795]: E0219 23:14:29.049547 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10309826-f6d9-49b0-a98c-1c31aab8ca7b" containerName="registry-server" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.049638 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="10309826-f6d9-49b0-a98c-1c31aab8ca7b" containerName="registry-server" Feb 19 23:14:29 crc kubenswrapper[4795]: E0219 23:14:29.049765 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10309826-f6d9-49b0-a98c-1c31aab8ca7b" containerName="extract-utilities" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.049842 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="10309826-f6d9-49b0-a98c-1c31aab8ca7b" containerName="extract-utilities" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.050188 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="10309826-f6d9-49b0-a98c-1c31aab8ca7b" containerName="registry-server" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.060643 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.086654 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jvh4m"] Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.168375 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/453f74c8-d1b5-4e9f-b405-0341eead8a87-utilities\") pod \"redhat-marketplace-jvh4m\" (UID: \"453f74c8-d1b5-4e9f-b405-0341eead8a87\") " pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.168716 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/453f74c8-d1b5-4e9f-b405-0341eead8a87-catalog-content\") pod \"redhat-marketplace-jvh4m\" (UID: \"453f74c8-d1b5-4e9f-b405-0341eead8a87\") " pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.168879 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsgf9\" (UniqueName: \"kubernetes.io/projected/453f74c8-d1b5-4e9f-b405-0341eead8a87-kube-api-access-dsgf9\") pod \"redhat-marketplace-jvh4m\" (UID: \"453f74c8-d1b5-4e9f-b405-0341eead8a87\") " pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.272536 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/453f74c8-d1b5-4e9f-b405-0341eead8a87-utilities\") pod \"redhat-marketplace-jvh4m\" (UID: \"453f74c8-d1b5-4e9f-b405-0341eead8a87\") " pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.272581 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/453f74c8-d1b5-4e9f-b405-0341eead8a87-catalog-content\") pod \"redhat-marketplace-jvh4m\" (UID: \"453f74c8-d1b5-4e9f-b405-0341eead8a87\") " pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.272647 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsgf9\" (UniqueName: \"kubernetes.io/projected/453f74c8-d1b5-4e9f-b405-0341eead8a87-kube-api-access-dsgf9\") pod \"redhat-marketplace-jvh4m\" (UID: \"453f74c8-d1b5-4e9f-b405-0341eead8a87\") " pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.273045 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/453f74c8-d1b5-4e9f-b405-0341eead8a87-utilities\") pod \"redhat-marketplace-jvh4m\" (UID: \"453f74c8-d1b5-4e9f-b405-0341eead8a87\") " pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.273121 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/453f74c8-d1b5-4e9f-b405-0341eead8a87-catalog-content\") pod \"redhat-marketplace-jvh4m\" (UID: \"453f74c8-d1b5-4e9f-b405-0341eead8a87\") " pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.290070 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsgf9\" (UniqueName: \"kubernetes.io/projected/453f74c8-d1b5-4e9f-b405-0341eead8a87-kube-api-access-dsgf9\") pod \"redhat-marketplace-jvh4m\" (UID: \"453f74c8-d1b5-4e9f-b405-0341eead8a87\") " pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.388184 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.862093 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jvh4m"] Feb 19 23:14:30 crc kubenswrapper[4795]: I0219 23:14:30.382663 4795 generic.go:334] "Generic (PLEG): container finished" podID="453f74c8-d1b5-4e9f-b405-0341eead8a87" containerID="62121896a47c05580cb59ecb2938a3458503ebc87eafd5ef41f948ed9909b535" exitCode=0 Feb 19 23:14:30 crc kubenswrapper[4795]: I0219 23:14:30.382731 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvh4m" event={"ID":"453f74c8-d1b5-4e9f-b405-0341eead8a87","Type":"ContainerDied","Data":"62121896a47c05580cb59ecb2938a3458503ebc87eafd5ef41f948ed9909b535"} Feb 19 23:14:30 crc kubenswrapper[4795]: I0219 23:14:30.382979 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvh4m" event={"ID":"453f74c8-d1b5-4e9f-b405-0341eead8a87","Type":"ContainerStarted","Data":"50d08332a8251ec3f48c5be46a354c07cc40d484277909a9ca772e3ee10ef0b2"} Feb 19 23:14:30 crc kubenswrapper[4795]: I0219 23:14:30.384869 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 23:14:31 crc kubenswrapper[4795]: I0219 23:14:31.393769 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvh4m" event={"ID":"453f74c8-d1b5-4e9f-b405-0341eead8a87","Type":"ContainerStarted","Data":"4546fb2b0f692e0afbd8f9a2c0d0da64c1561e2b5959bb521a8e4212817a8c2c"} Feb 19 23:14:32 crc kubenswrapper[4795]: I0219 23:14:32.406363 4795 generic.go:334] "Generic (PLEG): container finished" podID="453f74c8-d1b5-4e9f-b405-0341eead8a87" containerID="4546fb2b0f692e0afbd8f9a2c0d0da64c1561e2b5959bb521a8e4212817a8c2c" exitCode=0 Feb 19 23:14:32 crc kubenswrapper[4795]: I0219 23:14:32.406442 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvh4m" event={"ID":"453f74c8-d1b5-4e9f-b405-0341eead8a87","Type":"ContainerDied","Data":"4546fb2b0f692e0afbd8f9a2c0d0da64c1561e2b5959bb521a8e4212817a8c2c"} Feb 19 23:14:33 crc kubenswrapper[4795]: I0219 23:14:33.417897 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvh4m" event={"ID":"453f74c8-d1b5-4e9f-b405-0341eead8a87","Type":"ContainerStarted","Data":"b62b2acf00fa6b1a617728bd14c7220c9a57c93c0808377409db315c3214c241"} Feb 19 23:14:33 crc kubenswrapper[4795]: I0219 23:14:33.443854 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jvh4m" podStartSLOduration=2.005029077 podStartE2EDuration="4.443836364s" podCreationTimestamp="2026-02-19 23:14:29 +0000 UTC" firstStartedPulling="2026-02-19 23:14:30.384677983 +0000 UTC m=+6381.577195847" lastFinishedPulling="2026-02-19 23:14:32.82348527 +0000 UTC m=+6384.016003134" observedRunningTime="2026-02-19 23:14:33.43551894 +0000 UTC m=+6384.628036824" watchObservedRunningTime="2026-02-19 23:14:33.443836364 +0000 UTC m=+6384.636354218" Feb 19 23:14:35 crc kubenswrapper[4795]: I0219 23:14:35.512264 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:14:35 crc kubenswrapper[4795]: E0219 23:14:35.512947 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:14:39 crc kubenswrapper[4795]: I0219 23:14:39.389745 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:39 crc kubenswrapper[4795]: I0219 23:14:39.390406 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:39 crc kubenswrapper[4795]: I0219 23:14:39.463149 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:39 crc kubenswrapper[4795]: I0219 23:14:39.555385 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:39 crc kubenswrapper[4795]: I0219 23:14:39.706509 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jvh4m"] Feb 19 23:14:41 crc kubenswrapper[4795]: I0219 23:14:41.493948 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jvh4m" podUID="453f74c8-d1b5-4e9f-b405-0341eead8a87" containerName="registry-server" containerID="cri-o://b62b2acf00fa6b1a617728bd14c7220c9a57c93c0808377409db315c3214c241" gracePeriod=2 Feb 19 23:14:41 crc kubenswrapper[4795]: I0219 23:14:41.995623 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.160975 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/453f74c8-d1b5-4e9f-b405-0341eead8a87-catalog-content\") pod \"453f74c8-d1b5-4e9f-b405-0341eead8a87\" (UID: \"453f74c8-d1b5-4e9f-b405-0341eead8a87\") " Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.161050 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/453f74c8-d1b5-4e9f-b405-0341eead8a87-utilities\") pod \"453f74c8-d1b5-4e9f-b405-0341eead8a87\" (UID: \"453f74c8-d1b5-4e9f-b405-0341eead8a87\") " Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.161222 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsgf9\" (UniqueName: \"kubernetes.io/projected/453f74c8-d1b5-4e9f-b405-0341eead8a87-kube-api-access-dsgf9\") pod \"453f74c8-d1b5-4e9f-b405-0341eead8a87\" (UID: \"453f74c8-d1b5-4e9f-b405-0341eead8a87\") " Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.162281 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/453f74c8-d1b5-4e9f-b405-0341eead8a87-utilities" (OuterVolumeSpecName: "utilities") pod "453f74c8-d1b5-4e9f-b405-0341eead8a87" (UID: "453f74c8-d1b5-4e9f-b405-0341eead8a87"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.168040 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/453f74c8-d1b5-4e9f-b405-0341eead8a87-kube-api-access-dsgf9" (OuterVolumeSpecName: "kube-api-access-dsgf9") pod "453f74c8-d1b5-4e9f-b405-0341eead8a87" (UID: "453f74c8-d1b5-4e9f-b405-0341eead8a87"). InnerVolumeSpecName "kube-api-access-dsgf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.183501 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/453f74c8-d1b5-4e9f-b405-0341eead8a87-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "453f74c8-d1b5-4e9f-b405-0341eead8a87" (UID: "453f74c8-d1b5-4e9f-b405-0341eead8a87"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.264717 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/453f74c8-d1b5-4e9f-b405-0341eead8a87-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.264760 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/453f74c8-d1b5-4e9f-b405-0341eead8a87-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.264776 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsgf9\" (UniqueName: \"kubernetes.io/projected/453f74c8-d1b5-4e9f-b405-0341eead8a87-kube-api-access-dsgf9\") on node \"crc\" DevicePath \"\"" Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.508273 4795 generic.go:334] "Generic (PLEG): container finished" podID="453f74c8-d1b5-4e9f-b405-0341eead8a87" containerID="b62b2acf00fa6b1a617728bd14c7220c9a57c93c0808377409db315c3214c241" exitCode=0 Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.508326 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvh4m" event={"ID":"453f74c8-d1b5-4e9f-b405-0341eead8a87","Type":"ContainerDied","Data":"b62b2acf00fa6b1a617728bd14c7220c9a57c93c0808377409db315c3214c241"} Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.508350 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.508371 4795 scope.go:117] "RemoveContainer" containerID="b62b2acf00fa6b1a617728bd14c7220c9a57c93c0808377409db315c3214c241" Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.508357 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvh4m" event={"ID":"453f74c8-d1b5-4e9f-b405-0341eead8a87","Type":"ContainerDied","Data":"50d08332a8251ec3f48c5be46a354c07cc40d484277909a9ca772e3ee10ef0b2"} Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.554712 4795 scope.go:117] "RemoveContainer" containerID="4546fb2b0f692e0afbd8f9a2c0d0da64c1561e2b5959bb521a8e4212817a8c2c" Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.557476 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jvh4m"] Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.571896 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jvh4m"] Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.580457 4795 scope.go:117] "RemoveContainer" containerID="62121896a47c05580cb59ecb2938a3458503ebc87eafd5ef41f948ed9909b535" Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.639909 4795 scope.go:117] "RemoveContainer" containerID="b62b2acf00fa6b1a617728bd14c7220c9a57c93c0808377409db315c3214c241" Feb 19 23:14:42 crc kubenswrapper[4795]: E0219 23:14:42.640415 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b62b2acf00fa6b1a617728bd14c7220c9a57c93c0808377409db315c3214c241\": container with ID starting with b62b2acf00fa6b1a617728bd14c7220c9a57c93c0808377409db315c3214c241 not found: ID does not exist" containerID="b62b2acf00fa6b1a617728bd14c7220c9a57c93c0808377409db315c3214c241" Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.640454 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b62b2acf00fa6b1a617728bd14c7220c9a57c93c0808377409db315c3214c241"} err="failed to get container status \"b62b2acf00fa6b1a617728bd14c7220c9a57c93c0808377409db315c3214c241\": rpc error: code = NotFound desc = could not find container \"b62b2acf00fa6b1a617728bd14c7220c9a57c93c0808377409db315c3214c241\": container with ID starting with b62b2acf00fa6b1a617728bd14c7220c9a57c93c0808377409db315c3214c241 not found: ID does not exist" Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.640498 4795 scope.go:117] "RemoveContainer" containerID="4546fb2b0f692e0afbd8f9a2c0d0da64c1561e2b5959bb521a8e4212817a8c2c" Feb 19 23:14:42 crc kubenswrapper[4795]: E0219 23:14:42.640766 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4546fb2b0f692e0afbd8f9a2c0d0da64c1561e2b5959bb521a8e4212817a8c2c\": container with ID starting with 4546fb2b0f692e0afbd8f9a2c0d0da64c1561e2b5959bb521a8e4212817a8c2c not found: ID does not exist" containerID="4546fb2b0f692e0afbd8f9a2c0d0da64c1561e2b5959bb521a8e4212817a8c2c" Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.640808 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4546fb2b0f692e0afbd8f9a2c0d0da64c1561e2b5959bb521a8e4212817a8c2c"} err="failed to get container status \"4546fb2b0f692e0afbd8f9a2c0d0da64c1561e2b5959bb521a8e4212817a8c2c\": rpc error: code = NotFound desc = could not find container \"4546fb2b0f692e0afbd8f9a2c0d0da64c1561e2b5959bb521a8e4212817a8c2c\": container with ID starting with 4546fb2b0f692e0afbd8f9a2c0d0da64c1561e2b5959bb521a8e4212817a8c2c not found: ID does not exist" Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.640840 4795 scope.go:117] "RemoveContainer" containerID="62121896a47c05580cb59ecb2938a3458503ebc87eafd5ef41f948ed9909b535" Feb 19 23:14:42 crc kubenswrapper[4795]: E0219 23:14:42.641271 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62121896a47c05580cb59ecb2938a3458503ebc87eafd5ef41f948ed9909b535\": container with ID starting with 62121896a47c05580cb59ecb2938a3458503ebc87eafd5ef41f948ed9909b535 not found: ID does not exist" containerID="62121896a47c05580cb59ecb2938a3458503ebc87eafd5ef41f948ed9909b535" Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.641318 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62121896a47c05580cb59ecb2938a3458503ebc87eafd5ef41f948ed9909b535"} err="failed to get container status \"62121896a47c05580cb59ecb2938a3458503ebc87eafd5ef41f948ed9909b535\": rpc error: code = NotFound desc = could not find container \"62121896a47c05580cb59ecb2938a3458503ebc87eafd5ef41f948ed9909b535\": container with ID starting with 62121896a47c05580cb59ecb2938a3458503ebc87eafd5ef41f948ed9909b535 not found: ID does not exist" Feb 19 23:14:43 crc kubenswrapper[4795]: I0219 23:14:43.526767 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="453f74c8-d1b5-4e9f-b405-0341eead8a87" path="/var/lib/kubelet/pods/453f74c8-d1b5-4e9f-b405-0341eead8a87/volumes" Feb 19 23:14:49 crc kubenswrapper[4795]: I0219 23:14:49.518202 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:14:49 crc kubenswrapper[4795]: E0219 23:14:49.518931 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.163623 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95"] Feb 19 23:15:00 crc kubenswrapper[4795]: E0219 23:15:00.164817 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="453f74c8-d1b5-4e9f-b405-0341eead8a87" containerName="registry-server" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.164837 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="453f74c8-d1b5-4e9f-b405-0341eead8a87" containerName="registry-server" Feb 19 23:15:00 crc kubenswrapper[4795]: E0219 23:15:00.164900 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="453f74c8-d1b5-4e9f-b405-0341eead8a87" containerName="extract-content" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.164910 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="453f74c8-d1b5-4e9f-b405-0341eead8a87" containerName="extract-content" Feb 19 23:15:00 crc kubenswrapper[4795]: E0219 23:15:00.164931 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="453f74c8-d1b5-4e9f-b405-0341eead8a87" containerName="extract-utilities" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.164941 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="453f74c8-d1b5-4e9f-b405-0341eead8a87" containerName="extract-utilities" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.165305 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="453f74c8-d1b5-4e9f-b405-0341eead8a87" containerName="registry-server" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.166413 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.175573 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.175842 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.179994 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95"] Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.198284 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3eafa182-621e-48fe-a019-360c2f94c212-secret-volume\") pod \"collect-profiles-29525715-mrk95\" (UID: \"3eafa182-621e-48fe-a019-360c2f94c212\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.198325 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3eafa182-621e-48fe-a019-360c2f94c212-config-volume\") pod \"collect-profiles-29525715-mrk95\" (UID: \"3eafa182-621e-48fe-a019-360c2f94c212\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.198364 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsdb9\" (UniqueName: \"kubernetes.io/projected/3eafa182-621e-48fe-a019-360c2f94c212-kube-api-access-xsdb9\") pod \"collect-profiles-29525715-mrk95\" (UID: \"3eafa182-621e-48fe-a019-360c2f94c212\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.299614 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3eafa182-621e-48fe-a019-360c2f94c212-secret-volume\") pod \"collect-profiles-29525715-mrk95\" (UID: \"3eafa182-621e-48fe-a019-360c2f94c212\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.299676 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3eafa182-621e-48fe-a019-360c2f94c212-config-volume\") pod \"collect-profiles-29525715-mrk95\" (UID: \"3eafa182-621e-48fe-a019-360c2f94c212\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.299726 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsdb9\" (UniqueName: \"kubernetes.io/projected/3eafa182-621e-48fe-a019-360c2f94c212-kube-api-access-xsdb9\") pod \"collect-profiles-29525715-mrk95\" (UID: \"3eafa182-621e-48fe-a019-360c2f94c212\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.300665 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3eafa182-621e-48fe-a019-360c2f94c212-config-volume\") pod \"collect-profiles-29525715-mrk95\" (UID: \"3eafa182-621e-48fe-a019-360c2f94c212\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.308189 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3eafa182-621e-48fe-a019-360c2f94c212-secret-volume\") pod \"collect-profiles-29525715-mrk95\" (UID: \"3eafa182-621e-48fe-a019-360c2f94c212\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.315466 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsdb9\" (UniqueName: \"kubernetes.io/projected/3eafa182-621e-48fe-a019-360c2f94c212-kube-api-access-xsdb9\") pod \"collect-profiles-29525715-mrk95\" (UID: \"3eafa182-621e-48fe-a019-360c2f94c212\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.498820 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.989397 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95"] Feb 19 23:15:00 crc kubenswrapper[4795]: W0219 23:15:00.989643 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3eafa182_621e_48fe_a019_360c2f94c212.slice/crio-6406f99a63ec383df0dd1e873de85230266833554c0a7f03d3954fe835bc1010 WatchSource:0}: Error finding container 6406f99a63ec383df0dd1e873de85230266833554c0a7f03d3954fe835bc1010: Status 404 returned error can't find the container with id 6406f99a63ec383df0dd1e873de85230266833554c0a7f03d3954fe835bc1010 Feb 19 23:15:01 crc kubenswrapper[4795]: I0219 23:15:01.512426 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:15:01 crc kubenswrapper[4795]: E0219 23:15:01.513221 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:15:01 crc kubenswrapper[4795]: I0219 23:15:01.724999 4795 generic.go:334] "Generic (PLEG): container finished" podID="3eafa182-621e-48fe-a019-360c2f94c212" containerID="506e26301ea1d8c97cb2f36560ab4d1582f8a10338a4a109885babb88591cfe0" exitCode=0 Feb 19 23:15:01 crc kubenswrapper[4795]: I0219 23:15:01.725047 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95" event={"ID":"3eafa182-621e-48fe-a019-360c2f94c212","Type":"ContainerDied","Data":"506e26301ea1d8c97cb2f36560ab4d1582f8a10338a4a109885babb88591cfe0"} Feb 19 23:15:01 crc kubenswrapper[4795]: I0219 23:15:01.725105 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95" event={"ID":"3eafa182-621e-48fe-a019-360c2f94c212","Type":"ContainerStarted","Data":"6406f99a63ec383df0dd1e873de85230266833554c0a7f03d3954fe835bc1010"} Feb 19 23:15:03 crc kubenswrapper[4795]: I0219 23:15:03.088765 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95" Feb 19 23:15:03 crc kubenswrapper[4795]: I0219 23:15:03.259796 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3eafa182-621e-48fe-a019-360c2f94c212-config-volume\") pod \"3eafa182-621e-48fe-a019-360c2f94c212\" (UID: \"3eafa182-621e-48fe-a019-360c2f94c212\") " Feb 19 23:15:03 crc kubenswrapper[4795]: I0219 23:15:03.260217 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3eafa182-621e-48fe-a019-360c2f94c212-secret-volume\") pod \"3eafa182-621e-48fe-a019-360c2f94c212\" (UID: \"3eafa182-621e-48fe-a019-360c2f94c212\") " Feb 19 23:15:03 crc kubenswrapper[4795]: I0219 23:15:03.260353 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsdb9\" (UniqueName: \"kubernetes.io/projected/3eafa182-621e-48fe-a019-360c2f94c212-kube-api-access-xsdb9\") pod \"3eafa182-621e-48fe-a019-360c2f94c212\" (UID: \"3eafa182-621e-48fe-a019-360c2f94c212\") " Feb 19 23:15:03 crc kubenswrapper[4795]: I0219 23:15:03.261009 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3eafa182-621e-48fe-a019-360c2f94c212-config-volume" (OuterVolumeSpecName: "config-volume") pod "3eafa182-621e-48fe-a019-360c2f94c212" (UID: "3eafa182-621e-48fe-a019-360c2f94c212"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:15:03 crc kubenswrapper[4795]: I0219 23:15:03.261917 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3eafa182-621e-48fe-a019-360c2f94c212-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:03 crc kubenswrapper[4795]: I0219 23:15:03.265818 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eafa182-621e-48fe-a019-360c2f94c212-kube-api-access-xsdb9" (OuterVolumeSpecName: "kube-api-access-xsdb9") pod "3eafa182-621e-48fe-a019-360c2f94c212" (UID: "3eafa182-621e-48fe-a019-360c2f94c212"). InnerVolumeSpecName "kube-api-access-xsdb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:15:03 crc kubenswrapper[4795]: I0219 23:15:03.265929 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eafa182-621e-48fe-a019-360c2f94c212-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3eafa182-621e-48fe-a019-360c2f94c212" (UID: "3eafa182-621e-48fe-a019-360c2f94c212"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:15:03 crc kubenswrapper[4795]: I0219 23:15:03.364048 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3eafa182-621e-48fe-a019-360c2f94c212-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:03 crc kubenswrapper[4795]: I0219 23:15:03.364535 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsdb9\" (UniqueName: \"kubernetes.io/projected/3eafa182-621e-48fe-a019-360c2f94c212-kube-api-access-xsdb9\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:03 crc kubenswrapper[4795]: I0219 23:15:03.745637 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95" event={"ID":"3eafa182-621e-48fe-a019-360c2f94c212","Type":"ContainerDied","Data":"6406f99a63ec383df0dd1e873de85230266833554c0a7f03d3954fe835bc1010"} Feb 19 23:15:03 crc kubenswrapper[4795]: I0219 23:15:03.745685 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6406f99a63ec383df0dd1e873de85230266833554c0a7f03d3954fe835bc1010" Feb 19 23:15:03 crc kubenswrapper[4795]: I0219 23:15:03.746045 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95" Feb 19 23:15:04 crc kubenswrapper[4795]: I0219 23:15:04.165147 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr"] Feb 19 23:15:04 crc kubenswrapper[4795]: I0219 23:15:04.173358 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr"] Feb 19 23:15:05 crc kubenswrapper[4795]: I0219 23:15:05.527935 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8d7fc5a-2c38-45d1-92d4-e30329082e49" path="/var/lib/kubelet/pods/e8d7fc5a-2c38-45d1-92d4-e30329082e49/volumes" Feb 19 23:15:15 crc kubenswrapper[4795]: I0219 23:15:15.516370 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:15:15 crc kubenswrapper[4795]: E0219 23:15:15.517622 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:15:26 crc kubenswrapper[4795]: I0219 23:15:26.513579 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:15:26 crc kubenswrapper[4795]: E0219 23:15:26.515019 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:15:27 crc kubenswrapper[4795]: I0219 23:15:27.046975 4795 scope.go:117] "RemoveContainer" containerID="2891e05af0080148a30c661d705a64987123339160913040e4d09a5170f489c1" Feb 19 23:15:39 crc kubenswrapper[4795]: I0219 23:15:39.525268 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:15:39 crc kubenswrapper[4795]: E0219 23:15:39.526024 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:15:54 crc kubenswrapper[4795]: I0219 23:15:54.511503 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:15:54 crc kubenswrapper[4795]: E0219 23:15:54.512399 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:16:07 crc kubenswrapper[4795]: I0219 23:16:07.512112 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:16:07 crc kubenswrapper[4795]: E0219 23:16:07.512935 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:16:09 crc kubenswrapper[4795]: I0219 23:16:09.062353 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-pvsmv"] Feb 19 23:16:09 crc kubenswrapper[4795]: I0219 23:16:09.071993 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-4351-account-create-update-7cp54"] Feb 19 23:16:09 crc kubenswrapper[4795]: I0219 23:16:09.080654 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-pvsmv"] Feb 19 23:16:09 crc kubenswrapper[4795]: I0219 23:16:09.088692 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-4351-account-create-update-7cp54"] Feb 19 23:16:09 crc kubenswrapper[4795]: I0219 23:16:09.526821 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ce3f6fb-8688-4e53-8d30-e6c7edbf5636" path="/var/lib/kubelet/pods/3ce3f6fb-8688-4e53-8d30-e6c7edbf5636/volumes" Feb 19 23:16:09 crc kubenswrapper[4795]: I0219 23:16:09.527683 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7049350-2c57-49c2-aef7-b9f0bd28abfc" path="/var/lib/kubelet/pods/c7049350-2c57-49c2-aef7-b9f0bd28abfc/volumes" Feb 19 23:16:21 crc kubenswrapper[4795]: I0219 23:16:21.511903 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:16:21 crc kubenswrapper[4795]: E0219 23:16:21.513473 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:16:22 crc kubenswrapper[4795]: I0219 23:16:22.041795 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-wlhqm"] Feb 19 23:16:22 crc kubenswrapper[4795]: I0219 23:16:22.051927 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-wlhqm"] Feb 19 23:16:23 crc kubenswrapper[4795]: I0219 23:16:23.525939 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="497c4c82-13ae-430c-83bd-1f1c4d4683e4" path="/var/lib/kubelet/pods/497c4c82-13ae-430c-83bd-1f1c4d4683e4/volumes" Feb 19 23:16:27 crc kubenswrapper[4795]: I0219 23:16:27.149411 4795 scope.go:117] "RemoveContainer" containerID="26cce21cbd7189a101a6a725aa7d2a769b17bb5f8957270015deb5068ba381a3" Feb 19 23:16:27 crc kubenswrapper[4795]: I0219 23:16:27.179782 4795 scope.go:117] "RemoveContainer" containerID="be3b7d10ee8dbba79631201a8d5da4057d99e4383e860152ba77db4992ff52fc" Feb 19 23:16:27 crc kubenswrapper[4795]: I0219 23:16:27.233796 4795 scope.go:117] "RemoveContainer" containerID="394778543b7d586f5d57eb0c386c1afcba774c0cbc8858b3c036d9d786189525" Feb 19 23:16:34 crc kubenswrapper[4795]: I0219 23:16:34.511286 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:16:34 crc kubenswrapper[4795]: E0219 23:16:34.512035 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:16:47 crc kubenswrapper[4795]: I0219 23:16:47.512814 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:16:47 crc kubenswrapper[4795]: E0219 23:16:47.513933 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:17:02 crc kubenswrapper[4795]: I0219 23:17:02.512061 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:17:02 crc kubenswrapper[4795]: E0219 23:17:02.513011 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:17:14 crc kubenswrapper[4795]: I0219 23:17:14.512615 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:17:14 crc kubenswrapper[4795]: E0219 23:17:14.513732 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:17:25 crc kubenswrapper[4795]: I0219 23:17:25.512055 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:17:25 crc kubenswrapper[4795]: E0219 23:17:25.513834 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:17:36 crc kubenswrapper[4795]: I0219 23:17:36.511782 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:17:36 crc kubenswrapper[4795]: E0219 23:17:36.512536 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:17:51 crc kubenswrapper[4795]: I0219 23:17:51.518557 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:17:51 crc kubenswrapper[4795]: E0219 23:17:51.519880 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:18:06 crc kubenswrapper[4795]: I0219 23:18:06.512100 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:18:06 crc kubenswrapper[4795]: E0219 23:18:06.513148 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:18:17 crc kubenswrapper[4795]: I0219 23:18:17.513307 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:18:17 crc kubenswrapper[4795]: E0219 23:18:17.514049 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:18:25 crc kubenswrapper[4795]: I0219 23:18:25.038982 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-jll2l"] Feb 19 23:18:25 crc kubenswrapper[4795]: I0219 23:18:25.056930 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-aa86-account-create-update-n8xdq"] Feb 19 23:18:25 crc kubenswrapper[4795]: I0219 23:18:25.066538 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-jll2l"] Feb 19 23:18:25 crc kubenswrapper[4795]: I0219 23:18:25.074968 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-aa86-account-create-update-n8xdq"] Feb 19 23:18:25 crc kubenswrapper[4795]: I0219 23:18:25.522448 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3086733-54e4-4041-9896-88f6df519492" path="/var/lib/kubelet/pods/d3086733-54e4-4041-9896-88f6df519492/volumes" Feb 19 23:18:25 crc kubenswrapper[4795]: I0219 23:18:25.523071 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8" path="/var/lib/kubelet/pods/d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8/volumes" Feb 19 23:18:27 crc kubenswrapper[4795]: I0219 23:18:27.351301 4795 scope.go:117] "RemoveContainer" containerID="49e34c186890a85782e2cf1f05ffa268eb8918522484c1ad1e090793c43863fa" Feb 19 23:18:27 crc kubenswrapper[4795]: I0219 23:18:27.378955 4795 scope.go:117] "RemoveContainer" containerID="6653ee424e57053f6b0308afd986889a06d2949741f5136682c4b14068b5b724" Feb 19 23:18:30 crc kubenswrapper[4795]: I0219 23:18:30.512093 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:18:30 crc kubenswrapper[4795]: I0219 23:18:30.873240 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"ad9ce3f3f0f6a8b1730c68641f9c3a9fd43cad22ef4d316d4a9d380b0b12e9e5"} Feb 19 23:18:35 crc kubenswrapper[4795]: I0219 23:18:35.027704 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-q2lkk"] Feb 19 23:18:35 crc kubenswrapper[4795]: I0219 23:18:35.035448 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-q2lkk"] Feb 19 23:18:35 crc kubenswrapper[4795]: I0219 23:18:35.522522 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1953fbb-b558-497f-b889-62b41f35e4b4" path="/var/lib/kubelet/pods/e1953fbb-b558-497f-b889-62b41f35e4b4/volumes" Feb 19 23:18:53 crc kubenswrapper[4795]: I0219 23:18:53.041203 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-4647-account-create-update-f7k78"] Feb 19 23:18:53 crc kubenswrapper[4795]: I0219 23:18:53.051494 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-657cv"] Feb 19 23:18:53 crc kubenswrapper[4795]: I0219 23:18:53.060679 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-657cv"] Feb 19 23:18:53 crc kubenswrapper[4795]: I0219 23:18:53.068103 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-4647-account-create-update-f7k78"] Feb 19 23:18:53 crc kubenswrapper[4795]: I0219 23:18:53.524558 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce" path="/var/lib/kubelet/pods/4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce/volumes" Feb 19 23:18:53 crc kubenswrapper[4795]: I0219 23:18:53.526610 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eadcaa3c-623a-409a-b735-2a38854c8036" path="/var/lib/kubelet/pods/eadcaa3c-623a-409a-b735-2a38854c8036/volumes" Feb 19 23:19:05 crc kubenswrapper[4795]: I0219 23:19:05.055622 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-8fzxs"] Feb 19 23:19:05 crc kubenswrapper[4795]: I0219 23:19:05.070610 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-8fzxs"] Feb 19 23:19:05 crc kubenswrapper[4795]: I0219 23:19:05.546744 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd19113b-623e-4f3e-8392-09968a5d71f9" path="/var/lib/kubelet/pods/bd19113b-623e-4f3e-8392-09968a5d71f9/volumes" Feb 19 23:19:27 crc kubenswrapper[4795]: I0219 23:19:27.478070 4795 scope.go:117] "RemoveContainer" containerID="2537d15feb1a2c6f922f8d39bb6ed8ef442cb362fc714c8399a87d6d69b4784d" Feb 19 23:19:27 crc kubenswrapper[4795]: I0219 23:19:27.512095 4795 scope.go:117] "RemoveContainer" containerID="47bda409c5d13cad14cf54811039c8544c7b3358bb1de45385c9a60e61a75704" Feb 19 23:19:27 crc kubenswrapper[4795]: I0219 23:19:27.581816 4795 scope.go:117] "RemoveContainer" containerID="74c8a86b466b15b56eddcfa8140418aa598bd95ce7f01643ab2976a1bd25cfe5" Feb 19 23:19:27 crc kubenswrapper[4795]: I0219 23:19:27.624383 4795 scope.go:117] "RemoveContainer" containerID="a15bc43318a5f56769ebc9469003d0ddb425e341f35a3faca3d202f2707e3b2d" Feb 19 23:20:39 crc kubenswrapper[4795]: I0219 23:20:39.020972 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qvzdn"] Feb 19 23:20:39 crc kubenswrapper[4795]: E0219 23:20:39.021909 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eafa182-621e-48fe-a019-360c2f94c212" containerName="collect-profiles" Feb 19 23:20:39 crc kubenswrapper[4795]: I0219 23:20:39.021924 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eafa182-621e-48fe-a019-360c2f94c212" containerName="collect-profiles" Feb 19 23:20:39 crc kubenswrapper[4795]: I0219 23:20:39.022140 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eafa182-621e-48fe-a019-360c2f94c212" containerName="collect-profiles" Feb 19 23:20:39 crc kubenswrapper[4795]: I0219 23:20:39.023851 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:39 crc kubenswrapper[4795]: I0219 23:20:39.045127 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qvzdn"] Feb 19 23:20:39 crc kubenswrapper[4795]: I0219 23:20:39.079073 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-catalog-content\") pod \"community-operators-qvzdn\" (UID: \"27d6fc33-0869-4db3-8e1d-ce352d33d9cb\") " pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:39 crc kubenswrapper[4795]: I0219 23:20:39.079367 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-utilities\") pod \"community-operators-qvzdn\" (UID: \"27d6fc33-0869-4db3-8e1d-ce352d33d9cb\") " pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:39 crc kubenswrapper[4795]: I0219 23:20:39.079503 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9hvq\" (UniqueName: \"kubernetes.io/projected/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-kube-api-access-g9hvq\") pod \"community-operators-qvzdn\" (UID: \"27d6fc33-0869-4db3-8e1d-ce352d33d9cb\") " pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:39 crc kubenswrapper[4795]: I0219 23:20:39.182316 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-catalog-content\") pod \"community-operators-qvzdn\" (UID: \"27d6fc33-0869-4db3-8e1d-ce352d33d9cb\") " pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:39 crc kubenswrapper[4795]: I0219 23:20:39.182392 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-utilities\") pod \"community-operators-qvzdn\" (UID: \"27d6fc33-0869-4db3-8e1d-ce352d33d9cb\") " pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:39 crc kubenswrapper[4795]: I0219 23:20:39.182429 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9hvq\" (UniqueName: \"kubernetes.io/projected/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-kube-api-access-g9hvq\") pod \"community-operators-qvzdn\" (UID: \"27d6fc33-0869-4db3-8e1d-ce352d33d9cb\") " pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:39 crc kubenswrapper[4795]: I0219 23:20:39.182921 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-utilities\") pod \"community-operators-qvzdn\" (UID: \"27d6fc33-0869-4db3-8e1d-ce352d33d9cb\") " pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:39 crc kubenswrapper[4795]: I0219 23:20:39.183032 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-catalog-content\") pod \"community-operators-qvzdn\" (UID: \"27d6fc33-0869-4db3-8e1d-ce352d33d9cb\") " pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:39 crc kubenswrapper[4795]: I0219 23:20:39.218888 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9hvq\" (UniqueName: \"kubernetes.io/projected/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-kube-api-access-g9hvq\") pod \"community-operators-qvzdn\" (UID: \"27d6fc33-0869-4db3-8e1d-ce352d33d9cb\") " pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:39 crc kubenswrapper[4795]: I0219 23:20:39.354946 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:39 crc kubenswrapper[4795]: I0219 23:20:39.922956 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qvzdn"] Feb 19 23:20:40 crc kubenswrapper[4795]: I0219 23:20:40.206291 4795 generic.go:334] "Generic (PLEG): container finished" podID="27d6fc33-0869-4db3-8e1d-ce352d33d9cb" containerID="90b7b74d7421043ee57f26dd4154d02ea41a8e86526c38c70cf79a9ef21e8930" exitCode=0 Feb 19 23:20:40 crc kubenswrapper[4795]: I0219 23:20:40.206344 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qvzdn" event={"ID":"27d6fc33-0869-4db3-8e1d-ce352d33d9cb","Type":"ContainerDied","Data":"90b7b74d7421043ee57f26dd4154d02ea41a8e86526c38c70cf79a9ef21e8930"} Feb 19 23:20:40 crc kubenswrapper[4795]: I0219 23:20:40.206591 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qvzdn" event={"ID":"27d6fc33-0869-4db3-8e1d-ce352d33d9cb","Type":"ContainerStarted","Data":"123360a20adc433e46230a3d86ab168ba8be10b64f023d54e7c7d020d514ec34"} Feb 19 23:20:40 crc kubenswrapper[4795]: I0219 23:20:40.209013 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 23:20:41 crc kubenswrapper[4795]: I0219 23:20:41.218834 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qvzdn" event={"ID":"27d6fc33-0869-4db3-8e1d-ce352d33d9cb","Type":"ContainerStarted","Data":"9502aeb2932f638771dcbfbf5feb77ab738d50770951d8f7ad44433899b319c6"} Feb 19 23:20:43 crc kubenswrapper[4795]: I0219 23:20:43.237719 4795 generic.go:334] "Generic (PLEG): container finished" podID="27d6fc33-0869-4db3-8e1d-ce352d33d9cb" containerID="9502aeb2932f638771dcbfbf5feb77ab738d50770951d8f7ad44433899b319c6" exitCode=0 Feb 19 23:20:43 crc kubenswrapper[4795]: I0219 23:20:43.237764 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qvzdn" event={"ID":"27d6fc33-0869-4db3-8e1d-ce352d33d9cb","Type":"ContainerDied","Data":"9502aeb2932f638771dcbfbf5feb77ab738d50770951d8f7ad44433899b319c6"} Feb 19 23:20:44 crc kubenswrapper[4795]: I0219 23:20:44.249910 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qvzdn" event={"ID":"27d6fc33-0869-4db3-8e1d-ce352d33d9cb","Type":"ContainerStarted","Data":"94823ab5e3f75d119d89e78533d723b0a182b5fc409cd7a722cff54a7a7b4c4e"} Feb 19 23:20:44 crc kubenswrapper[4795]: I0219 23:20:44.269230 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qvzdn" podStartSLOduration=2.820813672 podStartE2EDuration="6.269207535s" podCreationTimestamp="2026-02-19 23:20:38 +0000 UTC" firstStartedPulling="2026-02-19 23:20:40.208824513 +0000 UTC m=+6751.401342377" lastFinishedPulling="2026-02-19 23:20:43.657218386 +0000 UTC m=+6754.849736240" observedRunningTime="2026-02-19 23:20:44.265638635 +0000 UTC m=+6755.458156499" watchObservedRunningTime="2026-02-19 23:20:44.269207535 +0000 UTC m=+6755.461725399" Feb 19 23:20:49 crc kubenswrapper[4795]: I0219 23:20:49.355271 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:49 crc kubenswrapper[4795]: I0219 23:20:49.355935 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:49 crc kubenswrapper[4795]: I0219 23:20:49.423312 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:50 crc kubenswrapper[4795]: I0219 23:20:50.364183 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:50 crc kubenswrapper[4795]: I0219 23:20:50.422886 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qvzdn"] Feb 19 23:20:52 crc kubenswrapper[4795]: I0219 23:20:52.329067 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qvzdn" podUID="27d6fc33-0869-4db3-8e1d-ce352d33d9cb" containerName="registry-server" containerID="cri-o://94823ab5e3f75d119d89e78533d723b0a182b5fc409cd7a722cff54a7a7b4c4e" gracePeriod=2 Feb 19 23:20:52 crc kubenswrapper[4795]: I0219 23:20:52.857070 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:52 crc kubenswrapper[4795]: I0219 23:20:52.891718 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-utilities\") pod \"27d6fc33-0869-4db3-8e1d-ce352d33d9cb\" (UID: \"27d6fc33-0869-4db3-8e1d-ce352d33d9cb\") " Feb 19 23:20:52 crc kubenswrapper[4795]: I0219 23:20:52.891855 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-catalog-content\") pod \"27d6fc33-0869-4db3-8e1d-ce352d33d9cb\" (UID: \"27d6fc33-0869-4db3-8e1d-ce352d33d9cb\") " Feb 19 23:20:52 crc kubenswrapper[4795]: I0219 23:20:52.891900 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9hvq\" (UniqueName: \"kubernetes.io/projected/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-kube-api-access-g9hvq\") pod \"27d6fc33-0869-4db3-8e1d-ce352d33d9cb\" (UID: \"27d6fc33-0869-4db3-8e1d-ce352d33d9cb\") " Feb 19 23:20:52 crc kubenswrapper[4795]: I0219 23:20:52.892481 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-utilities" (OuterVolumeSpecName: "utilities") pod "27d6fc33-0869-4db3-8e1d-ce352d33d9cb" (UID: "27d6fc33-0869-4db3-8e1d-ce352d33d9cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:20:52 crc kubenswrapper[4795]: I0219 23:20:52.892860 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:20:52 crc kubenswrapper[4795]: I0219 23:20:52.898385 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-kube-api-access-g9hvq" (OuterVolumeSpecName: "kube-api-access-g9hvq") pod "27d6fc33-0869-4db3-8e1d-ce352d33d9cb" (UID: "27d6fc33-0869-4db3-8e1d-ce352d33d9cb"). InnerVolumeSpecName "kube-api-access-g9hvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:20:52 crc kubenswrapper[4795]: I0219 23:20:52.946399 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "27d6fc33-0869-4db3-8e1d-ce352d33d9cb" (UID: "27d6fc33-0869-4db3-8e1d-ce352d33d9cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:20:52 crc kubenswrapper[4795]: I0219 23:20:52.994249 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:20:52 crc kubenswrapper[4795]: I0219 23:20:52.994278 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9hvq\" (UniqueName: \"kubernetes.io/projected/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-kube-api-access-g9hvq\") on node \"crc\" DevicePath \"\"" Feb 19 23:20:53 crc kubenswrapper[4795]: I0219 23:20:53.340798 4795 generic.go:334] "Generic (PLEG): container finished" podID="27d6fc33-0869-4db3-8e1d-ce352d33d9cb" containerID="94823ab5e3f75d119d89e78533d723b0a182b5fc409cd7a722cff54a7a7b4c4e" exitCode=0 Feb 19 23:20:53 crc kubenswrapper[4795]: I0219 23:20:53.340872 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qvzdn" event={"ID":"27d6fc33-0869-4db3-8e1d-ce352d33d9cb","Type":"ContainerDied","Data":"94823ab5e3f75d119d89e78533d723b0a182b5fc409cd7a722cff54a7a7b4c4e"} Feb 19 23:20:53 crc kubenswrapper[4795]: I0219 23:20:53.342145 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qvzdn" event={"ID":"27d6fc33-0869-4db3-8e1d-ce352d33d9cb","Type":"ContainerDied","Data":"123360a20adc433e46230a3d86ab168ba8be10b64f023d54e7c7d020d514ec34"} Feb 19 23:20:53 crc kubenswrapper[4795]: I0219 23:20:53.340914 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:53 crc kubenswrapper[4795]: I0219 23:20:53.342236 4795 scope.go:117] "RemoveContainer" containerID="94823ab5e3f75d119d89e78533d723b0a182b5fc409cd7a722cff54a7a7b4c4e" Feb 19 23:20:53 crc kubenswrapper[4795]: I0219 23:20:53.365827 4795 scope.go:117] "RemoveContainer" containerID="9502aeb2932f638771dcbfbf5feb77ab738d50770951d8f7ad44433899b319c6" Feb 19 23:20:53 crc kubenswrapper[4795]: I0219 23:20:53.387557 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qvzdn"] Feb 19 23:20:53 crc kubenswrapper[4795]: I0219 23:20:53.400716 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qvzdn"] Feb 19 23:20:53 crc kubenswrapper[4795]: I0219 23:20:53.415488 4795 scope.go:117] "RemoveContainer" containerID="90b7b74d7421043ee57f26dd4154d02ea41a8e86526c38c70cf79a9ef21e8930" Feb 19 23:20:53 crc kubenswrapper[4795]: I0219 23:20:53.451511 4795 scope.go:117] "RemoveContainer" containerID="94823ab5e3f75d119d89e78533d723b0a182b5fc409cd7a722cff54a7a7b4c4e" Feb 19 23:20:53 crc kubenswrapper[4795]: E0219 23:20:53.451989 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94823ab5e3f75d119d89e78533d723b0a182b5fc409cd7a722cff54a7a7b4c4e\": container with ID starting with 94823ab5e3f75d119d89e78533d723b0a182b5fc409cd7a722cff54a7a7b4c4e not found: ID does not exist" containerID="94823ab5e3f75d119d89e78533d723b0a182b5fc409cd7a722cff54a7a7b4c4e" Feb 19 23:20:53 crc kubenswrapper[4795]: I0219 23:20:53.452018 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94823ab5e3f75d119d89e78533d723b0a182b5fc409cd7a722cff54a7a7b4c4e"} err="failed to get container status \"94823ab5e3f75d119d89e78533d723b0a182b5fc409cd7a722cff54a7a7b4c4e\": rpc error: code = NotFound desc = could not find container \"94823ab5e3f75d119d89e78533d723b0a182b5fc409cd7a722cff54a7a7b4c4e\": container with ID starting with 94823ab5e3f75d119d89e78533d723b0a182b5fc409cd7a722cff54a7a7b4c4e not found: ID does not exist" Feb 19 23:20:53 crc kubenswrapper[4795]: I0219 23:20:53.452037 4795 scope.go:117] "RemoveContainer" containerID="9502aeb2932f638771dcbfbf5feb77ab738d50770951d8f7ad44433899b319c6" Feb 19 23:20:53 crc kubenswrapper[4795]: E0219 23:20:53.452557 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9502aeb2932f638771dcbfbf5feb77ab738d50770951d8f7ad44433899b319c6\": container with ID starting with 9502aeb2932f638771dcbfbf5feb77ab738d50770951d8f7ad44433899b319c6 not found: ID does not exist" containerID="9502aeb2932f638771dcbfbf5feb77ab738d50770951d8f7ad44433899b319c6" Feb 19 23:20:53 crc kubenswrapper[4795]: I0219 23:20:53.452597 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9502aeb2932f638771dcbfbf5feb77ab738d50770951d8f7ad44433899b319c6"} err="failed to get container status \"9502aeb2932f638771dcbfbf5feb77ab738d50770951d8f7ad44433899b319c6\": rpc error: code = NotFound desc = could not find container \"9502aeb2932f638771dcbfbf5feb77ab738d50770951d8f7ad44433899b319c6\": container with ID starting with 9502aeb2932f638771dcbfbf5feb77ab738d50770951d8f7ad44433899b319c6 not found: ID does not exist" Feb 19 23:20:53 crc kubenswrapper[4795]: I0219 23:20:53.452620 4795 scope.go:117] "RemoveContainer" containerID="90b7b74d7421043ee57f26dd4154d02ea41a8e86526c38c70cf79a9ef21e8930" Feb 19 23:20:53 crc kubenswrapper[4795]: E0219 23:20:53.452881 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90b7b74d7421043ee57f26dd4154d02ea41a8e86526c38c70cf79a9ef21e8930\": container with ID starting with 90b7b74d7421043ee57f26dd4154d02ea41a8e86526c38c70cf79a9ef21e8930 not found: ID does not exist" containerID="90b7b74d7421043ee57f26dd4154d02ea41a8e86526c38c70cf79a9ef21e8930" Feb 19 23:20:53 crc kubenswrapper[4795]: I0219 23:20:53.452909 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90b7b74d7421043ee57f26dd4154d02ea41a8e86526c38c70cf79a9ef21e8930"} err="failed to get container status \"90b7b74d7421043ee57f26dd4154d02ea41a8e86526c38c70cf79a9ef21e8930\": rpc error: code = NotFound desc = could not find container \"90b7b74d7421043ee57f26dd4154d02ea41a8e86526c38c70cf79a9ef21e8930\": container with ID starting with 90b7b74d7421043ee57f26dd4154d02ea41a8e86526c38c70cf79a9ef21e8930 not found: ID does not exist" Feb 19 23:20:53 crc kubenswrapper[4795]: I0219 23:20:53.553267 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27d6fc33-0869-4db3-8e1d-ce352d33d9cb" path="/var/lib/kubelet/pods/27d6fc33-0869-4db3-8e1d-ce352d33d9cb/volumes" Feb 19 23:20:58 crc kubenswrapper[4795]: I0219 23:20:58.427526 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:20:58 crc kubenswrapper[4795]: I0219 23:20:58.428056 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:21:28 crc kubenswrapper[4795]: I0219 23:21:28.427738 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:21:28 crc kubenswrapper[4795]: I0219 23:21:28.428519 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:21:58 crc kubenswrapper[4795]: I0219 23:21:58.427466 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:21:58 crc kubenswrapper[4795]: I0219 23:21:58.428085 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:21:58 crc kubenswrapper[4795]: I0219 23:21:58.428134 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 23:21:58 crc kubenswrapper[4795]: I0219 23:21:58.429083 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ad9ce3f3f0f6a8b1730c68641f9c3a9fd43cad22ef4d316d4a9d380b0b12e9e5"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 23:21:58 crc kubenswrapper[4795]: I0219 23:21:58.429150 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://ad9ce3f3f0f6a8b1730c68641f9c3a9fd43cad22ef4d316d4a9d380b0b12e9e5" gracePeriod=600 Feb 19 23:21:58 crc kubenswrapper[4795]: I0219 23:21:58.979141 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="ad9ce3f3f0f6a8b1730c68641f9c3a9fd43cad22ef4d316d4a9d380b0b12e9e5" exitCode=0 Feb 19 23:21:58 crc kubenswrapper[4795]: I0219 23:21:58.979305 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"ad9ce3f3f0f6a8b1730c68641f9c3a9fd43cad22ef4d316d4a9d380b0b12e9e5"} Feb 19 23:21:58 crc kubenswrapper[4795]: I0219 23:21:58.979550 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02"} Feb 19 23:21:58 crc kubenswrapper[4795]: I0219 23:21:58.979578 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:22:04 crc kubenswrapper[4795]: I0219 23:22:04.048623 4795 generic.go:334] "Generic (PLEG): container finished" podID="c3cbdd11-d93f-4025-9c08-7530a68f6113" containerID="587260c6871d5ac2af0ecde5796363d9253fc80e5d5b060bd7a191b9752a30d4" exitCode=0 Feb 19 23:22:04 crc kubenswrapper[4795]: I0219 23:22:04.048713 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" event={"ID":"c3cbdd11-d93f-4025-9c08-7530a68f6113","Type":"ContainerDied","Data":"587260c6871d5ac2af0ecde5796363d9253fc80e5d5b060bd7a191b9752a30d4"} Feb 19 23:22:05 crc kubenswrapper[4795]: I0219 23:22:05.575353 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:22:05 crc kubenswrapper[4795]: I0219 23:22:05.708878 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-tripleo-cleanup-combined-ca-bundle\") pod \"c3cbdd11-d93f-4025-9c08-7530a68f6113\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " Feb 19 23:22:05 crc kubenswrapper[4795]: I0219 23:22:05.708933 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hh9wf\" (UniqueName: \"kubernetes.io/projected/c3cbdd11-d93f-4025-9c08-7530a68f6113-kube-api-access-hh9wf\") pod \"c3cbdd11-d93f-4025-9c08-7530a68f6113\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " Feb 19 23:22:05 crc kubenswrapper[4795]: I0219 23:22:05.708963 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-ceph\") pod \"c3cbdd11-d93f-4025-9c08-7530a68f6113\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " Feb 19 23:22:05 crc kubenswrapper[4795]: I0219 23:22:05.709239 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-inventory\") pod \"c3cbdd11-d93f-4025-9c08-7530a68f6113\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " Feb 19 23:22:05 crc kubenswrapper[4795]: I0219 23:22:05.709296 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-ssh-key-openstack-cell1\") pod \"c3cbdd11-d93f-4025-9c08-7530a68f6113\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " Feb 19 23:22:05 crc kubenswrapper[4795]: I0219 23:22:05.714644 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "c3cbdd11-d93f-4025-9c08-7530a68f6113" (UID: "c3cbdd11-d93f-4025-9c08-7530a68f6113"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:22:05 crc kubenswrapper[4795]: I0219 23:22:05.714758 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-ceph" (OuterVolumeSpecName: "ceph") pod "c3cbdd11-d93f-4025-9c08-7530a68f6113" (UID: "c3cbdd11-d93f-4025-9c08-7530a68f6113"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:22:05 crc kubenswrapper[4795]: I0219 23:22:05.715218 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3cbdd11-d93f-4025-9c08-7530a68f6113-kube-api-access-hh9wf" (OuterVolumeSpecName: "kube-api-access-hh9wf") pod "c3cbdd11-d93f-4025-9c08-7530a68f6113" (UID: "c3cbdd11-d93f-4025-9c08-7530a68f6113"). InnerVolumeSpecName "kube-api-access-hh9wf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:22:05 crc kubenswrapper[4795]: I0219 23:22:05.736553 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-inventory" (OuterVolumeSpecName: "inventory") pod "c3cbdd11-d93f-4025-9c08-7530a68f6113" (UID: "c3cbdd11-d93f-4025-9c08-7530a68f6113"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:22:05 crc kubenswrapper[4795]: I0219 23:22:05.738290 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "c3cbdd11-d93f-4025-9c08-7530a68f6113" (UID: "c3cbdd11-d93f-4025-9c08-7530a68f6113"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:22:05 crc kubenswrapper[4795]: I0219 23:22:05.812148 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hh9wf\" (UniqueName: \"kubernetes.io/projected/c3cbdd11-d93f-4025-9c08-7530a68f6113-kube-api-access-hh9wf\") on node \"crc\" DevicePath \"\"" Feb 19 23:22:05 crc kubenswrapper[4795]: I0219 23:22:05.812203 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:22:05 crc kubenswrapper[4795]: I0219 23:22:05.812214 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:22:05 crc kubenswrapper[4795]: I0219 23:22:05.812223 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:22:05 crc kubenswrapper[4795]: I0219 23:22:05.812234 4795 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:22:06 crc kubenswrapper[4795]: I0219 23:22:06.067013 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" event={"ID":"c3cbdd11-d93f-4025-9c08-7530a68f6113","Type":"ContainerDied","Data":"0c55dc0c067482da01178354ee4aefa7baf05a359c6d442c03755a2406d2ea95"} Feb 19 23:22:06 crc kubenswrapper[4795]: I0219 23:22:06.067056 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c55dc0c067482da01178354ee4aefa7baf05a359c6d442c03755a2406d2ea95" Feb 19 23:22:06 crc kubenswrapper[4795]: I0219 23:22:06.067088 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:22:06 crc kubenswrapper[4795]: I0219 23:22:06.922841 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8dlgt"] Feb 19 23:22:06 crc kubenswrapper[4795]: E0219 23:22:06.923627 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27d6fc33-0869-4db3-8e1d-ce352d33d9cb" containerName="extract-utilities" Feb 19 23:22:06 crc kubenswrapper[4795]: I0219 23:22:06.923641 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="27d6fc33-0869-4db3-8e1d-ce352d33d9cb" containerName="extract-utilities" Feb 19 23:22:06 crc kubenswrapper[4795]: E0219 23:22:06.923673 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3cbdd11-d93f-4025-9c08-7530a68f6113" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Feb 19 23:22:06 crc kubenswrapper[4795]: I0219 23:22:06.923681 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3cbdd11-d93f-4025-9c08-7530a68f6113" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Feb 19 23:22:06 crc kubenswrapper[4795]: E0219 23:22:06.923698 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27d6fc33-0869-4db3-8e1d-ce352d33d9cb" containerName="registry-server" Feb 19 23:22:06 crc kubenswrapper[4795]: I0219 23:22:06.923704 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="27d6fc33-0869-4db3-8e1d-ce352d33d9cb" containerName="registry-server" Feb 19 23:22:06 crc kubenswrapper[4795]: E0219 23:22:06.923731 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27d6fc33-0869-4db3-8e1d-ce352d33d9cb" containerName="extract-content" Feb 19 23:22:06 crc kubenswrapper[4795]: I0219 23:22:06.923736 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="27d6fc33-0869-4db3-8e1d-ce352d33d9cb" containerName="extract-content" Feb 19 23:22:06 crc kubenswrapper[4795]: I0219 23:22:06.923933 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3cbdd11-d93f-4025-9c08-7530a68f6113" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Feb 19 23:22:06 crc kubenswrapper[4795]: I0219 23:22:06.923953 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="27d6fc33-0869-4db3-8e1d-ce352d33d9cb" containerName="registry-server" Feb 19 23:22:06 crc kubenswrapper[4795]: I0219 23:22:06.925492 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:06 crc kubenswrapper[4795]: I0219 23:22:06.938021 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8dlgt"] Feb 19 23:22:07 crc kubenswrapper[4795]: I0219 23:22:07.033823 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8467b6c-6941-4df2-b652-61b4c8eee22e-catalog-content\") pod \"certified-operators-8dlgt\" (UID: \"c8467b6c-6941-4df2-b652-61b4c8eee22e\") " pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:07 crc kubenswrapper[4795]: I0219 23:22:07.033899 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-457r5\" (UniqueName: \"kubernetes.io/projected/c8467b6c-6941-4df2-b652-61b4c8eee22e-kube-api-access-457r5\") pod \"certified-operators-8dlgt\" (UID: \"c8467b6c-6941-4df2-b652-61b4c8eee22e\") " pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:07 crc kubenswrapper[4795]: I0219 23:22:07.033926 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8467b6c-6941-4df2-b652-61b4c8eee22e-utilities\") pod \"certified-operators-8dlgt\" (UID: \"c8467b6c-6941-4df2-b652-61b4c8eee22e\") " pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:07 crc kubenswrapper[4795]: I0219 23:22:07.136236 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8467b6c-6941-4df2-b652-61b4c8eee22e-catalog-content\") pod \"certified-operators-8dlgt\" (UID: \"c8467b6c-6941-4df2-b652-61b4c8eee22e\") " pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:07 crc kubenswrapper[4795]: I0219 23:22:07.136300 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-457r5\" (UniqueName: \"kubernetes.io/projected/c8467b6c-6941-4df2-b652-61b4c8eee22e-kube-api-access-457r5\") pod \"certified-operators-8dlgt\" (UID: \"c8467b6c-6941-4df2-b652-61b4c8eee22e\") " pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:07 crc kubenswrapper[4795]: I0219 23:22:07.136325 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8467b6c-6941-4df2-b652-61b4c8eee22e-utilities\") pod \"certified-operators-8dlgt\" (UID: \"c8467b6c-6941-4df2-b652-61b4c8eee22e\") " pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:07 crc kubenswrapper[4795]: I0219 23:22:07.137291 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8467b6c-6941-4df2-b652-61b4c8eee22e-utilities\") pod \"certified-operators-8dlgt\" (UID: \"c8467b6c-6941-4df2-b652-61b4c8eee22e\") " pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:07 crc kubenswrapper[4795]: I0219 23:22:07.137297 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8467b6c-6941-4df2-b652-61b4c8eee22e-catalog-content\") pod \"certified-operators-8dlgt\" (UID: \"c8467b6c-6941-4df2-b652-61b4c8eee22e\") " pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:07 crc kubenswrapper[4795]: I0219 23:22:07.156885 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-457r5\" (UniqueName: \"kubernetes.io/projected/c8467b6c-6941-4df2-b652-61b4c8eee22e-kube-api-access-457r5\") pod \"certified-operators-8dlgt\" (UID: \"c8467b6c-6941-4df2-b652-61b4c8eee22e\") " pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:07 crc kubenswrapper[4795]: I0219 23:22:07.263220 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:07 crc kubenswrapper[4795]: I0219 23:22:07.841515 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8dlgt"] Feb 19 23:22:08 crc kubenswrapper[4795]: I0219 23:22:08.086485 4795 generic.go:334] "Generic (PLEG): container finished" podID="c8467b6c-6941-4df2-b652-61b4c8eee22e" containerID="7d4060c75eec61b2f3b0d6ea6e6f1779ef5db1e64d9724c3a5006f0393287cef" exitCode=0 Feb 19 23:22:08 crc kubenswrapper[4795]: I0219 23:22:08.086535 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8dlgt" event={"ID":"c8467b6c-6941-4df2-b652-61b4c8eee22e","Type":"ContainerDied","Data":"7d4060c75eec61b2f3b0d6ea6e6f1779ef5db1e64d9724c3a5006f0393287cef"} Feb 19 23:22:08 crc kubenswrapper[4795]: I0219 23:22:08.086782 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8dlgt" event={"ID":"c8467b6c-6941-4df2-b652-61b4c8eee22e","Type":"ContainerStarted","Data":"7f04f25cf122644f89f7e4477044089ac044f21f8955c90051f04532f6b7ae72"} Feb 19 23:22:09 crc kubenswrapper[4795]: I0219 23:22:09.320236 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xhm6c"] Feb 19 23:22:09 crc kubenswrapper[4795]: I0219 23:22:09.324291 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:09 crc kubenswrapper[4795]: I0219 23:22:09.370978 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xhm6c"] Feb 19 23:22:09 crc kubenswrapper[4795]: I0219 23:22:09.493693 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42bqd\" (UniqueName: \"kubernetes.io/projected/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-kube-api-access-42bqd\") pod \"redhat-operators-xhm6c\" (UID: \"33946ece-f847-4ce2-ab50-c4f2f61f9b4e\") " pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:09 crc kubenswrapper[4795]: I0219 23:22:09.493810 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-catalog-content\") pod \"redhat-operators-xhm6c\" (UID: \"33946ece-f847-4ce2-ab50-c4f2f61f9b4e\") " pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:09 crc kubenswrapper[4795]: I0219 23:22:09.493959 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-utilities\") pod \"redhat-operators-xhm6c\" (UID: \"33946ece-f847-4ce2-ab50-c4f2f61f9b4e\") " pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:09 crc kubenswrapper[4795]: I0219 23:22:09.595602 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-utilities\") pod \"redhat-operators-xhm6c\" (UID: \"33946ece-f847-4ce2-ab50-c4f2f61f9b4e\") " pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:09 crc kubenswrapper[4795]: I0219 23:22:09.595678 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42bqd\" (UniqueName: \"kubernetes.io/projected/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-kube-api-access-42bqd\") pod \"redhat-operators-xhm6c\" (UID: \"33946ece-f847-4ce2-ab50-c4f2f61f9b4e\") " pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:09 crc kubenswrapper[4795]: I0219 23:22:09.595740 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-catalog-content\") pod \"redhat-operators-xhm6c\" (UID: \"33946ece-f847-4ce2-ab50-c4f2f61f9b4e\") " pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:09 crc kubenswrapper[4795]: I0219 23:22:09.596319 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-catalog-content\") pod \"redhat-operators-xhm6c\" (UID: \"33946ece-f847-4ce2-ab50-c4f2f61f9b4e\") " pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:09 crc kubenswrapper[4795]: I0219 23:22:09.596541 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-utilities\") pod \"redhat-operators-xhm6c\" (UID: \"33946ece-f847-4ce2-ab50-c4f2f61f9b4e\") " pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:09 crc kubenswrapper[4795]: I0219 23:22:09.630312 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42bqd\" (UniqueName: \"kubernetes.io/projected/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-kube-api-access-42bqd\") pod \"redhat-operators-xhm6c\" (UID: \"33946ece-f847-4ce2-ab50-c4f2f61f9b4e\") " pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:09 crc kubenswrapper[4795]: I0219 23:22:09.650642 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:09 crc kubenswrapper[4795]: I0219 23:22:09.995811 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xhm6c"] Feb 19 23:22:10 crc kubenswrapper[4795]: W0219 23:22:10.011749 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33946ece_f847_4ce2_ab50_c4f2f61f9b4e.slice/crio-3f311eb836ae45dd9c28aa7638effd6c5fc9d4545f954e724bb6af8ab3992a1b WatchSource:0}: Error finding container 3f311eb836ae45dd9c28aa7638effd6c5fc9d4545f954e724bb6af8ab3992a1b: Status 404 returned error can't find the container with id 3f311eb836ae45dd9c28aa7638effd6c5fc9d4545f954e724bb6af8ab3992a1b Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.106916 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhm6c" event={"ID":"33946ece-f847-4ce2-ab50-c4f2f61f9b4e","Type":"ContainerStarted","Data":"3f311eb836ae45dd9c28aa7638effd6c5fc9d4545f954e724bb6af8ab3992a1b"} Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.110797 4795 generic.go:334] "Generic (PLEG): container finished" podID="c8467b6c-6941-4df2-b652-61b4c8eee22e" containerID="3f35772a044ff24e0c28484b8544121dea1a949261e4bc5de787469ca663a5f1" exitCode=0 Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.110837 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8dlgt" event={"ID":"c8467b6c-6941-4df2-b652-61b4c8eee22e","Type":"ContainerDied","Data":"3f35772a044ff24e0c28484b8544121dea1a949261e4bc5de787469ca663a5f1"} Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.175557 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-rmhfb"] Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.177596 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.180096 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.180396 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.182366 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.187260 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-rmhfb"] Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.190147 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.316827 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-ceph\") pod \"bootstrap-openstack-openstack-cell1-rmhfb\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.317342 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-inventory\") pod \"bootstrap-openstack-openstack-cell1-rmhfb\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.317418 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-rmhfb\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.317528 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-rmhfb\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.317774 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwdtc\" (UniqueName: \"kubernetes.io/projected/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-kube-api-access-kwdtc\") pod \"bootstrap-openstack-openstack-cell1-rmhfb\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.420395 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-inventory\") pod \"bootstrap-openstack-openstack-cell1-rmhfb\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.420464 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-rmhfb\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.420503 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-rmhfb\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.420580 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwdtc\" (UniqueName: \"kubernetes.io/projected/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-kube-api-access-kwdtc\") pod \"bootstrap-openstack-openstack-cell1-rmhfb\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.420690 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-ceph\") pod \"bootstrap-openstack-openstack-cell1-rmhfb\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.427601 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-inventory\") pod \"bootstrap-openstack-openstack-cell1-rmhfb\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.427936 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-rmhfb\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.428001 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-rmhfb\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.428042 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-ceph\") pod \"bootstrap-openstack-openstack-cell1-rmhfb\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.443000 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwdtc\" (UniqueName: \"kubernetes.io/projected/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-kube-api-access-kwdtc\") pod \"bootstrap-openstack-openstack-cell1-rmhfb\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.503715 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:11 crc kubenswrapper[4795]: I0219 23:22:11.135715 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8dlgt" event={"ID":"c8467b6c-6941-4df2-b652-61b4c8eee22e","Type":"ContainerStarted","Data":"cd8f791e15dd3e7c0787cc203addbd2ace7aa3c8d53487ae15d5e2afca341071"} Feb 19 23:22:11 crc kubenswrapper[4795]: I0219 23:22:11.141748 4795 generic.go:334] "Generic (PLEG): container finished" podID="33946ece-f847-4ce2-ab50-c4f2f61f9b4e" containerID="487ce0a487777574543b5bb91827dc66d9dcd5cfb6feea2ce24ffe5619111c28" exitCode=0 Feb 19 23:22:11 crc kubenswrapper[4795]: I0219 23:22:11.141811 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhm6c" event={"ID":"33946ece-f847-4ce2-ab50-c4f2f61f9b4e","Type":"ContainerDied","Data":"487ce0a487777574543b5bb91827dc66d9dcd5cfb6feea2ce24ffe5619111c28"} Feb 19 23:22:11 crc kubenswrapper[4795]: I0219 23:22:11.164463 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-rmhfb"] Feb 19 23:22:11 crc kubenswrapper[4795]: I0219 23:22:11.168219 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8dlgt" podStartSLOduration=2.716172142 podStartE2EDuration="5.16820106s" podCreationTimestamp="2026-02-19 23:22:06 +0000 UTC" firstStartedPulling="2026-02-19 23:22:08.088500301 +0000 UTC m=+6839.281018165" lastFinishedPulling="2026-02-19 23:22:10.540529219 +0000 UTC m=+6841.733047083" observedRunningTime="2026-02-19 23:22:11.15645554 +0000 UTC m=+6842.348973394" watchObservedRunningTime="2026-02-19 23:22:11.16820106 +0000 UTC m=+6842.360718924" Feb 19 23:22:12 crc kubenswrapper[4795]: I0219 23:22:12.152051 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" event={"ID":"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa","Type":"ContainerStarted","Data":"94c7431ebacbd1806e6fe318625607d7ac4c8655988210ced64888f48d2153e5"} Feb 19 23:22:12 crc kubenswrapper[4795]: I0219 23:22:12.152409 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" event={"ID":"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa","Type":"ContainerStarted","Data":"8334d40c98b3c6b685c906be16789e5a35ea56989f4f2c9374c39b35dba237a9"} Feb 19 23:22:12 crc kubenswrapper[4795]: I0219 23:22:12.156330 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhm6c" event={"ID":"33946ece-f847-4ce2-ab50-c4f2f61f9b4e","Type":"ContainerStarted","Data":"b72ba919511616616308f9bb0b6c710d4ba3184033ebc6fa828ee43fa86036d4"} Feb 19 23:22:12 crc kubenswrapper[4795]: I0219 23:22:12.174125 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" podStartSLOduration=1.647355608 podStartE2EDuration="2.174106467s" podCreationTimestamp="2026-02-19 23:22:10 +0000 UTC" firstStartedPulling="2026-02-19 23:22:11.17630256 +0000 UTC m=+6842.368820414" lastFinishedPulling="2026-02-19 23:22:11.703053409 +0000 UTC m=+6842.895571273" observedRunningTime="2026-02-19 23:22:12.166194252 +0000 UTC m=+6843.358712116" watchObservedRunningTime="2026-02-19 23:22:12.174106467 +0000 UTC m=+6843.366624331" Feb 19 23:22:17 crc kubenswrapper[4795]: I0219 23:22:17.205597 4795 generic.go:334] "Generic (PLEG): container finished" podID="33946ece-f847-4ce2-ab50-c4f2f61f9b4e" containerID="b72ba919511616616308f9bb0b6c710d4ba3184033ebc6fa828ee43fa86036d4" exitCode=0 Feb 19 23:22:17 crc kubenswrapper[4795]: I0219 23:22:17.207325 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhm6c" event={"ID":"33946ece-f847-4ce2-ab50-c4f2f61f9b4e","Type":"ContainerDied","Data":"b72ba919511616616308f9bb0b6c710d4ba3184033ebc6fa828ee43fa86036d4"} Feb 19 23:22:17 crc kubenswrapper[4795]: I0219 23:22:17.263371 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:17 crc kubenswrapper[4795]: I0219 23:22:17.263425 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:17 crc kubenswrapper[4795]: I0219 23:22:17.313125 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:18 crc kubenswrapper[4795]: I0219 23:22:18.218385 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhm6c" event={"ID":"33946ece-f847-4ce2-ab50-c4f2f61f9b4e","Type":"ContainerStarted","Data":"a9b8825ded8e0e381b677dccc80a6f9b71c79547183cb194af2bb32d4f2ef2fc"} Feb 19 23:22:18 crc kubenswrapper[4795]: I0219 23:22:18.250151 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xhm6c" podStartSLOduration=2.67220479 podStartE2EDuration="9.250127187s" podCreationTimestamp="2026-02-19 23:22:09 +0000 UTC" firstStartedPulling="2026-02-19 23:22:11.143891787 +0000 UTC m=+6842.336409651" lastFinishedPulling="2026-02-19 23:22:17.721814184 +0000 UTC m=+6848.914332048" observedRunningTime="2026-02-19 23:22:18.239048083 +0000 UTC m=+6849.431565957" watchObservedRunningTime="2026-02-19 23:22:18.250127187 +0000 UTC m=+6849.442645051" Feb 19 23:22:18 crc kubenswrapper[4795]: I0219 23:22:18.275475 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:19 crc kubenswrapper[4795]: I0219 23:22:19.650852 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:19 crc kubenswrapper[4795]: I0219 23:22:19.651160 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:20 crc kubenswrapper[4795]: I0219 23:22:20.312217 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8dlgt"] Feb 19 23:22:20 crc kubenswrapper[4795]: I0219 23:22:20.312796 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8dlgt" podUID="c8467b6c-6941-4df2-b652-61b4c8eee22e" containerName="registry-server" containerID="cri-o://cd8f791e15dd3e7c0787cc203addbd2ace7aa3c8d53487ae15d5e2afca341071" gracePeriod=2 Feb 19 23:22:20 crc kubenswrapper[4795]: I0219 23:22:20.704702 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xhm6c" podUID="33946ece-f847-4ce2-ab50-c4f2f61f9b4e" containerName="registry-server" probeResult="failure" output=< Feb 19 23:22:20 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Feb 19 23:22:20 crc kubenswrapper[4795]: > Feb 19 23:22:20 crc kubenswrapper[4795]: I0219 23:22:20.837155 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:20 crc kubenswrapper[4795]: I0219 23:22:20.931235 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-457r5\" (UniqueName: \"kubernetes.io/projected/c8467b6c-6941-4df2-b652-61b4c8eee22e-kube-api-access-457r5\") pod \"c8467b6c-6941-4df2-b652-61b4c8eee22e\" (UID: \"c8467b6c-6941-4df2-b652-61b4c8eee22e\") " Feb 19 23:22:20 crc kubenswrapper[4795]: I0219 23:22:20.931384 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8467b6c-6941-4df2-b652-61b4c8eee22e-catalog-content\") pod \"c8467b6c-6941-4df2-b652-61b4c8eee22e\" (UID: \"c8467b6c-6941-4df2-b652-61b4c8eee22e\") " Feb 19 23:22:20 crc kubenswrapper[4795]: I0219 23:22:20.931526 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8467b6c-6941-4df2-b652-61b4c8eee22e-utilities\") pod \"c8467b6c-6941-4df2-b652-61b4c8eee22e\" (UID: \"c8467b6c-6941-4df2-b652-61b4c8eee22e\") " Feb 19 23:22:20 crc kubenswrapper[4795]: I0219 23:22:20.932758 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8467b6c-6941-4df2-b652-61b4c8eee22e-utilities" (OuterVolumeSpecName: "utilities") pod "c8467b6c-6941-4df2-b652-61b4c8eee22e" (UID: "c8467b6c-6941-4df2-b652-61b4c8eee22e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:22:20 crc kubenswrapper[4795]: I0219 23:22:20.937128 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8467b6c-6941-4df2-b652-61b4c8eee22e-kube-api-access-457r5" (OuterVolumeSpecName: "kube-api-access-457r5") pod "c8467b6c-6941-4df2-b652-61b4c8eee22e" (UID: "c8467b6c-6941-4df2-b652-61b4c8eee22e"). InnerVolumeSpecName "kube-api-access-457r5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:22:20 crc kubenswrapper[4795]: I0219 23:22:20.976915 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8467b6c-6941-4df2-b652-61b4c8eee22e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8467b6c-6941-4df2-b652-61b4c8eee22e" (UID: "c8467b6c-6941-4df2-b652-61b4c8eee22e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.034105 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8467b6c-6941-4df2-b652-61b4c8eee22e-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.034139 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-457r5\" (UniqueName: \"kubernetes.io/projected/c8467b6c-6941-4df2-b652-61b4c8eee22e-kube-api-access-457r5\") on node \"crc\" DevicePath \"\"" Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.034149 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8467b6c-6941-4df2-b652-61b4c8eee22e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.272236 4795 generic.go:334] "Generic (PLEG): container finished" podID="c8467b6c-6941-4df2-b652-61b4c8eee22e" containerID="cd8f791e15dd3e7c0787cc203addbd2ace7aa3c8d53487ae15d5e2afca341071" exitCode=0 Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.272278 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8dlgt" event={"ID":"c8467b6c-6941-4df2-b652-61b4c8eee22e","Type":"ContainerDied","Data":"cd8f791e15dd3e7c0787cc203addbd2ace7aa3c8d53487ae15d5e2afca341071"} Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.272304 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8dlgt" event={"ID":"c8467b6c-6941-4df2-b652-61b4c8eee22e","Type":"ContainerDied","Data":"7f04f25cf122644f89f7e4477044089ac044f21f8955c90051f04532f6b7ae72"} Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.272306 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.272322 4795 scope.go:117] "RemoveContainer" containerID="cd8f791e15dd3e7c0787cc203addbd2ace7aa3c8d53487ae15d5e2afca341071" Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.290899 4795 scope.go:117] "RemoveContainer" containerID="3f35772a044ff24e0c28484b8544121dea1a949261e4bc5de787469ca663a5f1" Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.310527 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8dlgt"] Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.313104 4795 scope.go:117] "RemoveContainer" containerID="7d4060c75eec61b2f3b0d6ea6e6f1779ef5db1e64d9724c3a5006f0393287cef" Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.324364 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8dlgt"] Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.364575 4795 scope.go:117] "RemoveContainer" containerID="cd8f791e15dd3e7c0787cc203addbd2ace7aa3c8d53487ae15d5e2afca341071" Feb 19 23:22:21 crc kubenswrapper[4795]: E0219 23:22:21.365106 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd8f791e15dd3e7c0787cc203addbd2ace7aa3c8d53487ae15d5e2afca341071\": container with ID starting with cd8f791e15dd3e7c0787cc203addbd2ace7aa3c8d53487ae15d5e2afca341071 not found: ID does not exist" containerID="cd8f791e15dd3e7c0787cc203addbd2ace7aa3c8d53487ae15d5e2afca341071" Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.365159 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd8f791e15dd3e7c0787cc203addbd2ace7aa3c8d53487ae15d5e2afca341071"} err="failed to get container status \"cd8f791e15dd3e7c0787cc203addbd2ace7aa3c8d53487ae15d5e2afca341071\": rpc error: code = NotFound desc = could not find container \"cd8f791e15dd3e7c0787cc203addbd2ace7aa3c8d53487ae15d5e2afca341071\": container with ID starting with cd8f791e15dd3e7c0787cc203addbd2ace7aa3c8d53487ae15d5e2afca341071 not found: ID does not exist" Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.365208 4795 scope.go:117] "RemoveContainer" containerID="3f35772a044ff24e0c28484b8544121dea1a949261e4bc5de787469ca663a5f1" Feb 19 23:22:21 crc kubenswrapper[4795]: E0219 23:22:21.365589 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f35772a044ff24e0c28484b8544121dea1a949261e4bc5de787469ca663a5f1\": container with ID starting with 3f35772a044ff24e0c28484b8544121dea1a949261e4bc5de787469ca663a5f1 not found: ID does not exist" containerID="3f35772a044ff24e0c28484b8544121dea1a949261e4bc5de787469ca663a5f1" Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.365639 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f35772a044ff24e0c28484b8544121dea1a949261e4bc5de787469ca663a5f1"} err="failed to get container status \"3f35772a044ff24e0c28484b8544121dea1a949261e4bc5de787469ca663a5f1\": rpc error: code = NotFound desc = could not find container \"3f35772a044ff24e0c28484b8544121dea1a949261e4bc5de787469ca663a5f1\": container with ID starting with 3f35772a044ff24e0c28484b8544121dea1a949261e4bc5de787469ca663a5f1 not found: ID does not exist" Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.365671 4795 scope.go:117] "RemoveContainer" containerID="7d4060c75eec61b2f3b0d6ea6e6f1779ef5db1e64d9724c3a5006f0393287cef" Feb 19 23:22:21 crc kubenswrapper[4795]: E0219 23:22:21.365955 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d4060c75eec61b2f3b0d6ea6e6f1779ef5db1e64d9724c3a5006f0393287cef\": container with ID starting with 7d4060c75eec61b2f3b0d6ea6e6f1779ef5db1e64d9724c3a5006f0393287cef not found: ID does not exist" containerID="7d4060c75eec61b2f3b0d6ea6e6f1779ef5db1e64d9724c3a5006f0393287cef" Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.365993 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d4060c75eec61b2f3b0d6ea6e6f1779ef5db1e64d9724c3a5006f0393287cef"} err="failed to get container status \"7d4060c75eec61b2f3b0d6ea6e6f1779ef5db1e64d9724c3a5006f0393287cef\": rpc error: code = NotFound desc = could not find container \"7d4060c75eec61b2f3b0d6ea6e6f1779ef5db1e64d9724c3a5006f0393287cef\": container with ID starting with 7d4060c75eec61b2f3b0d6ea6e6f1779ef5db1e64d9724c3a5006f0393287cef not found: ID does not exist" Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.527095 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8467b6c-6941-4df2-b652-61b4c8eee22e" path="/var/lib/kubelet/pods/c8467b6c-6941-4df2-b652-61b4c8eee22e/volumes" Feb 19 23:22:29 crc kubenswrapper[4795]: I0219 23:22:29.710804 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:29 crc kubenswrapper[4795]: I0219 23:22:29.759409 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:30 crc kubenswrapper[4795]: I0219 23:22:30.830814 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xhm6c"] Feb 19 23:22:31 crc kubenswrapper[4795]: I0219 23:22:31.391503 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xhm6c" podUID="33946ece-f847-4ce2-ab50-c4f2f61f9b4e" containerName="registry-server" containerID="cri-o://a9b8825ded8e0e381b677dccc80a6f9b71c79547183cb194af2bb32d4f2ef2fc" gracePeriod=2 Feb 19 23:22:31 crc kubenswrapper[4795]: I0219 23:22:31.823822 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:31 crc kubenswrapper[4795]: I0219 23:22:31.900349 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-catalog-content\") pod \"33946ece-f847-4ce2-ab50-c4f2f61f9b4e\" (UID: \"33946ece-f847-4ce2-ab50-c4f2f61f9b4e\") " Feb 19 23:22:31 crc kubenswrapper[4795]: I0219 23:22:31.900725 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42bqd\" (UniqueName: \"kubernetes.io/projected/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-kube-api-access-42bqd\") pod \"33946ece-f847-4ce2-ab50-c4f2f61f9b4e\" (UID: \"33946ece-f847-4ce2-ab50-c4f2f61f9b4e\") " Feb 19 23:22:31 crc kubenswrapper[4795]: I0219 23:22:31.901041 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-utilities\") pod \"33946ece-f847-4ce2-ab50-c4f2f61f9b4e\" (UID: \"33946ece-f847-4ce2-ab50-c4f2f61f9b4e\") " Feb 19 23:22:31 crc kubenswrapper[4795]: I0219 23:22:31.902210 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-utilities" (OuterVolumeSpecName: "utilities") pod "33946ece-f847-4ce2-ab50-c4f2f61f9b4e" (UID: "33946ece-f847-4ce2-ab50-c4f2f61f9b4e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:22:31 crc kubenswrapper[4795]: I0219 23:22:31.906390 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-kube-api-access-42bqd" (OuterVolumeSpecName: "kube-api-access-42bqd") pod "33946ece-f847-4ce2-ab50-c4f2f61f9b4e" (UID: "33946ece-f847-4ce2-ab50-c4f2f61f9b4e"). InnerVolumeSpecName "kube-api-access-42bqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.003929 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42bqd\" (UniqueName: \"kubernetes.io/projected/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-kube-api-access-42bqd\") on node \"crc\" DevicePath \"\"" Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.003999 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.048045 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33946ece-f847-4ce2-ab50-c4f2f61f9b4e" (UID: "33946ece-f847-4ce2-ab50-c4f2f61f9b4e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.107277 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.406011 4795 generic.go:334] "Generic (PLEG): container finished" podID="33946ece-f847-4ce2-ab50-c4f2f61f9b4e" containerID="a9b8825ded8e0e381b677dccc80a6f9b71c79547183cb194af2bb32d4f2ef2fc" exitCode=0 Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.406065 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhm6c" event={"ID":"33946ece-f847-4ce2-ab50-c4f2f61f9b4e","Type":"ContainerDied","Data":"a9b8825ded8e0e381b677dccc80a6f9b71c79547183cb194af2bb32d4f2ef2fc"} Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.406097 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhm6c" event={"ID":"33946ece-f847-4ce2-ab50-c4f2f61f9b4e","Type":"ContainerDied","Data":"3f311eb836ae45dd9c28aa7638effd6c5fc9d4545f954e724bb6af8ab3992a1b"} Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.406110 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.406119 4795 scope.go:117] "RemoveContainer" containerID="a9b8825ded8e0e381b677dccc80a6f9b71c79547183cb194af2bb32d4f2ef2fc" Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.439091 4795 scope.go:117] "RemoveContainer" containerID="b72ba919511616616308f9bb0b6c710d4ba3184033ebc6fa828ee43fa86036d4" Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.470301 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xhm6c"] Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.480589 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xhm6c"] Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.489421 4795 scope.go:117] "RemoveContainer" containerID="487ce0a487777574543b5bb91827dc66d9dcd5cfb6feea2ce24ffe5619111c28" Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.536051 4795 scope.go:117] "RemoveContainer" containerID="a9b8825ded8e0e381b677dccc80a6f9b71c79547183cb194af2bb32d4f2ef2fc" Feb 19 23:22:32 crc kubenswrapper[4795]: E0219 23:22:32.536516 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9b8825ded8e0e381b677dccc80a6f9b71c79547183cb194af2bb32d4f2ef2fc\": container with ID starting with a9b8825ded8e0e381b677dccc80a6f9b71c79547183cb194af2bb32d4f2ef2fc not found: ID does not exist" containerID="a9b8825ded8e0e381b677dccc80a6f9b71c79547183cb194af2bb32d4f2ef2fc" Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.536625 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9b8825ded8e0e381b677dccc80a6f9b71c79547183cb194af2bb32d4f2ef2fc"} err="failed to get container status \"a9b8825ded8e0e381b677dccc80a6f9b71c79547183cb194af2bb32d4f2ef2fc\": rpc error: code = NotFound desc = could not find container \"a9b8825ded8e0e381b677dccc80a6f9b71c79547183cb194af2bb32d4f2ef2fc\": container with ID starting with a9b8825ded8e0e381b677dccc80a6f9b71c79547183cb194af2bb32d4f2ef2fc not found: ID does not exist" Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.536718 4795 scope.go:117] "RemoveContainer" containerID="b72ba919511616616308f9bb0b6c710d4ba3184033ebc6fa828ee43fa86036d4" Feb 19 23:22:32 crc kubenswrapper[4795]: E0219 23:22:32.537088 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b72ba919511616616308f9bb0b6c710d4ba3184033ebc6fa828ee43fa86036d4\": container with ID starting with b72ba919511616616308f9bb0b6c710d4ba3184033ebc6fa828ee43fa86036d4 not found: ID does not exist" containerID="b72ba919511616616308f9bb0b6c710d4ba3184033ebc6fa828ee43fa86036d4" Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.537140 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b72ba919511616616308f9bb0b6c710d4ba3184033ebc6fa828ee43fa86036d4"} err="failed to get container status \"b72ba919511616616308f9bb0b6c710d4ba3184033ebc6fa828ee43fa86036d4\": rpc error: code = NotFound desc = could not find container \"b72ba919511616616308f9bb0b6c710d4ba3184033ebc6fa828ee43fa86036d4\": container with ID starting with b72ba919511616616308f9bb0b6c710d4ba3184033ebc6fa828ee43fa86036d4 not found: ID does not exist" Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.537210 4795 scope.go:117] "RemoveContainer" containerID="487ce0a487777574543b5bb91827dc66d9dcd5cfb6feea2ce24ffe5619111c28" Feb 19 23:22:32 crc kubenswrapper[4795]: E0219 23:22:32.537556 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"487ce0a487777574543b5bb91827dc66d9dcd5cfb6feea2ce24ffe5619111c28\": container with ID starting with 487ce0a487777574543b5bb91827dc66d9dcd5cfb6feea2ce24ffe5619111c28 not found: ID does not exist" containerID="487ce0a487777574543b5bb91827dc66d9dcd5cfb6feea2ce24ffe5619111c28" Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.537590 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"487ce0a487777574543b5bb91827dc66d9dcd5cfb6feea2ce24ffe5619111c28"} err="failed to get container status \"487ce0a487777574543b5bb91827dc66d9dcd5cfb6feea2ce24ffe5619111c28\": rpc error: code = NotFound desc = could not find container \"487ce0a487777574543b5bb91827dc66d9dcd5cfb6feea2ce24ffe5619111c28\": container with ID starting with 487ce0a487777574543b5bb91827dc66d9dcd5cfb6feea2ce24ffe5619111c28 not found: ID does not exist" Feb 19 23:22:33 crc kubenswrapper[4795]: I0219 23:22:33.529128 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33946ece-f847-4ce2-ab50-c4f2f61f9b4e" path="/var/lib/kubelet/pods/33946ece-f847-4ce2-ab50-c4f2f61f9b4e/volumes" Feb 19 23:23:58 crc kubenswrapper[4795]: I0219 23:23:58.427109 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:23:58 crc kubenswrapper[4795]: I0219 23:23:58.427832 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:24:28 crc kubenswrapper[4795]: I0219 23:24:28.427945 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:24:28 crc kubenswrapper[4795]: I0219 23:24:28.428580 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:24:58 crc kubenswrapper[4795]: I0219 23:24:58.427295 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:24:58 crc kubenswrapper[4795]: I0219 23:24:58.427733 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:24:58 crc kubenswrapper[4795]: I0219 23:24:58.427777 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 23:24:58 crc kubenswrapper[4795]: I0219 23:24:58.428553 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 23:24:58 crc kubenswrapper[4795]: I0219 23:24:58.428605 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" gracePeriod=600 Feb 19 23:24:58 crc kubenswrapper[4795]: E0219 23:24:58.547524 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:24:58 crc kubenswrapper[4795]: I0219 23:24:58.787452 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" exitCode=0 Feb 19 23:24:58 crc kubenswrapper[4795]: I0219 23:24:58.787563 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02"} Feb 19 23:24:58 crc kubenswrapper[4795]: I0219 23:24:58.787918 4795 scope.go:117] "RemoveContainer" containerID="ad9ce3f3f0f6a8b1730c68641f9c3a9fd43cad22ef4d316d4a9d380b0b12e9e5" Feb 19 23:24:58 crc kubenswrapper[4795]: I0219 23:24:58.788632 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:24:58 crc kubenswrapper[4795]: E0219 23:24:58.789054 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:25:09 crc kubenswrapper[4795]: I0219 23:25:09.530460 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:25:09 crc kubenswrapper[4795]: E0219 23:25:09.533541 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:25:14 crc kubenswrapper[4795]: I0219 23:25:14.959065 4795 generic.go:334] "Generic (PLEG): container finished" podID="a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa" containerID="94c7431ebacbd1806e6fe318625607d7ac4c8655988210ced64888f48d2153e5" exitCode=0 Feb 19 23:25:14 crc kubenswrapper[4795]: I0219 23:25:14.959237 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" event={"ID":"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa","Type":"ContainerDied","Data":"94c7431ebacbd1806e6fe318625607d7ac4c8655988210ced64888f48d2153e5"} Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.465736 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.504714 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-bootstrap-combined-ca-bundle\") pod \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.504788 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-ceph\") pod \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.504922 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-ssh-key-openstack-cell1\") pod \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.504955 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwdtc\" (UniqueName: \"kubernetes.io/projected/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-kube-api-access-kwdtc\") pod \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.504975 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-inventory\") pod \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.513049 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa" (UID: "a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.514392 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-ceph" (OuterVolumeSpecName: "ceph") pod "a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa" (UID: "a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.516182 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-kube-api-access-kwdtc" (OuterVolumeSpecName: "kube-api-access-kwdtc") pod "a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa" (UID: "a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa"). InnerVolumeSpecName "kube-api-access-kwdtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.540464 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa" (UID: "a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.550324 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-inventory" (OuterVolumeSpecName: "inventory") pod "a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa" (UID: "a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.607091 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.607132 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwdtc\" (UniqueName: \"kubernetes.io/projected/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-kube-api-access-kwdtc\") on node \"crc\" DevicePath \"\"" Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.607144 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.607154 4795 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.607181 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.983125 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" event={"ID":"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa","Type":"ContainerDied","Data":"8334d40c98b3c6b685c906be16789e5a35ea56989f4f2c9374c39b35dba237a9"} Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.983203 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8334d40c98b3c6b685c906be16789e5a35ea56989f4f2c9374c39b35dba237a9" Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.983210 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.088923 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-wfcx4"] Feb 19 23:25:17 crc kubenswrapper[4795]: E0219 23:25:17.089455 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8467b6c-6941-4df2-b652-61b4c8eee22e" containerName="extract-content" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.089478 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8467b6c-6941-4df2-b652-61b4c8eee22e" containerName="extract-content" Feb 19 23:25:17 crc kubenswrapper[4795]: E0219 23:25:17.089507 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33946ece-f847-4ce2-ab50-c4f2f61f9b4e" containerName="extract-content" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.089517 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="33946ece-f847-4ce2-ab50-c4f2f61f9b4e" containerName="extract-content" Feb 19 23:25:17 crc kubenswrapper[4795]: E0219 23:25:17.089540 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa" containerName="bootstrap-openstack-openstack-cell1" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.089548 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa" containerName="bootstrap-openstack-openstack-cell1" Feb 19 23:25:17 crc kubenswrapper[4795]: E0219 23:25:17.089579 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33946ece-f847-4ce2-ab50-c4f2f61f9b4e" containerName="extract-utilities" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.089587 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="33946ece-f847-4ce2-ab50-c4f2f61f9b4e" containerName="extract-utilities" Feb 19 23:25:17 crc kubenswrapper[4795]: E0219 23:25:17.089600 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33946ece-f847-4ce2-ab50-c4f2f61f9b4e" containerName="registry-server" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.089608 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="33946ece-f847-4ce2-ab50-c4f2f61f9b4e" containerName="registry-server" Feb 19 23:25:17 crc kubenswrapper[4795]: E0219 23:25:17.089620 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8467b6c-6941-4df2-b652-61b4c8eee22e" containerName="extract-utilities" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.089628 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8467b6c-6941-4df2-b652-61b4c8eee22e" containerName="extract-utilities" Feb 19 23:25:17 crc kubenswrapper[4795]: E0219 23:25:17.089644 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8467b6c-6941-4df2-b652-61b4c8eee22e" containerName="registry-server" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.089653 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8467b6c-6941-4df2-b652-61b4c8eee22e" containerName="registry-server" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.089877 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa" containerName="bootstrap-openstack-openstack-cell1" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.089904 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="33946ece-f847-4ce2-ab50-c4f2f61f9b4e" containerName="registry-server" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.089923 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8467b6c-6941-4df2-b652-61b4c8eee22e" containerName="registry-server" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.092295 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.099052 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.100709 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.101004 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.101292 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.109461 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-wfcx4"] Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.123929 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-ceph\") pod \"download-cache-openstack-openstack-cell1-wfcx4\" (UID: \"69464400-c61c-41bd-aeeb-984f7f948a16\") " pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.124072 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-inventory\") pod \"download-cache-openstack-openstack-cell1-wfcx4\" (UID: \"69464400-c61c-41bd-aeeb-984f7f948a16\") " pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.124434 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-wfcx4\" (UID: \"69464400-c61c-41bd-aeeb-984f7f948a16\") " pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.124561 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z879\" (UniqueName: \"kubernetes.io/projected/69464400-c61c-41bd-aeeb-984f7f948a16-kube-api-access-8z879\") pod \"download-cache-openstack-openstack-cell1-wfcx4\" (UID: \"69464400-c61c-41bd-aeeb-984f7f948a16\") " pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.225653 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-ceph\") pod \"download-cache-openstack-openstack-cell1-wfcx4\" (UID: \"69464400-c61c-41bd-aeeb-984f7f948a16\") " pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.225707 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-inventory\") pod \"download-cache-openstack-openstack-cell1-wfcx4\" (UID: \"69464400-c61c-41bd-aeeb-984f7f948a16\") " pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.225766 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-wfcx4\" (UID: \"69464400-c61c-41bd-aeeb-984f7f948a16\") " pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.225821 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z879\" (UniqueName: \"kubernetes.io/projected/69464400-c61c-41bd-aeeb-984f7f948a16-kube-api-access-8z879\") pod \"download-cache-openstack-openstack-cell1-wfcx4\" (UID: \"69464400-c61c-41bd-aeeb-984f7f948a16\") " pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.229358 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-wfcx4\" (UID: \"69464400-c61c-41bd-aeeb-984f7f948a16\") " pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.229358 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-inventory\") pod \"download-cache-openstack-openstack-cell1-wfcx4\" (UID: \"69464400-c61c-41bd-aeeb-984f7f948a16\") " pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.230969 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-ceph\") pod \"download-cache-openstack-openstack-cell1-wfcx4\" (UID: \"69464400-c61c-41bd-aeeb-984f7f948a16\") " pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.242577 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z879\" (UniqueName: \"kubernetes.io/projected/69464400-c61c-41bd-aeeb-984f7f948a16-kube-api-access-8z879\") pod \"download-cache-openstack-openstack-cell1-wfcx4\" (UID: \"69464400-c61c-41bd-aeeb-984f7f948a16\") " pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.426475 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.983398 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-wfcx4"] Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.998230 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" event={"ID":"69464400-c61c-41bd-aeeb-984f7f948a16","Type":"ContainerStarted","Data":"5a15f244c3f5001d78c7272f8f15efa1a2a5ab014c4d21f43ab37cd73349c5a7"} Feb 19 23:25:20 crc kubenswrapper[4795]: I0219 23:25:20.027285 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" event={"ID":"69464400-c61c-41bd-aeeb-984f7f948a16","Type":"ContainerStarted","Data":"61f2e5abf146e1a51e958bfaea16801ce6ea6794f6232235430acd11508c38dc"} Feb 19 23:25:23 crc kubenswrapper[4795]: I0219 23:25:23.513451 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:25:23 crc kubenswrapper[4795]: E0219 23:25:23.514348 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:25:34 crc kubenswrapper[4795]: I0219 23:25:34.512135 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:25:34 crc kubenswrapper[4795]: E0219 23:25:34.512890 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:25:43 crc kubenswrapper[4795]: I0219 23:25:43.256800 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" podStartSLOduration=25.366609336 podStartE2EDuration="26.256777388s" podCreationTimestamp="2026-02-19 23:25:17 +0000 UTC" firstStartedPulling="2026-02-19 23:25:17.984945652 +0000 UTC m=+7029.177463516" lastFinishedPulling="2026-02-19 23:25:18.875113654 +0000 UTC m=+7030.067631568" observedRunningTime="2026-02-19 23:25:20.043845502 +0000 UTC m=+7031.236363396" watchObservedRunningTime="2026-02-19 23:25:43.256777388 +0000 UTC m=+7054.449295252" Feb 19 23:25:43 crc kubenswrapper[4795]: I0219 23:25:43.260756 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j2f42"] Feb 19 23:25:43 crc kubenswrapper[4795]: I0219 23:25:43.263620 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:43 crc kubenswrapper[4795]: I0219 23:25:43.280397 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j2f42"] Feb 19 23:25:43 crc kubenswrapper[4795]: I0219 23:25:43.403038 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c5b2e93-bfd6-409c-955a-78f80b984a11-catalog-content\") pod \"redhat-marketplace-j2f42\" (UID: \"3c5b2e93-bfd6-409c-955a-78f80b984a11\") " pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:43 crc kubenswrapper[4795]: I0219 23:25:43.404235 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c5b2e93-bfd6-409c-955a-78f80b984a11-utilities\") pod \"redhat-marketplace-j2f42\" (UID: \"3c5b2e93-bfd6-409c-955a-78f80b984a11\") " pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:43 crc kubenswrapper[4795]: I0219 23:25:43.404346 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twq4h\" (UniqueName: \"kubernetes.io/projected/3c5b2e93-bfd6-409c-955a-78f80b984a11-kube-api-access-twq4h\") pod \"redhat-marketplace-j2f42\" (UID: \"3c5b2e93-bfd6-409c-955a-78f80b984a11\") " pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:43 crc kubenswrapper[4795]: I0219 23:25:43.506192 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c5b2e93-bfd6-409c-955a-78f80b984a11-utilities\") pod \"redhat-marketplace-j2f42\" (UID: \"3c5b2e93-bfd6-409c-955a-78f80b984a11\") " pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:43 crc kubenswrapper[4795]: I0219 23:25:43.506259 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twq4h\" (UniqueName: \"kubernetes.io/projected/3c5b2e93-bfd6-409c-955a-78f80b984a11-kube-api-access-twq4h\") pod \"redhat-marketplace-j2f42\" (UID: \"3c5b2e93-bfd6-409c-955a-78f80b984a11\") " pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:43 crc kubenswrapper[4795]: I0219 23:25:43.506330 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c5b2e93-bfd6-409c-955a-78f80b984a11-catalog-content\") pod \"redhat-marketplace-j2f42\" (UID: \"3c5b2e93-bfd6-409c-955a-78f80b984a11\") " pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:43 crc kubenswrapper[4795]: I0219 23:25:43.506703 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c5b2e93-bfd6-409c-955a-78f80b984a11-utilities\") pod \"redhat-marketplace-j2f42\" (UID: \"3c5b2e93-bfd6-409c-955a-78f80b984a11\") " pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:43 crc kubenswrapper[4795]: I0219 23:25:43.506857 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c5b2e93-bfd6-409c-955a-78f80b984a11-catalog-content\") pod \"redhat-marketplace-j2f42\" (UID: \"3c5b2e93-bfd6-409c-955a-78f80b984a11\") " pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:43 crc kubenswrapper[4795]: I0219 23:25:43.531135 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twq4h\" (UniqueName: \"kubernetes.io/projected/3c5b2e93-bfd6-409c-955a-78f80b984a11-kube-api-access-twq4h\") pod \"redhat-marketplace-j2f42\" (UID: \"3c5b2e93-bfd6-409c-955a-78f80b984a11\") " pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:43 crc kubenswrapper[4795]: I0219 23:25:43.606474 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:44 crc kubenswrapper[4795]: I0219 23:25:44.094341 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j2f42"] Feb 19 23:25:44 crc kubenswrapper[4795]: I0219 23:25:44.264109 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2f42" event={"ID":"3c5b2e93-bfd6-409c-955a-78f80b984a11","Type":"ContainerStarted","Data":"58a1348a548eed875beb7501c5f4e5a8e1b842ab2941eb12d2da73920b895673"} Feb 19 23:25:45 crc kubenswrapper[4795]: I0219 23:25:45.277441 4795 generic.go:334] "Generic (PLEG): container finished" podID="3c5b2e93-bfd6-409c-955a-78f80b984a11" containerID="acea590c649b76bec688ad5b2514c2f2abc3c5067f3ac85d69ba16fac86b259c" exitCode=0 Feb 19 23:25:45 crc kubenswrapper[4795]: I0219 23:25:45.277619 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2f42" event={"ID":"3c5b2e93-bfd6-409c-955a-78f80b984a11","Type":"ContainerDied","Data":"acea590c649b76bec688ad5b2514c2f2abc3c5067f3ac85d69ba16fac86b259c"} Feb 19 23:25:45 crc kubenswrapper[4795]: I0219 23:25:45.280859 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 23:25:47 crc kubenswrapper[4795]: I0219 23:25:47.512724 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:25:47 crc kubenswrapper[4795]: E0219 23:25:47.513518 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:25:48 crc kubenswrapper[4795]: I0219 23:25:48.324178 4795 generic.go:334] "Generic (PLEG): container finished" podID="3c5b2e93-bfd6-409c-955a-78f80b984a11" containerID="57273199fd35f22e63681804ce55aacdbac84c17c7b8c77009db7d1d3c52905f" exitCode=0 Feb 19 23:25:48 crc kubenswrapper[4795]: I0219 23:25:48.324275 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2f42" event={"ID":"3c5b2e93-bfd6-409c-955a-78f80b984a11","Type":"ContainerDied","Data":"57273199fd35f22e63681804ce55aacdbac84c17c7b8c77009db7d1d3c52905f"} Feb 19 23:25:49 crc kubenswrapper[4795]: I0219 23:25:49.334914 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2f42" event={"ID":"3c5b2e93-bfd6-409c-955a-78f80b984a11","Type":"ContainerStarted","Data":"19d25ff8c015afdd2628f4228b1fdb66a0580f86620d6c600e7ba30f2dbdff64"} Feb 19 23:25:49 crc kubenswrapper[4795]: I0219 23:25:49.361270 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j2f42" podStartSLOduration=2.92788686 podStartE2EDuration="6.361251095s" podCreationTimestamp="2026-02-19 23:25:43 +0000 UTC" firstStartedPulling="2026-02-19 23:25:45.280480588 +0000 UTC m=+7056.472998462" lastFinishedPulling="2026-02-19 23:25:48.713844833 +0000 UTC m=+7059.906362697" observedRunningTime="2026-02-19 23:25:49.35261218 +0000 UTC m=+7060.545130044" watchObservedRunningTime="2026-02-19 23:25:49.361251095 +0000 UTC m=+7060.553768959" Feb 19 23:25:53 crc kubenswrapper[4795]: I0219 23:25:53.607114 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:53 crc kubenswrapper[4795]: I0219 23:25:53.607749 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:53 crc kubenswrapper[4795]: I0219 23:25:53.650102 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:54 crc kubenswrapper[4795]: I0219 23:25:54.444026 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:54 crc kubenswrapper[4795]: I0219 23:25:54.499040 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j2f42"] Feb 19 23:25:56 crc kubenswrapper[4795]: I0219 23:25:56.410002 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j2f42" podUID="3c5b2e93-bfd6-409c-955a-78f80b984a11" containerName="registry-server" containerID="cri-o://19d25ff8c015afdd2628f4228b1fdb66a0580f86620d6c600e7ba30f2dbdff64" gracePeriod=2 Feb 19 23:25:56 crc kubenswrapper[4795]: I0219 23:25:56.912650 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.001479 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c5b2e93-bfd6-409c-955a-78f80b984a11-catalog-content\") pod \"3c5b2e93-bfd6-409c-955a-78f80b984a11\" (UID: \"3c5b2e93-bfd6-409c-955a-78f80b984a11\") " Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.001592 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c5b2e93-bfd6-409c-955a-78f80b984a11-utilities\") pod \"3c5b2e93-bfd6-409c-955a-78f80b984a11\" (UID: \"3c5b2e93-bfd6-409c-955a-78f80b984a11\") " Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.001713 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twq4h\" (UniqueName: \"kubernetes.io/projected/3c5b2e93-bfd6-409c-955a-78f80b984a11-kube-api-access-twq4h\") pod \"3c5b2e93-bfd6-409c-955a-78f80b984a11\" (UID: \"3c5b2e93-bfd6-409c-955a-78f80b984a11\") " Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.002595 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c5b2e93-bfd6-409c-955a-78f80b984a11-utilities" (OuterVolumeSpecName: "utilities") pod "3c5b2e93-bfd6-409c-955a-78f80b984a11" (UID: "3c5b2e93-bfd6-409c-955a-78f80b984a11"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.007348 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c5b2e93-bfd6-409c-955a-78f80b984a11-kube-api-access-twq4h" (OuterVolumeSpecName: "kube-api-access-twq4h") pod "3c5b2e93-bfd6-409c-955a-78f80b984a11" (UID: "3c5b2e93-bfd6-409c-955a-78f80b984a11"). InnerVolumeSpecName "kube-api-access-twq4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.036840 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c5b2e93-bfd6-409c-955a-78f80b984a11-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3c5b2e93-bfd6-409c-955a-78f80b984a11" (UID: "3c5b2e93-bfd6-409c-955a-78f80b984a11"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.104264 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twq4h\" (UniqueName: \"kubernetes.io/projected/3c5b2e93-bfd6-409c-955a-78f80b984a11-kube-api-access-twq4h\") on node \"crc\" DevicePath \"\"" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.104291 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c5b2e93-bfd6-409c-955a-78f80b984a11-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.104300 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c5b2e93-bfd6-409c-955a-78f80b984a11-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.427256 4795 generic.go:334] "Generic (PLEG): container finished" podID="3c5b2e93-bfd6-409c-955a-78f80b984a11" containerID="19d25ff8c015afdd2628f4228b1fdb66a0580f86620d6c600e7ba30f2dbdff64" exitCode=0 Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.427361 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2f42" event={"ID":"3c5b2e93-bfd6-409c-955a-78f80b984a11","Type":"ContainerDied","Data":"19d25ff8c015afdd2628f4228b1fdb66a0580f86620d6c600e7ba30f2dbdff64"} Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.427431 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2f42" event={"ID":"3c5b2e93-bfd6-409c-955a-78f80b984a11","Type":"ContainerDied","Data":"58a1348a548eed875beb7501c5f4e5a8e1b842ab2941eb12d2da73920b895673"} Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.427471 4795 scope.go:117] "RemoveContainer" containerID="19d25ff8c015afdd2628f4228b1fdb66a0580f86620d6c600e7ba30f2dbdff64" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.428367 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.474623 4795 scope.go:117] "RemoveContainer" containerID="57273199fd35f22e63681804ce55aacdbac84c17c7b8c77009db7d1d3c52905f" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.478082 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j2f42"] Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.492557 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j2f42"] Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.499652 4795 scope.go:117] "RemoveContainer" containerID="acea590c649b76bec688ad5b2514c2f2abc3c5067f3ac85d69ba16fac86b259c" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.529660 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c5b2e93-bfd6-409c-955a-78f80b984a11" path="/var/lib/kubelet/pods/3c5b2e93-bfd6-409c-955a-78f80b984a11/volumes" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.573839 4795 scope.go:117] "RemoveContainer" containerID="19d25ff8c015afdd2628f4228b1fdb66a0580f86620d6c600e7ba30f2dbdff64" Feb 19 23:25:57 crc kubenswrapper[4795]: E0219 23:25:57.574402 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19d25ff8c015afdd2628f4228b1fdb66a0580f86620d6c600e7ba30f2dbdff64\": container with ID starting with 19d25ff8c015afdd2628f4228b1fdb66a0580f86620d6c600e7ba30f2dbdff64 not found: ID does not exist" containerID="19d25ff8c015afdd2628f4228b1fdb66a0580f86620d6c600e7ba30f2dbdff64" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.574439 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19d25ff8c015afdd2628f4228b1fdb66a0580f86620d6c600e7ba30f2dbdff64"} err="failed to get container status \"19d25ff8c015afdd2628f4228b1fdb66a0580f86620d6c600e7ba30f2dbdff64\": rpc error: code = NotFound desc = could not find container \"19d25ff8c015afdd2628f4228b1fdb66a0580f86620d6c600e7ba30f2dbdff64\": container with ID starting with 19d25ff8c015afdd2628f4228b1fdb66a0580f86620d6c600e7ba30f2dbdff64 not found: ID does not exist" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.574463 4795 scope.go:117] "RemoveContainer" containerID="57273199fd35f22e63681804ce55aacdbac84c17c7b8c77009db7d1d3c52905f" Feb 19 23:25:57 crc kubenswrapper[4795]: E0219 23:25:57.575103 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57273199fd35f22e63681804ce55aacdbac84c17c7b8c77009db7d1d3c52905f\": container with ID starting with 57273199fd35f22e63681804ce55aacdbac84c17c7b8c77009db7d1d3c52905f not found: ID does not exist" containerID="57273199fd35f22e63681804ce55aacdbac84c17c7b8c77009db7d1d3c52905f" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.575156 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57273199fd35f22e63681804ce55aacdbac84c17c7b8c77009db7d1d3c52905f"} err="failed to get container status \"57273199fd35f22e63681804ce55aacdbac84c17c7b8c77009db7d1d3c52905f\": rpc error: code = NotFound desc = could not find container \"57273199fd35f22e63681804ce55aacdbac84c17c7b8c77009db7d1d3c52905f\": container with ID starting with 57273199fd35f22e63681804ce55aacdbac84c17c7b8c77009db7d1d3c52905f not found: ID does not exist" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.575224 4795 scope.go:117] "RemoveContainer" containerID="acea590c649b76bec688ad5b2514c2f2abc3c5067f3ac85d69ba16fac86b259c" Feb 19 23:25:57 crc kubenswrapper[4795]: E0219 23:25:57.575705 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acea590c649b76bec688ad5b2514c2f2abc3c5067f3ac85d69ba16fac86b259c\": container with ID starting with acea590c649b76bec688ad5b2514c2f2abc3c5067f3ac85d69ba16fac86b259c not found: ID does not exist" containerID="acea590c649b76bec688ad5b2514c2f2abc3c5067f3ac85d69ba16fac86b259c" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.575743 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acea590c649b76bec688ad5b2514c2f2abc3c5067f3ac85d69ba16fac86b259c"} err="failed to get container status \"acea590c649b76bec688ad5b2514c2f2abc3c5067f3ac85d69ba16fac86b259c\": rpc error: code = NotFound desc = could not find container \"acea590c649b76bec688ad5b2514c2f2abc3c5067f3ac85d69ba16fac86b259c\": container with ID starting with acea590c649b76bec688ad5b2514c2f2abc3c5067f3ac85d69ba16fac86b259c not found: ID does not exist" Feb 19 23:26:00 crc kubenswrapper[4795]: I0219 23:26:00.512565 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:26:00 crc kubenswrapper[4795]: E0219 23:26:00.513588 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:26:15 crc kubenswrapper[4795]: I0219 23:26:15.512114 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:26:15 crc kubenswrapper[4795]: E0219 23:26:15.513030 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:26:26 crc kubenswrapper[4795]: I0219 23:26:26.512490 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:26:26 crc kubenswrapper[4795]: E0219 23:26:26.513238 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:26:41 crc kubenswrapper[4795]: I0219 23:26:41.512373 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:26:41 crc kubenswrapper[4795]: E0219 23:26:41.513782 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:26:55 crc kubenswrapper[4795]: I0219 23:26:55.512438 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:26:55 crc kubenswrapper[4795]: E0219 23:26:55.513416 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:27:07 crc kubenswrapper[4795]: I0219 23:27:07.512481 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:27:07 crc kubenswrapper[4795]: E0219 23:27:07.513302 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:27:09 crc kubenswrapper[4795]: I0219 23:27:09.172915 4795 generic.go:334] "Generic (PLEG): container finished" podID="69464400-c61c-41bd-aeeb-984f7f948a16" containerID="61f2e5abf146e1a51e958bfaea16801ce6ea6794f6232235430acd11508c38dc" exitCode=0 Feb 19 23:27:09 crc kubenswrapper[4795]: I0219 23:27:09.172994 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" event={"ID":"69464400-c61c-41bd-aeeb-984f7f948a16","Type":"ContainerDied","Data":"61f2e5abf146e1a51e958bfaea16801ce6ea6794f6232235430acd11508c38dc"} Feb 19 23:27:10 crc kubenswrapper[4795]: I0219 23:27:10.707557 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" Feb 19 23:27:10 crc kubenswrapper[4795]: I0219 23:27:10.755616 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z879\" (UniqueName: \"kubernetes.io/projected/69464400-c61c-41bd-aeeb-984f7f948a16-kube-api-access-8z879\") pod \"69464400-c61c-41bd-aeeb-984f7f948a16\" (UID: \"69464400-c61c-41bd-aeeb-984f7f948a16\") " Feb 19 23:27:10 crc kubenswrapper[4795]: I0219 23:27:10.755788 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-ssh-key-openstack-cell1\") pod \"69464400-c61c-41bd-aeeb-984f7f948a16\" (UID: \"69464400-c61c-41bd-aeeb-984f7f948a16\") " Feb 19 23:27:10 crc kubenswrapper[4795]: I0219 23:27:10.755825 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-inventory\") pod \"69464400-c61c-41bd-aeeb-984f7f948a16\" (UID: \"69464400-c61c-41bd-aeeb-984f7f948a16\") " Feb 19 23:27:10 crc kubenswrapper[4795]: I0219 23:27:10.755969 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-ceph\") pod \"69464400-c61c-41bd-aeeb-984f7f948a16\" (UID: \"69464400-c61c-41bd-aeeb-984f7f948a16\") " Feb 19 23:27:10 crc kubenswrapper[4795]: I0219 23:27:10.763372 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-ceph" (OuterVolumeSpecName: "ceph") pod "69464400-c61c-41bd-aeeb-984f7f948a16" (UID: "69464400-c61c-41bd-aeeb-984f7f948a16"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:27:10 crc kubenswrapper[4795]: I0219 23:27:10.768368 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69464400-c61c-41bd-aeeb-984f7f948a16-kube-api-access-8z879" (OuterVolumeSpecName: "kube-api-access-8z879") pod "69464400-c61c-41bd-aeeb-984f7f948a16" (UID: "69464400-c61c-41bd-aeeb-984f7f948a16"). InnerVolumeSpecName "kube-api-access-8z879". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:27:10 crc kubenswrapper[4795]: I0219 23:27:10.787637 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "69464400-c61c-41bd-aeeb-984f7f948a16" (UID: "69464400-c61c-41bd-aeeb-984f7f948a16"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:27:10 crc kubenswrapper[4795]: I0219 23:27:10.804372 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-inventory" (OuterVolumeSpecName: "inventory") pod "69464400-c61c-41bd-aeeb-984f7f948a16" (UID: "69464400-c61c-41bd-aeeb-984f7f948a16"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:27:10 crc kubenswrapper[4795]: I0219 23:27:10.859587 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:27:10 crc kubenswrapper[4795]: I0219 23:27:10.859626 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:27:10 crc kubenswrapper[4795]: I0219 23:27:10.859641 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:27:10 crc kubenswrapper[4795]: I0219 23:27:10.859651 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z879\" (UniqueName: \"kubernetes.io/projected/69464400-c61c-41bd-aeeb-984f7f948a16-kube-api-access-8z879\") on node \"crc\" DevicePath \"\"" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.192763 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" event={"ID":"69464400-c61c-41bd-aeeb-984f7f948a16","Type":"ContainerDied","Data":"5a15f244c3f5001d78c7272f8f15efa1a2a5ab014c4d21f43ab37cd73349c5a7"} Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.192813 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a15f244c3f5001d78c7272f8f15efa1a2a5ab014c4d21f43ab37cd73349c5a7" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.192826 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.279444 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-2twf8"] Feb 19 23:27:11 crc kubenswrapper[4795]: E0219 23:27:11.280150 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c5b2e93-bfd6-409c-955a-78f80b984a11" containerName="extract-content" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.280187 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c5b2e93-bfd6-409c-955a-78f80b984a11" containerName="extract-content" Feb 19 23:27:11 crc kubenswrapper[4795]: E0219 23:27:11.280212 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c5b2e93-bfd6-409c-955a-78f80b984a11" containerName="registry-server" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.280220 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c5b2e93-bfd6-409c-955a-78f80b984a11" containerName="registry-server" Feb 19 23:27:11 crc kubenswrapper[4795]: E0219 23:27:11.280266 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c5b2e93-bfd6-409c-955a-78f80b984a11" containerName="extract-utilities" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.280276 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c5b2e93-bfd6-409c-955a-78f80b984a11" containerName="extract-utilities" Feb 19 23:27:11 crc kubenswrapper[4795]: E0219 23:27:11.280301 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69464400-c61c-41bd-aeeb-984f7f948a16" containerName="download-cache-openstack-openstack-cell1" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.280308 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="69464400-c61c-41bd-aeeb-984f7f948a16" containerName="download-cache-openstack-openstack-cell1" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.280530 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="69464400-c61c-41bd-aeeb-984f7f948a16" containerName="download-cache-openstack-openstack-cell1" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.280547 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c5b2e93-bfd6-409c-955a-78f80b984a11" containerName="registry-server" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.281683 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-2twf8" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.284040 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.284315 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.287222 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.288130 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.291284 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-2twf8"] Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.369049 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b628n\" (UniqueName: \"kubernetes.io/projected/5a293bce-3326-47c0-a9b5-b5af13dc46c8-kube-api-access-b628n\") pod \"configure-network-openstack-openstack-cell1-2twf8\" (UID: \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\") " pod="openstack/configure-network-openstack-openstack-cell1-2twf8" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.369257 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-ceph\") pod \"configure-network-openstack-openstack-cell1-2twf8\" (UID: \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\") " pod="openstack/configure-network-openstack-openstack-cell1-2twf8" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.369402 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-2twf8\" (UID: \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\") " pod="openstack/configure-network-openstack-openstack-cell1-2twf8" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.369641 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-inventory\") pod \"configure-network-openstack-openstack-cell1-2twf8\" (UID: \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\") " pod="openstack/configure-network-openstack-openstack-cell1-2twf8" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.473247 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b628n\" (UniqueName: \"kubernetes.io/projected/5a293bce-3326-47c0-a9b5-b5af13dc46c8-kube-api-access-b628n\") pod \"configure-network-openstack-openstack-cell1-2twf8\" (UID: \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\") " pod="openstack/configure-network-openstack-openstack-cell1-2twf8" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.473387 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-ceph\") pod \"configure-network-openstack-openstack-cell1-2twf8\" (UID: \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\") " pod="openstack/configure-network-openstack-openstack-cell1-2twf8" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.473453 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-2twf8\" (UID: \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\") " pod="openstack/configure-network-openstack-openstack-cell1-2twf8" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.473551 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-inventory\") pod \"configure-network-openstack-openstack-cell1-2twf8\" (UID: \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\") " pod="openstack/configure-network-openstack-openstack-cell1-2twf8" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.478570 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-ceph\") pod \"configure-network-openstack-openstack-cell1-2twf8\" (UID: \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\") " pod="openstack/configure-network-openstack-openstack-cell1-2twf8" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.479345 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-2twf8\" (UID: \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\") " pod="openstack/configure-network-openstack-openstack-cell1-2twf8" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.480281 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-inventory\") pod \"configure-network-openstack-openstack-cell1-2twf8\" (UID: \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\") " pod="openstack/configure-network-openstack-openstack-cell1-2twf8" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.490209 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b628n\" (UniqueName: \"kubernetes.io/projected/5a293bce-3326-47c0-a9b5-b5af13dc46c8-kube-api-access-b628n\") pod \"configure-network-openstack-openstack-cell1-2twf8\" (UID: \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\") " pod="openstack/configure-network-openstack-openstack-cell1-2twf8" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.598889 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-2twf8" Feb 19 23:27:12 crc kubenswrapper[4795]: I0219 23:27:12.184997 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-2twf8"] Feb 19 23:27:12 crc kubenswrapper[4795]: I0219 23:27:12.205707 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-2twf8" event={"ID":"5a293bce-3326-47c0-a9b5-b5af13dc46c8","Type":"ContainerStarted","Data":"7e52b4553606d0655b60660a80810958f26b5efc3f788860961635ae7f9bd444"} Feb 19 23:27:13 crc kubenswrapper[4795]: I0219 23:27:13.216897 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-2twf8" event={"ID":"5a293bce-3326-47c0-a9b5-b5af13dc46c8","Type":"ContainerStarted","Data":"359a513d338aa3470a639a702c3ec34b9225f00bb77c08a2396abde50063710a"} Feb 19 23:27:13 crc kubenswrapper[4795]: I0219 23:27:13.246264 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-2twf8" podStartSLOduration=1.701007835 podStartE2EDuration="2.246218848s" podCreationTimestamp="2026-02-19 23:27:11 +0000 UTC" firstStartedPulling="2026-02-19 23:27:12.194785259 +0000 UTC m=+7143.387303123" lastFinishedPulling="2026-02-19 23:27:12.739996262 +0000 UTC m=+7143.932514136" observedRunningTime="2026-02-19 23:27:13.237144591 +0000 UTC m=+7144.429662505" watchObservedRunningTime="2026-02-19 23:27:13.246218848 +0000 UTC m=+7144.438736722" Feb 19 23:27:22 crc kubenswrapper[4795]: I0219 23:27:22.512716 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:27:22 crc kubenswrapper[4795]: E0219 23:27:22.513900 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:27:34 crc kubenswrapper[4795]: I0219 23:27:34.525797 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:27:34 crc kubenswrapper[4795]: E0219 23:27:34.526801 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:27:45 crc kubenswrapper[4795]: I0219 23:27:45.512705 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:27:45 crc kubenswrapper[4795]: E0219 23:27:45.513652 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:27:59 crc kubenswrapper[4795]: I0219 23:27:59.520182 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:27:59 crc kubenswrapper[4795]: E0219 23:27:59.520959 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:28:14 crc kubenswrapper[4795]: I0219 23:28:14.512541 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:28:14 crc kubenswrapper[4795]: E0219 23:28:14.513626 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:28:29 crc kubenswrapper[4795]: I0219 23:28:29.517752 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:28:29 crc kubenswrapper[4795]: E0219 23:28:29.518555 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:28:40 crc kubenswrapper[4795]: I0219 23:28:40.117948 4795 generic.go:334] "Generic (PLEG): container finished" podID="5a293bce-3326-47c0-a9b5-b5af13dc46c8" containerID="359a513d338aa3470a639a702c3ec34b9225f00bb77c08a2396abde50063710a" exitCode=0 Feb 19 23:28:40 crc kubenswrapper[4795]: I0219 23:28:40.118078 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-2twf8" event={"ID":"5a293bce-3326-47c0-a9b5-b5af13dc46c8","Type":"ContainerDied","Data":"359a513d338aa3470a639a702c3ec34b9225f00bb77c08a2396abde50063710a"} Feb 19 23:28:41 crc kubenswrapper[4795]: I0219 23:28:41.589322 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-2twf8" Feb 19 23:28:41 crc kubenswrapper[4795]: I0219 23:28:41.661700 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-inventory\") pod \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\" (UID: \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\") " Feb 19 23:28:41 crc kubenswrapper[4795]: I0219 23:28:41.662225 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b628n\" (UniqueName: \"kubernetes.io/projected/5a293bce-3326-47c0-a9b5-b5af13dc46c8-kube-api-access-b628n\") pod \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\" (UID: \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\") " Feb 19 23:28:41 crc kubenswrapper[4795]: I0219 23:28:41.662353 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-ceph\") pod \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\" (UID: \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\") " Feb 19 23:28:41 crc kubenswrapper[4795]: I0219 23:28:41.662405 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-ssh-key-openstack-cell1\") pod \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\" (UID: \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\") " Feb 19 23:28:41 crc kubenswrapper[4795]: I0219 23:28:41.668808 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a293bce-3326-47c0-a9b5-b5af13dc46c8-kube-api-access-b628n" (OuterVolumeSpecName: "kube-api-access-b628n") pod "5a293bce-3326-47c0-a9b5-b5af13dc46c8" (UID: "5a293bce-3326-47c0-a9b5-b5af13dc46c8"). InnerVolumeSpecName "kube-api-access-b628n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:28:41 crc kubenswrapper[4795]: I0219 23:28:41.668910 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-ceph" (OuterVolumeSpecName: "ceph") pod "5a293bce-3326-47c0-a9b5-b5af13dc46c8" (UID: "5a293bce-3326-47c0-a9b5-b5af13dc46c8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:28:41 crc kubenswrapper[4795]: I0219 23:28:41.691432 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "5a293bce-3326-47c0-a9b5-b5af13dc46c8" (UID: "5a293bce-3326-47c0-a9b5-b5af13dc46c8"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:28:41 crc kubenswrapper[4795]: I0219 23:28:41.696637 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-inventory" (OuterVolumeSpecName: "inventory") pod "5a293bce-3326-47c0-a9b5-b5af13dc46c8" (UID: "5a293bce-3326-47c0-a9b5-b5af13dc46c8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:28:41 crc kubenswrapper[4795]: I0219 23:28:41.765907 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:28:41 crc kubenswrapper[4795]: I0219 23:28:41.765957 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b628n\" (UniqueName: \"kubernetes.io/projected/5a293bce-3326-47c0-a9b5-b5af13dc46c8-kube-api-access-b628n\") on node \"crc\" DevicePath \"\"" Feb 19 23:28:41 crc kubenswrapper[4795]: I0219 23:28:41.765977 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:28:41 crc kubenswrapper[4795]: I0219 23:28:41.765994 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.138743 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-2twf8" event={"ID":"5a293bce-3326-47c0-a9b5-b5af13dc46c8","Type":"ContainerDied","Data":"7e52b4553606d0655b60660a80810958f26b5efc3f788860961635ae7f9bd444"} Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.138783 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e52b4553606d0655b60660a80810958f26b5efc3f788860961635ae7f9bd444" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.139567 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-2twf8" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.230393 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-44t2w"] Feb 19 23:28:42 crc kubenswrapper[4795]: E0219 23:28:42.230962 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a293bce-3326-47c0-a9b5-b5af13dc46c8" containerName="configure-network-openstack-openstack-cell1" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.230991 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a293bce-3326-47c0-a9b5-b5af13dc46c8" containerName="configure-network-openstack-openstack-cell1" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.231283 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a293bce-3326-47c0-a9b5-b5af13dc46c8" containerName="configure-network-openstack-openstack-cell1" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.232002 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-44t2w" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.234536 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.238340 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.238646 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.242667 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.255731 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-44t2w"] Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.279857 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-inventory\") pod \"validate-network-openstack-openstack-cell1-44t2w\" (UID: \"94dbf6be-911e-46d9-a950-fa19fa137490\") " pod="openstack/validate-network-openstack-openstack-cell1-44t2w" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.279930 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-ceph\") pod \"validate-network-openstack-openstack-cell1-44t2w\" (UID: \"94dbf6be-911e-46d9-a950-fa19fa137490\") " pod="openstack/validate-network-openstack-openstack-cell1-44t2w" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.280084 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpgbv\" (UniqueName: \"kubernetes.io/projected/94dbf6be-911e-46d9-a950-fa19fa137490-kube-api-access-tpgbv\") pod \"validate-network-openstack-openstack-cell1-44t2w\" (UID: \"94dbf6be-911e-46d9-a950-fa19fa137490\") " pod="openstack/validate-network-openstack-openstack-cell1-44t2w" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.280117 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-44t2w\" (UID: \"94dbf6be-911e-46d9-a950-fa19fa137490\") " pod="openstack/validate-network-openstack-openstack-cell1-44t2w" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.382457 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-inventory\") pod \"validate-network-openstack-openstack-cell1-44t2w\" (UID: \"94dbf6be-911e-46d9-a950-fa19fa137490\") " pod="openstack/validate-network-openstack-openstack-cell1-44t2w" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.382506 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-ceph\") pod \"validate-network-openstack-openstack-cell1-44t2w\" (UID: \"94dbf6be-911e-46d9-a950-fa19fa137490\") " pod="openstack/validate-network-openstack-openstack-cell1-44t2w" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.382597 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpgbv\" (UniqueName: \"kubernetes.io/projected/94dbf6be-911e-46d9-a950-fa19fa137490-kube-api-access-tpgbv\") pod \"validate-network-openstack-openstack-cell1-44t2w\" (UID: \"94dbf6be-911e-46d9-a950-fa19fa137490\") " pod="openstack/validate-network-openstack-openstack-cell1-44t2w" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.382622 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-44t2w\" (UID: \"94dbf6be-911e-46d9-a950-fa19fa137490\") " pod="openstack/validate-network-openstack-openstack-cell1-44t2w" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.386907 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-ceph\") pod \"validate-network-openstack-openstack-cell1-44t2w\" (UID: \"94dbf6be-911e-46d9-a950-fa19fa137490\") " pod="openstack/validate-network-openstack-openstack-cell1-44t2w" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.387498 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-44t2w\" (UID: \"94dbf6be-911e-46d9-a950-fa19fa137490\") " pod="openstack/validate-network-openstack-openstack-cell1-44t2w" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.393834 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-inventory\") pod \"validate-network-openstack-openstack-cell1-44t2w\" (UID: \"94dbf6be-911e-46d9-a950-fa19fa137490\") " pod="openstack/validate-network-openstack-openstack-cell1-44t2w" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.398285 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpgbv\" (UniqueName: \"kubernetes.io/projected/94dbf6be-911e-46d9-a950-fa19fa137490-kube-api-access-tpgbv\") pod \"validate-network-openstack-openstack-cell1-44t2w\" (UID: \"94dbf6be-911e-46d9-a950-fa19fa137490\") " pod="openstack/validate-network-openstack-openstack-cell1-44t2w" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.513701 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:28:42 crc kubenswrapper[4795]: E0219 23:28:42.513999 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.566536 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-44t2w" Feb 19 23:28:43 crc kubenswrapper[4795]: I0219 23:28:43.126180 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-44t2w"] Feb 19 23:28:43 crc kubenswrapper[4795]: W0219 23:28:43.136593 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94dbf6be_911e_46d9_a950_fa19fa137490.slice/crio-637d19a2b3792f44bf090f08ca44ccb5090e52c43db85942869e3627dba6ac80 WatchSource:0}: Error finding container 637d19a2b3792f44bf090f08ca44ccb5090e52c43db85942869e3627dba6ac80: Status 404 returned error can't find the container with id 637d19a2b3792f44bf090f08ca44ccb5090e52c43db85942869e3627dba6ac80 Feb 19 23:28:44 crc kubenswrapper[4795]: I0219 23:28:44.164070 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-44t2w" event={"ID":"94dbf6be-911e-46d9-a950-fa19fa137490","Type":"ContainerStarted","Data":"52eb3235fcf364c95d5fa52e43247c7f4457ed2cf97218b6e6ddac7c429f7e1b"} Feb 19 23:28:44 crc kubenswrapper[4795]: I0219 23:28:44.164615 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-44t2w" event={"ID":"94dbf6be-911e-46d9-a950-fa19fa137490","Type":"ContainerStarted","Data":"637d19a2b3792f44bf090f08ca44ccb5090e52c43db85942869e3627dba6ac80"} Feb 19 23:28:44 crc kubenswrapper[4795]: I0219 23:28:44.192921 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-44t2w" podStartSLOduration=1.709258087 podStartE2EDuration="2.192902013s" podCreationTimestamp="2026-02-19 23:28:42 +0000 UTC" firstStartedPulling="2026-02-19 23:28:43.145141349 +0000 UTC m=+7234.337659213" lastFinishedPulling="2026-02-19 23:28:43.628785275 +0000 UTC m=+7234.821303139" observedRunningTime="2026-02-19 23:28:44.191543005 +0000 UTC m=+7235.384060869" watchObservedRunningTime="2026-02-19 23:28:44.192902013 +0000 UTC m=+7235.385419867" Feb 19 23:28:49 crc kubenswrapper[4795]: I0219 23:28:49.231477 4795 generic.go:334] "Generic (PLEG): container finished" podID="94dbf6be-911e-46d9-a950-fa19fa137490" containerID="52eb3235fcf364c95d5fa52e43247c7f4457ed2cf97218b6e6ddac7c429f7e1b" exitCode=0 Feb 19 23:28:49 crc kubenswrapper[4795]: I0219 23:28:49.231595 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-44t2w" event={"ID":"94dbf6be-911e-46d9-a950-fa19fa137490","Type":"ContainerDied","Data":"52eb3235fcf364c95d5fa52e43247c7f4457ed2cf97218b6e6ddac7c429f7e1b"} Feb 19 23:28:50 crc kubenswrapper[4795]: I0219 23:28:50.887999 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-44t2w" Feb 19 23:28:50 crc kubenswrapper[4795]: I0219 23:28:50.970495 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-inventory\") pod \"94dbf6be-911e-46d9-a950-fa19fa137490\" (UID: \"94dbf6be-911e-46d9-a950-fa19fa137490\") " Feb 19 23:28:50 crc kubenswrapper[4795]: I0219 23:28:50.970677 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-ceph\") pod \"94dbf6be-911e-46d9-a950-fa19fa137490\" (UID: \"94dbf6be-911e-46d9-a950-fa19fa137490\") " Feb 19 23:28:50 crc kubenswrapper[4795]: I0219 23:28:50.970721 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-ssh-key-openstack-cell1\") pod \"94dbf6be-911e-46d9-a950-fa19fa137490\" (UID: \"94dbf6be-911e-46d9-a950-fa19fa137490\") " Feb 19 23:28:50 crc kubenswrapper[4795]: I0219 23:28:50.970903 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpgbv\" (UniqueName: \"kubernetes.io/projected/94dbf6be-911e-46d9-a950-fa19fa137490-kube-api-access-tpgbv\") pod \"94dbf6be-911e-46d9-a950-fa19fa137490\" (UID: \"94dbf6be-911e-46d9-a950-fa19fa137490\") " Feb 19 23:28:50 crc kubenswrapper[4795]: I0219 23:28:50.975906 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-ceph" (OuterVolumeSpecName: "ceph") pod "94dbf6be-911e-46d9-a950-fa19fa137490" (UID: "94dbf6be-911e-46d9-a950-fa19fa137490"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:28:50 crc kubenswrapper[4795]: I0219 23:28:50.976454 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94dbf6be-911e-46d9-a950-fa19fa137490-kube-api-access-tpgbv" (OuterVolumeSpecName: "kube-api-access-tpgbv") pod "94dbf6be-911e-46d9-a950-fa19fa137490" (UID: "94dbf6be-911e-46d9-a950-fa19fa137490"). InnerVolumeSpecName "kube-api-access-tpgbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.001837 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-inventory" (OuterVolumeSpecName: "inventory") pod "94dbf6be-911e-46d9-a950-fa19fa137490" (UID: "94dbf6be-911e-46d9-a950-fa19fa137490"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.003749 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "94dbf6be-911e-46d9-a950-fa19fa137490" (UID: "94dbf6be-911e-46d9-a950-fa19fa137490"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.072899 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpgbv\" (UniqueName: \"kubernetes.io/projected/94dbf6be-911e-46d9-a950-fa19fa137490-kube-api-access-tpgbv\") on node \"crc\" DevicePath \"\"" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.072933 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.072943 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.072952 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.254149 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-44t2w" event={"ID":"94dbf6be-911e-46d9-a950-fa19fa137490","Type":"ContainerDied","Data":"637d19a2b3792f44bf090f08ca44ccb5090e52c43db85942869e3627dba6ac80"} Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.254198 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="637d19a2b3792f44bf090f08ca44ccb5090e52c43db85942869e3627dba6ac80" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.254248 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-44t2w" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.347010 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-6ctfc"] Feb 19 23:28:51 crc kubenswrapper[4795]: E0219 23:28:51.347548 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94dbf6be-911e-46d9-a950-fa19fa137490" containerName="validate-network-openstack-openstack-cell1" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.347566 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="94dbf6be-911e-46d9-a950-fa19fa137490" containerName="validate-network-openstack-openstack-cell1" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.347832 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="94dbf6be-911e-46d9-a950-fa19fa137490" containerName="validate-network-openstack-openstack-cell1" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.348915 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-6ctfc" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.353759 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.354126 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.354463 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.354680 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.369488 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-6ctfc"] Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.384981 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-6ctfc\" (UID: \"41df3556-7d70-47f5-bd79-bec74fbd269c\") " pod="openstack/install-os-openstack-openstack-cell1-6ctfc" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.385441 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74wxd\" (UniqueName: \"kubernetes.io/projected/41df3556-7d70-47f5-bd79-bec74fbd269c-kube-api-access-74wxd\") pod \"install-os-openstack-openstack-cell1-6ctfc\" (UID: \"41df3556-7d70-47f5-bd79-bec74fbd269c\") " pod="openstack/install-os-openstack-openstack-cell1-6ctfc" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.385494 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-inventory\") pod \"install-os-openstack-openstack-cell1-6ctfc\" (UID: \"41df3556-7d70-47f5-bd79-bec74fbd269c\") " pod="openstack/install-os-openstack-openstack-cell1-6ctfc" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.385568 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-ceph\") pod \"install-os-openstack-openstack-cell1-6ctfc\" (UID: \"41df3556-7d70-47f5-bd79-bec74fbd269c\") " pod="openstack/install-os-openstack-openstack-cell1-6ctfc" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.487559 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74wxd\" (UniqueName: \"kubernetes.io/projected/41df3556-7d70-47f5-bd79-bec74fbd269c-kube-api-access-74wxd\") pod \"install-os-openstack-openstack-cell1-6ctfc\" (UID: \"41df3556-7d70-47f5-bd79-bec74fbd269c\") " pod="openstack/install-os-openstack-openstack-cell1-6ctfc" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.487614 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-inventory\") pod \"install-os-openstack-openstack-cell1-6ctfc\" (UID: \"41df3556-7d70-47f5-bd79-bec74fbd269c\") " pod="openstack/install-os-openstack-openstack-cell1-6ctfc" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.487654 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-ceph\") pod \"install-os-openstack-openstack-cell1-6ctfc\" (UID: \"41df3556-7d70-47f5-bd79-bec74fbd269c\") " pod="openstack/install-os-openstack-openstack-cell1-6ctfc" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.487733 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-6ctfc\" (UID: \"41df3556-7d70-47f5-bd79-bec74fbd269c\") " pod="openstack/install-os-openstack-openstack-cell1-6ctfc" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.491474 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-6ctfc\" (UID: \"41df3556-7d70-47f5-bd79-bec74fbd269c\") " pod="openstack/install-os-openstack-openstack-cell1-6ctfc" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.492130 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-inventory\") pod \"install-os-openstack-openstack-cell1-6ctfc\" (UID: \"41df3556-7d70-47f5-bd79-bec74fbd269c\") " pod="openstack/install-os-openstack-openstack-cell1-6ctfc" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.492508 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-ceph\") pod \"install-os-openstack-openstack-cell1-6ctfc\" (UID: \"41df3556-7d70-47f5-bd79-bec74fbd269c\") " pod="openstack/install-os-openstack-openstack-cell1-6ctfc" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.507850 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74wxd\" (UniqueName: \"kubernetes.io/projected/41df3556-7d70-47f5-bd79-bec74fbd269c-kube-api-access-74wxd\") pod \"install-os-openstack-openstack-cell1-6ctfc\" (UID: \"41df3556-7d70-47f5-bd79-bec74fbd269c\") " pod="openstack/install-os-openstack-openstack-cell1-6ctfc" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.676380 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-6ctfc" Feb 19 23:28:52 crc kubenswrapper[4795]: I0219 23:28:52.230955 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-6ctfc"] Feb 19 23:28:52 crc kubenswrapper[4795]: I0219 23:28:52.267895 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-6ctfc" event={"ID":"41df3556-7d70-47f5-bd79-bec74fbd269c","Type":"ContainerStarted","Data":"24420da62070a51e6a7078c855f70d8ac227fa3cb1925b98155c54146d41d372"} Feb 19 23:28:53 crc kubenswrapper[4795]: I0219 23:28:53.277036 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-6ctfc" event={"ID":"41df3556-7d70-47f5-bd79-bec74fbd269c","Type":"ContainerStarted","Data":"32f851236caf6b2bbe224e448255cea34c1cbbfdd587e7b66125b4e5133ed565"} Feb 19 23:28:53 crc kubenswrapper[4795]: I0219 23:28:53.296545 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-6ctfc" podStartSLOduration=1.898797467 podStartE2EDuration="2.296526945s" podCreationTimestamp="2026-02-19 23:28:51 +0000 UTC" firstStartedPulling="2026-02-19 23:28:52.25618306 +0000 UTC m=+7243.448700924" lastFinishedPulling="2026-02-19 23:28:52.653912538 +0000 UTC m=+7243.846430402" observedRunningTime="2026-02-19 23:28:53.293332214 +0000 UTC m=+7244.485850098" watchObservedRunningTime="2026-02-19 23:28:53.296526945 +0000 UTC m=+7244.489044809" Feb 19 23:28:56 crc kubenswrapper[4795]: I0219 23:28:56.511842 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:28:56 crc kubenswrapper[4795]: E0219 23:28:56.512915 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:29:07 crc kubenswrapper[4795]: I0219 23:29:07.512604 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:29:07 crc kubenswrapper[4795]: E0219 23:29:07.513768 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:29:21 crc kubenswrapper[4795]: I0219 23:29:21.512467 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:29:21 crc kubenswrapper[4795]: E0219 23:29:21.518087 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:29:34 crc kubenswrapper[4795]: I0219 23:29:34.512236 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:29:34 crc kubenswrapper[4795]: E0219 23:29:34.512983 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:29:36 crc kubenswrapper[4795]: I0219 23:29:36.707613 4795 generic.go:334] "Generic (PLEG): container finished" podID="41df3556-7d70-47f5-bd79-bec74fbd269c" containerID="32f851236caf6b2bbe224e448255cea34c1cbbfdd587e7b66125b4e5133ed565" exitCode=0 Feb 19 23:29:36 crc kubenswrapper[4795]: I0219 23:29:36.707688 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-6ctfc" event={"ID":"41df3556-7d70-47f5-bd79-bec74fbd269c","Type":"ContainerDied","Data":"32f851236caf6b2bbe224e448255cea34c1cbbfdd587e7b66125b4e5133ed565"} Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.222891 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-6ctfc" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.290840 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-ssh-key-openstack-cell1\") pod \"41df3556-7d70-47f5-bd79-bec74fbd269c\" (UID: \"41df3556-7d70-47f5-bd79-bec74fbd269c\") " Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.290888 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-inventory\") pod \"41df3556-7d70-47f5-bd79-bec74fbd269c\" (UID: \"41df3556-7d70-47f5-bd79-bec74fbd269c\") " Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.290996 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-ceph\") pod \"41df3556-7d70-47f5-bd79-bec74fbd269c\" (UID: \"41df3556-7d70-47f5-bd79-bec74fbd269c\") " Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.291131 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74wxd\" (UniqueName: \"kubernetes.io/projected/41df3556-7d70-47f5-bd79-bec74fbd269c-kube-api-access-74wxd\") pod \"41df3556-7d70-47f5-bd79-bec74fbd269c\" (UID: \"41df3556-7d70-47f5-bd79-bec74fbd269c\") " Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.296080 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41df3556-7d70-47f5-bd79-bec74fbd269c-kube-api-access-74wxd" (OuterVolumeSpecName: "kube-api-access-74wxd") pod "41df3556-7d70-47f5-bd79-bec74fbd269c" (UID: "41df3556-7d70-47f5-bd79-bec74fbd269c"). InnerVolumeSpecName "kube-api-access-74wxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.298302 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-ceph" (OuterVolumeSpecName: "ceph") pod "41df3556-7d70-47f5-bd79-bec74fbd269c" (UID: "41df3556-7d70-47f5-bd79-bec74fbd269c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.320468 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-inventory" (OuterVolumeSpecName: "inventory") pod "41df3556-7d70-47f5-bd79-bec74fbd269c" (UID: "41df3556-7d70-47f5-bd79-bec74fbd269c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.320590 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "41df3556-7d70-47f5-bd79-bec74fbd269c" (UID: "41df3556-7d70-47f5-bd79-bec74fbd269c"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.393376 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.393581 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.393661 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.393728 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74wxd\" (UniqueName: \"kubernetes.io/projected/41df3556-7d70-47f5-bd79-bec74fbd269c-kube-api-access-74wxd\") on node \"crc\" DevicePath \"\"" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.723889 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-6ctfc" event={"ID":"41df3556-7d70-47f5-bd79-bec74fbd269c","Type":"ContainerDied","Data":"24420da62070a51e6a7078c855f70d8ac227fa3cb1925b98155c54146d41d372"} Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.723925 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24420da62070a51e6a7078c855f70d8ac227fa3cb1925b98155c54146d41d372" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.723928 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-6ctfc" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.819289 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-lbssf"] Feb 19 23:29:38 crc kubenswrapper[4795]: E0219 23:29:38.819691 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41df3556-7d70-47f5-bd79-bec74fbd269c" containerName="install-os-openstack-openstack-cell1" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.819709 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="41df3556-7d70-47f5-bd79-bec74fbd269c" containerName="install-os-openstack-openstack-cell1" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.819932 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="41df3556-7d70-47f5-bd79-bec74fbd269c" containerName="install-os-openstack-openstack-cell1" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.820861 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-lbssf" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.823239 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.824189 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.824772 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.825701 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.830213 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-lbssf"] Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.904528 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqhfz\" (UniqueName: \"kubernetes.io/projected/0d850dd7-a1bb-42db-893b-b96eebee4c9c-kube-api-access-wqhfz\") pod \"configure-os-openstack-openstack-cell1-lbssf\" (UID: \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\") " pod="openstack/configure-os-openstack-openstack-cell1-lbssf" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.904903 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-ceph\") pod \"configure-os-openstack-openstack-cell1-lbssf\" (UID: \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\") " pod="openstack/configure-os-openstack-openstack-cell1-lbssf" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.904937 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-lbssf\" (UID: \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\") " pod="openstack/configure-os-openstack-openstack-cell1-lbssf" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.904964 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-inventory\") pod \"configure-os-openstack-openstack-cell1-lbssf\" (UID: \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\") " pod="openstack/configure-os-openstack-openstack-cell1-lbssf" Feb 19 23:29:39 crc kubenswrapper[4795]: I0219 23:29:39.006386 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-lbssf\" (UID: \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\") " pod="openstack/configure-os-openstack-openstack-cell1-lbssf" Feb 19 23:29:39 crc kubenswrapper[4795]: I0219 23:29:39.006447 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-inventory\") pod \"configure-os-openstack-openstack-cell1-lbssf\" (UID: \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\") " pod="openstack/configure-os-openstack-openstack-cell1-lbssf" Feb 19 23:29:39 crc kubenswrapper[4795]: I0219 23:29:39.006553 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqhfz\" (UniqueName: \"kubernetes.io/projected/0d850dd7-a1bb-42db-893b-b96eebee4c9c-kube-api-access-wqhfz\") pod \"configure-os-openstack-openstack-cell1-lbssf\" (UID: \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\") " pod="openstack/configure-os-openstack-openstack-cell1-lbssf" Feb 19 23:29:39 crc kubenswrapper[4795]: I0219 23:29:39.006666 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-ceph\") pod \"configure-os-openstack-openstack-cell1-lbssf\" (UID: \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\") " pod="openstack/configure-os-openstack-openstack-cell1-lbssf" Feb 19 23:29:39 crc kubenswrapper[4795]: I0219 23:29:39.013865 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-inventory\") pod \"configure-os-openstack-openstack-cell1-lbssf\" (UID: \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\") " pod="openstack/configure-os-openstack-openstack-cell1-lbssf" Feb 19 23:29:39 crc kubenswrapper[4795]: I0219 23:29:39.014179 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-lbssf\" (UID: \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\") " pod="openstack/configure-os-openstack-openstack-cell1-lbssf" Feb 19 23:29:39 crc kubenswrapper[4795]: I0219 23:29:39.021347 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-ceph\") pod \"configure-os-openstack-openstack-cell1-lbssf\" (UID: \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\") " pod="openstack/configure-os-openstack-openstack-cell1-lbssf" Feb 19 23:29:39 crc kubenswrapper[4795]: I0219 23:29:39.027791 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqhfz\" (UniqueName: \"kubernetes.io/projected/0d850dd7-a1bb-42db-893b-b96eebee4c9c-kube-api-access-wqhfz\") pod \"configure-os-openstack-openstack-cell1-lbssf\" (UID: \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\") " pod="openstack/configure-os-openstack-openstack-cell1-lbssf" Feb 19 23:29:39 crc kubenswrapper[4795]: I0219 23:29:39.139585 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-lbssf" Feb 19 23:29:39 crc kubenswrapper[4795]: W0219 23:29:39.756476 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d850dd7_a1bb_42db_893b_b96eebee4c9c.slice/crio-1db73399971c5dcbeab910f2434559109ff2b9046cd72f0acc08f2f83a413b30 WatchSource:0}: Error finding container 1db73399971c5dcbeab910f2434559109ff2b9046cd72f0acc08f2f83a413b30: Status 404 returned error can't find the container with id 1db73399971c5dcbeab910f2434559109ff2b9046cd72f0acc08f2f83a413b30 Feb 19 23:29:39 crc kubenswrapper[4795]: I0219 23:29:39.758306 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-lbssf"] Feb 19 23:29:40 crc kubenswrapper[4795]: I0219 23:29:40.741354 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-lbssf" event={"ID":"0d850dd7-a1bb-42db-893b-b96eebee4c9c","Type":"ContainerStarted","Data":"c9f6543c31eab5a72ff02a75a7c4ec250b61feb6ce31b2ae3d2ad84bde1fb437"} Feb 19 23:29:40 crc kubenswrapper[4795]: I0219 23:29:40.741712 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-lbssf" event={"ID":"0d850dd7-a1bb-42db-893b-b96eebee4c9c","Type":"ContainerStarted","Data":"1db73399971c5dcbeab910f2434559109ff2b9046cd72f0acc08f2f83a413b30"} Feb 19 23:29:40 crc kubenswrapper[4795]: I0219 23:29:40.782030 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-lbssf" podStartSLOduration=2.358657315 podStartE2EDuration="2.78201064s" podCreationTimestamp="2026-02-19 23:29:38 +0000 UTC" firstStartedPulling="2026-02-19 23:29:39.759883952 +0000 UTC m=+7290.952401826" lastFinishedPulling="2026-02-19 23:29:40.183237287 +0000 UTC m=+7291.375755151" observedRunningTime="2026-02-19 23:29:40.754502429 +0000 UTC m=+7291.947020293" watchObservedRunningTime="2026-02-19 23:29:40.78201064 +0000 UTC m=+7291.974528504" Feb 19 23:29:48 crc kubenswrapper[4795]: I0219 23:29:48.511959 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:29:48 crc kubenswrapper[4795]: E0219 23:29:48.512670 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:30:00 crc kubenswrapper[4795]: I0219 23:30:00.153893 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n"] Feb 19 23:30:00 crc kubenswrapper[4795]: I0219 23:30:00.156735 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n" Feb 19 23:30:00 crc kubenswrapper[4795]: I0219 23:30:00.158840 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 23:30:00 crc kubenswrapper[4795]: I0219 23:30:00.159188 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 23:30:00 crc kubenswrapper[4795]: I0219 23:30:00.162652 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n"] Feb 19 23:30:00 crc kubenswrapper[4795]: I0219 23:30:00.228934 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4ac63769-08cc-4f7e-9015-59c2df94d39f-secret-volume\") pod \"collect-profiles-29525730-mzs4n\" (UID: \"4ac63769-08cc-4f7e-9015-59c2df94d39f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n" Feb 19 23:30:00 crc kubenswrapper[4795]: I0219 23:30:00.229117 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ac63769-08cc-4f7e-9015-59c2df94d39f-config-volume\") pod \"collect-profiles-29525730-mzs4n\" (UID: \"4ac63769-08cc-4f7e-9015-59c2df94d39f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n" Feb 19 23:30:00 crc kubenswrapper[4795]: I0219 23:30:00.229266 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4nst\" (UniqueName: \"kubernetes.io/projected/4ac63769-08cc-4f7e-9015-59c2df94d39f-kube-api-access-m4nst\") pod \"collect-profiles-29525730-mzs4n\" (UID: \"4ac63769-08cc-4f7e-9015-59c2df94d39f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n" Feb 19 23:30:00 crc kubenswrapper[4795]: I0219 23:30:00.330882 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ac63769-08cc-4f7e-9015-59c2df94d39f-config-volume\") pod \"collect-profiles-29525730-mzs4n\" (UID: \"4ac63769-08cc-4f7e-9015-59c2df94d39f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n" Feb 19 23:30:00 crc kubenswrapper[4795]: I0219 23:30:00.330988 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4nst\" (UniqueName: \"kubernetes.io/projected/4ac63769-08cc-4f7e-9015-59c2df94d39f-kube-api-access-m4nst\") pod \"collect-profiles-29525730-mzs4n\" (UID: \"4ac63769-08cc-4f7e-9015-59c2df94d39f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n" Feb 19 23:30:00 crc kubenswrapper[4795]: I0219 23:30:00.331043 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4ac63769-08cc-4f7e-9015-59c2df94d39f-secret-volume\") pod \"collect-profiles-29525730-mzs4n\" (UID: \"4ac63769-08cc-4f7e-9015-59c2df94d39f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n" Feb 19 23:30:00 crc kubenswrapper[4795]: I0219 23:30:00.332040 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ac63769-08cc-4f7e-9015-59c2df94d39f-config-volume\") pod \"collect-profiles-29525730-mzs4n\" (UID: \"4ac63769-08cc-4f7e-9015-59c2df94d39f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n" Feb 19 23:30:00 crc kubenswrapper[4795]: I0219 23:30:00.338238 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4ac63769-08cc-4f7e-9015-59c2df94d39f-secret-volume\") pod \"collect-profiles-29525730-mzs4n\" (UID: \"4ac63769-08cc-4f7e-9015-59c2df94d39f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n" Feb 19 23:30:00 crc kubenswrapper[4795]: I0219 23:30:00.348791 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4nst\" (UniqueName: \"kubernetes.io/projected/4ac63769-08cc-4f7e-9015-59c2df94d39f-kube-api-access-m4nst\") pod \"collect-profiles-29525730-mzs4n\" (UID: \"4ac63769-08cc-4f7e-9015-59c2df94d39f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n" Feb 19 23:30:00 crc kubenswrapper[4795]: I0219 23:30:00.530346 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n" Feb 19 23:30:00 crc kubenswrapper[4795]: I0219 23:30:00.988095 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n"] Feb 19 23:30:01 crc kubenswrapper[4795]: I0219 23:30:01.942502 4795 generic.go:334] "Generic (PLEG): container finished" podID="4ac63769-08cc-4f7e-9015-59c2df94d39f" containerID="977244ee3537e5139e2881adcfaafbf007340a13c0343e3c88ee23e7e14c9ddf" exitCode=0 Feb 19 23:30:01 crc kubenswrapper[4795]: I0219 23:30:01.942656 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n" event={"ID":"4ac63769-08cc-4f7e-9015-59c2df94d39f","Type":"ContainerDied","Data":"977244ee3537e5139e2881adcfaafbf007340a13c0343e3c88ee23e7e14c9ddf"} Feb 19 23:30:01 crc kubenswrapper[4795]: I0219 23:30:01.943324 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n" event={"ID":"4ac63769-08cc-4f7e-9015-59c2df94d39f","Type":"ContainerStarted","Data":"28272af00a87c52e2b8764b7a7c16cbbc76ba88c3db82f0f8bfc9356174e7b6b"} Feb 19 23:30:03 crc kubenswrapper[4795]: I0219 23:30:03.313059 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n" Feb 19 23:30:03 crc kubenswrapper[4795]: I0219 23:30:03.395792 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4nst\" (UniqueName: \"kubernetes.io/projected/4ac63769-08cc-4f7e-9015-59c2df94d39f-kube-api-access-m4nst\") pod \"4ac63769-08cc-4f7e-9015-59c2df94d39f\" (UID: \"4ac63769-08cc-4f7e-9015-59c2df94d39f\") " Feb 19 23:30:03 crc kubenswrapper[4795]: I0219 23:30:03.395895 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4ac63769-08cc-4f7e-9015-59c2df94d39f-secret-volume\") pod \"4ac63769-08cc-4f7e-9015-59c2df94d39f\" (UID: \"4ac63769-08cc-4f7e-9015-59c2df94d39f\") " Feb 19 23:30:03 crc kubenswrapper[4795]: I0219 23:30:03.395946 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ac63769-08cc-4f7e-9015-59c2df94d39f-config-volume\") pod \"4ac63769-08cc-4f7e-9015-59c2df94d39f\" (UID: \"4ac63769-08cc-4f7e-9015-59c2df94d39f\") " Feb 19 23:30:03 crc kubenswrapper[4795]: I0219 23:30:03.397132 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ac63769-08cc-4f7e-9015-59c2df94d39f-config-volume" (OuterVolumeSpecName: "config-volume") pod "4ac63769-08cc-4f7e-9015-59c2df94d39f" (UID: "4ac63769-08cc-4f7e-9015-59c2df94d39f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:30:03 crc kubenswrapper[4795]: I0219 23:30:03.401231 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ac63769-08cc-4f7e-9015-59c2df94d39f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4ac63769-08cc-4f7e-9015-59c2df94d39f" (UID: "4ac63769-08cc-4f7e-9015-59c2df94d39f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:30:03 crc kubenswrapper[4795]: I0219 23:30:03.419754 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ac63769-08cc-4f7e-9015-59c2df94d39f-kube-api-access-m4nst" (OuterVolumeSpecName: "kube-api-access-m4nst") pod "4ac63769-08cc-4f7e-9015-59c2df94d39f" (UID: "4ac63769-08cc-4f7e-9015-59c2df94d39f"). InnerVolumeSpecName "kube-api-access-m4nst". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:30:03 crc kubenswrapper[4795]: I0219 23:30:03.499675 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4nst\" (UniqueName: \"kubernetes.io/projected/4ac63769-08cc-4f7e-9015-59c2df94d39f-kube-api-access-m4nst\") on node \"crc\" DevicePath \"\"" Feb 19 23:30:03 crc kubenswrapper[4795]: I0219 23:30:03.499705 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4ac63769-08cc-4f7e-9015-59c2df94d39f-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 23:30:03 crc kubenswrapper[4795]: I0219 23:30:03.499717 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ac63769-08cc-4f7e-9015-59c2df94d39f-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 23:30:03 crc kubenswrapper[4795]: I0219 23:30:03.511977 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:30:03 crc kubenswrapper[4795]: I0219 23:30:03.969598 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"53ab375a2fc3fa1ef244b68bab3723d6db871a1c5fc51b2d8dec9ed6289bc190"} Feb 19 23:30:03 crc kubenswrapper[4795]: I0219 23:30:03.972472 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n" event={"ID":"4ac63769-08cc-4f7e-9015-59c2df94d39f","Type":"ContainerDied","Data":"28272af00a87c52e2b8764b7a7c16cbbc76ba88c3db82f0f8bfc9356174e7b6b"} Feb 19 23:30:03 crc kubenswrapper[4795]: I0219 23:30:03.972523 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28272af00a87c52e2b8764b7a7c16cbbc76ba88c3db82f0f8bfc9356174e7b6b" Feb 19 23:30:03 crc kubenswrapper[4795]: I0219 23:30:03.972550 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n" Feb 19 23:30:04 crc kubenswrapper[4795]: I0219 23:30:04.388251 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd"] Feb 19 23:30:04 crc kubenswrapper[4795]: I0219 23:30:04.396961 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd"] Feb 19 23:30:05 crc kubenswrapper[4795]: I0219 23:30:05.532137 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54bacd9c-6bce-433c-972c-3990566baa40" path="/var/lib/kubelet/pods/54bacd9c-6bce-433c-972c-3990566baa40/volumes" Feb 19 23:30:24 crc kubenswrapper[4795]: I0219 23:30:24.145422 4795 generic.go:334] "Generic (PLEG): container finished" podID="0d850dd7-a1bb-42db-893b-b96eebee4c9c" containerID="c9f6543c31eab5a72ff02a75a7c4ec250b61feb6ce31b2ae3d2ad84bde1fb437" exitCode=0 Feb 19 23:30:24 crc kubenswrapper[4795]: I0219 23:30:24.145523 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-lbssf" event={"ID":"0d850dd7-a1bb-42db-893b-b96eebee4c9c","Type":"ContainerDied","Data":"c9f6543c31eab5a72ff02a75a7c4ec250b61feb6ce31b2ae3d2ad84bde1fb437"} Feb 19 23:30:25 crc kubenswrapper[4795]: I0219 23:30:25.702984 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-lbssf" Feb 19 23:30:25 crc kubenswrapper[4795]: I0219 23:30:25.802498 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-ssh-key-openstack-cell1\") pod \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\" (UID: \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\") " Feb 19 23:30:25 crc kubenswrapper[4795]: I0219 23:30:25.802589 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-inventory\") pod \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\" (UID: \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\") " Feb 19 23:30:25 crc kubenswrapper[4795]: I0219 23:30:25.802704 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-ceph\") pod \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\" (UID: \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\") " Feb 19 23:30:25 crc kubenswrapper[4795]: I0219 23:30:25.802738 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqhfz\" (UniqueName: \"kubernetes.io/projected/0d850dd7-a1bb-42db-893b-b96eebee4c9c-kube-api-access-wqhfz\") pod \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\" (UID: \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\") " Feb 19 23:30:25 crc kubenswrapper[4795]: I0219 23:30:25.807763 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-ceph" (OuterVolumeSpecName: "ceph") pod "0d850dd7-a1bb-42db-893b-b96eebee4c9c" (UID: "0d850dd7-a1bb-42db-893b-b96eebee4c9c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:30:25 crc kubenswrapper[4795]: I0219 23:30:25.810310 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d850dd7-a1bb-42db-893b-b96eebee4c9c-kube-api-access-wqhfz" (OuterVolumeSpecName: "kube-api-access-wqhfz") pod "0d850dd7-a1bb-42db-893b-b96eebee4c9c" (UID: "0d850dd7-a1bb-42db-893b-b96eebee4c9c"). InnerVolumeSpecName "kube-api-access-wqhfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:30:25 crc kubenswrapper[4795]: I0219 23:30:25.836426 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "0d850dd7-a1bb-42db-893b-b96eebee4c9c" (UID: "0d850dd7-a1bb-42db-893b-b96eebee4c9c"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:30:25 crc kubenswrapper[4795]: I0219 23:30:25.837686 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-inventory" (OuterVolumeSpecName: "inventory") pod "0d850dd7-a1bb-42db-893b-b96eebee4c9c" (UID: "0d850dd7-a1bb-42db-893b-b96eebee4c9c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:30:25 crc kubenswrapper[4795]: I0219 23:30:25.905114 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:30:25 crc kubenswrapper[4795]: I0219 23:30:25.905146 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:30:25 crc kubenswrapper[4795]: I0219 23:30:25.905155 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:30:25 crc kubenswrapper[4795]: I0219 23:30:25.905179 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqhfz\" (UniqueName: \"kubernetes.io/projected/0d850dd7-a1bb-42db-893b-b96eebee4c9c-kube-api-access-wqhfz\") on node \"crc\" DevicePath \"\"" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.174746 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-lbssf" event={"ID":"0d850dd7-a1bb-42db-893b-b96eebee4c9c","Type":"ContainerDied","Data":"1db73399971c5dcbeab910f2434559109ff2b9046cd72f0acc08f2f83a413b30"} Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.174793 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1db73399971c5dcbeab910f2434559109ff2b9046cd72f0acc08f2f83a413b30" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.174824 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-lbssf" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.251395 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-qzhll"] Feb 19 23:30:26 crc kubenswrapper[4795]: E0219 23:30:26.251804 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d850dd7-a1bb-42db-893b-b96eebee4c9c" containerName="configure-os-openstack-openstack-cell1" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.251819 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d850dd7-a1bb-42db-893b-b96eebee4c9c" containerName="configure-os-openstack-openstack-cell1" Feb 19 23:30:26 crc kubenswrapper[4795]: E0219 23:30:26.251849 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ac63769-08cc-4f7e-9015-59c2df94d39f" containerName="collect-profiles" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.251855 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ac63769-08cc-4f7e-9015-59c2df94d39f" containerName="collect-profiles" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.252084 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d850dd7-a1bb-42db-893b-b96eebee4c9c" containerName="configure-os-openstack-openstack-cell1" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.252107 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ac63769-08cc-4f7e-9015-59c2df94d39f" containerName="collect-profiles" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.252861 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-qzhll" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.254743 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.254881 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.255629 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.255922 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.264586 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-qzhll"] Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.415218 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l6mg\" (UniqueName: \"kubernetes.io/projected/adb280f6-14e8-45d3-91a1-1bf325d84aef-kube-api-access-5l6mg\") pod \"ssh-known-hosts-openstack-qzhll\" (UID: \"adb280f6-14e8-45d3-91a1-1bf325d84aef\") " pod="openstack/ssh-known-hosts-openstack-qzhll" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.415298 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-inventory-0\") pod \"ssh-known-hosts-openstack-qzhll\" (UID: \"adb280f6-14e8-45d3-91a1-1bf325d84aef\") " pod="openstack/ssh-known-hosts-openstack-qzhll" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.415358 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-ceph\") pod \"ssh-known-hosts-openstack-qzhll\" (UID: \"adb280f6-14e8-45d3-91a1-1bf325d84aef\") " pod="openstack/ssh-known-hosts-openstack-qzhll" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.415645 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-qzhll\" (UID: \"adb280f6-14e8-45d3-91a1-1bf325d84aef\") " pod="openstack/ssh-known-hosts-openstack-qzhll" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.518119 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l6mg\" (UniqueName: \"kubernetes.io/projected/adb280f6-14e8-45d3-91a1-1bf325d84aef-kube-api-access-5l6mg\") pod \"ssh-known-hosts-openstack-qzhll\" (UID: \"adb280f6-14e8-45d3-91a1-1bf325d84aef\") " pod="openstack/ssh-known-hosts-openstack-qzhll" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.518202 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-inventory-0\") pod \"ssh-known-hosts-openstack-qzhll\" (UID: \"adb280f6-14e8-45d3-91a1-1bf325d84aef\") " pod="openstack/ssh-known-hosts-openstack-qzhll" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.518276 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-ceph\") pod \"ssh-known-hosts-openstack-qzhll\" (UID: \"adb280f6-14e8-45d3-91a1-1bf325d84aef\") " pod="openstack/ssh-known-hosts-openstack-qzhll" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.518392 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-qzhll\" (UID: \"adb280f6-14e8-45d3-91a1-1bf325d84aef\") " pod="openstack/ssh-known-hosts-openstack-qzhll" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.532899 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-inventory-0\") pod \"ssh-known-hosts-openstack-qzhll\" (UID: \"adb280f6-14e8-45d3-91a1-1bf325d84aef\") " pod="openstack/ssh-known-hosts-openstack-qzhll" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.532899 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-qzhll\" (UID: \"adb280f6-14e8-45d3-91a1-1bf325d84aef\") " pod="openstack/ssh-known-hosts-openstack-qzhll" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.533280 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-ceph\") pod \"ssh-known-hosts-openstack-qzhll\" (UID: \"adb280f6-14e8-45d3-91a1-1bf325d84aef\") " pod="openstack/ssh-known-hosts-openstack-qzhll" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.552545 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l6mg\" (UniqueName: \"kubernetes.io/projected/adb280f6-14e8-45d3-91a1-1bf325d84aef-kube-api-access-5l6mg\") pod \"ssh-known-hosts-openstack-qzhll\" (UID: \"adb280f6-14e8-45d3-91a1-1bf325d84aef\") " pod="openstack/ssh-known-hosts-openstack-qzhll" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.573703 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-qzhll" Feb 19 23:30:27 crc kubenswrapper[4795]: I0219 23:30:27.142668 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-qzhll"] Feb 19 23:30:27 crc kubenswrapper[4795]: I0219 23:30:27.189135 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-qzhll" event={"ID":"adb280f6-14e8-45d3-91a1-1bf325d84aef","Type":"ContainerStarted","Data":"b55a3be4c08a80f7a0cdc33ce789154e9bfb621b45fd7b98bc130e28d94b7534"} Feb 19 23:30:28 crc kubenswrapper[4795]: I0219 23:30:28.118015 4795 scope.go:117] "RemoveContainer" containerID="eca52f37003a7a5168093ed1a2726d31a52843bd3e99b81128785c0ad70b60e9" Feb 19 23:30:28 crc kubenswrapper[4795]: I0219 23:30:28.206679 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-qzhll" event={"ID":"adb280f6-14e8-45d3-91a1-1bf325d84aef","Type":"ContainerStarted","Data":"4d38e981123d5faed9cee0f1f24700eb07147e276a988944bb7acaad976c3c58"} Feb 19 23:30:28 crc kubenswrapper[4795]: I0219 23:30:28.237458 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-qzhll" podStartSLOduration=1.779926218 podStartE2EDuration="2.237434072s" podCreationTimestamp="2026-02-19 23:30:26 +0000 UTC" firstStartedPulling="2026-02-19 23:30:27.146572245 +0000 UTC m=+7338.339090109" lastFinishedPulling="2026-02-19 23:30:27.604080099 +0000 UTC m=+7338.796597963" observedRunningTime="2026-02-19 23:30:28.227312115 +0000 UTC m=+7339.419829979" watchObservedRunningTime="2026-02-19 23:30:28.237434072 +0000 UTC m=+7339.429951946" Feb 19 23:30:36 crc kubenswrapper[4795]: I0219 23:30:36.313991 4795 generic.go:334] "Generic (PLEG): container finished" podID="adb280f6-14e8-45d3-91a1-1bf325d84aef" containerID="4d38e981123d5faed9cee0f1f24700eb07147e276a988944bb7acaad976c3c58" exitCode=0 Feb 19 23:30:36 crc kubenswrapper[4795]: I0219 23:30:36.314547 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-qzhll" event={"ID":"adb280f6-14e8-45d3-91a1-1bf325d84aef","Type":"ContainerDied","Data":"4d38e981123d5faed9cee0f1f24700eb07147e276a988944bb7acaad976c3c58"} Feb 19 23:30:37 crc kubenswrapper[4795]: I0219 23:30:37.766467 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-qzhll" Feb 19 23:30:37 crc kubenswrapper[4795]: I0219 23:30:37.900076 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l6mg\" (UniqueName: \"kubernetes.io/projected/adb280f6-14e8-45d3-91a1-1bf325d84aef-kube-api-access-5l6mg\") pod \"adb280f6-14e8-45d3-91a1-1bf325d84aef\" (UID: \"adb280f6-14e8-45d3-91a1-1bf325d84aef\") " Feb 19 23:30:37 crc kubenswrapper[4795]: I0219 23:30:37.900480 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-ssh-key-openstack-cell1\") pod \"adb280f6-14e8-45d3-91a1-1bf325d84aef\" (UID: \"adb280f6-14e8-45d3-91a1-1bf325d84aef\") " Feb 19 23:30:37 crc kubenswrapper[4795]: I0219 23:30:37.900513 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-inventory-0\") pod \"adb280f6-14e8-45d3-91a1-1bf325d84aef\" (UID: \"adb280f6-14e8-45d3-91a1-1bf325d84aef\") " Feb 19 23:30:37 crc kubenswrapper[4795]: I0219 23:30:37.900588 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-ceph\") pod \"adb280f6-14e8-45d3-91a1-1bf325d84aef\" (UID: \"adb280f6-14e8-45d3-91a1-1bf325d84aef\") " Feb 19 23:30:37 crc kubenswrapper[4795]: I0219 23:30:37.905198 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-ceph" (OuterVolumeSpecName: "ceph") pod "adb280f6-14e8-45d3-91a1-1bf325d84aef" (UID: "adb280f6-14e8-45d3-91a1-1bf325d84aef"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:30:37 crc kubenswrapper[4795]: I0219 23:30:37.905383 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adb280f6-14e8-45d3-91a1-1bf325d84aef-kube-api-access-5l6mg" (OuterVolumeSpecName: "kube-api-access-5l6mg") pod "adb280f6-14e8-45d3-91a1-1bf325d84aef" (UID: "adb280f6-14e8-45d3-91a1-1bf325d84aef"). InnerVolumeSpecName "kube-api-access-5l6mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:30:37 crc kubenswrapper[4795]: I0219 23:30:37.934973 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "adb280f6-14e8-45d3-91a1-1bf325d84aef" (UID: "adb280f6-14e8-45d3-91a1-1bf325d84aef"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:30:37 crc kubenswrapper[4795]: I0219 23:30:37.936336 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "adb280f6-14e8-45d3-91a1-1bf325d84aef" (UID: "adb280f6-14e8-45d3-91a1-1bf325d84aef"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.006382 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l6mg\" (UniqueName: \"kubernetes.io/projected/adb280f6-14e8-45d3-91a1-1bf325d84aef-kube-api-access-5l6mg\") on node \"crc\" DevicePath \"\"" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.006420 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.006435 4795 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.006449 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.333973 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-qzhll" event={"ID":"adb280f6-14e8-45d3-91a1-1bf325d84aef","Type":"ContainerDied","Data":"b55a3be4c08a80f7a0cdc33ce789154e9bfb621b45fd7b98bc130e28d94b7534"} Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.334234 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b55a3be4c08a80f7a0cdc33ce789154e9bfb621b45fd7b98bc130e28d94b7534" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.334283 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-qzhll" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.408188 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-7vb2g"] Feb 19 23:30:38 crc kubenswrapper[4795]: E0219 23:30:38.408630 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adb280f6-14e8-45d3-91a1-1bf325d84aef" containerName="ssh-known-hosts-openstack" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.408646 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="adb280f6-14e8-45d3-91a1-1bf325d84aef" containerName="ssh-known-hosts-openstack" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.408842 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="adb280f6-14e8-45d3-91a1-1bf325d84aef" containerName="ssh-known-hosts-openstack" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.409638 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-7vb2g" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.414572 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.414864 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.414990 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.415126 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.418943 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-7vb2g"] Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.517348 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-7vb2g\" (UID: \"d82522ab-bf1a-47f9-902b-c82105b5d09b\") " pod="openstack/run-os-openstack-openstack-cell1-7vb2g" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.517765 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qdsw\" (UniqueName: \"kubernetes.io/projected/d82522ab-bf1a-47f9-902b-c82105b5d09b-kube-api-access-8qdsw\") pod \"run-os-openstack-openstack-cell1-7vb2g\" (UID: \"d82522ab-bf1a-47f9-902b-c82105b5d09b\") " pod="openstack/run-os-openstack-openstack-cell1-7vb2g" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.517892 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-ceph\") pod \"run-os-openstack-openstack-cell1-7vb2g\" (UID: \"d82522ab-bf1a-47f9-902b-c82105b5d09b\") " pod="openstack/run-os-openstack-openstack-cell1-7vb2g" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.518062 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-inventory\") pod \"run-os-openstack-openstack-cell1-7vb2g\" (UID: \"d82522ab-bf1a-47f9-902b-c82105b5d09b\") " pod="openstack/run-os-openstack-openstack-cell1-7vb2g" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.620348 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-7vb2g\" (UID: \"d82522ab-bf1a-47f9-902b-c82105b5d09b\") " pod="openstack/run-os-openstack-openstack-cell1-7vb2g" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.620548 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qdsw\" (UniqueName: \"kubernetes.io/projected/d82522ab-bf1a-47f9-902b-c82105b5d09b-kube-api-access-8qdsw\") pod \"run-os-openstack-openstack-cell1-7vb2g\" (UID: \"d82522ab-bf1a-47f9-902b-c82105b5d09b\") " pod="openstack/run-os-openstack-openstack-cell1-7vb2g" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.620596 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-ceph\") pod \"run-os-openstack-openstack-cell1-7vb2g\" (UID: \"d82522ab-bf1a-47f9-902b-c82105b5d09b\") " pod="openstack/run-os-openstack-openstack-cell1-7vb2g" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.620719 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-inventory\") pod \"run-os-openstack-openstack-cell1-7vb2g\" (UID: \"d82522ab-bf1a-47f9-902b-c82105b5d09b\") " pod="openstack/run-os-openstack-openstack-cell1-7vb2g" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.625783 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-inventory\") pod \"run-os-openstack-openstack-cell1-7vb2g\" (UID: \"d82522ab-bf1a-47f9-902b-c82105b5d09b\") " pod="openstack/run-os-openstack-openstack-cell1-7vb2g" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.627405 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-ceph\") pod \"run-os-openstack-openstack-cell1-7vb2g\" (UID: \"d82522ab-bf1a-47f9-902b-c82105b5d09b\") " pod="openstack/run-os-openstack-openstack-cell1-7vb2g" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.629638 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-7vb2g\" (UID: \"d82522ab-bf1a-47f9-902b-c82105b5d09b\") " pod="openstack/run-os-openstack-openstack-cell1-7vb2g" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.635057 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qdsw\" (UniqueName: \"kubernetes.io/projected/d82522ab-bf1a-47f9-902b-c82105b5d09b-kube-api-access-8qdsw\") pod \"run-os-openstack-openstack-cell1-7vb2g\" (UID: \"d82522ab-bf1a-47f9-902b-c82105b5d09b\") " pod="openstack/run-os-openstack-openstack-cell1-7vb2g" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.728185 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-7vb2g" Feb 19 23:30:39 crc kubenswrapper[4795]: I0219 23:30:39.245542 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-7vb2g"] Feb 19 23:30:39 crc kubenswrapper[4795]: I0219 23:30:39.344849 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-7vb2g" event={"ID":"d82522ab-bf1a-47f9-902b-c82105b5d09b","Type":"ContainerStarted","Data":"59fe3a0debe2d123dc2daa1bb01aef0814263fc901e65de780bd139ee8c3917e"} Feb 19 23:30:40 crc kubenswrapper[4795]: I0219 23:30:40.355868 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-7vb2g" event={"ID":"d82522ab-bf1a-47f9-902b-c82105b5d09b","Type":"ContainerStarted","Data":"793f0b98692e634ad7a64e2338c47b0d5c616ca2233e547ea2a72b28316fe605"} Feb 19 23:30:40 crc kubenswrapper[4795]: I0219 23:30:40.378306 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-7vb2g" podStartSLOduration=1.941187782 podStartE2EDuration="2.378289206s" podCreationTimestamp="2026-02-19 23:30:38 +0000 UTC" firstStartedPulling="2026-02-19 23:30:39.252033364 +0000 UTC m=+7350.444551228" lastFinishedPulling="2026-02-19 23:30:39.689134788 +0000 UTC m=+7350.881652652" observedRunningTime="2026-02-19 23:30:40.369827566 +0000 UTC m=+7351.562345430" watchObservedRunningTime="2026-02-19 23:30:40.378289206 +0000 UTC m=+7351.570807060" Feb 19 23:30:49 crc kubenswrapper[4795]: I0219 23:30:49.457601 4795 generic.go:334] "Generic (PLEG): container finished" podID="d82522ab-bf1a-47f9-902b-c82105b5d09b" containerID="793f0b98692e634ad7a64e2338c47b0d5c616ca2233e547ea2a72b28316fe605" exitCode=0 Feb 19 23:30:49 crc kubenswrapper[4795]: I0219 23:30:49.457902 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-7vb2g" event={"ID":"d82522ab-bf1a-47f9-902b-c82105b5d09b","Type":"ContainerDied","Data":"793f0b98692e634ad7a64e2338c47b0d5c616ca2233e547ea2a72b28316fe605"} Feb 19 23:30:50 crc kubenswrapper[4795]: I0219 23:30:50.928945 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-7vb2g" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.020894 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-ceph\") pod \"d82522ab-bf1a-47f9-902b-c82105b5d09b\" (UID: \"d82522ab-bf1a-47f9-902b-c82105b5d09b\") " Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.021368 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qdsw\" (UniqueName: \"kubernetes.io/projected/d82522ab-bf1a-47f9-902b-c82105b5d09b-kube-api-access-8qdsw\") pod \"d82522ab-bf1a-47f9-902b-c82105b5d09b\" (UID: \"d82522ab-bf1a-47f9-902b-c82105b5d09b\") " Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.021477 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-inventory\") pod \"d82522ab-bf1a-47f9-902b-c82105b5d09b\" (UID: \"d82522ab-bf1a-47f9-902b-c82105b5d09b\") " Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.021566 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-ssh-key-openstack-cell1\") pod \"d82522ab-bf1a-47f9-902b-c82105b5d09b\" (UID: \"d82522ab-bf1a-47f9-902b-c82105b5d09b\") " Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.027274 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d82522ab-bf1a-47f9-902b-c82105b5d09b-kube-api-access-8qdsw" (OuterVolumeSpecName: "kube-api-access-8qdsw") pod "d82522ab-bf1a-47f9-902b-c82105b5d09b" (UID: "d82522ab-bf1a-47f9-902b-c82105b5d09b"). InnerVolumeSpecName "kube-api-access-8qdsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.027389 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-ceph" (OuterVolumeSpecName: "ceph") pod "d82522ab-bf1a-47f9-902b-c82105b5d09b" (UID: "d82522ab-bf1a-47f9-902b-c82105b5d09b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.054027 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-inventory" (OuterVolumeSpecName: "inventory") pod "d82522ab-bf1a-47f9-902b-c82105b5d09b" (UID: "d82522ab-bf1a-47f9-902b-c82105b5d09b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.054546 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "d82522ab-bf1a-47f9-902b-c82105b5d09b" (UID: "d82522ab-bf1a-47f9-902b-c82105b5d09b"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.125499 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qdsw\" (UniqueName: \"kubernetes.io/projected/d82522ab-bf1a-47f9-902b-c82105b5d09b-kube-api-access-8qdsw\") on node \"crc\" DevicePath \"\"" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.125720 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.125855 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.125979 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.485061 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-7vb2g" event={"ID":"d82522ab-bf1a-47f9-902b-c82105b5d09b","Type":"ContainerDied","Data":"59fe3a0debe2d123dc2daa1bb01aef0814263fc901e65de780bd139ee8c3917e"} Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.485646 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59fe3a0debe2d123dc2daa1bb01aef0814263fc901e65de780bd139ee8c3917e" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.485099 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-7vb2g" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.578880 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-bdhcf"] Feb 19 23:30:51 crc kubenswrapper[4795]: E0219 23:30:51.579877 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d82522ab-bf1a-47f9-902b-c82105b5d09b" containerName="run-os-openstack-openstack-cell1" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.579905 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d82522ab-bf1a-47f9-902b-c82105b5d09b" containerName="run-os-openstack-openstack-cell1" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.580249 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d82522ab-bf1a-47f9-902b-c82105b5d09b" containerName="run-os-openstack-openstack-cell1" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.581444 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.625418 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.625603 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.625624 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.625753 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.645496 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-bdhcf"] Feb 19 23:30:51 crc kubenswrapper[4795]: E0219 23:30:51.707009 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd82522ab_bf1a_47f9_902b_c82105b5d09b.slice/crio-59fe3a0debe2d123dc2daa1bb01aef0814263fc901e65de780bd139ee8c3917e\": RecentStats: unable to find data in memory cache]" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.756519 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c59dx\" (UniqueName: \"kubernetes.io/projected/99fb1ef3-d414-4a7e-9db8-54edf1aad197-kube-api-access-c59dx\") pod \"reboot-os-openstack-openstack-cell1-bdhcf\" (UID: \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.756615 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-ceph\") pod \"reboot-os-openstack-openstack-cell1-bdhcf\" (UID: \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.756643 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-bdhcf\" (UID: \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.756662 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-inventory\") pod \"reboot-os-openstack-openstack-cell1-bdhcf\" (UID: \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.858405 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c59dx\" (UniqueName: \"kubernetes.io/projected/99fb1ef3-d414-4a7e-9db8-54edf1aad197-kube-api-access-c59dx\") pod \"reboot-os-openstack-openstack-cell1-bdhcf\" (UID: \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.858507 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-ceph\") pod \"reboot-os-openstack-openstack-cell1-bdhcf\" (UID: \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.858529 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-bdhcf\" (UID: \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.858552 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-inventory\") pod \"reboot-os-openstack-openstack-cell1-bdhcf\" (UID: \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.862601 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-inventory\") pod \"reboot-os-openstack-openstack-cell1-bdhcf\" (UID: \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.863048 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-ceph\") pod \"reboot-os-openstack-openstack-cell1-bdhcf\" (UID: \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.867036 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-bdhcf\" (UID: \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.873280 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c59dx\" (UniqueName: \"kubernetes.io/projected/99fb1ef3-d414-4a7e-9db8-54edf1aad197-kube-api-access-c59dx\") pod \"reboot-os-openstack-openstack-cell1-bdhcf\" (UID: \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.956792 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" Feb 19 23:30:52 crc kubenswrapper[4795]: I0219 23:30:52.512520 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-bdhcf"] Feb 19 23:30:52 crc kubenswrapper[4795]: I0219 23:30:52.523875 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 23:30:53 crc kubenswrapper[4795]: I0219 23:30:53.502341 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" event={"ID":"99fb1ef3-d414-4a7e-9db8-54edf1aad197","Type":"ContainerStarted","Data":"0790acf7e75e4eda1a7132acb1d243e79b8c530ec6b23a04ac489a100963054d"} Feb 19 23:30:53 crc kubenswrapper[4795]: I0219 23:30:53.502635 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" event={"ID":"99fb1ef3-d414-4a7e-9db8-54edf1aad197","Type":"ContainerStarted","Data":"25ca7e89d2c644a930ea04fc731c49c656a8f3e6c490c2adb3ae60c32b542bfd"} Feb 19 23:30:53 crc kubenswrapper[4795]: I0219 23:30:53.531015 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" podStartSLOduration=2.076738852 podStartE2EDuration="2.53100168s" podCreationTimestamp="2026-02-19 23:30:51 +0000 UTC" firstStartedPulling="2026-02-19 23:30:52.523616667 +0000 UTC m=+7363.716134551" lastFinishedPulling="2026-02-19 23:30:52.977879515 +0000 UTC m=+7364.170397379" observedRunningTime="2026-02-19 23:30:53.527903243 +0000 UTC m=+7364.720421107" watchObservedRunningTime="2026-02-19 23:30:53.53100168 +0000 UTC m=+7364.723519544" Feb 19 23:31:05 crc kubenswrapper[4795]: I0219 23:31:05.749193 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4jw6z"] Feb 19 23:31:05 crc kubenswrapper[4795]: I0219 23:31:05.754850 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:05 crc kubenswrapper[4795]: I0219 23:31:05.761473 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4jw6z"] Feb 19 23:31:05 crc kubenswrapper[4795]: I0219 23:31:05.919684 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30103546-d69d-4d13-a174-02fa1187e597-catalog-content\") pod \"community-operators-4jw6z\" (UID: \"30103546-d69d-4d13-a174-02fa1187e597\") " pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:05 crc kubenswrapper[4795]: I0219 23:31:05.920110 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmnd8\" (UniqueName: \"kubernetes.io/projected/30103546-d69d-4d13-a174-02fa1187e597-kube-api-access-xmnd8\") pod \"community-operators-4jw6z\" (UID: \"30103546-d69d-4d13-a174-02fa1187e597\") " pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:05 crc kubenswrapper[4795]: I0219 23:31:05.920213 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30103546-d69d-4d13-a174-02fa1187e597-utilities\") pod \"community-operators-4jw6z\" (UID: \"30103546-d69d-4d13-a174-02fa1187e597\") " pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:06 crc kubenswrapper[4795]: I0219 23:31:06.022635 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmnd8\" (UniqueName: \"kubernetes.io/projected/30103546-d69d-4d13-a174-02fa1187e597-kube-api-access-xmnd8\") pod \"community-operators-4jw6z\" (UID: \"30103546-d69d-4d13-a174-02fa1187e597\") " pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:06 crc kubenswrapper[4795]: I0219 23:31:06.022766 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30103546-d69d-4d13-a174-02fa1187e597-utilities\") pod \"community-operators-4jw6z\" (UID: \"30103546-d69d-4d13-a174-02fa1187e597\") " pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:06 crc kubenswrapper[4795]: I0219 23:31:06.022854 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30103546-d69d-4d13-a174-02fa1187e597-catalog-content\") pod \"community-operators-4jw6z\" (UID: \"30103546-d69d-4d13-a174-02fa1187e597\") " pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:06 crc kubenswrapper[4795]: I0219 23:31:06.023300 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30103546-d69d-4d13-a174-02fa1187e597-catalog-content\") pod \"community-operators-4jw6z\" (UID: \"30103546-d69d-4d13-a174-02fa1187e597\") " pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:06 crc kubenswrapper[4795]: I0219 23:31:06.023485 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30103546-d69d-4d13-a174-02fa1187e597-utilities\") pod \"community-operators-4jw6z\" (UID: \"30103546-d69d-4d13-a174-02fa1187e597\") " pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:06 crc kubenswrapper[4795]: I0219 23:31:06.051792 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmnd8\" (UniqueName: \"kubernetes.io/projected/30103546-d69d-4d13-a174-02fa1187e597-kube-api-access-xmnd8\") pod \"community-operators-4jw6z\" (UID: \"30103546-d69d-4d13-a174-02fa1187e597\") " pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:06 crc kubenswrapper[4795]: I0219 23:31:06.110458 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:06 crc kubenswrapper[4795]: I0219 23:31:06.663419 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4jw6z"] Feb 19 23:31:06 crc kubenswrapper[4795]: I0219 23:31:06.683731 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jw6z" event={"ID":"30103546-d69d-4d13-a174-02fa1187e597","Type":"ContainerStarted","Data":"28b22d09111ffcca0542200937ddfd1229cba19345a04a67d12fa0c5bb260243"} Feb 19 23:31:07 crc kubenswrapper[4795]: I0219 23:31:07.711118 4795 generic.go:334] "Generic (PLEG): container finished" podID="30103546-d69d-4d13-a174-02fa1187e597" containerID="6320cb7221b284cfb2c72f7044097ab1e39d846ff23a6245278f7572a7eca7f7" exitCode=0 Feb 19 23:31:07 crc kubenswrapper[4795]: I0219 23:31:07.711251 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jw6z" event={"ID":"30103546-d69d-4d13-a174-02fa1187e597","Type":"ContainerDied","Data":"6320cb7221b284cfb2c72f7044097ab1e39d846ff23a6245278f7572a7eca7f7"} Feb 19 23:31:08 crc kubenswrapper[4795]: I0219 23:31:08.722820 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jw6z" event={"ID":"30103546-d69d-4d13-a174-02fa1187e597","Type":"ContainerStarted","Data":"0a748c20bed635afacbf89f615ab99c17dc3036e08b3b76bdd24e0b202af8217"} Feb 19 23:31:08 crc kubenswrapper[4795]: I0219 23:31:08.724340 4795 generic.go:334] "Generic (PLEG): container finished" podID="99fb1ef3-d414-4a7e-9db8-54edf1aad197" containerID="0790acf7e75e4eda1a7132acb1d243e79b8c530ec6b23a04ac489a100963054d" exitCode=0 Feb 19 23:31:08 crc kubenswrapper[4795]: I0219 23:31:08.724385 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" event={"ID":"99fb1ef3-d414-4a7e-9db8-54edf1aad197","Type":"ContainerDied","Data":"0790acf7e75e4eda1a7132acb1d243e79b8c530ec6b23a04ac489a100963054d"} Feb 19 23:31:09 crc kubenswrapper[4795]: I0219 23:31:09.738889 4795 generic.go:334] "Generic (PLEG): container finished" podID="30103546-d69d-4d13-a174-02fa1187e597" containerID="0a748c20bed635afacbf89f615ab99c17dc3036e08b3b76bdd24e0b202af8217" exitCode=0 Feb 19 23:31:09 crc kubenswrapper[4795]: I0219 23:31:09.738938 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jw6z" event={"ID":"30103546-d69d-4d13-a174-02fa1187e597","Type":"ContainerDied","Data":"0a748c20bed635afacbf89f615ab99c17dc3036e08b3b76bdd24e0b202af8217"} Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.239656 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.411791 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-inventory\") pod \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\" (UID: \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\") " Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.412604 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-ssh-key-openstack-cell1\") pod \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\" (UID: \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\") " Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.412845 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-ceph\") pod \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\" (UID: \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\") " Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.413004 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c59dx\" (UniqueName: \"kubernetes.io/projected/99fb1ef3-d414-4a7e-9db8-54edf1aad197-kube-api-access-c59dx\") pod \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\" (UID: \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\") " Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.417092 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-ceph" (OuterVolumeSpecName: "ceph") pod "99fb1ef3-d414-4a7e-9db8-54edf1aad197" (UID: "99fb1ef3-d414-4a7e-9db8-54edf1aad197"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.417708 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99fb1ef3-d414-4a7e-9db8-54edf1aad197-kube-api-access-c59dx" (OuterVolumeSpecName: "kube-api-access-c59dx") pod "99fb1ef3-d414-4a7e-9db8-54edf1aad197" (UID: "99fb1ef3-d414-4a7e-9db8-54edf1aad197"). InnerVolumeSpecName "kube-api-access-c59dx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.438927 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-inventory" (OuterVolumeSpecName: "inventory") pod "99fb1ef3-d414-4a7e-9db8-54edf1aad197" (UID: "99fb1ef3-d414-4a7e-9db8-54edf1aad197"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.453447 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "99fb1ef3-d414-4a7e-9db8-54edf1aad197" (UID: "99fb1ef3-d414-4a7e-9db8-54edf1aad197"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.516157 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.516205 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.516218 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c59dx\" (UniqueName: \"kubernetes.io/projected/99fb1ef3-d414-4a7e-9db8-54edf1aad197-kube-api-access-c59dx\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.516228 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.748796 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jw6z" event={"ID":"30103546-d69d-4d13-a174-02fa1187e597","Type":"ContainerStarted","Data":"70aa24356d7429627ef35aa91587e561fc9eef65e06cd4e84ebe1f25fdd8a61e"} Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.751478 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" event={"ID":"99fb1ef3-d414-4a7e-9db8-54edf1aad197","Type":"ContainerDied","Data":"25ca7e89d2c644a930ea04fc731c49c656a8f3e6c490c2adb3ae60c32b542bfd"} Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.751520 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.751528 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25ca7e89d2c644a930ea04fc731c49c656a8f3e6c490c2adb3ae60c32b542bfd" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.776147 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4jw6z" podStartSLOduration=3.249008946 podStartE2EDuration="5.776130591s" podCreationTimestamp="2026-02-19 23:31:05 +0000 UTC" firstStartedPulling="2026-02-19 23:31:07.716938264 +0000 UTC m=+7378.909456138" lastFinishedPulling="2026-02-19 23:31:10.244059919 +0000 UTC m=+7381.436577783" observedRunningTime="2026-02-19 23:31:10.765392138 +0000 UTC m=+7381.957910012" watchObservedRunningTime="2026-02-19 23:31:10.776130591 +0000 UTC m=+7381.968648455" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.853385 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-tjvng"] Feb 19 23:31:10 crc kubenswrapper[4795]: E0219 23:31:10.853898 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99fb1ef3-d414-4a7e-9db8-54edf1aad197" containerName="reboot-os-openstack-openstack-cell1" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.853921 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="99fb1ef3-d414-4a7e-9db8-54edf1aad197" containerName="reboot-os-openstack-openstack-cell1" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.854179 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="99fb1ef3-d414-4a7e-9db8-54edf1aad197" containerName="reboot-os-openstack-openstack-cell1" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.855068 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.857062 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.857231 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.857548 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.857707 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.888198 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-tjvng"] Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.027229 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.027317 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-inventory\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.027390 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.027413 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqddm\" (UniqueName: \"kubernetes.io/projected/dc08b8d0-e577-4674-9ca5-b1a02818725c-kube-api-access-mqddm\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.027440 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.027483 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.027539 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.027569 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.027584 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.027604 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.027627 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.027689 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ceph\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.129610 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.129710 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.129732 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.129759 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.129802 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.129884 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ceph\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.129915 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.129958 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-inventory\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.129979 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqddm\" (UniqueName: \"kubernetes.io/projected/dc08b8d0-e577-4674-9ca5-b1a02818725c-kube-api-access-mqddm\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.129999 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.130027 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.130091 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.138333 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ceph\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.138590 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.139004 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.141080 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.144098 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.152384 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-inventory\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.152731 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.153427 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.157156 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.162048 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqddm\" (UniqueName: \"kubernetes.io/projected/dc08b8d0-e577-4674-9ca5-b1a02818725c-kube-api-access-mqddm\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.162313 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.163097 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.191119 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.716343 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-tjvng"] Feb 19 23:31:11 crc kubenswrapper[4795]: W0219 23:31:11.732546 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc08b8d0_e577_4674_9ca5_b1a02818725c.slice/crio-3f7f392e235143b77368665dd884931fc858bf9b92d56ef3f16ad3d13b1da6e9 WatchSource:0}: Error finding container 3f7f392e235143b77368665dd884931fc858bf9b92d56ef3f16ad3d13b1da6e9: Status 404 returned error can't find the container with id 3f7f392e235143b77368665dd884931fc858bf9b92d56ef3f16ad3d13b1da6e9 Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.760834 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-tjvng" event={"ID":"dc08b8d0-e577-4674-9ca5-b1a02818725c","Type":"ContainerStarted","Data":"3f7f392e235143b77368665dd884931fc858bf9b92d56ef3f16ad3d13b1da6e9"} Feb 19 23:31:12 crc kubenswrapper[4795]: I0219 23:31:12.773593 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-tjvng" event={"ID":"dc08b8d0-e577-4674-9ca5-b1a02818725c","Type":"ContainerStarted","Data":"99751b63fc1cb5589f18af4e12bb716b9914210a385140ae6ce01e4b1797e3f7"} Feb 19 23:31:12 crc kubenswrapper[4795]: I0219 23:31:12.798715 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-tjvng" podStartSLOduration=2.341933919 podStartE2EDuration="2.798695768s" podCreationTimestamp="2026-02-19 23:31:10 +0000 UTC" firstStartedPulling="2026-02-19 23:31:11.736492348 +0000 UTC m=+7382.929010212" lastFinishedPulling="2026-02-19 23:31:12.193254177 +0000 UTC m=+7383.385772061" observedRunningTime="2026-02-19 23:31:12.792067062 +0000 UTC m=+7383.984584936" watchObservedRunningTime="2026-02-19 23:31:12.798695768 +0000 UTC m=+7383.991213632" Feb 19 23:31:16 crc kubenswrapper[4795]: I0219 23:31:16.111461 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:16 crc kubenswrapper[4795]: I0219 23:31:16.112055 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:16 crc kubenswrapper[4795]: I0219 23:31:16.159779 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:16 crc kubenswrapper[4795]: I0219 23:31:16.856641 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:16 crc kubenswrapper[4795]: I0219 23:31:16.898900 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4jw6z"] Feb 19 23:31:18 crc kubenswrapper[4795]: I0219 23:31:18.827805 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4jw6z" podUID="30103546-d69d-4d13-a174-02fa1187e597" containerName="registry-server" containerID="cri-o://70aa24356d7429627ef35aa91587e561fc9eef65e06cd4e84ebe1f25fdd8a61e" gracePeriod=2 Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.353545 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.418119 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmnd8\" (UniqueName: \"kubernetes.io/projected/30103546-d69d-4d13-a174-02fa1187e597-kube-api-access-xmnd8\") pod \"30103546-d69d-4d13-a174-02fa1187e597\" (UID: \"30103546-d69d-4d13-a174-02fa1187e597\") " Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.418459 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30103546-d69d-4d13-a174-02fa1187e597-utilities\") pod \"30103546-d69d-4d13-a174-02fa1187e597\" (UID: \"30103546-d69d-4d13-a174-02fa1187e597\") " Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.418553 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30103546-d69d-4d13-a174-02fa1187e597-catalog-content\") pod \"30103546-d69d-4d13-a174-02fa1187e597\" (UID: \"30103546-d69d-4d13-a174-02fa1187e597\") " Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.427027 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30103546-d69d-4d13-a174-02fa1187e597-kube-api-access-xmnd8" (OuterVolumeSpecName: "kube-api-access-xmnd8") pod "30103546-d69d-4d13-a174-02fa1187e597" (UID: "30103546-d69d-4d13-a174-02fa1187e597"). InnerVolumeSpecName "kube-api-access-xmnd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.431302 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30103546-d69d-4d13-a174-02fa1187e597-utilities" (OuterVolumeSpecName: "utilities") pod "30103546-d69d-4d13-a174-02fa1187e597" (UID: "30103546-d69d-4d13-a174-02fa1187e597"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.490408 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30103546-d69d-4d13-a174-02fa1187e597-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "30103546-d69d-4d13-a174-02fa1187e597" (UID: "30103546-d69d-4d13-a174-02fa1187e597"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.524693 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30103546-d69d-4d13-a174-02fa1187e597-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.525076 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30103546-d69d-4d13-a174-02fa1187e597-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.525093 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmnd8\" (UniqueName: \"kubernetes.io/projected/30103546-d69d-4d13-a174-02fa1187e597-kube-api-access-xmnd8\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.840676 4795 generic.go:334] "Generic (PLEG): container finished" podID="30103546-d69d-4d13-a174-02fa1187e597" containerID="70aa24356d7429627ef35aa91587e561fc9eef65e06cd4e84ebe1f25fdd8a61e" exitCode=0 Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.840727 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jw6z" event={"ID":"30103546-d69d-4d13-a174-02fa1187e597","Type":"ContainerDied","Data":"70aa24356d7429627ef35aa91587e561fc9eef65e06cd4e84ebe1f25fdd8a61e"} Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.840756 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jw6z" event={"ID":"30103546-d69d-4d13-a174-02fa1187e597","Type":"ContainerDied","Data":"28b22d09111ffcca0542200937ddfd1229cba19345a04a67d12fa0c5bb260243"} Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.840773 4795 scope.go:117] "RemoveContainer" containerID="70aa24356d7429627ef35aa91587e561fc9eef65e06cd4e84ebe1f25fdd8a61e" Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.840810 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.876625 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4jw6z"] Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.886915 4795 scope.go:117] "RemoveContainer" containerID="0a748c20bed635afacbf89f615ab99c17dc3036e08b3b76bdd24e0b202af8217" Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.897873 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4jw6z"] Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.921199 4795 scope.go:117] "RemoveContainer" containerID="6320cb7221b284cfb2c72f7044097ab1e39d846ff23a6245278f7572a7eca7f7" Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.955828 4795 scope.go:117] "RemoveContainer" containerID="70aa24356d7429627ef35aa91587e561fc9eef65e06cd4e84ebe1f25fdd8a61e" Feb 19 23:31:19 crc kubenswrapper[4795]: E0219 23:31:19.956334 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70aa24356d7429627ef35aa91587e561fc9eef65e06cd4e84ebe1f25fdd8a61e\": container with ID starting with 70aa24356d7429627ef35aa91587e561fc9eef65e06cd4e84ebe1f25fdd8a61e not found: ID does not exist" containerID="70aa24356d7429627ef35aa91587e561fc9eef65e06cd4e84ebe1f25fdd8a61e" Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.956455 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70aa24356d7429627ef35aa91587e561fc9eef65e06cd4e84ebe1f25fdd8a61e"} err="failed to get container status \"70aa24356d7429627ef35aa91587e561fc9eef65e06cd4e84ebe1f25fdd8a61e\": rpc error: code = NotFound desc = could not find container \"70aa24356d7429627ef35aa91587e561fc9eef65e06cd4e84ebe1f25fdd8a61e\": container with ID starting with 70aa24356d7429627ef35aa91587e561fc9eef65e06cd4e84ebe1f25fdd8a61e not found: ID does not exist" Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.956482 4795 scope.go:117] "RemoveContainer" containerID="0a748c20bed635afacbf89f615ab99c17dc3036e08b3b76bdd24e0b202af8217" Feb 19 23:31:19 crc kubenswrapper[4795]: E0219 23:31:19.957382 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a748c20bed635afacbf89f615ab99c17dc3036e08b3b76bdd24e0b202af8217\": container with ID starting with 0a748c20bed635afacbf89f615ab99c17dc3036e08b3b76bdd24e0b202af8217 not found: ID does not exist" containerID="0a748c20bed635afacbf89f615ab99c17dc3036e08b3b76bdd24e0b202af8217" Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.957436 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a748c20bed635afacbf89f615ab99c17dc3036e08b3b76bdd24e0b202af8217"} err="failed to get container status \"0a748c20bed635afacbf89f615ab99c17dc3036e08b3b76bdd24e0b202af8217\": rpc error: code = NotFound desc = could not find container \"0a748c20bed635afacbf89f615ab99c17dc3036e08b3b76bdd24e0b202af8217\": container with ID starting with 0a748c20bed635afacbf89f615ab99c17dc3036e08b3b76bdd24e0b202af8217 not found: ID does not exist" Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.957456 4795 scope.go:117] "RemoveContainer" containerID="6320cb7221b284cfb2c72f7044097ab1e39d846ff23a6245278f7572a7eca7f7" Feb 19 23:31:19 crc kubenswrapper[4795]: E0219 23:31:19.957768 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6320cb7221b284cfb2c72f7044097ab1e39d846ff23a6245278f7572a7eca7f7\": container with ID starting with 6320cb7221b284cfb2c72f7044097ab1e39d846ff23a6245278f7572a7eca7f7 not found: ID does not exist" containerID="6320cb7221b284cfb2c72f7044097ab1e39d846ff23a6245278f7572a7eca7f7" Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.957807 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6320cb7221b284cfb2c72f7044097ab1e39d846ff23a6245278f7572a7eca7f7"} err="failed to get container status \"6320cb7221b284cfb2c72f7044097ab1e39d846ff23a6245278f7572a7eca7f7\": rpc error: code = NotFound desc = could not find container \"6320cb7221b284cfb2c72f7044097ab1e39d846ff23a6245278f7572a7eca7f7\": container with ID starting with 6320cb7221b284cfb2c72f7044097ab1e39d846ff23a6245278f7572a7eca7f7 not found: ID does not exist" Feb 19 23:31:21 crc kubenswrapper[4795]: I0219 23:31:21.525440 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30103546-d69d-4d13-a174-02fa1187e597" path="/var/lib/kubelet/pods/30103546-d69d-4d13-a174-02fa1187e597/volumes" Feb 19 23:31:31 crc kubenswrapper[4795]: I0219 23:31:31.979283 4795 generic.go:334] "Generic (PLEG): container finished" podID="dc08b8d0-e577-4674-9ca5-b1a02818725c" containerID="99751b63fc1cb5589f18af4e12bb716b9914210a385140ae6ce01e4b1797e3f7" exitCode=0 Feb 19 23:31:31 crc kubenswrapper[4795]: I0219 23:31:31.979390 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-tjvng" event={"ID":"dc08b8d0-e577-4674-9ca5-b1a02818725c","Type":"ContainerDied","Data":"99751b63fc1cb5589f18af4e12bb716b9914210a385140ae6ce01e4b1797e3f7"} Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.473466 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.614809 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-inventory\") pod \"dc08b8d0-e577-4674-9ca5-b1a02818725c\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.615083 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-metadata-combined-ca-bundle\") pod \"dc08b8d0-e577-4674-9ca5-b1a02818725c\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.615185 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ovn-combined-ca-bundle\") pod \"dc08b8d0-e577-4674-9ca5-b1a02818725c\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.615262 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-sriov-combined-ca-bundle\") pod \"dc08b8d0-e577-4674-9ca5-b1a02818725c\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.615298 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqddm\" (UniqueName: \"kubernetes.io/projected/dc08b8d0-e577-4674-9ca5-b1a02818725c-kube-api-access-mqddm\") pod \"dc08b8d0-e577-4674-9ca5-b1a02818725c\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.615347 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-bootstrap-combined-ca-bundle\") pod \"dc08b8d0-e577-4674-9ca5-b1a02818725c\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.615364 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-nova-combined-ca-bundle\") pod \"dc08b8d0-e577-4674-9ca5-b1a02818725c\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.615465 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-libvirt-combined-ca-bundle\") pod \"dc08b8d0-e577-4674-9ca5-b1a02818725c\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.615547 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-telemetry-combined-ca-bundle\") pod \"dc08b8d0-e577-4674-9ca5-b1a02818725c\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.615582 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ceph\") pod \"dc08b8d0-e577-4674-9ca5-b1a02818725c\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.615619 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ssh-key-openstack-cell1\") pod \"dc08b8d0-e577-4674-9ca5-b1a02818725c\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.615651 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-dhcp-combined-ca-bundle\") pod \"dc08b8d0-e577-4674-9ca5-b1a02818725c\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.622414 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "dc08b8d0-e577-4674-9ca5-b1a02818725c" (UID: "dc08b8d0-e577-4674-9ca5-b1a02818725c"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.623070 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "dc08b8d0-e577-4674-9ca5-b1a02818725c" (UID: "dc08b8d0-e577-4674-9ca5-b1a02818725c"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.623388 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "dc08b8d0-e577-4674-9ca5-b1a02818725c" (UID: "dc08b8d0-e577-4674-9ca5-b1a02818725c"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.623958 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc08b8d0-e577-4674-9ca5-b1a02818725c-kube-api-access-mqddm" (OuterVolumeSpecName: "kube-api-access-mqddm") pod "dc08b8d0-e577-4674-9ca5-b1a02818725c" (UID: "dc08b8d0-e577-4674-9ca5-b1a02818725c"). InnerVolumeSpecName "kube-api-access-mqddm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.624691 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ceph" (OuterVolumeSpecName: "ceph") pod "dc08b8d0-e577-4674-9ca5-b1a02818725c" (UID: "dc08b8d0-e577-4674-9ca5-b1a02818725c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.624724 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "dc08b8d0-e577-4674-9ca5-b1a02818725c" (UID: "dc08b8d0-e577-4674-9ca5-b1a02818725c"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.625072 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "dc08b8d0-e577-4674-9ca5-b1a02818725c" (UID: "dc08b8d0-e577-4674-9ca5-b1a02818725c"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.626779 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "dc08b8d0-e577-4674-9ca5-b1a02818725c" (UID: "dc08b8d0-e577-4674-9ca5-b1a02818725c"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.626909 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "dc08b8d0-e577-4674-9ca5-b1a02818725c" (UID: "dc08b8d0-e577-4674-9ca5-b1a02818725c"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.627531 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "dc08b8d0-e577-4674-9ca5-b1a02818725c" (UID: "dc08b8d0-e577-4674-9ca5-b1a02818725c"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.647110 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-inventory" (OuterVolumeSpecName: "inventory") pod "dc08b8d0-e577-4674-9ca5-b1a02818725c" (UID: "dc08b8d0-e577-4674-9ca5-b1a02818725c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.660460 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "dc08b8d0-e577-4674-9ca5-b1a02818725c" (UID: "dc08b8d0-e577-4674-9ca5-b1a02818725c"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.718904 4795 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.718941 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.718952 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.718963 4795 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.718974 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.718984 4795 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.718994 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.719002 4795 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.719011 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqddm\" (UniqueName: \"kubernetes.io/projected/dc08b8d0-e577-4674-9ca5-b1a02818725c-kube-api-access-mqddm\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.719020 4795 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.719027 4795 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.719035 4795 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.000803 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-tjvng" event={"ID":"dc08b8d0-e577-4674-9ca5-b1a02818725c","Type":"ContainerDied","Data":"3f7f392e235143b77368665dd884931fc858bf9b92d56ef3f16ad3d13b1da6e9"} Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.000874 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f7f392e235143b77368665dd884931fc858bf9b92d56ef3f16ad3d13b1da6e9" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.000906 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.086875 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-zjjhg"] Feb 19 23:31:34 crc kubenswrapper[4795]: E0219 23:31:34.087309 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30103546-d69d-4d13-a174-02fa1187e597" containerName="registry-server" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.087324 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="30103546-d69d-4d13-a174-02fa1187e597" containerName="registry-server" Feb 19 23:31:34 crc kubenswrapper[4795]: E0219 23:31:34.087334 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc08b8d0-e577-4674-9ca5-b1a02818725c" containerName="install-certs-openstack-openstack-cell1" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.087341 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc08b8d0-e577-4674-9ca5-b1a02818725c" containerName="install-certs-openstack-openstack-cell1" Feb 19 23:31:34 crc kubenswrapper[4795]: E0219 23:31:34.087361 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30103546-d69d-4d13-a174-02fa1187e597" containerName="extract-utilities" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.087367 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="30103546-d69d-4d13-a174-02fa1187e597" containerName="extract-utilities" Feb 19 23:31:34 crc kubenswrapper[4795]: E0219 23:31:34.087381 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30103546-d69d-4d13-a174-02fa1187e597" containerName="extract-content" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.087386 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="30103546-d69d-4d13-a174-02fa1187e597" containerName="extract-content" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.087578 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="30103546-d69d-4d13-a174-02fa1187e597" containerName="registry-server" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.087603 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc08b8d0-e577-4674-9ca5-b1a02818725c" containerName="install-certs-openstack-openstack-cell1" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.088498 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.093024 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.093240 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.093311 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.093350 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.107127 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-zjjhg"] Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.228113 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-zjjhg\" (UID: \"532484aa-8294-4c2d-b257-082b09bafb14\") " pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.228221 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6zjw\" (UniqueName: \"kubernetes.io/projected/532484aa-8294-4c2d-b257-082b09bafb14-kube-api-access-j6zjw\") pod \"ceph-client-openstack-openstack-cell1-zjjhg\" (UID: \"532484aa-8294-4c2d-b257-082b09bafb14\") " pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.228320 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-ceph\") pod \"ceph-client-openstack-openstack-cell1-zjjhg\" (UID: \"532484aa-8294-4c2d-b257-082b09bafb14\") " pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.228379 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-inventory\") pod \"ceph-client-openstack-openstack-cell1-zjjhg\" (UID: \"532484aa-8294-4c2d-b257-082b09bafb14\") " pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.330730 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-inventory\") pod \"ceph-client-openstack-openstack-cell1-zjjhg\" (UID: \"532484aa-8294-4c2d-b257-082b09bafb14\") " pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.330829 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-zjjhg\" (UID: \"532484aa-8294-4c2d-b257-082b09bafb14\") " pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.330902 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6zjw\" (UniqueName: \"kubernetes.io/projected/532484aa-8294-4c2d-b257-082b09bafb14-kube-api-access-j6zjw\") pod \"ceph-client-openstack-openstack-cell1-zjjhg\" (UID: \"532484aa-8294-4c2d-b257-082b09bafb14\") " pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.331027 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-ceph\") pod \"ceph-client-openstack-openstack-cell1-zjjhg\" (UID: \"532484aa-8294-4c2d-b257-082b09bafb14\") " pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.334292 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-inventory\") pod \"ceph-client-openstack-openstack-cell1-zjjhg\" (UID: \"532484aa-8294-4c2d-b257-082b09bafb14\") " pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.334817 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-zjjhg\" (UID: \"532484aa-8294-4c2d-b257-082b09bafb14\") " pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.336014 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-ceph\") pod \"ceph-client-openstack-openstack-cell1-zjjhg\" (UID: \"532484aa-8294-4c2d-b257-082b09bafb14\") " pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.347019 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6zjw\" (UniqueName: \"kubernetes.io/projected/532484aa-8294-4c2d-b257-082b09bafb14-kube-api-access-j6zjw\") pod \"ceph-client-openstack-openstack-cell1-zjjhg\" (UID: \"532484aa-8294-4c2d-b257-082b09bafb14\") " pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.406970 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" Feb 19 23:31:35 crc kubenswrapper[4795]: I0219 23:31:34.965145 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-zjjhg"] Feb 19 23:31:35 crc kubenswrapper[4795]: I0219 23:31:35.013109 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" event={"ID":"532484aa-8294-4c2d-b257-082b09bafb14","Type":"ContainerStarted","Data":"d2a5200be3bac39a3f54995b67133a5b52bf7a3d5949fc27caceb27198cda798"} Feb 19 23:31:36 crc kubenswrapper[4795]: I0219 23:31:36.024419 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" event={"ID":"532484aa-8294-4c2d-b257-082b09bafb14","Type":"ContainerStarted","Data":"33dcf59ba0f29e3e31575d7b58b985e030ba3d84d5bfa34eba4562c7de2dda67"} Feb 19 23:31:36 crc kubenswrapper[4795]: I0219 23:31:36.059902 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" podStartSLOduration=1.625393275 podStartE2EDuration="2.059881007s" podCreationTimestamp="2026-02-19 23:31:34 +0000 UTC" firstStartedPulling="2026-02-19 23:31:34.973857246 +0000 UTC m=+7406.166375120" lastFinishedPulling="2026-02-19 23:31:35.408344988 +0000 UTC m=+7406.600862852" observedRunningTime="2026-02-19 23:31:36.051342236 +0000 UTC m=+7407.243860100" watchObservedRunningTime="2026-02-19 23:31:36.059881007 +0000 UTC m=+7407.252398861" Feb 19 23:31:41 crc kubenswrapper[4795]: I0219 23:31:41.081139 4795 generic.go:334] "Generic (PLEG): container finished" podID="532484aa-8294-4c2d-b257-082b09bafb14" containerID="33dcf59ba0f29e3e31575d7b58b985e030ba3d84d5bfa34eba4562c7de2dda67" exitCode=0 Feb 19 23:31:41 crc kubenswrapper[4795]: I0219 23:31:41.081202 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" event={"ID":"532484aa-8294-4c2d-b257-082b09bafb14","Type":"ContainerDied","Data":"33dcf59ba0f29e3e31575d7b58b985e030ba3d84d5bfa34eba4562c7de2dda67"} Feb 19 23:31:42 crc kubenswrapper[4795]: I0219 23:31:42.613898 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" Feb 19 23:31:42 crc kubenswrapper[4795]: I0219 23:31:42.717608 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6zjw\" (UniqueName: \"kubernetes.io/projected/532484aa-8294-4c2d-b257-082b09bafb14-kube-api-access-j6zjw\") pod \"532484aa-8294-4c2d-b257-082b09bafb14\" (UID: \"532484aa-8294-4c2d-b257-082b09bafb14\") " Feb 19 23:31:42 crc kubenswrapper[4795]: I0219 23:31:42.718095 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-ceph\") pod \"532484aa-8294-4c2d-b257-082b09bafb14\" (UID: \"532484aa-8294-4c2d-b257-082b09bafb14\") " Feb 19 23:31:42 crc kubenswrapper[4795]: I0219 23:31:42.718216 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-ssh-key-openstack-cell1\") pod \"532484aa-8294-4c2d-b257-082b09bafb14\" (UID: \"532484aa-8294-4c2d-b257-082b09bafb14\") " Feb 19 23:31:42 crc kubenswrapper[4795]: I0219 23:31:42.718246 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-inventory\") pod \"532484aa-8294-4c2d-b257-082b09bafb14\" (UID: \"532484aa-8294-4c2d-b257-082b09bafb14\") " Feb 19 23:31:42 crc kubenswrapper[4795]: I0219 23:31:42.723837 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-ceph" (OuterVolumeSpecName: "ceph") pod "532484aa-8294-4c2d-b257-082b09bafb14" (UID: "532484aa-8294-4c2d-b257-082b09bafb14"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:42 crc kubenswrapper[4795]: I0219 23:31:42.725873 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/532484aa-8294-4c2d-b257-082b09bafb14-kube-api-access-j6zjw" (OuterVolumeSpecName: "kube-api-access-j6zjw") pod "532484aa-8294-4c2d-b257-082b09bafb14" (UID: "532484aa-8294-4c2d-b257-082b09bafb14"). InnerVolumeSpecName "kube-api-access-j6zjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:31:42 crc kubenswrapper[4795]: I0219 23:31:42.751835 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "532484aa-8294-4c2d-b257-082b09bafb14" (UID: "532484aa-8294-4c2d-b257-082b09bafb14"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:42 crc kubenswrapper[4795]: I0219 23:31:42.780801 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-inventory" (OuterVolumeSpecName: "inventory") pod "532484aa-8294-4c2d-b257-082b09bafb14" (UID: "532484aa-8294-4c2d-b257-082b09bafb14"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:42 crc kubenswrapper[4795]: I0219 23:31:42.820371 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6zjw\" (UniqueName: \"kubernetes.io/projected/532484aa-8294-4c2d-b257-082b09bafb14-kube-api-access-j6zjw\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:42 crc kubenswrapper[4795]: I0219 23:31:42.820400 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:42 crc kubenswrapper[4795]: I0219 23:31:42.820410 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:42 crc kubenswrapper[4795]: I0219 23:31:42.820420 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.101707 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" event={"ID":"532484aa-8294-4c2d-b257-082b09bafb14","Type":"ContainerDied","Data":"d2a5200be3bac39a3f54995b67133a5b52bf7a3d5949fc27caceb27198cda798"} Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.101749 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2a5200be3bac39a3f54995b67133a5b52bf7a3d5949fc27caceb27198cda798" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.101752 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.223292 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-b9rq5"] Feb 19 23:31:43 crc kubenswrapper[4795]: E0219 23:31:43.224950 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="532484aa-8294-4c2d-b257-082b09bafb14" containerName="ceph-client-openstack-openstack-cell1" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.224987 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="532484aa-8294-4c2d-b257-082b09bafb14" containerName="ceph-client-openstack-openstack-cell1" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.226439 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="532484aa-8294-4c2d-b257-082b09bafb14" containerName="ceph-client-openstack-openstack-cell1" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.228078 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.231078 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.231110 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.231598 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.231608 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.233387 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.245588 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-b9rq5"] Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.332963 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d02efd94-2196-48fe-85d5-e2c65d186d6e-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.333076 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.333116 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bvsj\" (UniqueName: \"kubernetes.io/projected/d02efd94-2196-48fe-85d5-e2c65d186d6e-kube-api-access-8bvsj\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.333199 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ceph\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.333455 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-inventory\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.333580 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.435654 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-inventory\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.435710 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.435803 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d02efd94-2196-48fe-85d5-e2c65d186d6e-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.435837 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.435855 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bvsj\" (UniqueName: \"kubernetes.io/projected/d02efd94-2196-48fe-85d5-e2c65d186d6e-kube-api-access-8bvsj\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.435894 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ceph\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.437079 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d02efd94-2196-48fe-85d5-e2c65d186d6e-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.439804 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-inventory\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.440276 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ceph\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.442116 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.444734 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.452940 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bvsj\" (UniqueName: \"kubernetes.io/projected/d02efd94-2196-48fe-85d5-e2c65d186d6e-kube-api-access-8bvsj\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.551852 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:44 crc kubenswrapper[4795]: I0219 23:31:44.136892 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-b9rq5"] Feb 19 23:31:45 crc kubenswrapper[4795]: I0219 23:31:45.123515 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-b9rq5" event={"ID":"d02efd94-2196-48fe-85d5-e2c65d186d6e","Type":"ContainerStarted","Data":"e4c88993e055ef4e5ec56d3915d302612d8dc348bfd21bf85166259a8eaaba9e"} Feb 19 23:31:45 crc kubenswrapper[4795]: I0219 23:31:45.123880 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-b9rq5" event={"ID":"d02efd94-2196-48fe-85d5-e2c65d186d6e","Type":"ContainerStarted","Data":"6ea93587dfbb9217fefa8b1e6c8ed4ec98f16d4ee8f50c309e9ae240e35f3aa9"} Feb 19 23:31:45 crc kubenswrapper[4795]: I0219 23:31:45.148464 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-b9rq5" podStartSLOduration=1.6820629459999998 podStartE2EDuration="2.148446117s" podCreationTimestamp="2026-02-19 23:31:43 +0000 UTC" firstStartedPulling="2026-02-19 23:31:44.14700533 +0000 UTC m=+7415.339523194" lastFinishedPulling="2026-02-19 23:31:44.613388501 +0000 UTC m=+7415.805906365" observedRunningTime="2026-02-19 23:31:45.145469443 +0000 UTC m=+7416.337987307" watchObservedRunningTime="2026-02-19 23:31:45.148446117 +0000 UTC m=+7416.340963981" Feb 19 23:32:28 crc kubenswrapper[4795]: I0219 23:32:28.427672 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:32:28 crc kubenswrapper[4795]: I0219 23:32:28.428427 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:32:48 crc kubenswrapper[4795]: I0219 23:32:48.780833 4795 generic.go:334] "Generic (PLEG): container finished" podID="d02efd94-2196-48fe-85d5-e2c65d186d6e" containerID="e4c88993e055ef4e5ec56d3915d302612d8dc348bfd21bf85166259a8eaaba9e" exitCode=0 Feb 19 23:32:48 crc kubenswrapper[4795]: I0219 23:32:48.780925 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-b9rq5" event={"ID":"d02efd94-2196-48fe-85d5-e2c65d186d6e","Type":"ContainerDied","Data":"e4c88993e055ef4e5ec56d3915d302612d8dc348bfd21bf85166259a8eaaba9e"} Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.279049 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.403304 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bvsj\" (UniqueName: \"kubernetes.io/projected/d02efd94-2196-48fe-85d5-e2c65d186d6e-kube-api-access-8bvsj\") pod \"d02efd94-2196-48fe-85d5-e2c65d186d6e\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.403482 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ssh-key-openstack-cell1\") pod \"d02efd94-2196-48fe-85d5-e2c65d186d6e\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.403519 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d02efd94-2196-48fe-85d5-e2c65d186d6e-ovncontroller-config-0\") pod \"d02efd94-2196-48fe-85d5-e2c65d186d6e\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.403562 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ceph\") pod \"d02efd94-2196-48fe-85d5-e2c65d186d6e\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.403613 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ovn-combined-ca-bundle\") pod \"d02efd94-2196-48fe-85d5-e2c65d186d6e\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.403784 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-inventory\") pod \"d02efd94-2196-48fe-85d5-e2c65d186d6e\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.409263 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "d02efd94-2196-48fe-85d5-e2c65d186d6e" (UID: "d02efd94-2196-48fe-85d5-e2c65d186d6e"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.413519 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d02efd94-2196-48fe-85d5-e2c65d186d6e-kube-api-access-8bvsj" (OuterVolumeSpecName: "kube-api-access-8bvsj") pod "d02efd94-2196-48fe-85d5-e2c65d186d6e" (UID: "d02efd94-2196-48fe-85d5-e2c65d186d6e"). InnerVolumeSpecName "kube-api-access-8bvsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.413515 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ceph" (OuterVolumeSpecName: "ceph") pod "d02efd94-2196-48fe-85d5-e2c65d186d6e" (UID: "d02efd94-2196-48fe-85d5-e2c65d186d6e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.430682 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d02efd94-2196-48fe-85d5-e2c65d186d6e-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "d02efd94-2196-48fe-85d5-e2c65d186d6e" (UID: "d02efd94-2196-48fe-85d5-e2c65d186d6e"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.438661 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-inventory" (OuterVolumeSpecName: "inventory") pod "d02efd94-2196-48fe-85d5-e2c65d186d6e" (UID: "d02efd94-2196-48fe-85d5-e2c65d186d6e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.439193 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "d02efd94-2196-48fe-85d5-e2c65d186d6e" (UID: "d02efd94-2196-48fe-85d5-e2c65d186d6e"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.506535 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.506573 4795 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d02efd94-2196-48fe-85d5-e2c65d186d6e-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.506583 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.506593 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.506602 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.506610 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bvsj\" (UniqueName: \"kubernetes.io/projected/d02efd94-2196-48fe-85d5-e2c65d186d6e-kube-api-access-8bvsj\") on node \"crc\" DevicePath \"\"" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.807604 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-b9rq5" event={"ID":"d02efd94-2196-48fe-85d5-e2c65d186d6e","Type":"ContainerDied","Data":"6ea93587dfbb9217fefa8b1e6c8ed4ec98f16d4ee8f50c309e9ae240e35f3aa9"} Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.807684 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ea93587dfbb9217fefa8b1e6c8ed4ec98f16d4ee8f50c309e9ae240e35f3aa9" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.807687 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.902501 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-5855g"] Feb 19 23:32:50 crc kubenswrapper[4795]: E0219 23:32:50.902992 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d02efd94-2196-48fe-85d5-e2c65d186d6e" containerName="ovn-openstack-openstack-cell1" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.903006 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d02efd94-2196-48fe-85d5-e2c65d186d6e" containerName="ovn-openstack-openstack-cell1" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.903251 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d02efd94-2196-48fe-85d5-e2c65d186d6e" containerName="ovn-openstack-openstack-cell1" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.904091 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.906083 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.906129 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.906156 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.906995 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.908867 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.909570 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.917505 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-5855g"] Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.016453 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.017363 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.017483 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.017666 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.017813 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rphwm\" (UniqueName: \"kubernetes.io/projected/a23f1a80-1645-454d-b9cf-e039928b84cb-kube-api-access-rphwm\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.017871 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.017901 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.120342 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.120700 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rphwm\" (UniqueName: \"kubernetes.io/projected/a23f1a80-1645-454d-b9cf-e039928b84cb-kube-api-access-rphwm\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.120764 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.120794 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.120841 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.120954 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.120989 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.126073 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.126272 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.126281 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.127809 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.128902 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.131512 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.140052 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rphwm\" (UniqueName: \"kubernetes.io/projected/a23f1a80-1645-454d-b9cf-e039928b84cb-kube-api-access-rphwm\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.235222 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.780084 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-5855g"] Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.818661 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" event={"ID":"a23f1a80-1645-454d-b9cf-e039928b84cb","Type":"ContainerStarted","Data":"ac532540ed1c9ed8ee4bab62b2b0f16733f15b6c586c9842153ae0b7941ac4df"} Feb 19 23:32:52 crc kubenswrapper[4795]: I0219 23:32:52.843844 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" event={"ID":"a23f1a80-1645-454d-b9cf-e039928b84cb","Type":"ContainerStarted","Data":"2e3e3e59d5c650a24ac4dc17cdcdbd825d499979952a0af456bf13fbe15d8292"} Feb 19 23:32:52 crc kubenswrapper[4795]: I0219 23:32:52.870227 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" podStartSLOduration=2.442282574 podStartE2EDuration="2.87021085s" podCreationTimestamp="2026-02-19 23:32:50 +0000 UTC" firstStartedPulling="2026-02-19 23:32:51.788950332 +0000 UTC m=+7482.981468196" lastFinishedPulling="2026-02-19 23:32:52.216878608 +0000 UTC m=+7483.409396472" observedRunningTime="2026-02-19 23:32:52.869766637 +0000 UTC m=+7484.062284511" watchObservedRunningTime="2026-02-19 23:32:52.87021085 +0000 UTC m=+7484.062728714" Feb 19 23:32:58 crc kubenswrapper[4795]: I0219 23:32:58.427922 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:32:58 crc kubenswrapper[4795]: I0219 23:32:58.428906 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:32:58 crc kubenswrapper[4795]: I0219 23:32:58.580533 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nqrh8"] Feb 19 23:32:58 crc kubenswrapper[4795]: I0219 23:32:58.584217 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:32:58 crc kubenswrapper[4795]: I0219 23:32:58.603469 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nqrh8"] Feb 19 23:32:58 crc kubenswrapper[4795]: I0219 23:32:58.698726 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b009444-b438-433d-8e2c-abc763e6f9ee-utilities\") pod \"certified-operators-nqrh8\" (UID: \"1b009444-b438-433d-8e2c-abc763e6f9ee\") " pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:32:58 crc kubenswrapper[4795]: I0219 23:32:58.698816 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9w6h\" (UniqueName: \"kubernetes.io/projected/1b009444-b438-433d-8e2c-abc763e6f9ee-kube-api-access-g9w6h\") pod \"certified-operators-nqrh8\" (UID: \"1b009444-b438-433d-8e2c-abc763e6f9ee\") " pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:32:58 crc kubenswrapper[4795]: I0219 23:32:58.698930 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b009444-b438-433d-8e2c-abc763e6f9ee-catalog-content\") pod \"certified-operators-nqrh8\" (UID: \"1b009444-b438-433d-8e2c-abc763e6f9ee\") " pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:32:58 crc kubenswrapper[4795]: I0219 23:32:58.801011 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b009444-b438-433d-8e2c-abc763e6f9ee-catalog-content\") pod \"certified-operators-nqrh8\" (UID: \"1b009444-b438-433d-8e2c-abc763e6f9ee\") " pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:32:58 crc kubenswrapper[4795]: I0219 23:32:58.801488 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b009444-b438-433d-8e2c-abc763e6f9ee-utilities\") pod \"certified-operators-nqrh8\" (UID: \"1b009444-b438-433d-8e2c-abc763e6f9ee\") " pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:32:58 crc kubenswrapper[4795]: I0219 23:32:58.801566 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9w6h\" (UniqueName: \"kubernetes.io/projected/1b009444-b438-433d-8e2c-abc763e6f9ee-kube-api-access-g9w6h\") pod \"certified-operators-nqrh8\" (UID: \"1b009444-b438-433d-8e2c-abc763e6f9ee\") " pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:32:58 crc kubenswrapper[4795]: I0219 23:32:58.801669 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b009444-b438-433d-8e2c-abc763e6f9ee-catalog-content\") pod \"certified-operators-nqrh8\" (UID: \"1b009444-b438-433d-8e2c-abc763e6f9ee\") " pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:32:58 crc kubenswrapper[4795]: I0219 23:32:58.802017 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b009444-b438-433d-8e2c-abc763e6f9ee-utilities\") pod \"certified-operators-nqrh8\" (UID: \"1b009444-b438-433d-8e2c-abc763e6f9ee\") " pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:32:58 crc kubenswrapper[4795]: I0219 23:32:58.823032 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9w6h\" (UniqueName: \"kubernetes.io/projected/1b009444-b438-433d-8e2c-abc763e6f9ee-kube-api-access-g9w6h\") pod \"certified-operators-nqrh8\" (UID: \"1b009444-b438-433d-8e2c-abc763e6f9ee\") " pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:32:58 crc kubenswrapper[4795]: I0219 23:32:58.917427 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:32:59 crc kubenswrapper[4795]: I0219 23:32:59.488494 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nqrh8"] Feb 19 23:32:59 crc kubenswrapper[4795]: I0219 23:32:59.914622 4795 generic.go:334] "Generic (PLEG): container finished" podID="1b009444-b438-433d-8e2c-abc763e6f9ee" containerID="40ff688b57b898b0b5419dec2eb7dae2f53df75e9918f4af50555fc1c62cce91" exitCode=0 Feb 19 23:32:59 crc kubenswrapper[4795]: I0219 23:32:59.914731 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nqrh8" event={"ID":"1b009444-b438-433d-8e2c-abc763e6f9ee","Type":"ContainerDied","Data":"40ff688b57b898b0b5419dec2eb7dae2f53df75e9918f4af50555fc1c62cce91"} Feb 19 23:32:59 crc kubenswrapper[4795]: I0219 23:32:59.914951 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nqrh8" event={"ID":"1b009444-b438-433d-8e2c-abc763e6f9ee","Type":"ContainerStarted","Data":"90136202358e702fa7897a42d741cd8a62535cb68b9b9944a6dabad76cb7864e"} Feb 19 23:33:00 crc kubenswrapper[4795]: I0219 23:33:00.934785 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nqrh8" event={"ID":"1b009444-b438-433d-8e2c-abc763e6f9ee","Type":"ContainerStarted","Data":"456e4cdd0fe6f92610c5ca2c4002e956bc698f4747fe3e78727e6f5a34ddfe93"} Feb 19 23:33:01 crc kubenswrapper[4795]: I0219 23:33:01.946965 4795 generic.go:334] "Generic (PLEG): container finished" podID="1b009444-b438-433d-8e2c-abc763e6f9ee" containerID="456e4cdd0fe6f92610c5ca2c4002e956bc698f4747fe3e78727e6f5a34ddfe93" exitCode=0 Feb 19 23:33:01 crc kubenswrapper[4795]: I0219 23:33:01.947068 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nqrh8" event={"ID":"1b009444-b438-433d-8e2c-abc763e6f9ee","Type":"ContainerDied","Data":"456e4cdd0fe6f92610c5ca2c4002e956bc698f4747fe3e78727e6f5a34ddfe93"} Feb 19 23:33:02 crc kubenswrapper[4795]: I0219 23:33:02.957898 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nqrh8" event={"ID":"1b009444-b438-433d-8e2c-abc763e6f9ee","Type":"ContainerStarted","Data":"b01a941fb86e6ae179a993966f808b1259b5941731dc4bfeff98b21fbe400935"} Feb 19 23:33:02 crc kubenswrapper[4795]: I0219 23:33:02.984624 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nqrh8" podStartSLOduration=2.537221318 podStartE2EDuration="4.984606414s" podCreationTimestamp="2026-02-19 23:32:58 +0000 UTC" firstStartedPulling="2026-02-19 23:32:59.917032981 +0000 UTC m=+7491.109550845" lastFinishedPulling="2026-02-19 23:33:02.364418077 +0000 UTC m=+7493.556935941" observedRunningTime="2026-02-19 23:33:02.976951588 +0000 UTC m=+7494.169469462" watchObservedRunningTime="2026-02-19 23:33:02.984606414 +0000 UTC m=+7494.177124278" Feb 19 23:33:08 crc kubenswrapper[4795]: I0219 23:33:08.918870 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:33:08 crc kubenswrapper[4795]: I0219 23:33:08.919682 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:33:08 crc kubenswrapper[4795]: I0219 23:33:08.987400 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:33:09 crc kubenswrapper[4795]: I0219 23:33:09.073416 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:33:09 crc kubenswrapper[4795]: I0219 23:33:09.225219 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nqrh8"] Feb 19 23:33:11 crc kubenswrapper[4795]: I0219 23:33:11.042496 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nqrh8" podUID="1b009444-b438-433d-8e2c-abc763e6f9ee" containerName="registry-server" containerID="cri-o://b01a941fb86e6ae179a993966f808b1259b5941731dc4bfeff98b21fbe400935" gracePeriod=2 Feb 19 23:33:11 crc kubenswrapper[4795]: I0219 23:33:11.590182 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:33:11 crc kubenswrapper[4795]: I0219 23:33:11.695448 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b009444-b438-433d-8e2c-abc763e6f9ee-utilities\") pod \"1b009444-b438-433d-8e2c-abc763e6f9ee\" (UID: \"1b009444-b438-433d-8e2c-abc763e6f9ee\") " Feb 19 23:33:11 crc kubenswrapper[4795]: I0219 23:33:11.695515 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9w6h\" (UniqueName: \"kubernetes.io/projected/1b009444-b438-433d-8e2c-abc763e6f9ee-kube-api-access-g9w6h\") pod \"1b009444-b438-433d-8e2c-abc763e6f9ee\" (UID: \"1b009444-b438-433d-8e2c-abc763e6f9ee\") " Feb 19 23:33:11 crc kubenswrapper[4795]: I0219 23:33:11.695551 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b009444-b438-433d-8e2c-abc763e6f9ee-catalog-content\") pod \"1b009444-b438-433d-8e2c-abc763e6f9ee\" (UID: \"1b009444-b438-433d-8e2c-abc763e6f9ee\") " Feb 19 23:33:11 crc kubenswrapper[4795]: I0219 23:33:11.697304 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b009444-b438-433d-8e2c-abc763e6f9ee-utilities" (OuterVolumeSpecName: "utilities") pod "1b009444-b438-433d-8e2c-abc763e6f9ee" (UID: "1b009444-b438-433d-8e2c-abc763e6f9ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:33:11 crc kubenswrapper[4795]: I0219 23:33:11.703181 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b009444-b438-433d-8e2c-abc763e6f9ee-kube-api-access-g9w6h" (OuterVolumeSpecName: "kube-api-access-g9w6h") pod "1b009444-b438-433d-8e2c-abc763e6f9ee" (UID: "1b009444-b438-433d-8e2c-abc763e6f9ee"). InnerVolumeSpecName "kube-api-access-g9w6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:33:11 crc kubenswrapper[4795]: I0219 23:33:11.745139 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b009444-b438-433d-8e2c-abc763e6f9ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1b009444-b438-433d-8e2c-abc763e6f9ee" (UID: "1b009444-b438-433d-8e2c-abc763e6f9ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:33:11 crc kubenswrapper[4795]: I0219 23:33:11.798663 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b009444-b438-433d-8e2c-abc763e6f9ee-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:33:11 crc kubenswrapper[4795]: I0219 23:33:11.798702 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9w6h\" (UniqueName: \"kubernetes.io/projected/1b009444-b438-433d-8e2c-abc763e6f9ee-kube-api-access-g9w6h\") on node \"crc\" DevicePath \"\"" Feb 19 23:33:11 crc kubenswrapper[4795]: I0219 23:33:11.798719 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b009444-b438-433d-8e2c-abc763e6f9ee-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:33:12 crc kubenswrapper[4795]: I0219 23:33:12.054839 4795 generic.go:334] "Generic (PLEG): container finished" podID="1b009444-b438-433d-8e2c-abc763e6f9ee" containerID="b01a941fb86e6ae179a993966f808b1259b5941731dc4bfeff98b21fbe400935" exitCode=0 Feb 19 23:33:12 crc kubenswrapper[4795]: I0219 23:33:12.054890 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:33:12 crc kubenswrapper[4795]: I0219 23:33:12.054886 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nqrh8" event={"ID":"1b009444-b438-433d-8e2c-abc763e6f9ee","Type":"ContainerDied","Data":"b01a941fb86e6ae179a993966f808b1259b5941731dc4bfeff98b21fbe400935"} Feb 19 23:33:12 crc kubenswrapper[4795]: I0219 23:33:12.055314 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nqrh8" event={"ID":"1b009444-b438-433d-8e2c-abc763e6f9ee","Type":"ContainerDied","Data":"90136202358e702fa7897a42d741cd8a62535cb68b9b9944a6dabad76cb7864e"} Feb 19 23:33:12 crc kubenswrapper[4795]: I0219 23:33:12.055339 4795 scope.go:117] "RemoveContainer" containerID="b01a941fb86e6ae179a993966f808b1259b5941731dc4bfeff98b21fbe400935" Feb 19 23:33:12 crc kubenswrapper[4795]: I0219 23:33:12.076286 4795 scope.go:117] "RemoveContainer" containerID="456e4cdd0fe6f92610c5ca2c4002e956bc698f4747fe3e78727e6f5a34ddfe93" Feb 19 23:33:12 crc kubenswrapper[4795]: I0219 23:33:12.096891 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nqrh8"] Feb 19 23:33:12 crc kubenswrapper[4795]: I0219 23:33:12.105625 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nqrh8"] Feb 19 23:33:12 crc kubenswrapper[4795]: I0219 23:33:12.126629 4795 scope.go:117] "RemoveContainer" containerID="40ff688b57b898b0b5419dec2eb7dae2f53df75e9918f4af50555fc1c62cce91" Feb 19 23:33:12 crc kubenswrapper[4795]: I0219 23:33:12.153359 4795 scope.go:117] "RemoveContainer" containerID="b01a941fb86e6ae179a993966f808b1259b5941731dc4bfeff98b21fbe400935" Feb 19 23:33:12 crc kubenswrapper[4795]: E0219 23:33:12.153806 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b01a941fb86e6ae179a993966f808b1259b5941731dc4bfeff98b21fbe400935\": container with ID starting with b01a941fb86e6ae179a993966f808b1259b5941731dc4bfeff98b21fbe400935 not found: ID does not exist" containerID="b01a941fb86e6ae179a993966f808b1259b5941731dc4bfeff98b21fbe400935" Feb 19 23:33:12 crc kubenswrapper[4795]: I0219 23:33:12.153864 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b01a941fb86e6ae179a993966f808b1259b5941731dc4bfeff98b21fbe400935"} err="failed to get container status \"b01a941fb86e6ae179a993966f808b1259b5941731dc4bfeff98b21fbe400935\": rpc error: code = NotFound desc = could not find container \"b01a941fb86e6ae179a993966f808b1259b5941731dc4bfeff98b21fbe400935\": container with ID starting with b01a941fb86e6ae179a993966f808b1259b5941731dc4bfeff98b21fbe400935 not found: ID does not exist" Feb 19 23:33:12 crc kubenswrapper[4795]: I0219 23:33:12.153886 4795 scope.go:117] "RemoveContainer" containerID="456e4cdd0fe6f92610c5ca2c4002e956bc698f4747fe3e78727e6f5a34ddfe93" Feb 19 23:33:12 crc kubenswrapper[4795]: E0219 23:33:12.154204 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"456e4cdd0fe6f92610c5ca2c4002e956bc698f4747fe3e78727e6f5a34ddfe93\": container with ID starting with 456e4cdd0fe6f92610c5ca2c4002e956bc698f4747fe3e78727e6f5a34ddfe93 not found: ID does not exist" containerID="456e4cdd0fe6f92610c5ca2c4002e956bc698f4747fe3e78727e6f5a34ddfe93" Feb 19 23:33:12 crc kubenswrapper[4795]: I0219 23:33:12.154222 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"456e4cdd0fe6f92610c5ca2c4002e956bc698f4747fe3e78727e6f5a34ddfe93"} err="failed to get container status \"456e4cdd0fe6f92610c5ca2c4002e956bc698f4747fe3e78727e6f5a34ddfe93\": rpc error: code = NotFound desc = could not find container \"456e4cdd0fe6f92610c5ca2c4002e956bc698f4747fe3e78727e6f5a34ddfe93\": container with ID starting with 456e4cdd0fe6f92610c5ca2c4002e956bc698f4747fe3e78727e6f5a34ddfe93 not found: ID does not exist" Feb 19 23:33:12 crc kubenswrapper[4795]: I0219 23:33:12.154236 4795 scope.go:117] "RemoveContainer" containerID="40ff688b57b898b0b5419dec2eb7dae2f53df75e9918f4af50555fc1c62cce91" Feb 19 23:33:12 crc kubenswrapper[4795]: E0219 23:33:12.154645 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40ff688b57b898b0b5419dec2eb7dae2f53df75e9918f4af50555fc1c62cce91\": container with ID starting with 40ff688b57b898b0b5419dec2eb7dae2f53df75e9918f4af50555fc1c62cce91 not found: ID does not exist" containerID="40ff688b57b898b0b5419dec2eb7dae2f53df75e9918f4af50555fc1c62cce91" Feb 19 23:33:12 crc kubenswrapper[4795]: I0219 23:33:12.154669 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40ff688b57b898b0b5419dec2eb7dae2f53df75e9918f4af50555fc1c62cce91"} err="failed to get container status \"40ff688b57b898b0b5419dec2eb7dae2f53df75e9918f4af50555fc1c62cce91\": rpc error: code = NotFound desc = could not find container \"40ff688b57b898b0b5419dec2eb7dae2f53df75e9918f4af50555fc1c62cce91\": container with ID starting with 40ff688b57b898b0b5419dec2eb7dae2f53df75e9918f4af50555fc1c62cce91 not found: ID does not exist" Feb 19 23:33:13 crc kubenswrapper[4795]: I0219 23:33:13.525727 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b009444-b438-433d-8e2c-abc763e6f9ee" path="/var/lib/kubelet/pods/1b009444-b438-433d-8e2c-abc763e6f9ee/volumes" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.032614 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f54zd"] Feb 19 23:33:16 crc kubenswrapper[4795]: E0219 23:33:16.033903 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b009444-b438-433d-8e2c-abc763e6f9ee" containerName="extract-utilities" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.033918 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b009444-b438-433d-8e2c-abc763e6f9ee" containerName="extract-utilities" Feb 19 23:33:16 crc kubenswrapper[4795]: E0219 23:33:16.033945 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b009444-b438-433d-8e2c-abc763e6f9ee" containerName="registry-server" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.033951 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b009444-b438-433d-8e2c-abc763e6f9ee" containerName="registry-server" Feb 19 23:33:16 crc kubenswrapper[4795]: E0219 23:33:16.033971 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b009444-b438-433d-8e2c-abc763e6f9ee" containerName="extract-content" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.033977 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b009444-b438-433d-8e2c-abc763e6f9ee" containerName="extract-content" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.034238 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b009444-b438-433d-8e2c-abc763e6f9ee" containerName="registry-server" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.035707 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.053421 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f54zd"] Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.111116 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32a0d5fa-fc7f-4107-807b-633943866132-utilities\") pod \"redhat-operators-f54zd\" (UID: \"32a0d5fa-fc7f-4107-807b-633943866132\") " pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.111334 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32a0d5fa-fc7f-4107-807b-633943866132-catalog-content\") pod \"redhat-operators-f54zd\" (UID: \"32a0d5fa-fc7f-4107-807b-633943866132\") " pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.111422 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2wff\" (UniqueName: \"kubernetes.io/projected/32a0d5fa-fc7f-4107-807b-633943866132-kube-api-access-l2wff\") pod \"redhat-operators-f54zd\" (UID: \"32a0d5fa-fc7f-4107-807b-633943866132\") " pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.213722 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2wff\" (UniqueName: \"kubernetes.io/projected/32a0d5fa-fc7f-4107-807b-633943866132-kube-api-access-l2wff\") pod \"redhat-operators-f54zd\" (UID: \"32a0d5fa-fc7f-4107-807b-633943866132\") " pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.213887 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32a0d5fa-fc7f-4107-807b-633943866132-utilities\") pod \"redhat-operators-f54zd\" (UID: \"32a0d5fa-fc7f-4107-807b-633943866132\") " pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.213985 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32a0d5fa-fc7f-4107-807b-633943866132-catalog-content\") pod \"redhat-operators-f54zd\" (UID: \"32a0d5fa-fc7f-4107-807b-633943866132\") " pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.214527 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32a0d5fa-fc7f-4107-807b-633943866132-catalog-content\") pod \"redhat-operators-f54zd\" (UID: \"32a0d5fa-fc7f-4107-807b-633943866132\") " pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.214865 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32a0d5fa-fc7f-4107-807b-633943866132-utilities\") pod \"redhat-operators-f54zd\" (UID: \"32a0d5fa-fc7f-4107-807b-633943866132\") " pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.243561 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2wff\" (UniqueName: \"kubernetes.io/projected/32a0d5fa-fc7f-4107-807b-633943866132-kube-api-access-l2wff\") pod \"redhat-operators-f54zd\" (UID: \"32a0d5fa-fc7f-4107-807b-633943866132\") " pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.369917 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.890972 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f54zd"] Feb 19 23:33:17 crc kubenswrapper[4795]: I0219 23:33:17.108273 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f54zd" event={"ID":"32a0d5fa-fc7f-4107-807b-633943866132","Type":"ContainerStarted","Data":"cbde4ac654107bdb2c52eb88bfbc789c7cd281346e25fb6499981fa8657af579"} Feb 19 23:33:17 crc kubenswrapper[4795]: I0219 23:33:17.108331 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f54zd" event={"ID":"32a0d5fa-fc7f-4107-807b-633943866132","Type":"ContainerStarted","Data":"c64e95032c64e4397cad5beadc6695dedca6a054e73613870cbcf23737ede0cd"} Feb 19 23:33:18 crc kubenswrapper[4795]: I0219 23:33:18.118307 4795 generic.go:334] "Generic (PLEG): container finished" podID="32a0d5fa-fc7f-4107-807b-633943866132" containerID="cbde4ac654107bdb2c52eb88bfbc789c7cd281346e25fb6499981fa8657af579" exitCode=0 Feb 19 23:33:18 crc kubenswrapper[4795]: I0219 23:33:18.118375 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f54zd" event={"ID":"32a0d5fa-fc7f-4107-807b-633943866132","Type":"ContainerDied","Data":"cbde4ac654107bdb2c52eb88bfbc789c7cd281346e25fb6499981fa8657af579"} Feb 19 23:33:20 crc kubenswrapper[4795]: I0219 23:33:20.142319 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f54zd" event={"ID":"32a0d5fa-fc7f-4107-807b-633943866132","Type":"ContainerStarted","Data":"8af5e4ebc1fb66d082450eb8a8352c069d289c2c0be9e5e62fcdbbd6f386e671"} Feb 19 23:33:24 crc kubenswrapper[4795]: I0219 23:33:24.186065 4795 generic.go:334] "Generic (PLEG): container finished" podID="32a0d5fa-fc7f-4107-807b-633943866132" containerID="8af5e4ebc1fb66d082450eb8a8352c069d289c2c0be9e5e62fcdbbd6f386e671" exitCode=0 Feb 19 23:33:24 crc kubenswrapper[4795]: I0219 23:33:24.186139 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f54zd" event={"ID":"32a0d5fa-fc7f-4107-807b-633943866132","Type":"ContainerDied","Data":"8af5e4ebc1fb66d082450eb8a8352c069d289c2c0be9e5e62fcdbbd6f386e671"} Feb 19 23:33:25 crc kubenswrapper[4795]: I0219 23:33:25.198995 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f54zd" event={"ID":"32a0d5fa-fc7f-4107-807b-633943866132","Type":"ContainerStarted","Data":"1c94231c1ab58f24440993ae6d81e8caccc25d13e1fe573e94326ba8e1dab56b"} Feb 19 23:33:25 crc kubenswrapper[4795]: I0219 23:33:25.238024 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f54zd" podStartSLOduration=2.7452607479999998 podStartE2EDuration="9.238004116s" podCreationTimestamp="2026-02-19 23:33:16 +0000 UTC" firstStartedPulling="2026-02-19 23:33:18.120817122 +0000 UTC m=+7509.313334976" lastFinishedPulling="2026-02-19 23:33:24.61356044 +0000 UTC m=+7515.806078344" observedRunningTime="2026-02-19 23:33:25.222687054 +0000 UTC m=+7516.415204948" watchObservedRunningTime="2026-02-19 23:33:25.238004116 +0000 UTC m=+7516.430522000" Feb 19 23:33:26 crc kubenswrapper[4795]: I0219 23:33:26.370034 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:26 crc kubenswrapper[4795]: I0219 23:33:26.370795 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:27 crc kubenswrapper[4795]: I0219 23:33:27.423542 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-f54zd" podUID="32a0d5fa-fc7f-4107-807b-633943866132" containerName="registry-server" probeResult="failure" output=< Feb 19 23:33:27 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Feb 19 23:33:27 crc kubenswrapper[4795]: > Feb 19 23:33:28 crc kubenswrapper[4795]: I0219 23:33:28.427658 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:33:28 crc kubenswrapper[4795]: I0219 23:33:28.427739 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:33:28 crc kubenswrapper[4795]: I0219 23:33:28.427803 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 23:33:28 crc kubenswrapper[4795]: I0219 23:33:28.428721 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"53ab375a2fc3fa1ef244b68bab3723d6db871a1c5fc51b2d8dec9ed6289bc190"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 23:33:28 crc kubenswrapper[4795]: I0219 23:33:28.428827 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://53ab375a2fc3fa1ef244b68bab3723d6db871a1c5fc51b2d8dec9ed6289bc190" gracePeriod=600 Feb 19 23:33:29 crc kubenswrapper[4795]: I0219 23:33:29.242988 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="53ab375a2fc3fa1ef244b68bab3723d6db871a1c5fc51b2d8dec9ed6289bc190" exitCode=0 Feb 19 23:33:29 crc kubenswrapper[4795]: I0219 23:33:29.243058 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"53ab375a2fc3fa1ef244b68bab3723d6db871a1c5fc51b2d8dec9ed6289bc190"} Feb 19 23:33:29 crc kubenswrapper[4795]: I0219 23:33:29.243635 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f"} Feb 19 23:33:29 crc kubenswrapper[4795]: I0219 23:33:29.243654 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:33:36 crc kubenswrapper[4795]: I0219 23:33:36.449391 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:36 crc kubenswrapper[4795]: I0219 23:33:36.502665 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:36 crc kubenswrapper[4795]: I0219 23:33:36.808281 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f54zd"] Feb 19 23:33:38 crc kubenswrapper[4795]: I0219 23:33:38.342033 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-f54zd" podUID="32a0d5fa-fc7f-4107-807b-633943866132" containerName="registry-server" containerID="cri-o://1c94231c1ab58f24440993ae6d81e8caccc25d13e1fe573e94326ba8e1dab56b" gracePeriod=2 Feb 19 23:33:38 crc kubenswrapper[4795]: I0219 23:33:38.906389 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.070712 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32a0d5fa-fc7f-4107-807b-633943866132-utilities\") pod \"32a0d5fa-fc7f-4107-807b-633943866132\" (UID: \"32a0d5fa-fc7f-4107-807b-633943866132\") " Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.070771 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2wff\" (UniqueName: \"kubernetes.io/projected/32a0d5fa-fc7f-4107-807b-633943866132-kube-api-access-l2wff\") pod \"32a0d5fa-fc7f-4107-807b-633943866132\" (UID: \"32a0d5fa-fc7f-4107-807b-633943866132\") " Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.070863 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32a0d5fa-fc7f-4107-807b-633943866132-catalog-content\") pod \"32a0d5fa-fc7f-4107-807b-633943866132\" (UID: \"32a0d5fa-fc7f-4107-807b-633943866132\") " Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.071626 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32a0d5fa-fc7f-4107-807b-633943866132-utilities" (OuterVolumeSpecName: "utilities") pod "32a0d5fa-fc7f-4107-807b-633943866132" (UID: "32a0d5fa-fc7f-4107-807b-633943866132"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.077063 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32a0d5fa-fc7f-4107-807b-633943866132-kube-api-access-l2wff" (OuterVolumeSpecName: "kube-api-access-l2wff") pod "32a0d5fa-fc7f-4107-807b-633943866132" (UID: "32a0d5fa-fc7f-4107-807b-633943866132"). InnerVolumeSpecName "kube-api-access-l2wff". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.173529 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32a0d5fa-fc7f-4107-807b-633943866132-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.173565 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2wff\" (UniqueName: \"kubernetes.io/projected/32a0d5fa-fc7f-4107-807b-633943866132-kube-api-access-l2wff\") on node \"crc\" DevicePath \"\"" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.217632 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32a0d5fa-fc7f-4107-807b-633943866132-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "32a0d5fa-fc7f-4107-807b-633943866132" (UID: "32a0d5fa-fc7f-4107-807b-633943866132"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.275374 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32a0d5fa-fc7f-4107-807b-633943866132-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.355387 4795 generic.go:334] "Generic (PLEG): container finished" podID="32a0d5fa-fc7f-4107-807b-633943866132" containerID="1c94231c1ab58f24440993ae6d81e8caccc25d13e1fe573e94326ba8e1dab56b" exitCode=0 Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.355433 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f54zd" event={"ID":"32a0d5fa-fc7f-4107-807b-633943866132","Type":"ContainerDied","Data":"1c94231c1ab58f24440993ae6d81e8caccc25d13e1fe573e94326ba8e1dab56b"} Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.355439 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.355468 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f54zd" event={"ID":"32a0d5fa-fc7f-4107-807b-633943866132","Type":"ContainerDied","Data":"c64e95032c64e4397cad5beadc6695dedca6a054e73613870cbcf23737ede0cd"} Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.355491 4795 scope.go:117] "RemoveContainer" containerID="1c94231c1ab58f24440993ae6d81e8caccc25d13e1fe573e94326ba8e1dab56b" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.379320 4795 scope.go:117] "RemoveContainer" containerID="8af5e4ebc1fb66d082450eb8a8352c069d289c2c0be9e5e62fcdbbd6f386e671" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.399893 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f54zd"] Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.408376 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-f54zd"] Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.430149 4795 scope.go:117] "RemoveContainer" containerID="cbde4ac654107bdb2c52eb88bfbc789c7cd281346e25fb6499981fa8657af579" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.472153 4795 scope.go:117] "RemoveContainer" containerID="1c94231c1ab58f24440993ae6d81e8caccc25d13e1fe573e94326ba8e1dab56b" Feb 19 23:33:39 crc kubenswrapper[4795]: E0219 23:33:39.473021 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c94231c1ab58f24440993ae6d81e8caccc25d13e1fe573e94326ba8e1dab56b\": container with ID starting with 1c94231c1ab58f24440993ae6d81e8caccc25d13e1fe573e94326ba8e1dab56b not found: ID does not exist" containerID="1c94231c1ab58f24440993ae6d81e8caccc25d13e1fe573e94326ba8e1dab56b" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.473081 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c94231c1ab58f24440993ae6d81e8caccc25d13e1fe573e94326ba8e1dab56b"} err="failed to get container status \"1c94231c1ab58f24440993ae6d81e8caccc25d13e1fe573e94326ba8e1dab56b\": rpc error: code = NotFound desc = could not find container \"1c94231c1ab58f24440993ae6d81e8caccc25d13e1fe573e94326ba8e1dab56b\": container with ID starting with 1c94231c1ab58f24440993ae6d81e8caccc25d13e1fe573e94326ba8e1dab56b not found: ID does not exist" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.473116 4795 scope.go:117] "RemoveContainer" containerID="8af5e4ebc1fb66d082450eb8a8352c069d289c2c0be9e5e62fcdbbd6f386e671" Feb 19 23:33:39 crc kubenswrapper[4795]: E0219 23:33:39.473441 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8af5e4ebc1fb66d082450eb8a8352c069d289c2c0be9e5e62fcdbbd6f386e671\": container with ID starting with 8af5e4ebc1fb66d082450eb8a8352c069d289c2c0be9e5e62fcdbbd6f386e671 not found: ID does not exist" containerID="8af5e4ebc1fb66d082450eb8a8352c069d289c2c0be9e5e62fcdbbd6f386e671" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.473474 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8af5e4ebc1fb66d082450eb8a8352c069d289c2c0be9e5e62fcdbbd6f386e671"} err="failed to get container status \"8af5e4ebc1fb66d082450eb8a8352c069d289c2c0be9e5e62fcdbbd6f386e671\": rpc error: code = NotFound desc = could not find container \"8af5e4ebc1fb66d082450eb8a8352c069d289c2c0be9e5e62fcdbbd6f386e671\": container with ID starting with 8af5e4ebc1fb66d082450eb8a8352c069d289c2c0be9e5e62fcdbbd6f386e671 not found: ID does not exist" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.473494 4795 scope.go:117] "RemoveContainer" containerID="cbde4ac654107bdb2c52eb88bfbc789c7cd281346e25fb6499981fa8657af579" Feb 19 23:33:39 crc kubenswrapper[4795]: E0219 23:33:39.473725 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbde4ac654107bdb2c52eb88bfbc789c7cd281346e25fb6499981fa8657af579\": container with ID starting with cbde4ac654107bdb2c52eb88bfbc789c7cd281346e25fb6499981fa8657af579 not found: ID does not exist" containerID="cbde4ac654107bdb2c52eb88bfbc789c7cd281346e25fb6499981fa8657af579" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.473766 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbde4ac654107bdb2c52eb88bfbc789c7cd281346e25fb6499981fa8657af579"} err="failed to get container status \"cbde4ac654107bdb2c52eb88bfbc789c7cd281346e25fb6499981fa8657af579\": rpc error: code = NotFound desc = could not find container \"cbde4ac654107bdb2c52eb88bfbc789c7cd281346e25fb6499981fa8657af579\": container with ID starting with cbde4ac654107bdb2c52eb88bfbc789c7cd281346e25fb6499981fa8657af579 not found: ID does not exist" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.527824 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32a0d5fa-fc7f-4107-807b-633943866132" path="/var/lib/kubelet/pods/32a0d5fa-fc7f-4107-807b-633943866132/volumes" Feb 19 23:33:43 crc kubenswrapper[4795]: I0219 23:33:43.398472 4795 generic.go:334] "Generic (PLEG): container finished" podID="a23f1a80-1645-454d-b9cf-e039928b84cb" containerID="2e3e3e59d5c650a24ac4dc17cdcdbd825d499979952a0af456bf13fbe15d8292" exitCode=0 Feb 19 23:33:43 crc kubenswrapper[4795]: I0219 23:33:43.398554 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" event={"ID":"a23f1a80-1645-454d-b9cf-e039928b84cb","Type":"ContainerDied","Data":"2e3e3e59d5c650a24ac4dc17cdcdbd825d499979952a0af456bf13fbe15d8292"} Feb 19 23:33:44 crc kubenswrapper[4795]: I0219 23:33:44.884653 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:33:44 crc kubenswrapper[4795]: I0219 23:33:44.995501 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-neutron-metadata-combined-ca-bundle\") pod \"a23f1a80-1645-454d-b9cf-e039928b84cb\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " Feb 19 23:33:44 crc kubenswrapper[4795]: I0219 23:33:44.995573 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-inventory\") pod \"a23f1a80-1645-454d-b9cf-e039928b84cb\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " Feb 19 23:33:44 crc kubenswrapper[4795]: I0219 23:33:44.995594 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-ceph\") pod \"a23f1a80-1645-454d-b9cf-e039928b84cb\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " Feb 19 23:33:44 crc kubenswrapper[4795]: I0219 23:33:44.995625 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-ssh-key-openstack-cell1\") pod \"a23f1a80-1645-454d-b9cf-e039928b84cb\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " Feb 19 23:33:44 crc kubenswrapper[4795]: I0219 23:33:44.995702 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"a23f1a80-1645-454d-b9cf-e039928b84cb\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " Feb 19 23:33:44 crc kubenswrapper[4795]: I0219 23:33:44.995719 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-nova-metadata-neutron-config-0\") pod \"a23f1a80-1645-454d-b9cf-e039928b84cb\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " Feb 19 23:33:44 crc kubenswrapper[4795]: I0219 23:33:44.995893 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rphwm\" (UniqueName: \"kubernetes.io/projected/a23f1a80-1645-454d-b9cf-e039928b84cb-kube-api-access-rphwm\") pod \"a23f1a80-1645-454d-b9cf-e039928b84cb\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.001913 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a23f1a80-1645-454d-b9cf-e039928b84cb-kube-api-access-rphwm" (OuterVolumeSpecName: "kube-api-access-rphwm") pod "a23f1a80-1645-454d-b9cf-e039928b84cb" (UID: "a23f1a80-1645-454d-b9cf-e039928b84cb"). InnerVolumeSpecName "kube-api-access-rphwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.003452 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "a23f1a80-1645-454d-b9cf-e039928b84cb" (UID: "a23f1a80-1645-454d-b9cf-e039928b84cb"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.020429 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-ceph" (OuterVolumeSpecName: "ceph") pod "a23f1a80-1645-454d-b9cf-e039928b84cb" (UID: "a23f1a80-1645-454d-b9cf-e039928b84cb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.029144 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "a23f1a80-1645-454d-b9cf-e039928b84cb" (UID: "a23f1a80-1645-454d-b9cf-e039928b84cb"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.032823 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "a23f1a80-1645-454d-b9cf-e039928b84cb" (UID: "a23f1a80-1645-454d-b9cf-e039928b84cb"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.035229 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-inventory" (OuterVolumeSpecName: "inventory") pod "a23f1a80-1645-454d-b9cf-e039928b84cb" (UID: "a23f1a80-1645-454d-b9cf-e039928b84cb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.050990 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "a23f1a80-1645-454d-b9cf-e039928b84cb" (UID: "a23f1a80-1645-454d-b9cf-e039928b84cb"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.098842 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.098878 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.098889 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.098899 4795 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.098909 4795 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.098920 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rphwm\" (UniqueName: \"kubernetes.io/projected/a23f1a80-1645-454d-b9cf-e039928b84cb-kube-api-access-rphwm\") on node \"crc\" DevicePath \"\"" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.098929 4795 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.419031 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" event={"ID":"a23f1a80-1645-454d-b9cf-e039928b84cb","Type":"ContainerDied","Data":"ac532540ed1c9ed8ee4bab62b2b0f16733f15b6c586c9842153ae0b7941ac4df"} Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.419085 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac532540ed1c9ed8ee4bab62b2b0f16733f15b6c586c9842153ae0b7941ac4df" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.419099 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.540604 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-drll7"] Feb 19 23:33:45 crc kubenswrapper[4795]: E0219 23:33:45.541626 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a0d5fa-fc7f-4107-807b-633943866132" containerName="registry-server" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.541660 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a0d5fa-fc7f-4107-807b-633943866132" containerName="registry-server" Feb 19 23:33:45 crc kubenswrapper[4795]: E0219 23:33:45.541700 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a0d5fa-fc7f-4107-807b-633943866132" containerName="extract-utilities" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.541719 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a0d5fa-fc7f-4107-807b-633943866132" containerName="extract-utilities" Feb 19 23:33:45 crc kubenswrapper[4795]: E0219 23:33:45.541802 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a0d5fa-fc7f-4107-807b-633943866132" containerName="extract-content" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.541821 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a0d5fa-fc7f-4107-807b-633943866132" containerName="extract-content" Feb 19 23:33:45 crc kubenswrapper[4795]: E0219 23:33:45.541860 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a23f1a80-1645-454d-b9cf-e039928b84cb" containerName="neutron-metadata-openstack-openstack-cell1" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.541881 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a23f1a80-1645-454d-b9cf-e039928b84cb" containerName="neutron-metadata-openstack-openstack-cell1" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.545784 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="32a0d5fa-fc7f-4107-807b-633943866132" containerName="registry-server" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.545890 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a23f1a80-1645-454d-b9cf-e039928b84cb" containerName="neutron-metadata-openstack-openstack-cell1" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.547617 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-drll7"] Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.547760 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.558850 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.559012 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.559148 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.559252 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.559393 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.716266 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd8nf\" (UniqueName: \"kubernetes.io/projected/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-kube-api-access-rd8nf\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.716321 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-ceph\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.716347 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.716416 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.716455 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.717099 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-inventory\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.819427 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-inventory\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.819541 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd8nf\" (UniqueName: \"kubernetes.io/projected/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-kube-api-access-rd8nf\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.819589 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-ceph\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.819626 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.819664 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.819688 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.825713 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-inventory\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.825824 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.827134 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.828448 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-ceph\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.830016 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.843847 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd8nf\" (UniqueName: \"kubernetes.io/projected/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-kube-api-access-rd8nf\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.875823 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:46 crc kubenswrapper[4795]: I0219 23:33:46.470454 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-drll7"] Feb 19 23:33:47 crc kubenswrapper[4795]: I0219 23:33:47.445060 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-drll7" event={"ID":"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d","Type":"ContainerStarted","Data":"3fb8f68f6db9466d9e5013e9df140faafbadc024d0aebac89bdac3d89056c34b"} Feb 19 23:33:47 crc kubenswrapper[4795]: I0219 23:33:47.445876 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-drll7" event={"ID":"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d","Type":"ContainerStarted","Data":"340270680cada990d6856771e8b48e114d5f75d179a6b65f804e0367442e7b18"} Feb 19 23:33:47 crc kubenswrapper[4795]: I0219 23:33:47.467729 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-drll7" podStartSLOduration=2.031218623 podStartE2EDuration="2.467708361s" podCreationTimestamp="2026-02-19 23:33:45 +0000 UTC" firstStartedPulling="2026-02-19 23:33:46.481866994 +0000 UTC m=+7537.674384858" lastFinishedPulling="2026-02-19 23:33:46.918356732 +0000 UTC m=+7538.110874596" observedRunningTime="2026-02-19 23:33:47.465338784 +0000 UTC m=+7538.657856688" watchObservedRunningTime="2026-02-19 23:33:47.467708361 +0000 UTC m=+7538.660226225" Feb 19 23:35:28 crc kubenswrapper[4795]: I0219 23:35:28.427648 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:35:28 crc kubenswrapper[4795]: I0219 23:35:28.428134 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:35:53 crc kubenswrapper[4795]: I0219 23:35:53.086099 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hcqms"] Feb 19 23:35:53 crc kubenswrapper[4795]: I0219 23:35:53.091700 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:35:53 crc kubenswrapper[4795]: I0219 23:35:53.120250 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hcqms"] Feb 19 23:35:53 crc kubenswrapper[4795]: I0219 23:35:53.262040 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92699807-f31f-4ef1-80a3-c85b5ae52267-utilities\") pod \"redhat-marketplace-hcqms\" (UID: \"92699807-f31f-4ef1-80a3-c85b5ae52267\") " pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:35:53 crc kubenswrapper[4795]: I0219 23:35:53.262257 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s76d4\" (UniqueName: \"kubernetes.io/projected/92699807-f31f-4ef1-80a3-c85b5ae52267-kube-api-access-s76d4\") pod \"redhat-marketplace-hcqms\" (UID: \"92699807-f31f-4ef1-80a3-c85b5ae52267\") " pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:35:53 crc kubenswrapper[4795]: I0219 23:35:53.262372 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92699807-f31f-4ef1-80a3-c85b5ae52267-catalog-content\") pod \"redhat-marketplace-hcqms\" (UID: \"92699807-f31f-4ef1-80a3-c85b5ae52267\") " pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:35:53 crc kubenswrapper[4795]: I0219 23:35:53.364776 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92699807-f31f-4ef1-80a3-c85b5ae52267-utilities\") pod \"redhat-marketplace-hcqms\" (UID: \"92699807-f31f-4ef1-80a3-c85b5ae52267\") " pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:35:53 crc kubenswrapper[4795]: I0219 23:35:53.364832 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s76d4\" (UniqueName: \"kubernetes.io/projected/92699807-f31f-4ef1-80a3-c85b5ae52267-kube-api-access-s76d4\") pod \"redhat-marketplace-hcqms\" (UID: \"92699807-f31f-4ef1-80a3-c85b5ae52267\") " pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:35:53 crc kubenswrapper[4795]: I0219 23:35:53.364891 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92699807-f31f-4ef1-80a3-c85b5ae52267-catalog-content\") pod \"redhat-marketplace-hcqms\" (UID: \"92699807-f31f-4ef1-80a3-c85b5ae52267\") " pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:35:53 crc kubenswrapper[4795]: I0219 23:35:53.365514 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92699807-f31f-4ef1-80a3-c85b5ae52267-catalog-content\") pod \"redhat-marketplace-hcqms\" (UID: \"92699807-f31f-4ef1-80a3-c85b5ae52267\") " pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:35:53 crc kubenswrapper[4795]: I0219 23:35:53.365770 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92699807-f31f-4ef1-80a3-c85b5ae52267-utilities\") pod \"redhat-marketplace-hcqms\" (UID: \"92699807-f31f-4ef1-80a3-c85b5ae52267\") " pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:35:53 crc kubenswrapper[4795]: I0219 23:35:53.395735 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s76d4\" (UniqueName: \"kubernetes.io/projected/92699807-f31f-4ef1-80a3-c85b5ae52267-kube-api-access-s76d4\") pod \"redhat-marketplace-hcqms\" (UID: \"92699807-f31f-4ef1-80a3-c85b5ae52267\") " pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:35:53 crc kubenswrapper[4795]: I0219 23:35:53.422869 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:35:53 crc kubenswrapper[4795]: I0219 23:35:53.902302 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hcqms"] Feb 19 23:35:54 crc kubenswrapper[4795]: I0219 23:35:54.777498 4795 generic.go:334] "Generic (PLEG): container finished" podID="92699807-f31f-4ef1-80a3-c85b5ae52267" containerID="075ff9636f71ea6b36cce9312e631fc0c64c7440971ccbcdb446ae2d249ac68d" exitCode=0 Feb 19 23:35:54 crc kubenswrapper[4795]: I0219 23:35:54.777584 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hcqms" event={"ID":"92699807-f31f-4ef1-80a3-c85b5ae52267","Type":"ContainerDied","Data":"075ff9636f71ea6b36cce9312e631fc0c64c7440971ccbcdb446ae2d249ac68d"} Feb 19 23:35:54 crc kubenswrapper[4795]: I0219 23:35:54.777835 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hcqms" event={"ID":"92699807-f31f-4ef1-80a3-c85b5ae52267","Type":"ContainerStarted","Data":"c8819ed89cfef249f5f7927f5e7d5d89c50e025a9ea87be413c8b7593d29b7f2"} Feb 19 23:35:54 crc kubenswrapper[4795]: I0219 23:35:54.780185 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 23:35:55 crc kubenswrapper[4795]: I0219 23:35:55.788948 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hcqms" event={"ID":"92699807-f31f-4ef1-80a3-c85b5ae52267","Type":"ContainerStarted","Data":"bc4e8940e2128aa71b442ef79e89ec98fa8c8ba519f41bdd1d4f14b52325a51b"} Feb 19 23:35:56 crc kubenswrapper[4795]: I0219 23:35:56.806337 4795 generic.go:334] "Generic (PLEG): container finished" podID="92699807-f31f-4ef1-80a3-c85b5ae52267" containerID="bc4e8940e2128aa71b442ef79e89ec98fa8c8ba519f41bdd1d4f14b52325a51b" exitCode=0 Feb 19 23:35:56 crc kubenswrapper[4795]: I0219 23:35:56.806396 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hcqms" event={"ID":"92699807-f31f-4ef1-80a3-c85b5ae52267","Type":"ContainerDied","Data":"bc4e8940e2128aa71b442ef79e89ec98fa8c8ba519f41bdd1d4f14b52325a51b"} Feb 19 23:35:57 crc kubenswrapper[4795]: I0219 23:35:57.820399 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hcqms" event={"ID":"92699807-f31f-4ef1-80a3-c85b5ae52267","Type":"ContainerStarted","Data":"f30a680eb80ce693ac8d0b06f622aff4dfe172c9fef0f9ac28f9a5b3bf27d44a"} Feb 19 23:35:57 crc kubenswrapper[4795]: I0219 23:35:57.843827 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hcqms" podStartSLOduration=2.411270578 podStartE2EDuration="4.843807916s" podCreationTimestamp="2026-02-19 23:35:53 +0000 UTC" firstStartedPulling="2026-02-19 23:35:54.779891316 +0000 UTC m=+7665.972409180" lastFinishedPulling="2026-02-19 23:35:57.212428644 +0000 UTC m=+7668.404946518" observedRunningTime="2026-02-19 23:35:57.836563601 +0000 UTC m=+7669.029081515" watchObservedRunningTime="2026-02-19 23:35:57.843807916 +0000 UTC m=+7669.036325770" Feb 19 23:35:58 crc kubenswrapper[4795]: I0219 23:35:58.427298 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:35:58 crc kubenswrapper[4795]: I0219 23:35:58.427352 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:36:03 crc kubenswrapper[4795]: I0219 23:36:03.424018 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:36:03 crc kubenswrapper[4795]: I0219 23:36:03.424611 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:36:03 crc kubenswrapper[4795]: I0219 23:36:03.482059 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:36:03 crc kubenswrapper[4795]: I0219 23:36:03.919157 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:36:03 crc kubenswrapper[4795]: I0219 23:36:03.988448 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hcqms"] Feb 19 23:36:05 crc kubenswrapper[4795]: I0219 23:36:05.892876 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hcqms" podUID="92699807-f31f-4ef1-80a3-c85b5ae52267" containerName="registry-server" containerID="cri-o://f30a680eb80ce693ac8d0b06f622aff4dfe172c9fef0f9ac28f9a5b3bf27d44a" gracePeriod=2 Feb 19 23:36:06 crc kubenswrapper[4795]: I0219 23:36:06.908606 4795 generic.go:334] "Generic (PLEG): container finished" podID="92699807-f31f-4ef1-80a3-c85b5ae52267" containerID="f30a680eb80ce693ac8d0b06f622aff4dfe172c9fef0f9ac28f9a5b3bf27d44a" exitCode=0 Feb 19 23:36:06 crc kubenswrapper[4795]: I0219 23:36:06.908750 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hcqms" event={"ID":"92699807-f31f-4ef1-80a3-c85b5ae52267","Type":"ContainerDied","Data":"f30a680eb80ce693ac8d0b06f622aff4dfe172c9fef0f9ac28f9a5b3bf27d44a"} Feb 19 23:36:06 crc kubenswrapper[4795]: I0219 23:36:06.909140 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hcqms" event={"ID":"92699807-f31f-4ef1-80a3-c85b5ae52267","Type":"ContainerDied","Data":"c8819ed89cfef249f5f7927f5e7d5d89c50e025a9ea87be413c8b7593d29b7f2"} Feb 19 23:36:06 crc kubenswrapper[4795]: I0219 23:36:06.909157 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8819ed89cfef249f5f7927f5e7d5d89c50e025a9ea87be413c8b7593d29b7f2" Feb 19 23:36:06 crc kubenswrapper[4795]: I0219 23:36:06.926748 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:36:07 crc kubenswrapper[4795]: I0219 23:36:07.064311 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92699807-f31f-4ef1-80a3-c85b5ae52267-catalog-content\") pod \"92699807-f31f-4ef1-80a3-c85b5ae52267\" (UID: \"92699807-f31f-4ef1-80a3-c85b5ae52267\") " Feb 19 23:36:07 crc kubenswrapper[4795]: I0219 23:36:07.064423 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s76d4\" (UniqueName: \"kubernetes.io/projected/92699807-f31f-4ef1-80a3-c85b5ae52267-kube-api-access-s76d4\") pod \"92699807-f31f-4ef1-80a3-c85b5ae52267\" (UID: \"92699807-f31f-4ef1-80a3-c85b5ae52267\") " Feb 19 23:36:07 crc kubenswrapper[4795]: I0219 23:36:07.064726 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92699807-f31f-4ef1-80a3-c85b5ae52267-utilities\") pod \"92699807-f31f-4ef1-80a3-c85b5ae52267\" (UID: \"92699807-f31f-4ef1-80a3-c85b5ae52267\") " Feb 19 23:36:07 crc kubenswrapper[4795]: I0219 23:36:07.066462 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92699807-f31f-4ef1-80a3-c85b5ae52267-utilities" (OuterVolumeSpecName: "utilities") pod "92699807-f31f-4ef1-80a3-c85b5ae52267" (UID: "92699807-f31f-4ef1-80a3-c85b5ae52267"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:36:07 crc kubenswrapper[4795]: I0219 23:36:07.076517 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92699807-f31f-4ef1-80a3-c85b5ae52267-kube-api-access-s76d4" (OuterVolumeSpecName: "kube-api-access-s76d4") pod "92699807-f31f-4ef1-80a3-c85b5ae52267" (UID: "92699807-f31f-4ef1-80a3-c85b5ae52267"). InnerVolumeSpecName "kube-api-access-s76d4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:36:07 crc kubenswrapper[4795]: I0219 23:36:07.086938 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92699807-f31f-4ef1-80a3-c85b5ae52267-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "92699807-f31f-4ef1-80a3-c85b5ae52267" (UID: "92699807-f31f-4ef1-80a3-c85b5ae52267"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:36:07 crc kubenswrapper[4795]: I0219 23:36:07.167457 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92699807-f31f-4ef1-80a3-c85b5ae52267-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:36:07 crc kubenswrapper[4795]: I0219 23:36:07.167485 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s76d4\" (UniqueName: \"kubernetes.io/projected/92699807-f31f-4ef1-80a3-c85b5ae52267-kube-api-access-s76d4\") on node \"crc\" DevicePath \"\"" Feb 19 23:36:07 crc kubenswrapper[4795]: I0219 23:36:07.167495 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92699807-f31f-4ef1-80a3-c85b5ae52267-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:36:07 crc kubenswrapper[4795]: I0219 23:36:07.923212 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:36:07 crc kubenswrapper[4795]: I0219 23:36:07.962131 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hcqms"] Feb 19 23:36:07 crc kubenswrapper[4795]: I0219 23:36:07.980428 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hcqms"] Feb 19 23:36:09 crc kubenswrapper[4795]: I0219 23:36:09.537388 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92699807-f31f-4ef1-80a3-c85b5ae52267" path="/var/lib/kubelet/pods/92699807-f31f-4ef1-80a3-c85b5ae52267/volumes" Feb 19 23:36:28 crc kubenswrapper[4795]: I0219 23:36:28.427028 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:36:28 crc kubenswrapper[4795]: I0219 23:36:28.427598 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:36:28 crc kubenswrapper[4795]: I0219 23:36:28.427643 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 23:36:28 crc kubenswrapper[4795]: I0219 23:36:28.428500 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 23:36:28 crc kubenswrapper[4795]: I0219 23:36:28.428564 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" gracePeriod=600 Feb 19 23:36:28 crc kubenswrapper[4795]: E0219 23:36:28.552588 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:36:28 crc kubenswrapper[4795]: I0219 23:36:28.736676 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" exitCode=0 Feb 19 23:36:28 crc kubenswrapper[4795]: I0219 23:36:28.736734 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f"} Feb 19 23:36:28 crc kubenswrapper[4795]: I0219 23:36:28.738131 4795 scope.go:117] "RemoveContainer" containerID="53ab375a2fc3fa1ef244b68bab3723d6db871a1c5fc51b2d8dec9ed6289bc190" Feb 19 23:36:28 crc kubenswrapper[4795]: I0219 23:36:28.738885 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:36:28 crc kubenswrapper[4795]: E0219 23:36:28.739201 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:36:41 crc kubenswrapper[4795]: I0219 23:36:41.511553 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:36:41 crc kubenswrapper[4795]: E0219 23:36:41.512658 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:36:56 crc kubenswrapper[4795]: I0219 23:36:56.512393 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:36:56 crc kubenswrapper[4795]: E0219 23:36:56.514507 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:37:09 crc kubenswrapper[4795]: I0219 23:37:09.520733 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:37:09 crc kubenswrapper[4795]: E0219 23:37:09.521956 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:37:20 crc kubenswrapper[4795]: I0219 23:37:20.512473 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:37:20 crc kubenswrapper[4795]: E0219 23:37:20.513400 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:37:32 crc kubenswrapper[4795]: I0219 23:37:32.513463 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:37:32 crc kubenswrapper[4795]: E0219 23:37:32.514258 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:37:43 crc kubenswrapper[4795]: I0219 23:37:43.512389 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:37:43 crc kubenswrapper[4795]: E0219 23:37:43.513543 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:37:56 crc kubenswrapper[4795]: I0219 23:37:56.512492 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:37:56 crc kubenswrapper[4795]: E0219 23:37:56.513781 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:38:10 crc kubenswrapper[4795]: I0219 23:38:10.512594 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:38:10 crc kubenswrapper[4795]: E0219 23:38:10.513570 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:38:13 crc kubenswrapper[4795]: I0219 23:38:13.815435 4795 generic.go:334] "Generic (PLEG): container finished" podID="b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d" containerID="3fb8f68f6db9466d9e5013e9df140faafbadc024d0aebac89bdac3d89056c34b" exitCode=0 Feb 19 23:38:13 crc kubenswrapper[4795]: I0219 23:38:13.815595 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-drll7" event={"ID":"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d","Type":"ContainerDied","Data":"3fb8f68f6db9466d9e5013e9df140faafbadc024d0aebac89bdac3d89056c34b"} Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.369901 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.534960 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-ceph\") pod \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.535020 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-inventory\") pod \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.535066 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-libvirt-combined-ca-bundle\") pod \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.535102 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-libvirt-secret-0\") pod \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.535230 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rd8nf\" (UniqueName: \"kubernetes.io/projected/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-kube-api-access-rd8nf\") pod \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.535295 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-ssh-key-openstack-cell1\") pod \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.541380 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d" (UID: "b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.541474 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-kube-api-access-rd8nf" (OuterVolumeSpecName: "kube-api-access-rd8nf") pod "b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d" (UID: "b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d"). InnerVolumeSpecName "kube-api-access-rd8nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.548472 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-ceph" (OuterVolumeSpecName: "ceph") pod "b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d" (UID: "b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.565425 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d" (UID: "b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.575498 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-inventory" (OuterVolumeSpecName: "inventory") pod "b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d" (UID: "b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.578302 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d" (UID: "b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.640420 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.640872 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.641130 4795 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.641355 4795 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.641762 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rd8nf\" (UniqueName: \"kubernetes.io/projected/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-kube-api-access-rd8nf\") on node \"crc\" DevicePath \"\"" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.641900 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.840327 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-drll7" event={"ID":"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d","Type":"ContainerDied","Data":"340270680cada990d6856771e8b48e114d5f75d179a6b65f804e0367442e7b18"} Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.840406 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="340270680cada990d6856771e8b48e114d5f75d179a6b65f804e0367442e7b18" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.840471 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.948333 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-q26dr"] Feb 19 23:38:15 crc kubenswrapper[4795]: E0219 23:38:15.949105 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92699807-f31f-4ef1-80a3-c85b5ae52267" containerName="extract-content" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.949126 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="92699807-f31f-4ef1-80a3-c85b5ae52267" containerName="extract-content" Feb 19 23:38:15 crc kubenswrapper[4795]: E0219 23:38:15.949135 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92699807-f31f-4ef1-80a3-c85b5ae52267" containerName="registry-server" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.949214 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="92699807-f31f-4ef1-80a3-c85b5ae52267" containerName="registry-server" Feb 19 23:38:15 crc kubenswrapper[4795]: E0219 23:38:15.949252 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d" containerName="libvirt-openstack-openstack-cell1" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.949261 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d" containerName="libvirt-openstack-openstack-cell1" Feb 19 23:38:15 crc kubenswrapper[4795]: E0219 23:38:15.949279 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92699807-f31f-4ef1-80a3-c85b5ae52267" containerName="extract-utilities" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.949285 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="92699807-f31f-4ef1-80a3-c85b5ae52267" containerName="extract-utilities" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.949474 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="92699807-f31f-4ef1-80a3-c85b5ae52267" containerName="registry-server" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.949488 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d" containerName="libvirt-openstack-openstack-cell1" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.950245 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.956521 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.956735 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.956859 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.957011 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.957596 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.957866 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.958411 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.976202 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-q26dr"] Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.051141 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.051285 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.051382 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.051468 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.051526 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-inventory\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.051576 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.051668 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-ceph\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.051705 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.051757 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.051827 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.051898 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2sgz\" (UniqueName: \"kubernetes.io/projected/df818d88-cec5-4daf-8b17-cc4bb298b498-kube-api-access-t2sgz\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.052020 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.052208 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.154246 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.154352 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.154402 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.154446 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.154481 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.154517 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-inventory\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.154540 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.154579 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-ceph\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.154619 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.154640 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.154739 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.154794 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2sgz\" (UniqueName: \"kubernetes.io/projected/df818d88-cec5-4daf-8b17-cc4bb298b498-kube-api-access-t2sgz\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.154860 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.156682 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.157003 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.161029 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-inventory\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.161033 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-ceph\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.161498 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.161816 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.161844 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.162893 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.163484 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.166416 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.166824 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.167035 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.179610 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2sgz\" (UniqueName: \"kubernetes.io/projected/df818d88-cec5-4daf-8b17-cc4bb298b498-kube-api-access-t2sgz\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.283826 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.873229 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-q26dr"] Feb 19 23:38:17 crc kubenswrapper[4795]: I0219 23:38:17.862325 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" event={"ID":"df818d88-cec5-4daf-8b17-cc4bb298b498","Type":"ContainerStarted","Data":"c612bf4c4e008dbfc987ab33b3be7f695dac905bf903aefb2bc487385869b975"} Feb 19 23:38:17 crc kubenswrapper[4795]: I0219 23:38:17.862772 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" event={"ID":"df818d88-cec5-4daf-8b17-cc4bb298b498","Type":"ContainerStarted","Data":"3ddf7886211510da97446e708903944220215664f3eae31a24ed524c038decc3"} Feb 19 23:38:17 crc kubenswrapper[4795]: I0219 23:38:17.881411 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" podStartSLOduration=2.376485352 podStartE2EDuration="2.881383398s" podCreationTimestamp="2026-02-19 23:38:15 +0000 UTC" firstStartedPulling="2026-02-19 23:38:16.882508804 +0000 UTC m=+7808.075026708" lastFinishedPulling="2026-02-19 23:38:17.38740689 +0000 UTC m=+7808.579924754" observedRunningTime="2026-02-19 23:38:17.879356581 +0000 UTC m=+7809.071874445" watchObservedRunningTime="2026-02-19 23:38:17.881383398 +0000 UTC m=+7809.073901282" Feb 19 23:38:21 crc kubenswrapper[4795]: I0219 23:38:21.512136 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:38:21 crc kubenswrapper[4795]: E0219 23:38:21.513207 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:38:36 crc kubenswrapper[4795]: I0219 23:38:36.511894 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:38:36 crc kubenswrapper[4795]: E0219 23:38:36.513250 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:38:47 crc kubenswrapper[4795]: I0219 23:38:47.512215 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:38:47 crc kubenswrapper[4795]: E0219 23:38:47.514506 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:39:00 crc kubenswrapper[4795]: I0219 23:39:00.511912 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:39:00 crc kubenswrapper[4795]: E0219 23:39:00.513746 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:39:11 crc kubenswrapper[4795]: I0219 23:39:11.514702 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:39:11 crc kubenswrapper[4795]: E0219 23:39:11.515567 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:39:24 crc kubenswrapper[4795]: I0219 23:39:24.512692 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:39:24 crc kubenswrapper[4795]: E0219 23:39:24.513785 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:39:39 crc kubenswrapper[4795]: I0219 23:39:39.520606 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:39:39 crc kubenswrapper[4795]: E0219 23:39:39.521849 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:39:52 crc kubenswrapper[4795]: I0219 23:39:52.512202 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:39:52 crc kubenswrapper[4795]: E0219 23:39:52.513383 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:40:04 crc kubenswrapper[4795]: I0219 23:40:04.512360 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:40:04 crc kubenswrapper[4795]: E0219 23:40:04.513192 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:40:19 crc kubenswrapper[4795]: I0219 23:40:19.519003 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:40:19 crc kubenswrapper[4795]: E0219 23:40:19.519921 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:40:30 crc kubenswrapper[4795]: I0219 23:40:30.512668 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:40:30 crc kubenswrapper[4795]: E0219 23:40:30.513480 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:40:42 crc kubenswrapper[4795]: I0219 23:40:42.512079 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:40:42 crc kubenswrapper[4795]: E0219 23:40:42.512958 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:40:56 crc kubenswrapper[4795]: I0219 23:40:56.512338 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:40:56 crc kubenswrapper[4795]: E0219 23:40:56.513267 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:40:59 crc kubenswrapper[4795]: I0219 23:40:59.533879 4795 generic.go:334] "Generic (PLEG): container finished" podID="df818d88-cec5-4daf-8b17-cc4bb298b498" containerID="c612bf4c4e008dbfc987ab33b3be7f695dac905bf903aefb2bc487385869b975" exitCode=0 Feb 19 23:40:59 crc kubenswrapper[4795]: I0219 23:40:59.533923 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" event={"ID":"df818d88-cec5-4daf-8b17-cc4bb298b498","Type":"ContainerDied","Data":"c612bf4c4e008dbfc987ab33b3be7f695dac905bf903aefb2bc487385869b975"} Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.020045 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.098250 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-migration-ssh-key-0\") pod \"df818d88-cec5-4daf-8b17-cc4bb298b498\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.098308 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-inventory\") pod \"df818d88-cec5-4daf-8b17-cc4bb298b498\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.098347 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-1\") pod \"df818d88-cec5-4daf-8b17-cc4bb298b498\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.098398 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-0\") pod \"df818d88-cec5-4daf-8b17-cc4bb298b498\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.098432 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-ssh-key-openstack-cell1\") pod \"df818d88-cec5-4daf-8b17-cc4bb298b498\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.098482 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-ceph\") pod \"df818d88-cec5-4daf-8b17-cc4bb298b498\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.098540 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-3\") pod \"df818d88-cec5-4daf-8b17-cc4bb298b498\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.098581 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2sgz\" (UniqueName: \"kubernetes.io/projected/df818d88-cec5-4daf-8b17-cc4bb298b498-kube-api-access-t2sgz\") pod \"df818d88-cec5-4daf-8b17-cc4bb298b498\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.098604 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-migration-ssh-key-1\") pod \"df818d88-cec5-4daf-8b17-cc4bb298b498\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.098631 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-combined-ca-bundle\") pod \"df818d88-cec5-4daf-8b17-cc4bb298b498\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.098653 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-2\") pod \"df818d88-cec5-4daf-8b17-cc4bb298b498\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.098702 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cells-global-config-0\") pod \"df818d88-cec5-4daf-8b17-cc4bb298b498\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.098743 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cells-global-config-1\") pod \"df818d88-cec5-4daf-8b17-cc4bb298b498\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.117245 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df818d88-cec5-4daf-8b17-cc4bb298b498-kube-api-access-t2sgz" (OuterVolumeSpecName: "kube-api-access-t2sgz") pod "df818d88-cec5-4daf-8b17-cc4bb298b498" (UID: "df818d88-cec5-4daf-8b17-cc4bb298b498"). InnerVolumeSpecName "kube-api-access-t2sgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.121398 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-ceph" (OuterVolumeSpecName: "ceph") pod "df818d88-cec5-4daf-8b17-cc4bb298b498" (UID: "df818d88-cec5-4daf-8b17-cc4bb298b498"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.126308 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "df818d88-cec5-4daf-8b17-cc4bb298b498" (UID: "df818d88-cec5-4daf-8b17-cc4bb298b498"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.127625 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "df818d88-cec5-4daf-8b17-cc4bb298b498" (UID: "df818d88-cec5-4daf-8b17-cc4bb298b498"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.129375 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "df818d88-cec5-4daf-8b17-cc4bb298b498" (UID: "df818d88-cec5-4daf-8b17-cc4bb298b498"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.132580 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "df818d88-cec5-4daf-8b17-cc4bb298b498" (UID: "df818d88-cec5-4daf-8b17-cc4bb298b498"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.133988 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "df818d88-cec5-4daf-8b17-cc4bb298b498" (UID: "df818d88-cec5-4daf-8b17-cc4bb298b498"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.139379 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "df818d88-cec5-4daf-8b17-cc4bb298b498" (UID: "df818d88-cec5-4daf-8b17-cc4bb298b498"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.139699 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-inventory" (OuterVolumeSpecName: "inventory") pod "df818d88-cec5-4daf-8b17-cc4bb298b498" (UID: "df818d88-cec5-4daf-8b17-cc4bb298b498"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.141303 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "df818d88-cec5-4daf-8b17-cc4bb298b498" (UID: "df818d88-cec5-4daf-8b17-cc4bb298b498"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.143018 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "df818d88-cec5-4daf-8b17-cc4bb298b498" (UID: "df818d88-cec5-4daf-8b17-cc4bb298b498"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.150414 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "df818d88-cec5-4daf-8b17-cc4bb298b498" (UID: "df818d88-cec5-4daf-8b17-cc4bb298b498"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.152354 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "df818d88-cec5-4daf-8b17-cc4bb298b498" (UID: "df818d88-cec5-4daf-8b17-cc4bb298b498"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.201726 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.201760 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.201785 4795 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.201796 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.201805 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.201815 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.201851 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.201860 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.201869 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.201879 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2sgz\" (UniqueName: \"kubernetes.io/projected/df818d88-cec5-4daf-8b17-cc4bb298b498-kube-api-access-t2sgz\") on node \"crc\" DevicePath \"\"" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.201888 4795 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.201896 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.201926 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.554497 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" event={"ID":"df818d88-cec5-4daf-8b17-cc4bb298b498","Type":"ContainerDied","Data":"3ddf7886211510da97446e708903944220215664f3eae31a24ed524c038decc3"} Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.554821 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ddf7886211510da97446e708903944220215664f3eae31a24ed524c038decc3" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.554559 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.650596 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-45fwk"] Feb 19 23:41:01 crc kubenswrapper[4795]: E0219 23:41:01.650988 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df818d88-cec5-4daf-8b17-cc4bb298b498" containerName="nova-cell1-openstack-openstack-cell1" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.651005 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="df818d88-cec5-4daf-8b17-cc4bb298b498" containerName="nova-cell1-openstack-openstack-cell1" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.651239 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="df818d88-cec5-4daf-8b17-cc4bb298b498" containerName="nova-cell1-openstack-openstack-cell1" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.651894 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.654664 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.654950 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.655014 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.654954 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.655181 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.664013 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-45fwk"] Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.710868 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.710916 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.711024 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.711054 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-inventory\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.711231 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d52zf\" (UniqueName: \"kubernetes.io/projected/8272a408-0416-4077-9e85-b2962992b3f4-kube-api-access-d52zf\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.711261 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.711286 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.711340 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceph\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.813817 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.813910 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-inventory\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.813988 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d52zf\" (UniqueName: \"kubernetes.io/projected/8272a408-0416-4077-9e85-b2962992b3f4-kube-api-access-d52zf\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.814013 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.814041 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.814062 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceph\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.814135 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.814161 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.817477 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.817539 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.818729 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-inventory\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.818805 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.818850 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.819514 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.822695 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceph\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.830768 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d52zf\" (UniqueName: \"kubernetes.io/projected/8272a408-0416-4077-9e85-b2962992b3f4-kube-api-access-d52zf\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.979948 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:02 crc kubenswrapper[4795]: I0219 23:41:02.615817 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 23:41:02 crc kubenswrapper[4795]: I0219 23:41:02.616130 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-45fwk"] Feb 19 23:41:03 crc kubenswrapper[4795]: I0219 23:41:03.582476 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-45fwk" event={"ID":"8272a408-0416-4077-9e85-b2962992b3f4","Type":"ContainerStarted","Data":"0b42e21c1af1d461acf5578be3104a023047049f42895449a7c537e578dfd456"} Feb 19 23:41:03 crc kubenswrapper[4795]: I0219 23:41:03.582982 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-45fwk" event={"ID":"8272a408-0416-4077-9e85-b2962992b3f4","Type":"ContainerStarted","Data":"4bd117515658cc36bb2328744f0a39f6b5d75a26befda093e2b0ac74514155d0"} Feb 19 23:41:03 crc kubenswrapper[4795]: I0219 23:41:03.611047 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-45fwk" podStartSLOduration=2.151522709 podStartE2EDuration="2.611027004s" podCreationTimestamp="2026-02-19 23:41:01 +0000 UTC" firstStartedPulling="2026-02-19 23:41:02.615533476 +0000 UTC m=+7973.808051360" lastFinishedPulling="2026-02-19 23:41:03.075037791 +0000 UTC m=+7974.267555655" observedRunningTime="2026-02-19 23:41:03.597381196 +0000 UTC m=+7974.789899060" watchObservedRunningTime="2026-02-19 23:41:03.611027004 +0000 UTC m=+7974.803544868" Feb 19 23:41:10 crc kubenswrapper[4795]: I0219 23:41:10.512211 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:41:10 crc kubenswrapper[4795]: E0219 23:41:10.513760 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:41:23 crc kubenswrapper[4795]: I0219 23:41:23.516729 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:41:23 crc kubenswrapper[4795]: E0219 23:41:23.517503 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:41:36 crc kubenswrapper[4795]: I0219 23:41:36.513418 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:41:36 crc kubenswrapper[4795]: I0219 23:41:36.923329 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"9e31f1c0e7be53ce872e54fe0d6436f7bde017185ed0c455c7619a4196124fcf"} Feb 19 23:41:50 crc kubenswrapper[4795]: I0219 23:41:50.607363 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hzj6g"] Feb 19 23:41:50 crc kubenswrapper[4795]: I0219 23:41:50.612286 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:41:50 crc kubenswrapper[4795]: I0219 23:41:50.618705 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hzj6g"] Feb 19 23:41:50 crc kubenswrapper[4795]: I0219 23:41:50.704363 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-utilities\") pod \"community-operators-hzj6g\" (UID: \"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54\") " pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:41:50 crc kubenswrapper[4795]: I0219 23:41:50.704420 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mxz9\" (UniqueName: \"kubernetes.io/projected/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-kube-api-access-2mxz9\") pod \"community-operators-hzj6g\" (UID: \"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54\") " pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:41:50 crc kubenswrapper[4795]: I0219 23:41:50.704441 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-catalog-content\") pod \"community-operators-hzj6g\" (UID: \"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54\") " pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:41:50 crc kubenswrapper[4795]: I0219 23:41:50.806059 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-utilities\") pod \"community-operators-hzj6g\" (UID: \"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54\") " pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:41:50 crc kubenswrapper[4795]: I0219 23:41:50.806128 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mxz9\" (UniqueName: \"kubernetes.io/projected/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-kube-api-access-2mxz9\") pod \"community-operators-hzj6g\" (UID: \"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54\") " pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:41:50 crc kubenswrapper[4795]: I0219 23:41:50.806156 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-catalog-content\") pod \"community-operators-hzj6g\" (UID: \"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54\") " pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:41:50 crc kubenswrapper[4795]: I0219 23:41:50.806763 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-catalog-content\") pod \"community-operators-hzj6g\" (UID: \"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54\") " pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:41:50 crc kubenswrapper[4795]: I0219 23:41:50.806977 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-utilities\") pod \"community-operators-hzj6g\" (UID: \"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54\") " pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:41:50 crc kubenswrapper[4795]: I0219 23:41:50.841897 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mxz9\" (UniqueName: \"kubernetes.io/projected/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-kube-api-access-2mxz9\") pod \"community-operators-hzj6g\" (UID: \"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54\") " pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:41:50 crc kubenswrapper[4795]: I0219 23:41:50.935398 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:41:51 crc kubenswrapper[4795]: I0219 23:41:51.536453 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hzj6g"] Feb 19 23:41:52 crc kubenswrapper[4795]: I0219 23:41:52.117366 4795 generic.go:334] "Generic (PLEG): container finished" podID="d66e1e76-05a3-4f86-a4c9-e8cc579ebc54" containerID="418c6b5a7ac011e8c420e27f2a710b6f00839117ed0a59e39915cee13e64928c" exitCode=0 Feb 19 23:41:52 crc kubenswrapper[4795]: I0219 23:41:52.117465 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hzj6g" event={"ID":"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54","Type":"ContainerDied","Data":"418c6b5a7ac011e8c420e27f2a710b6f00839117ed0a59e39915cee13e64928c"} Feb 19 23:41:52 crc kubenswrapper[4795]: I0219 23:41:52.117548 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hzj6g" event={"ID":"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54","Type":"ContainerStarted","Data":"10e8ad5bc250f45fd68c45c575016b8ca91733714ac148efd5db92b878d83133"} Feb 19 23:41:53 crc kubenswrapper[4795]: I0219 23:41:53.131072 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hzj6g" event={"ID":"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54","Type":"ContainerStarted","Data":"4844b56bccf75eec16c9b232d64d9d730884b6de51940b6c23cdd9f083e74dcd"} Feb 19 23:41:54 crc kubenswrapper[4795]: I0219 23:41:54.146931 4795 generic.go:334] "Generic (PLEG): container finished" podID="d66e1e76-05a3-4f86-a4c9-e8cc579ebc54" containerID="4844b56bccf75eec16c9b232d64d9d730884b6de51940b6c23cdd9f083e74dcd" exitCode=0 Feb 19 23:41:54 crc kubenswrapper[4795]: I0219 23:41:54.147161 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hzj6g" event={"ID":"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54","Type":"ContainerDied","Data":"4844b56bccf75eec16c9b232d64d9d730884b6de51940b6c23cdd9f083e74dcd"} Feb 19 23:41:55 crc kubenswrapper[4795]: I0219 23:41:55.157029 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hzj6g" event={"ID":"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54","Type":"ContainerStarted","Data":"ea8b5477879331dfa22db3892cbb33ece54e60c2618079a5daca69b7b42fee11"} Feb 19 23:41:55 crc kubenswrapper[4795]: I0219 23:41:55.181443 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hzj6g" podStartSLOduration=2.7317581459999998 podStartE2EDuration="5.181425216s" podCreationTimestamp="2026-02-19 23:41:50 +0000 UTC" firstStartedPulling="2026-02-19 23:41:52.121456977 +0000 UTC m=+8023.313974841" lastFinishedPulling="2026-02-19 23:41:54.571124057 +0000 UTC m=+8025.763641911" observedRunningTime="2026-02-19 23:41:55.173809675 +0000 UTC m=+8026.366327539" watchObservedRunningTime="2026-02-19 23:41:55.181425216 +0000 UTC m=+8026.373943080" Feb 19 23:42:00 crc kubenswrapper[4795]: I0219 23:42:00.936529 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:42:00 crc kubenswrapper[4795]: I0219 23:42:00.937434 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:42:00 crc kubenswrapper[4795]: I0219 23:42:00.991473 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:42:01 crc kubenswrapper[4795]: I0219 23:42:01.306734 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:42:01 crc kubenswrapper[4795]: I0219 23:42:01.368268 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hzj6g"] Feb 19 23:42:03 crc kubenswrapper[4795]: I0219 23:42:03.254084 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hzj6g" podUID="d66e1e76-05a3-4f86-a4c9-e8cc579ebc54" containerName="registry-server" containerID="cri-o://ea8b5477879331dfa22db3892cbb33ece54e60c2618079a5daca69b7b42fee11" gracePeriod=2 Feb 19 23:42:03 crc kubenswrapper[4795]: I0219 23:42:03.777750 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:42:03 crc kubenswrapper[4795]: I0219 23:42:03.895450 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mxz9\" (UniqueName: \"kubernetes.io/projected/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-kube-api-access-2mxz9\") pod \"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54\" (UID: \"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54\") " Feb 19 23:42:03 crc kubenswrapper[4795]: I0219 23:42:03.895524 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-catalog-content\") pod \"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54\" (UID: \"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54\") " Feb 19 23:42:03 crc kubenswrapper[4795]: I0219 23:42:03.895567 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-utilities\") pod \"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54\" (UID: \"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54\") " Feb 19 23:42:03 crc kubenswrapper[4795]: I0219 23:42:03.897020 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-utilities" (OuterVolumeSpecName: "utilities") pod "d66e1e76-05a3-4f86-a4c9-e8cc579ebc54" (UID: "d66e1e76-05a3-4f86-a4c9-e8cc579ebc54"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:42:03 crc kubenswrapper[4795]: I0219 23:42:03.904141 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-kube-api-access-2mxz9" (OuterVolumeSpecName: "kube-api-access-2mxz9") pod "d66e1e76-05a3-4f86-a4c9-e8cc579ebc54" (UID: "d66e1e76-05a3-4f86-a4c9-e8cc579ebc54"). InnerVolumeSpecName "kube-api-access-2mxz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:42:03 crc kubenswrapper[4795]: I0219 23:42:03.946477 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d66e1e76-05a3-4f86-a4c9-e8cc579ebc54" (UID: "d66e1e76-05a3-4f86-a4c9-e8cc579ebc54"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:42:03 crc kubenswrapper[4795]: I0219 23:42:03.998268 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mxz9\" (UniqueName: \"kubernetes.io/projected/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-kube-api-access-2mxz9\") on node \"crc\" DevicePath \"\"" Feb 19 23:42:03 crc kubenswrapper[4795]: I0219 23:42:03.998301 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:42:03 crc kubenswrapper[4795]: I0219 23:42:03.998310 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:42:04 crc kubenswrapper[4795]: I0219 23:42:04.271778 4795 generic.go:334] "Generic (PLEG): container finished" podID="d66e1e76-05a3-4f86-a4c9-e8cc579ebc54" containerID="ea8b5477879331dfa22db3892cbb33ece54e60c2618079a5daca69b7b42fee11" exitCode=0 Feb 19 23:42:04 crc kubenswrapper[4795]: I0219 23:42:04.271837 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hzj6g" event={"ID":"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54","Type":"ContainerDied","Data":"ea8b5477879331dfa22db3892cbb33ece54e60c2618079a5daca69b7b42fee11"} Feb 19 23:42:04 crc kubenswrapper[4795]: I0219 23:42:04.271877 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:42:04 crc kubenswrapper[4795]: I0219 23:42:04.272133 4795 scope.go:117] "RemoveContainer" containerID="ea8b5477879331dfa22db3892cbb33ece54e60c2618079a5daca69b7b42fee11" Feb 19 23:42:04 crc kubenswrapper[4795]: I0219 23:42:04.272119 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hzj6g" event={"ID":"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54","Type":"ContainerDied","Data":"10e8ad5bc250f45fd68c45c575016b8ca91733714ac148efd5db92b878d83133"} Feb 19 23:42:04 crc kubenswrapper[4795]: I0219 23:42:04.295660 4795 scope.go:117] "RemoveContainer" containerID="4844b56bccf75eec16c9b232d64d9d730884b6de51940b6c23cdd9f083e74dcd" Feb 19 23:42:04 crc kubenswrapper[4795]: I0219 23:42:04.350507 4795 scope.go:117] "RemoveContainer" containerID="418c6b5a7ac011e8c420e27f2a710b6f00839117ed0a59e39915cee13e64928c" Feb 19 23:42:04 crc kubenswrapper[4795]: I0219 23:42:04.356571 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hzj6g"] Feb 19 23:42:04 crc kubenswrapper[4795]: I0219 23:42:04.369110 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hzj6g"] Feb 19 23:42:04 crc kubenswrapper[4795]: I0219 23:42:04.416096 4795 scope.go:117] "RemoveContainer" containerID="ea8b5477879331dfa22db3892cbb33ece54e60c2618079a5daca69b7b42fee11" Feb 19 23:42:04 crc kubenswrapper[4795]: E0219 23:42:04.417398 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea8b5477879331dfa22db3892cbb33ece54e60c2618079a5daca69b7b42fee11\": container with ID starting with ea8b5477879331dfa22db3892cbb33ece54e60c2618079a5daca69b7b42fee11 not found: ID does not exist" containerID="ea8b5477879331dfa22db3892cbb33ece54e60c2618079a5daca69b7b42fee11" Feb 19 23:42:04 crc kubenswrapper[4795]: I0219 23:42:04.417610 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea8b5477879331dfa22db3892cbb33ece54e60c2618079a5daca69b7b42fee11"} err="failed to get container status \"ea8b5477879331dfa22db3892cbb33ece54e60c2618079a5daca69b7b42fee11\": rpc error: code = NotFound desc = could not find container \"ea8b5477879331dfa22db3892cbb33ece54e60c2618079a5daca69b7b42fee11\": container with ID starting with ea8b5477879331dfa22db3892cbb33ece54e60c2618079a5daca69b7b42fee11 not found: ID does not exist" Feb 19 23:42:04 crc kubenswrapper[4795]: I0219 23:42:04.417778 4795 scope.go:117] "RemoveContainer" containerID="4844b56bccf75eec16c9b232d64d9d730884b6de51940b6c23cdd9f083e74dcd" Feb 19 23:42:04 crc kubenswrapper[4795]: E0219 23:42:04.418320 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4844b56bccf75eec16c9b232d64d9d730884b6de51940b6c23cdd9f083e74dcd\": container with ID starting with 4844b56bccf75eec16c9b232d64d9d730884b6de51940b6c23cdd9f083e74dcd not found: ID does not exist" containerID="4844b56bccf75eec16c9b232d64d9d730884b6de51940b6c23cdd9f083e74dcd" Feb 19 23:42:04 crc kubenswrapper[4795]: I0219 23:42:04.418350 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4844b56bccf75eec16c9b232d64d9d730884b6de51940b6c23cdd9f083e74dcd"} err="failed to get container status \"4844b56bccf75eec16c9b232d64d9d730884b6de51940b6c23cdd9f083e74dcd\": rpc error: code = NotFound desc = could not find container \"4844b56bccf75eec16c9b232d64d9d730884b6de51940b6c23cdd9f083e74dcd\": container with ID starting with 4844b56bccf75eec16c9b232d64d9d730884b6de51940b6c23cdd9f083e74dcd not found: ID does not exist" Feb 19 23:42:04 crc kubenswrapper[4795]: I0219 23:42:04.418372 4795 scope.go:117] "RemoveContainer" containerID="418c6b5a7ac011e8c420e27f2a710b6f00839117ed0a59e39915cee13e64928c" Feb 19 23:42:04 crc kubenswrapper[4795]: E0219 23:42:04.418754 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"418c6b5a7ac011e8c420e27f2a710b6f00839117ed0a59e39915cee13e64928c\": container with ID starting with 418c6b5a7ac011e8c420e27f2a710b6f00839117ed0a59e39915cee13e64928c not found: ID does not exist" containerID="418c6b5a7ac011e8c420e27f2a710b6f00839117ed0a59e39915cee13e64928c" Feb 19 23:42:04 crc kubenswrapper[4795]: I0219 23:42:04.418878 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"418c6b5a7ac011e8c420e27f2a710b6f00839117ed0a59e39915cee13e64928c"} err="failed to get container status \"418c6b5a7ac011e8c420e27f2a710b6f00839117ed0a59e39915cee13e64928c\": rpc error: code = NotFound desc = could not find container \"418c6b5a7ac011e8c420e27f2a710b6f00839117ed0a59e39915cee13e64928c\": container with ID starting with 418c6b5a7ac011e8c420e27f2a710b6f00839117ed0a59e39915cee13e64928c not found: ID does not exist" Feb 19 23:42:05 crc kubenswrapper[4795]: I0219 23:42:05.526501 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d66e1e76-05a3-4f86-a4c9-e8cc579ebc54" path="/var/lib/kubelet/pods/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54/volumes" Feb 19 23:42:28 crc kubenswrapper[4795]: I0219 23:42:28.462018 4795 scope.go:117] "RemoveContainer" containerID="f30a680eb80ce693ac8d0b06f622aff4dfe172c9fef0f9ac28f9a5b3bf27d44a" Feb 19 23:42:28 crc kubenswrapper[4795]: I0219 23:42:28.495022 4795 scope.go:117] "RemoveContainer" containerID="bc4e8940e2128aa71b442ef79e89ec98fa8c8ba519f41bdd1d4f14b52325a51b" Feb 19 23:42:28 crc kubenswrapper[4795]: I0219 23:42:28.519370 4795 scope.go:117] "RemoveContainer" containerID="075ff9636f71ea6b36cce9312e631fc0c64c7440971ccbcdb446ae2d249ac68d" Feb 19 23:43:23 crc kubenswrapper[4795]: I0219 23:43:23.713916 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7c7cc"] Feb 19 23:43:23 crc kubenswrapper[4795]: E0219 23:43:23.715268 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d66e1e76-05a3-4f86-a4c9-e8cc579ebc54" containerName="extract-utilities" Feb 19 23:43:23 crc kubenswrapper[4795]: I0219 23:43:23.715286 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66e1e76-05a3-4f86-a4c9-e8cc579ebc54" containerName="extract-utilities" Feb 19 23:43:23 crc kubenswrapper[4795]: E0219 23:43:23.715319 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d66e1e76-05a3-4f86-a4c9-e8cc579ebc54" containerName="extract-content" Feb 19 23:43:23 crc kubenswrapper[4795]: I0219 23:43:23.715327 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66e1e76-05a3-4f86-a4c9-e8cc579ebc54" containerName="extract-content" Feb 19 23:43:23 crc kubenswrapper[4795]: E0219 23:43:23.715360 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d66e1e76-05a3-4f86-a4c9-e8cc579ebc54" containerName="registry-server" Feb 19 23:43:23 crc kubenswrapper[4795]: I0219 23:43:23.715371 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66e1e76-05a3-4f86-a4c9-e8cc579ebc54" containerName="registry-server" Feb 19 23:43:23 crc kubenswrapper[4795]: I0219 23:43:23.715627 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d66e1e76-05a3-4f86-a4c9-e8cc579ebc54" containerName="registry-server" Feb 19 23:43:23 crc kubenswrapper[4795]: I0219 23:43:23.717884 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:23 crc kubenswrapper[4795]: I0219 23:43:23.735794 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7c7cc"] Feb 19 23:43:23 crc kubenswrapper[4795]: I0219 23:43:23.864358 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-catalog-content\") pod \"certified-operators-7c7cc\" (UID: \"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb\") " pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:23 crc kubenswrapper[4795]: I0219 23:43:23.864523 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-utilities\") pod \"certified-operators-7c7cc\" (UID: \"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb\") " pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:23 crc kubenswrapper[4795]: I0219 23:43:23.864563 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x46l\" (UniqueName: \"kubernetes.io/projected/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-kube-api-access-2x46l\") pod \"certified-operators-7c7cc\" (UID: \"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb\") " pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:23 crc kubenswrapper[4795]: I0219 23:43:23.967235 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-catalog-content\") pod \"certified-operators-7c7cc\" (UID: \"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb\") " pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:23 crc kubenswrapper[4795]: I0219 23:43:23.967367 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-utilities\") pod \"certified-operators-7c7cc\" (UID: \"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb\") " pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:23 crc kubenswrapper[4795]: I0219 23:43:23.967404 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x46l\" (UniqueName: \"kubernetes.io/projected/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-kube-api-access-2x46l\") pod \"certified-operators-7c7cc\" (UID: \"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb\") " pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:23 crc kubenswrapper[4795]: I0219 23:43:23.967918 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-catalog-content\") pod \"certified-operators-7c7cc\" (UID: \"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb\") " pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:23 crc kubenswrapper[4795]: I0219 23:43:23.968258 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-utilities\") pod \"certified-operators-7c7cc\" (UID: \"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb\") " pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:23 crc kubenswrapper[4795]: I0219 23:43:23.993201 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x46l\" (UniqueName: \"kubernetes.io/projected/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-kube-api-access-2x46l\") pod \"certified-operators-7c7cc\" (UID: \"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb\") " pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:24 crc kubenswrapper[4795]: I0219 23:43:24.048553 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:24 crc kubenswrapper[4795]: I0219 23:43:24.571202 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7c7cc"] Feb 19 23:43:24 crc kubenswrapper[4795]: I0219 23:43:24.603690 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7c7cc" event={"ID":"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb","Type":"ContainerStarted","Data":"debfe03b9665db2cc109523ff1423a76bfce7cdda7d589e75926cc61bee4c9ef"} Feb 19 23:43:25 crc kubenswrapper[4795]: I0219 23:43:25.614074 4795 generic.go:334] "Generic (PLEG): container finished" podID="acc1dbdf-a61c-4e68-a49d-33bc1e4396eb" containerID="21423fc563687e24411e1faae3b78c466da53cae2f1a4436cd02f4f6b90825c1" exitCode=0 Feb 19 23:43:25 crc kubenswrapper[4795]: I0219 23:43:25.614126 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7c7cc" event={"ID":"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb","Type":"ContainerDied","Data":"21423fc563687e24411e1faae3b78c466da53cae2f1a4436cd02f4f6b90825c1"} Feb 19 23:43:26 crc kubenswrapper[4795]: I0219 23:43:26.625635 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7c7cc" event={"ID":"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb","Type":"ContainerStarted","Data":"eaa249cd0f1657d583a8712882677e627e7f81412a3b6890b15387bd2353b574"} Feb 19 23:43:27 crc kubenswrapper[4795]: I0219 23:43:27.639895 4795 generic.go:334] "Generic (PLEG): container finished" podID="acc1dbdf-a61c-4e68-a49d-33bc1e4396eb" containerID="eaa249cd0f1657d583a8712882677e627e7f81412a3b6890b15387bd2353b574" exitCode=0 Feb 19 23:43:27 crc kubenswrapper[4795]: I0219 23:43:27.640057 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7c7cc" event={"ID":"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb","Type":"ContainerDied","Data":"eaa249cd0f1657d583a8712882677e627e7f81412a3b6890b15387bd2353b574"} Feb 19 23:43:28 crc kubenswrapper[4795]: I0219 23:43:28.651115 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7c7cc" event={"ID":"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb","Type":"ContainerStarted","Data":"f89e5d52a6cbe3e6e5d621bbf7f239549572d0c852602bede32534077cfe914a"} Feb 19 23:43:28 crc kubenswrapper[4795]: I0219 23:43:28.673456 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7c7cc" podStartSLOduration=3.205609124 podStartE2EDuration="5.673433796s" podCreationTimestamp="2026-02-19 23:43:23 +0000 UTC" firstStartedPulling="2026-02-19 23:43:25.616611616 +0000 UTC m=+8116.809129480" lastFinishedPulling="2026-02-19 23:43:28.084436288 +0000 UTC m=+8119.276954152" observedRunningTime="2026-02-19 23:43:28.665675001 +0000 UTC m=+8119.858192875" watchObservedRunningTime="2026-02-19 23:43:28.673433796 +0000 UTC m=+8119.865951660" Feb 19 23:43:34 crc kubenswrapper[4795]: I0219 23:43:34.049730 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:34 crc kubenswrapper[4795]: I0219 23:43:34.050482 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:34 crc kubenswrapper[4795]: I0219 23:43:34.092815 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:34 crc kubenswrapper[4795]: I0219 23:43:34.777508 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:34 crc kubenswrapper[4795]: I0219 23:43:34.828177 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7c7cc"] Feb 19 23:43:36 crc kubenswrapper[4795]: I0219 23:43:36.719295 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7c7cc" podUID="acc1dbdf-a61c-4e68-a49d-33bc1e4396eb" containerName="registry-server" containerID="cri-o://f89e5d52a6cbe3e6e5d621bbf7f239549572d0c852602bede32534077cfe914a" gracePeriod=2 Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.249660 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.355017 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x46l\" (UniqueName: \"kubernetes.io/projected/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-kube-api-access-2x46l\") pod \"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb\" (UID: \"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb\") " Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.355106 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-utilities\") pod \"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb\" (UID: \"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb\") " Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.355393 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-catalog-content\") pod \"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb\" (UID: \"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb\") " Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.355963 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-utilities" (OuterVolumeSpecName: "utilities") pod "acc1dbdf-a61c-4e68-a49d-33bc1e4396eb" (UID: "acc1dbdf-a61c-4e68-a49d-33bc1e4396eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.360771 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-kube-api-access-2x46l" (OuterVolumeSpecName: "kube-api-access-2x46l") pod "acc1dbdf-a61c-4e68-a49d-33bc1e4396eb" (UID: "acc1dbdf-a61c-4e68-a49d-33bc1e4396eb"). InnerVolumeSpecName "kube-api-access-2x46l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.457833 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2x46l\" (UniqueName: \"kubernetes.io/projected/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-kube-api-access-2x46l\") on node \"crc\" DevicePath \"\"" Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.457884 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.543274 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "acc1dbdf-a61c-4e68-a49d-33bc1e4396eb" (UID: "acc1dbdf-a61c-4e68-a49d-33bc1e4396eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.559488 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.730784 4795 generic.go:334] "Generic (PLEG): container finished" podID="acc1dbdf-a61c-4e68-a49d-33bc1e4396eb" containerID="f89e5d52a6cbe3e6e5d621bbf7f239549572d0c852602bede32534077cfe914a" exitCode=0 Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.730831 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7c7cc" event={"ID":"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb","Type":"ContainerDied","Data":"f89e5d52a6cbe3e6e5d621bbf7f239549572d0c852602bede32534077cfe914a"} Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.731873 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7c7cc" event={"ID":"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb","Type":"ContainerDied","Data":"debfe03b9665db2cc109523ff1423a76bfce7cdda7d589e75926cc61bee4c9ef"} Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.730909 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.731902 4795 scope.go:117] "RemoveContainer" containerID="f89e5d52a6cbe3e6e5d621bbf7f239549572d0c852602bede32534077cfe914a" Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.769595 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7c7cc"] Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.770835 4795 scope.go:117] "RemoveContainer" containerID="eaa249cd0f1657d583a8712882677e627e7f81412a3b6890b15387bd2353b574" Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.779402 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7c7cc"] Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.794500 4795 scope.go:117] "RemoveContainer" containerID="21423fc563687e24411e1faae3b78c466da53cae2f1a4436cd02f4f6b90825c1" Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.832662 4795 scope.go:117] "RemoveContainer" containerID="f89e5d52a6cbe3e6e5d621bbf7f239549572d0c852602bede32534077cfe914a" Feb 19 23:43:37 crc kubenswrapper[4795]: E0219 23:43:37.833319 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f89e5d52a6cbe3e6e5d621bbf7f239549572d0c852602bede32534077cfe914a\": container with ID starting with f89e5d52a6cbe3e6e5d621bbf7f239549572d0c852602bede32534077cfe914a not found: ID does not exist" containerID="f89e5d52a6cbe3e6e5d621bbf7f239549572d0c852602bede32534077cfe914a" Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.833380 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f89e5d52a6cbe3e6e5d621bbf7f239549572d0c852602bede32534077cfe914a"} err="failed to get container status \"f89e5d52a6cbe3e6e5d621bbf7f239549572d0c852602bede32534077cfe914a\": rpc error: code = NotFound desc = could not find container \"f89e5d52a6cbe3e6e5d621bbf7f239549572d0c852602bede32534077cfe914a\": container with ID starting with f89e5d52a6cbe3e6e5d621bbf7f239549572d0c852602bede32534077cfe914a not found: ID does not exist" Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.833419 4795 scope.go:117] "RemoveContainer" containerID="eaa249cd0f1657d583a8712882677e627e7f81412a3b6890b15387bd2353b574" Feb 19 23:43:37 crc kubenswrapper[4795]: E0219 23:43:37.833860 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaa249cd0f1657d583a8712882677e627e7f81412a3b6890b15387bd2353b574\": container with ID starting with eaa249cd0f1657d583a8712882677e627e7f81412a3b6890b15387bd2353b574 not found: ID does not exist" containerID="eaa249cd0f1657d583a8712882677e627e7f81412a3b6890b15387bd2353b574" Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.833897 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaa249cd0f1657d583a8712882677e627e7f81412a3b6890b15387bd2353b574"} err="failed to get container status \"eaa249cd0f1657d583a8712882677e627e7f81412a3b6890b15387bd2353b574\": rpc error: code = NotFound desc = could not find container \"eaa249cd0f1657d583a8712882677e627e7f81412a3b6890b15387bd2353b574\": container with ID starting with eaa249cd0f1657d583a8712882677e627e7f81412a3b6890b15387bd2353b574 not found: ID does not exist" Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.833922 4795 scope.go:117] "RemoveContainer" containerID="21423fc563687e24411e1faae3b78c466da53cae2f1a4436cd02f4f6b90825c1" Feb 19 23:43:37 crc kubenswrapper[4795]: E0219 23:43:37.834220 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21423fc563687e24411e1faae3b78c466da53cae2f1a4436cd02f4f6b90825c1\": container with ID starting with 21423fc563687e24411e1faae3b78c466da53cae2f1a4436cd02f4f6b90825c1 not found: ID does not exist" containerID="21423fc563687e24411e1faae3b78c466da53cae2f1a4436cd02f4f6b90825c1" Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.834268 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21423fc563687e24411e1faae3b78c466da53cae2f1a4436cd02f4f6b90825c1"} err="failed to get container status \"21423fc563687e24411e1faae3b78c466da53cae2f1a4436cd02f4f6b90825c1\": rpc error: code = NotFound desc = could not find container \"21423fc563687e24411e1faae3b78c466da53cae2f1a4436cd02f4f6b90825c1\": container with ID starting with 21423fc563687e24411e1faae3b78c466da53cae2f1a4436cd02f4f6b90825c1 not found: ID does not exist" Feb 19 23:43:39 crc kubenswrapper[4795]: I0219 23:43:39.529001 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acc1dbdf-a61c-4e68-a49d-33bc1e4396eb" path="/var/lib/kubelet/pods/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb/volumes" Feb 19 23:43:58 crc kubenswrapper[4795]: I0219 23:43:58.427485 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:43:58 crc kubenswrapper[4795]: I0219 23:43:58.428224 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:44:18 crc kubenswrapper[4795]: I0219 23:44:18.143122 4795 generic.go:334] "Generic (PLEG): container finished" podID="8272a408-0416-4077-9e85-b2962992b3f4" containerID="0b42e21c1af1d461acf5578be3104a023047049f42895449a7c537e578dfd456" exitCode=0 Feb 19 23:44:18 crc kubenswrapper[4795]: I0219 23:44:18.143275 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-45fwk" event={"ID":"8272a408-0416-4077-9e85-b2962992b3f4","Type":"ContainerDied","Data":"0b42e21c1af1d461acf5578be3104a023047049f42895449a7c537e578dfd456"} Feb 19 23:44:18 crc kubenswrapper[4795]: I0219 23:44:18.828966 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lc9vb"] Feb 19 23:44:18 crc kubenswrapper[4795]: E0219 23:44:18.829654 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acc1dbdf-a61c-4e68-a49d-33bc1e4396eb" containerName="registry-server" Feb 19 23:44:18 crc kubenswrapper[4795]: I0219 23:44:18.829676 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="acc1dbdf-a61c-4e68-a49d-33bc1e4396eb" containerName="registry-server" Feb 19 23:44:18 crc kubenswrapper[4795]: E0219 23:44:18.829700 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acc1dbdf-a61c-4e68-a49d-33bc1e4396eb" containerName="extract-content" Feb 19 23:44:18 crc kubenswrapper[4795]: I0219 23:44:18.829712 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="acc1dbdf-a61c-4e68-a49d-33bc1e4396eb" containerName="extract-content" Feb 19 23:44:18 crc kubenswrapper[4795]: E0219 23:44:18.829737 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acc1dbdf-a61c-4e68-a49d-33bc1e4396eb" containerName="extract-utilities" Feb 19 23:44:18 crc kubenswrapper[4795]: I0219 23:44:18.829748 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="acc1dbdf-a61c-4e68-a49d-33bc1e4396eb" containerName="extract-utilities" Feb 19 23:44:18 crc kubenswrapper[4795]: I0219 23:44:18.830127 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="acc1dbdf-a61c-4e68-a49d-33bc1e4396eb" containerName="registry-server" Feb 19 23:44:18 crc kubenswrapper[4795]: I0219 23:44:18.832300 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:18 crc kubenswrapper[4795]: I0219 23:44:18.850517 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lc9vb"] Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.009164 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb7kx\" (UniqueName: \"kubernetes.io/projected/e6a307d7-05c1-401d-af88-11c5da428876-kube-api-access-cb7kx\") pod \"redhat-operators-lc9vb\" (UID: \"e6a307d7-05c1-401d-af88-11c5da428876\") " pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.009481 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6a307d7-05c1-401d-af88-11c5da428876-catalog-content\") pod \"redhat-operators-lc9vb\" (UID: \"e6a307d7-05c1-401d-af88-11c5da428876\") " pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.009624 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6a307d7-05c1-401d-af88-11c5da428876-utilities\") pod \"redhat-operators-lc9vb\" (UID: \"e6a307d7-05c1-401d-af88-11c5da428876\") " pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.110887 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6a307d7-05c1-401d-af88-11c5da428876-utilities\") pod \"redhat-operators-lc9vb\" (UID: \"e6a307d7-05c1-401d-af88-11c5da428876\") " pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.110983 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb7kx\" (UniqueName: \"kubernetes.io/projected/e6a307d7-05c1-401d-af88-11c5da428876-kube-api-access-cb7kx\") pod \"redhat-operators-lc9vb\" (UID: \"e6a307d7-05c1-401d-af88-11c5da428876\") " pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.111005 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6a307d7-05c1-401d-af88-11c5da428876-catalog-content\") pod \"redhat-operators-lc9vb\" (UID: \"e6a307d7-05c1-401d-af88-11c5da428876\") " pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.111652 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6a307d7-05c1-401d-af88-11c5da428876-catalog-content\") pod \"redhat-operators-lc9vb\" (UID: \"e6a307d7-05c1-401d-af88-11c5da428876\") " pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.111890 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6a307d7-05c1-401d-af88-11c5da428876-utilities\") pod \"redhat-operators-lc9vb\" (UID: \"e6a307d7-05c1-401d-af88-11c5da428876\") " pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.132508 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb7kx\" (UniqueName: \"kubernetes.io/projected/e6a307d7-05c1-401d-af88-11c5da428876-kube-api-access-cb7kx\") pod \"redhat-operators-lc9vb\" (UID: \"e6a307d7-05c1-401d-af88-11c5da428876\") " pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.162436 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.701480 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.833992 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-1\") pod \"8272a408-0416-4077-9e85-b2962992b3f4\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.834075 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ssh-key-openstack-cell1\") pod \"8272a408-0416-4077-9e85-b2962992b3f4\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.834132 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-telemetry-combined-ca-bundle\") pod \"8272a408-0416-4077-9e85-b2962992b3f4\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.834151 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-2\") pod \"8272a408-0416-4077-9e85-b2962992b3f4\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.834293 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-0\") pod \"8272a408-0416-4077-9e85-b2962992b3f4\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.834328 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-inventory\") pod \"8272a408-0416-4077-9e85-b2962992b3f4\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.834402 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceph\") pod \"8272a408-0416-4077-9e85-b2962992b3f4\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.834465 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d52zf\" (UniqueName: \"kubernetes.io/projected/8272a408-0416-4077-9e85-b2962992b3f4-kube-api-access-d52zf\") pod \"8272a408-0416-4077-9e85-b2962992b3f4\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.860429 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8272a408-0416-4077-9e85-b2962992b3f4-kube-api-access-d52zf" (OuterVolumeSpecName: "kube-api-access-d52zf") pod "8272a408-0416-4077-9e85-b2962992b3f4" (UID: "8272a408-0416-4077-9e85-b2962992b3f4"). InnerVolumeSpecName "kube-api-access-d52zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.860902 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "8272a408-0416-4077-9e85-b2962992b3f4" (UID: "8272a408-0416-4077-9e85-b2962992b3f4"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.862399 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceph" (OuterVolumeSpecName: "ceph") pod "8272a408-0416-4077-9e85-b2962992b3f4" (UID: "8272a408-0416-4077-9e85-b2962992b3f4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.870033 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lc9vb"] Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.895399 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "8272a408-0416-4077-9e85-b2962992b3f4" (UID: "8272a408-0416-4077-9e85-b2962992b3f4"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.896807 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-inventory" (OuterVolumeSpecName: "inventory") pod "8272a408-0416-4077-9e85-b2962992b3f4" (UID: "8272a408-0416-4077-9e85-b2962992b3f4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.898732 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "8272a408-0416-4077-9e85-b2962992b3f4" (UID: "8272a408-0416-4077-9e85-b2962992b3f4"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.926302 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "8272a408-0416-4077-9e85-b2962992b3f4" (UID: "8272a408-0416-4077-9e85-b2962992b3f4"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.926319 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "8272a408-0416-4077-9e85-b2962992b3f4" (UID: "8272a408-0416-4077-9e85-b2962992b3f4"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.937857 4795 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.937902 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.937913 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.937926 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d52zf\" (UniqueName: \"kubernetes.io/projected/8272a408-0416-4077-9e85-b2962992b3f4-kube-api-access-d52zf\") on node \"crc\" DevicePath \"\"" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.937938 4795 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.937950 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.937963 4795 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.937974 4795 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.161801 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-45fwk" event={"ID":"8272a408-0416-4077-9e85-b2962992b3f4","Type":"ContainerDied","Data":"4bd117515658cc36bb2328744f0a39f6b5d75a26befda093e2b0ac74514155d0"} Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.161846 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bd117515658cc36bb2328744f0a39f6b5d75a26befda093e2b0ac74514155d0" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.161862 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.163420 4795 generic.go:334] "Generic (PLEG): container finished" podID="e6a307d7-05c1-401d-af88-11c5da428876" containerID="99964ff6c5bf047856b3662224a9bfd59e713b1e0cc12a1e9a2713e420b021cf" exitCode=0 Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.163470 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lc9vb" event={"ID":"e6a307d7-05c1-401d-af88-11c5da428876","Type":"ContainerDied","Data":"99964ff6c5bf047856b3662224a9bfd59e713b1e0cc12a1e9a2713e420b021cf"} Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.163503 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lc9vb" event={"ID":"e6a307d7-05c1-401d-af88-11c5da428876","Type":"ContainerStarted","Data":"2ed696cd125461c0e2485dab7fecdb64757ed0c48c355f72d39b69f2db14c4c7"} Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.276245 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-56tm4"] Feb 19 23:44:20 crc kubenswrapper[4795]: E0219 23:44:20.276710 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8272a408-0416-4077-9e85-b2962992b3f4" containerName="telemetry-openstack-openstack-cell1" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.276729 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8272a408-0416-4077-9e85-b2962992b3f4" containerName="telemetry-openstack-openstack-cell1" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.276913 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8272a408-0416-4077-9e85-b2962992b3f4" containerName="telemetry-openstack-openstack-cell1" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.277690 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.281411 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.281613 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.281659 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.284607 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.284834 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.285415 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-56tm4"] Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.446745 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np22w\" (UniqueName: \"kubernetes.io/projected/a29cf217-b932-4515-a8e6-4bb762611d24-kube-api-access-np22w\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.447245 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.447272 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.447307 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.447587 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.447695 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.549997 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.550053 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.550100 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np22w\" (UniqueName: \"kubernetes.io/projected/a29cf217-b932-4515-a8e6-4bb762611d24-kube-api-access-np22w\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.550231 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.550255 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.550283 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.556418 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.556430 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.556717 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.557291 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.558191 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.566905 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np22w\" (UniqueName: \"kubernetes.io/projected/a29cf217-b932-4515-a8e6-4bb762611d24-kube-api-access-np22w\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.614428 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:21 crc kubenswrapper[4795]: I0219 23:44:21.176612 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lc9vb" event={"ID":"e6a307d7-05c1-401d-af88-11c5da428876","Type":"ContainerStarted","Data":"fb078423dccc9a88288f14fb047d5ee2ff99fc2cebd5f6d24cb44c6f8cac1247"} Feb 19 23:44:21 crc kubenswrapper[4795]: I0219 23:44:21.192029 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-56tm4"] Feb 19 23:44:21 crc kubenswrapper[4795]: W0219 23:44:21.198064 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda29cf217_b932_4515_a8e6_4bb762611d24.slice/crio-5aa89fc378f6fea93340176af76e33a4d70c8767a42e44d4fe641addacdbb883 WatchSource:0}: Error finding container 5aa89fc378f6fea93340176af76e33a4d70c8767a42e44d4fe641addacdbb883: Status 404 returned error can't find the container with id 5aa89fc378f6fea93340176af76e33a4d70c8767a42e44d4fe641addacdbb883 Feb 19 23:44:22 crc kubenswrapper[4795]: I0219 23:44:22.189528 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" event={"ID":"a29cf217-b932-4515-a8e6-4bb762611d24","Type":"ContainerStarted","Data":"5aa89fc378f6fea93340176af76e33a4d70c8767a42e44d4fe641addacdbb883"} Feb 19 23:44:23 crc kubenswrapper[4795]: I0219 23:44:23.205517 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" event={"ID":"a29cf217-b932-4515-a8e6-4bb762611d24","Type":"ContainerStarted","Data":"1a18107b1e847d549a8cb6704fc642c32d982fc02150fa86a18638e1eb9c0ebc"} Feb 19 23:44:23 crc kubenswrapper[4795]: I0219 23:44:23.236039 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" podStartSLOduration=2.570472733 podStartE2EDuration="3.23601117s" podCreationTimestamp="2026-02-19 23:44:20 +0000 UTC" firstStartedPulling="2026-02-19 23:44:21.203753503 +0000 UTC m=+8172.396271367" lastFinishedPulling="2026-02-19 23:44:21.86929193 +0000 UTC m=+8173.061809804" observedRunningTime="2026-02-19 23:44:23.222680152 +0000 UTC m=+8174.415198036" watchObservedRunningTime="2026-02-19 23:44:23.23601117 +0000 UTC m=+8174.428529044" Feb 19 23:44:24 crc kubenswrapper[4795]: I0219 23:44:24.216873 4795 generic.go:334] "Generic (PLEG): container finished" podID="e6a307d7-05c1-401d-af88-11c5da428876" containerID="fb078423dccc9a88288f14fb047d5ee2ff99fc2cebd5f6d24cb44c6f8cac1247" exitCode=0 Feb 19 23:44:24 crc kubenswrapper[4795]: I0219 23:44:24.216967 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lc9vb" event={"ID":"e6a307d7-05c1-401d-af88-11c5da428876","Type":"ContainerDied","Data":"fb078423dccc9a88288f14fb047d5ee2ff99fc2cebd5f6d24cb44c6f8cac1247"} Feb 19 23:44:25 crc kubenswrapper[4795]: I0219 23:44:25.234227 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lc9vb" event={"ID":"e6a307d7-05c1-401d-af88-11c5da428876","Type":"ContainerStarted","Data":"d625d0002e7a8e1719ea0b0a1f420de6c0a47f0cdc083568ab96fbaff0aa6925"} Feb 19 23:44:25 crc kubenswrapper[4795]: I0219 23:44:25.265932 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lc9vb" podStartSLOduration=2.847881875 podStartE2EDuration="7.265913904s" podCreationTimestamp="2026-02-19 23:44:18 +0000 UTC" firstStartedPulling="2026-02-19 23:44:20.165963184 +0000 UTC m=+8171.358481048" lastFinishedPulling="2026-02-19 23:44:24.583995203 +0000 UTC m=+8175.776513077" observedRunningTime="2026-02-19 23:44:25.258366865 +0000 UTC m=+8176.450884749" watchObservedRunningTime="2026-02-19 23:44:25.265913904 +0000 UTC m=+8176.458431768" Feb 19 23:44:28 crc kubenswrapper[4795]: I0219 23:44:28.427961 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:44:28 crc kubenswrapper[4795]: I0219 23:44:28.428618 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:44:29 crc kubenswrapper[4795]: I0219 23:44:29.163996 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:29 crc kubenswrapper[4795]: I0219 23:44:29.164065 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:30 crc kubenswrapper[4795]: I0219 23:44:30.216766 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lc9vb" podUID="e6a307d7-05c1-401d-af88-11c5da428876" containerName="registry-server" probeResult="failure" output=< Feb 19 23:44:30 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Feb 19 23:44:30 crc kubenswrapper[4795]: > Feb 19 23:44:39 crc kubenswrapper[4795]: I0219 23:44:39.218129 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:39 crc kubenswrapper[4795]: I0219 23:44:39.269140 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:39 crc kubenswrapper[4795]: I0219 23:44:39.456114 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lc9vb"] Feb 19 23:44:40 crc kubenswrapper[4795]: I0219 23:44:40.383368 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lc9vb" podUID="e6a307d7-05c1-401d-af88-11c5da428876" containerName="registry-server" containerID="cri-o://d625d0002e7a8e1719ea0b0a1f420de6c0a47f0cdc083568ab96fbaff0aa6925" gracePeriod=2 Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.032916 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.201153 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6a307d7-05c1-401d-af88-11c5da428876-catalog-content\") pod \"e6a307d7-05c1-401d-af88-11c5da428876\" (UID: \"e6a307d7-05c1-401d-af88-11c5da428876\") " Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.201270 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb7kx\" (UniqueName: \"kubernetes.io/projected/e6a307d7-05c1-401d-af88-11c5da428876-kube-api-access-cb7kx\") pod \"e6a307d7-05c1-401d-af88-11c5da428876\" (UID: \"e6a307d7-05c1-401d-af88-11c5da428876\") " Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.201487 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6a307d7-05c1-401d-af88-11c5da428876-utilities\") pod \"e6a307d7-05c1-401d-af88-11c5da428876\" (UID: \"e6a307d7-05c1-401d-af88-11c5da428876\") " Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.202261 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6a307d7-05c1-401d-af88-11c5da428876-utilities" (OuterVolumeSpecName: "utilities") pod "e6a307d7-05c1-401d-af88-11c5da428876" (UID: "e6a307d7-05c1-401d-af88-11c5da428876"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.209421 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6a307d7-05c1-401d-af88-11c5da428876-kube-api-access-cb7kx" (OuterVolumeSpecName: "kube-api-access-cb7kx") pod "e6a307d7-05c1-401d-af88-11c5da428876" (UID: "e6a307d7-05c1-401d-af88-11c5da428876"). InnerVolumeSpecName "kube-api-access-cb7kx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.303885 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb7kx\" (UniqueName: \"kubernetes.io/projected/e6a307d7-05c1-401d-af88-11c5da428876-kube-api-access-cb7kx\") on node \"crc\" DevicePath \"\"" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.303920 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6a307d7-05c1-401d-af88-11c5da428876-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.322781 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6a307d7-05c1-401d-af88-11c5da428876-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6a307d7-05c1-401d-af88-11c5da428876" (UID: "e6a307d7-05c1-401d-af88-11c5da428876"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.395796 4795 generic.go:334] "Generic (PLEG): container finished" podID="e6a307d7-05c1-401d-af88-11c5da428876" containerID="d625d0002e7a8e1719ea0b0a1f420de6c0a47f0cdc083568ab96fbaff0aa6925" exitCode=0 Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.395897 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.396683 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lc9vb" event={"ID":"e6a307d7-05c1-401d-af88-11c5da428876","Type":"ContainerDied","Data":"d625d0002e7a8e1719ea0b0a1f420de6c0a47f0cdc083568ab96fbaff0aa6925"} Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.396806 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lc9vb" event={"ID":"e6a307d7-05c1-401d-af88-11c5da428876","Type":"ContainerDied","Data":"2ed696cd125461c0e2485dab7fecdb64757ed0c48c355f72d39b69f2db14c4c7"} Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.396894 4795 scope.go:117] "RemoveContainer" containerID="d625d0002e7a8e1719ea0b0a1f420de6c0a47f0cdc083568ab96fbaff0aa6925" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.406216 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6a307d7-05c1-401d-af88-11c5da428876-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.435840 4795 scope.go:117] "RemoveContainer" containerID="fb078423dccc9a88288f14fb047d5ee2ff99fc2cebd5f6d24cb44c6f8cac1247" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.441904 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lc9vb"] Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.452406 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lc9vb"] Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.464551 4795 scope.go:117] "RemoveContainer" containerID="99964ff6c5bf047856b3662224a9bfd59e713b1e0cc12a1e9a2713e420b021cf" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.509868 4795 scope.go:117] "RemoveContainer" containerID="d625d0002e7a8e1719ea0b0a1f420de6c0a47f0cdc083568ab96fbaff0aa6925" Feb 19 23:44:41 crc kubenswrapper[4795]: E0219 23:44:41.510362 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d625d0002e7a8e1719ea0b0a1f420de6c0a47f0cdc083568ab96fbaff0aa6925\": container with ID starting with d625d0002e7a8e1719ea0b0a1f420de6c0a47f0cdc083568ab96fbaff0aa6925 not found: ID does not exist" containerID="d625d0002e7a8e1719ea0b0a1f420de6c0a47f0cdc083568ab96fbaff0aa6925" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.510391 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d625d0002e7a8e1719ea0b0a1f420de6c0a47f0cdc083568ab96fbaff0aa6925"} err="failed to get container status \"d625d0002e7a8e1719ea0b0a1f420de6c0a47f0cdc083568ab96fbaff0aa6925\": rpc error: code = NotFound desc = could not find container \"d625d0002e7a8e1719ea0b0a1f420de6c0a47f0cdc083568ab96fbaff0aa6925\": container with ID starting with d625d0002e7a8e1719ea0b0a1f420de6c0a47f0cdc083568ab96fbaff0aa6925 not found: ID does not exist" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.510410 4795 scope.go:117] "RemoveContainer" containerID="fb078423dccc9a88288f14fb047d5ee2ff99fc2cebd5f6d24cb44c6f8cac1247" Feb 19 23:44:41 crc kubenswrapper[4795]: E0219 23:44:41.510848 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb078423dccc9a88288f14fb047d5ee2ff99fc2cebd5f6d24cb44c6f8cac1247\": container with ID starting with fb078423dccc9a88288f14fb047d5ee2ff99fc2cebd5f6d24cb44c6f8cac1247 not found: ID does not exist" containerID="fb078423dccc9a88288f14fb047d5ee2ff99fc2cebd5f6d24cb44c6f8cac1247" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.510894 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb078423dccc9a88288f14fb047d5ee2ff99fc2cebd5f6d24cb44c6f8cac1247"} err="failed to get container status \"fb078423dccc9a88288f14fb047d5ee2ff99fc2cebd5f6d24cb44c6f8cac1247\": rpc error: code = NotFound desc = could not find container \"fb078423dccc9a88288f14fb047d5ee2ff99fc2cebd5f6d24cb44c6f8cac1247\": container with ID starting with fb078423dccc9a88288f14fb047d5ee2ff99fc2cebd5f6d24cb44c6f8cac1247 not found: ID does not exist" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.510921 4795 scope.go:117] "RemoveContainer" containerID="99964ff6c5bf047856b3662224a9bfd59e713b1e0cc12a1e9a2713e420b021cf" Feb 19 23:44:41 crc kubenswrapper[4795]: E0219 23:44:41.511223 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99964ff6c5bf047856b3662224a9bfd59e713b1e0cc12a1e9a2713e420b021cf\": container with ID starting with 99964ff6c5bf047856b3662224a9bfd59e713b1e0cc12a1e9a2713e420b021cf not found: ID does not exist" containerID="99964ff6c5bf047856b3662224a9bfd59e713b1e0cc12a1e9a2713e420b021cf" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.511259 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99964ff6c5bf047856b3662224a9bfd59e713b1e0cc12a1e9a2713e420b021cf"} err="failed to get container status \"99964ff6c5bf047856b3662224a9bfd59e713b1e0cc12a1e9a2713e420b021cf\": rpc error: code = NotFound desc = could not find container \"99964ff6c5bf047856b3662224a9bfd59e713b1e0cc12a1e9a2713e420b021cf\": container with ID starting with 99964ff6c5bf047856b3662224a9bfd59e713b1e0cc12a1e9a2713e420b021cf not found: ID does not exist" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.524691 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6a307d7-05c1-401d-af88-11c5da428876" path="/var/lib/kubelet/pods/e6a307d7-05c1-401d-af88-11c5da428876/volumes" Feb 19 23:44:58 crc kubenswrapper[4795]: I0219 23:44:58.427064 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:44:58 crc kubenswrapper[4795]: I0219 23:44:58.427611 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:44:58 crc kubenswrapper[4795]: I0219 23:44:58.427652 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 23:44:58 crc kubenswrapper[4795]: I0219 23:44:58.428441 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9e31f1c0e7be53ce872e54fe0d6436f7bde017185ed0c455c7619a4196124fcf"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 23:44:58 crc kubenswrapper[4795]: I0219 23:44:58.428485 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://9e31f1c0e7be53ce872e54fe0d6436f7bde017185ed0c455c7619a4196124fcf" gracePeriod=600 Feb 19 23:44:58 crc kubenswrapper[4795]: I0219 23:44:58.589508 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="9e31f1c0e7be53ce872e54fe0d6436f7bde017185ed0c455c7619a4196124fcf" exitCode=0 Feb 19 23:44:58 crc kubenswrapper[4795]: I0219 23:44:58.589548 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"9e31f1c0e7be53ce872e54fe0d6436f7bde017185ed0c455c7619a4196124fcf"} Feb 19 23:44:58 crc kubenswrapper[4795]: I0219 23:44:58.589585 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:44:59 crc kubenswrapper[4795]: I0219 23:44:59.599828 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a"} Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.163805 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j"] Feb 19 23:45:00 crc kubenswrapper[4795]: E0219 23:45:00.164745 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6a307d7-05c1-401d-af88-11c5da428876" containerName="extract-utilities" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.164770 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6a307d7-05c1-401d-af88-11c5da428876" containerName="extract-utilities" Feb 19 23:45:00 crc kubenswrapper[4795]: E0219 23:45:00.164796 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6a307d7-05c1-401d-af88-11c5da428876" containerName="registry-server" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.164805 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6a307d7-05c1-401d-af88-11c5da428876" containerName="registry-server" Feb 19 23:45:00 crc kubenswrapper[4795]: E0219 23:45:00.164844 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6a307d7-05c1-401d-af88-11c5da428876" containerName="extract-content" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.164855 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6a307d7-05c1-401d-af88-11c5da428876" containerName="extract-content" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.165302 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6a307d7-05c1-401d-af88-11c5da428876" containerName="registry-server" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.166477 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.168785 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.168963 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.178426 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j"] Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.342434 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khs4c\" (UniqueName: \"kubernetes.io/projected/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-kube-api-access-khs4c\") pod \"collect-profiles-29525745-x445j\" (UID: \"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.342551 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-config-volume\") pod \"collect-profiles-29525745-x445j\" (UID: \"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.342622 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-secret-volume\") pod \"collect-profiles-29525745-x445j\" (UID: \"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.444273 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-secret-volume\") pod \"collect-profiles-29525745-x445j\" (UID: \"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.444405 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khs4c\" (UniqueName: \"kubernetes.io/projected/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-kube-api-access-khs4c\") pod \"collect-profiles-29525745-x445j\" (UID: \"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.444513 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-config-volume\") pod \"collect-profiles-29525745-x445j\" (UID: \"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.446376 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-config-volume\") pod \"collect-profiles-29525745-x445j\" (UID: \"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.450866 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-secret-volume\") pod \"collect-profiles-29525745-x445j\" (UID: \"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.460508 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khs4c\" (UniqueName: \"kubernetes.io/projected/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-kube-api-access-khs4c\") pod \"collect-profiles-29525745-x445j\" (UID: \"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.504622 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j" Feb 19 23:45:01 crc kubenswrapper[4795]: I0219 23:45:01.001490 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j"] Feb 19 23:45:01 crc kubenswrapper[4795]: I0219 23:45:01.625026 4795 generic.go:334] "Generic (PLEG): container finished" podID="7b73c4c3-06f1-4f35-b850-7ad4bb65c99d" containerID="0c1d03404adfbd4c18c1ebcc173828750273a5e9125c4f9041a81557f10583df" exitCode=0 Feb 19 23:45:01 crc kubenswrapper[4795]: I0219 23:45:01.625105 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j" event={"ID":"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d","Type":"ContainerDied","Data":"0c1d03404adfbd4c18c1ebcc173828750273a5e9125c4f9041a81557f10583df"} Feb 19 23:45:01 crc kubenswrapper[4795]: I0219 23:45:01.625330 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j" event={"ID":"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d","Type":"ContainerStarted","Data":"723918bc990f581f7fd525906c5a8a3950e23b609e1c7a7f8e318a49812fd600"} Feb 19 23:45:02 crc kubenswrapper[4795]: I0219 23:45:02.997834 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j" Feb 19 23:45:03 crc kubenswrapper[4795]: I0219 23:45:03.010351 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khs4c\" (UniqueName: \"kubernetes.io/projected/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-kube-api-access-khs4c\") pod \"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d\" (UID: \"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d\") " Feb 19 23:45:03 crc kubenswrapper[4795]: I0219 23:45:03.010397 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-config-volume\") pod \"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d\" (UID: \"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d\") " Feb 19 23:45:03 crc kubenswrapper[4795]: I0219 23:45:03.010728 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-secret-volume\") pod \"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d\" (UID: \"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d\") " Feb 19 23:45:03 crc kubenswrapper[4795]: I0219 23:45:03.011228 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-config-volume" (OuterVolumeSpecName: "config-volume") pod "7b73c4c3-06f1-4f35-b850-7ad4bb65c99d" (UID: "7b73c4c3-06f1-4f35-b850-7ad4bb65c99d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:45:03 crc kubenswrapper[4795]: I0219 23:45:03.016349 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7b73c4c3-06f1-4f35-b850-7ad4bb65c99d" (UID: "7b73c4c3-06f1-4f35-b850-7ad4bb65c99d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:45:03 crc kubenswrapper[4795]: I0219 23:45:03.017505 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-kube-api-access-khs4c" (OuterVolumeSpecName: "kube-api-access-khs4c") pod "7b73c4c3-06f1-4f35-b850-7ad4bb65c99d" (UID: "7b73c4c3-06f1-4f35-b850-7ad4bb65c99d"). InnerVolumeSpecName "kube-api-access-khs4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:45:03 crc kubenswrapper[4795]: I0219 23:45:03.112228 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 23:45:03 crc kubenswrapper[4795]: I0219 23:45:03.112264 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khs4c\" (UniqueName: \"kubernetes.io/projected/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-kube-api-access-khs4c\") on node \"crc\" DevicePath \"\"" Feb 19 23:45:03 crc kubenswrapper[4795]: I0219 23:45:03.112275 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 23:45:03 crc kubenswrapper[4795]: I0219 23:45:03.645978 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j" event={"ID":"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d","Type":"ContainerDied","Data":"723918bc990f581f7fd525906c5a8a3950e23b609e1c7a7f8e318a49812fd600"} Feb 19 23:45:03 crc kubenswrapper[4795]: I0219 23:45:03.646492 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="723918bc990f581f7fd525906c5a8a3950e23b609e1c7a7f8e318a49812fd600" Feb 19 23:45:03 crc kubenswrapper[4795]: I0219 23:45:03.646063 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j" Feb 19 23:45:04 crc kubenswrapper[4795]: I0219 23:45:04.082282 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68"] Feb 19 23:45:04 crc kubenswrapper[4795]: I0219 23:45:04.091213 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68"] Feb 19 23:45:05 crc kubenswrapper[4795]: I0219 23:45:05.527144 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40381732-f007-4395-b8d1-02b3fc37b091" path="/var/lib/kubelet/pods/40381732-f007-4395-b8d1-02b3fc37b091/volumes" Feb 19 23:45:26 crc kubenswrapper[4795]: I0219 23:45:26.883129 4795 generic.go:334] "Generic (PLEG): container finished" podID="a29cf217-b932-4515-a8e6-4bb762611d24" containerID="1a18107b1e847d549a8cb6704fc642c32d982fc02150fa86a18638e1eb9c0ebc" exitCode=0 Feb 19 23:45:26 crc kubenswrapper[4795]: I0219 23:45:26.884586 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" event={"ID":"a29cf217-b932-4515-a8e6-4bb762611d24","Type":"ContainerDied","Data":"1a18107b1e847d549a8cb6704fc642c32d982fc02150fa86a18638e1eb9c0ebc"} Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.429977 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.515405 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-neutron-sriov-agent-neutron-config-0\") pod \"a29cf217-b932-4515-a8e6-4bb762611d24\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.515509 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np22w\" (UniqueName: \"kubernetes.io/projected/a29cf217-b932-4515-a8e6-4bb762611d24-kube-api-access-np22w\") pod \"a29cf217-b932-4515-a8e6-4bb762611d24\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.515626 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-neutron-sriov-combined-ca-bundle\") pod \"a29cf217-b932-4515-a8e6-4bb762611d24\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.515721 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-ceph\") pod \"a29cf217-b932-4515-a8e6-4bb762611d24\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.515765 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-inventory\") pod \"a29cf217-b932-4515-a8e6-4bb762611d24\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.515806 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-ssh-key-openstack-cell1\") pod \"a29cf217-b932-4515-a8e6-4bb762611d24\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.533496 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-ceph" (OuterVolumeSpecName: "ceph") pod "a29cf217-b932-4515-a8e6-4bb762611d24" (UID: "a29cf217-b932-4515-a8e6-4bb762611d24"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.533534 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "a29cf217-b932-4515-a8e6-4bb762611d24" (UID: "a29cf217-b932-4515-a8e6-4bb762611d24"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.533779 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a29cf217-b932-4515-a8e6-4bb762611d24-kube-api-access-np22w" (OuterVolumeSpecName: "kube-api-access-np22w") pod "a29cf217-b932-4515-a8e6-4bb762611d24" (UID: "a29cf217-b932-4515-a8e6-4bb762611d24"). InnerVolumeSpecName "kube-api-access-np22w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.557462 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "a29cf217-b932-4515-a8e6-4bb762611d24" (UID: "a29cf217-b932-4515-a8e6-4bb762611d24"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.567362 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-inventory" (OuterVolumeSpecName: "inventory") pod "a29cf217-b932-4515-a8e6-4bb762611d24" (UID: "a29cf217-b932-4515-a8e6-4bb762611d24"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.569211 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "a29cf217-b932-4515-a8e6-4bb762611d24" (UID: "a29cf217-b932-4515-a8e6-4bb762611d24"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.618077 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.618128 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.618144 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.618161 4795 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.618198 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np22w\" (UniqueName: \"kubernetes.io/projected/a29cf217-b932-4515-a8e6-4bb762611d24-kube-api-access-np22w\") on node \"crc\" DevicePath \"\"" Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.618216 4795 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.672933 4795 scope.go:117] "RemoveContainer" containerID="f0a7532029d7c277b9d534f1fee42695eb4906e52f264e18f9db0ba49bbbdeb7" Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.927914 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" event={"ID":"a29cf217-b932-4515-a8e6-4bb762611d24","Type":"ContainerDied","Data":"5aa89fc378f6fea93340176af76e33a4d70c8767a42e44d4fe641addacdbb883"} Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.927954 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5aa89fc378f6fea93340176af76e33a4d70c8767a42e44d4fe641addacdbb883" Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.927968 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.043255 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-khvh4"] Feb 19 23:45:29 crc kubenswrapper[4795]: E0219 23:45:29.043675 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b73c4c3-06f1-4f35-b850-7ad4bb65c99d" containerName="collect-profiles" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.043691 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b73c4c3-06f1-4f35-b850-7ad4bb65c99d" containerName="collect-profiles" Feb 19 23:45:29 crc kubenswrapper[4795]: E0219 23:45:29.043734 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a29cf217-b932-4515-a8e6-4bb762611d24" containerName="neutron-sriov-openstack-openstack-cell1" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.043743 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a29cf217-b932-4515-a8e6-4bb762611d24" containerName="neutron-sriov-openstack-openstack-cell1" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.043937 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a29cf217-b932-4515-a8e6-4bb762611d24" containerName="neutron-sriov-openstack-openstack-cell1" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.043952 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b73c4c3-06f1-4f35-b850-7ad4bb65c99d" containerName="collect-profiles" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.044644 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.066838 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.067121 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.067259 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.067367 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.067473 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.074080 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-khvh4"] Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.130538 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.131008 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spsv6\" (UniqueName: \"kubernetes.io/projected/ff3df901-a0ae-456e-8103-60aaa6439785-kube-api-access-spsv6\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.131099 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.131131 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.131285 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.131380 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.232697 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spsv6\" (UniqueName: \"kubernetes.io/projected/ff3df901-a0ae-456e-8103-60aaa6439785-kube-api-access-spsv6\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.232828 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.232862 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.232928 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.232999 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.233068 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.237384 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.237419 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.237519 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.237826 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.238253 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.249624 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spsv6\" (UniqueName: \"kubernetes.io/projected/ff3df901-a0ae-456e-8103-60aaa6439785-kube-api-access-spsv6\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.395768 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.947561 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-khvh4"] Feb 19 23:45:30 crc kubenswrapper[4795]: I0219 23:45:30.957387 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" event={"ID":"ff3df901-a0ae-456e-8103-60aaa6439785","Type":"ContainerStarted","Data":"c2f202e4a47f5834d5d4ba7b5ccaec0183d8ee7a0ac71915df5f8ba0ec70bb7a"} Feb 19 23:45:30 crc kubenswrapper[4795]: I0219 23:45:30.958573 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" event={"ID":"ff3df901-a0ae-456e-8103-60aaa6439785","Type":"ContainerStarted","Data":"0676394dc314b80aee4627ac3a27539eb9572cc4cc7886c949a8f92d0416d8a3"} Feb 19 23:45:30 crc kubenswrapper[4795]: I0219 23:45:30.979260 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" podStartSLOduration=1.5180314940000001 podStartE2EDuration="1.979241577s" podCreationTimestamp="2026-02-19 23:45:29 +0000 UTC" firstStartedPulling="2026-02-19 23:45:29.961617696 +0000 UTC m=+8241.154135590" lastFinishedPulling="2026-02-19 23:45:30.422827809 +0000 UTC m=+8241.615345673" observedRunningTime="2026-02-19 23:45:30.974044943 +0000 UTC m=+8242.166562817" watchObservedRunningTime="2026-02-19 23:45:30.979241577 +0000 UTC m=+8242.171759451" Feb 19 23:46:25 crc kubenswrapper[4795]: I0219 23:46:25.482758 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sbmn2"] Feb 19 23:46:25 crc kubenswrapper[4795]: I0219 23:46:25.486569 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:25 crc kubenswrapper[4795]: I0219 23:46:25.500835 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbmn2"] Feb 19 23:46:25 crc kubenswrapper[4795]: I0219 23:46:25.556893 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b84443bf-9eee-4582-975b-6eb1a02b856b-utilities\") pod \"redhat-marketplace-sbmn2\" (UID: \"b84443bf-9eee-4582-975b-6eb1a02b856b\") " pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:25 crc kubenswrapper[4795]: I0219 23:46:25.557034 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b84443bf-9eee-4582-975b-6eb1a02b856b-catalog-content\") pod \"redhat-marketplace-sbmn2\" (UID: \"b84443bf-9eee-4582-975b-6eb1a02b856b\") " pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:25 crc kubenswrapper[4795]: I0219 23:46:25.557080 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmrfb\" (UniqueName: \"kubernetes.io/projected/b84443bf-9eee-4582-975b-6eb1a02b856b-kube-api-access-dmrfb\") pod \"redhat-marketplace-sbmn2\" (UID: \"b84443bf-9eee-4582-975b-6eb1a02b856b\") " pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:25 crc kubenswrapper[4795]: I0219 23:46:25.659088 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b84443bf-9eee-4582-975b-6eb1a02b856b-utilities\") pod \"redhat-marketplace-sbmn2\" (UID: \"b84443bf-9eee-4582-975b-6eb1a02b856b\") " pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:25 crc kubenswrapper[4795]: I0219 23:46:25.659302 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b84443bf-9eee-4582-975b-6eb1a02b856b-catalog-content\") pod \"redhat-marketplace-sbmn2\" (UID: \"b84443bf-9eee-4582-975b-6eb1a02b856b\") " pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:25 crc kubenswrapper[4795]: I0219 23:46:25.659377 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmrfb\" (UniqueName: \"kubernetes.io/projected/b84443bf-9eee-4582-975b-6eb1a02b856b-kube-api-access-dmrfb\") pod \"redhat-marketplace-sbmn2\" (UID: \"b84443bf-9eee-4582-975b-6eb1a02b856b\") " pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:25 crc kubenswrapper[4795]: I0219 23:46:25.660898 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b84443bf-9eee-4582-975b-6eb1a02b856b-catalog-content\") pod \"redhat-marketplace-sbmn2\" (UID: \"b84443bf-9eee-4582-975b-6eb1a02b856b\") " pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:25 crc kubenswrapper[4795]: I0219 23:46:25.661157 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b84443bf-9eee-4582-975b-6eb1a02b856b-utilities\") pod \"redhat-marketplace-sbmn2\" (UID: \"b84443bf-9eee-4582-975b-6eb1a02b856b\") " pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:25 crc kubenswrapper[4795]: I0219 23:46:25.702199 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmrfb\" (UniqueName: \"kubernetes.io/projected/b84443bf-9eee-4582-975b-6eb1a02b856b-kube-api-access-dmrfb\") pod \"redhat-marketplace-sbmn2\" (UID: \"b84443bf-9eee-4582-975b-6eb1a02b856b\") " pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:25 crc kubenswrapper[4795]: I0219 23:46:25.823436 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:26 crc kubenswrapper[4795]: I0219 23:46:26.288892 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbmn2"] Feb 19 23:46:26 crc kubenswrapper[4795]: I0219 23:46:26.606107 4795 generic.go:334] "Generic (PLEG): container finished" podID="b84443bf-9eee-4582-975b-6eb1a02b856b" containerID="15e2caf1d36a29c75635195d8c25d9abf28cb859ec7996d4c3c0d8dd5e734adc" exitCode=0 Feb 19 23:46:26 crc kubenswrapper[4795]: I0219 23:46:26.606335 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbmn2" event={"ID":"b84443bf-9eee-4582-975b-6eb1a02b856b","Type":"ContainerDied","Data":"15e2caf1d36a29c75635195d8c25d9abf28cb859ec7996d4c3c0d8dd5e734adc"} Feb 19 23:46:26 crc kubenswrapper[4795]: I0219 23:46:26.606453 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbmn2" event={"ID":"b84443bf-9eee-4582-975b-6eb1a02b856b","Type":"ContainerStarted","Data":"82b949cb4644af7dac55f5e52cb7a9cd5efb4d958c1a6c32fcc97200612266db"} Feb 19 23:46:26 crc kubenswrapper[4795]: I0219 23:46:26.609359 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 23:46:27 crc kubenswrapper[4795]: I0219 23:46:27.617893 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbmn2" event={"ID":"b84443bf-9eee-4582-975b-6eb1a02b856b","Type":"ContainerStarted","Data":"9c5d1dddf86e209c0b9a3a26fcbdc4092f27328a5bf6a0d47b8e78aa6b9732b1"} Feb 19 23:46:28 crc kubenswrapper[4795]: I0219 23:46:28.629570 4795 generic.go:334] "Generic (PLEG): container finished" podID="b84443bf-9eee-4582-975b-6eb1a02b856b" containerID="9c5d1dddf86e209c0b9a3a26fcbdc4092f27328a5bf6a0d47b8e78aa6b9732b1" exitCode=0 Feb 19 23:46:28 crc kubenswrapper[4795]: I0219 23:46:28.629648 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbmn2" event={"ID":"b84443bf-9eee-4582-975b-6eb1a02b856b","Type":"ContainerDied","Data":"9c5d1dddf86e209c0b9a3a26fcbdc4092f27328a5bf6a0d47b8e78aa6b9732b1"} Feb 19 23:46:29 crc kubenswrapper[4795]: I0219 23:46:29.641926 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbmn2" event={"ID":"b84443bf-9eee-4582-975b-6eb1a02b856b","Type":"ContainerStarted","Data":"73313296109df7e2e49c16e07eb57c12eccff3c34b12446456875205b34255f4"} Feb 19 23:46:29 crc kubenswrapper[4795]: I0219 23:46:29.666628 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sbmn2" podStartSLOduration=2.230168811 podStartE2EDuration="4.666611764s" podCreationTimestamp="2026-02-19 23:46:25 +0000 UTC" firstStartedPulling="2026-02-19 23:46:26.609067853 +0000 UTC m=+8297.801585727" lastFinishedPulling="2026-02-19 23:46:29.045510816 +0000 UTC m=+8300.238028680" observedRunningTime="2026-02-19 23:46:29.666385928 +0000 UTC m=+8300.858903792" watchObservedRunningTime="2026-02-19 23:46:29.666611764 +0000 UTC m=+8300.859129628" Feb 19 23:46:35 crc kubenswrapper[4795]: I0219 23:46:35.824205 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:35 crc kubenswrapper[4795]: I0219 23:46:35.824806 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:35 crc kubenswrapper[4795]: I0219 23:46:35.877849 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:36 crc kubenswrapper[4795]: I0219 23:46:36.765808 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:36 crc kubenswrapper[4795]: I0219 23:46:36.813091 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbmn2"] Feb 19 23:46:38 crc kubenswrapper[4795]: I0219 23:46:38.733710 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sbmn2" podUID="b84443bf-9eee-4582-975b-6eb1a02b856b" containerName="registry-server" containerID="cri-o://73313296109df7e2e49c16e07eb57c12eccff3c34b12446456875205b34255f4" gracePeriod=2 Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.199088 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.294598 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmrfb\" (UniqueName: \"kubernetes.io/projected/b84443bf-9eee-4582-975b-6eb1a02b856b-kube-api-access-dmrfb\") pod \"b84443bf-9eee-4582-975b-6eb1a02b856b\" (UID: \"b84443bf-9eee-4582-975b-6eb1a02b856b\") " Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.294923 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b84443bf-9eee-4582-975b-6eb1a02b856b-utilities\") pod \"b84443bf-9eee-4582-975b-6eb1a02b856b\" (UID: \"b84443bf-9eee-4582-975b-6eb1a02b856b\") " Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.295110 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b84443bf-9eee-4582-975b-6eb1a02b856b-catalog-content\") pod \"b84443bf-9eee-4582-975b-6eb1a02b856b\" (UID: \"b84443bf-9eee-4582-975b-6eb1a02b856b\") " Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.295791 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b84443bf-9eee-4582-975b-6eb1a02b856b-utilities" (OuterVolumeSpecName: "utilities") pod "b84443bf-9eee-4582-975b-6eb1a02b856b" (UID: "b84443bf-9eee-4582-975b-6eb1a02b856b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.306858 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b84443bf-9eee-4582-975b-6eb1a02b856b-kube-api-access-dmrfb" (OuterVolumeSpecName: "kube-api-access-dmrfb") pod "b84443bf-9eee-4582-975b-6eb1a02b856b" (UID: "b84443bf-9eee-4582-975b-6eb1a02b856b"). InnerVolumeSpecName "kube-api-access-dmrfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.316355 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b84443bf-9eee-4582-975b-6eb1a02b856b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b84443bf-9eee-4582-975b-6eb1a02b856b" (UID: "b84443bf-9eee-4582-975b-6eb1a02b856b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.397644 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b84443bf-9eee-4582-975b-6eb1a02b856b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.397686 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmrfb\" (UniqueName: \"kubernetes.io/projected/b84443bf-9eee-4582-975b-6eb1a02b856b-kube-api-access-dmrfb\") on node \"crc\" DevicePath \"\"" Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.397696 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b84443bf-9eee-4582-975b-6eb1a02b856b-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.743330 4795 generic.go:334] "Generic (PLEG): container finished" podID="b84443bf-9eee-4582-975b-6eb1a02b856b" containerID="73313296109df7e2e49c16e07eb57c12eccff3c34b12446456875205b34255f4" exitCode=0 Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.743379 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbmn2" event={"ID":"b84443bf-9eee-4582-975b-6eb1a02b856b","Type":"ContainerDied","Data":"73313296109df7e2e49c16e07eb57c12eccff3c34b12446456875205b34255f4"} Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.743411 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbmn2" event={"ID":"b84443bf-9eee-4582-975b-6eb1a02b856b","Type":"ContainerDied","Data":"82b949cb4644af7dac55f5e52cb7a9cd5efb4d958c1a6c32fcc97200612266db"} Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.743433 4795 scope.go:117] "RemoveContainer" containerID="73313296109df7e2e49c16e07eb57c12eccff3c34b12446456875205b34255f4" Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.743468 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.770047 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbmn2"] Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.770158 4795 scope.go:117] "RemoveContainer" containerID="9c5d1dddf86e209c0b9a3a26fcbdc4092f27328a5bf6a0d47b8e78aa6b9732b1" Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.784676 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbmn2"] Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.789514 4795 scope.go:117] "RemoveContainer" containerID="15e2caf1d36a29c75635195d8c25d9abf28cb859ec7996d4c3c0d8dd5e734adc" Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.837813 4795 scope.go:117] "RemoveContainer" containerID="73313296109df7e2e49c16e07eb57c12eccff3c34b12446456875205b34255f4" Feb 19 23:46:39 crc kubenswrapper[4795]: E0219 23:46:39.843800 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73313296109df7e2e49c16e07eb57c12eccff3c34b12446456875205b34255f4\": container with ID starting with 73313296109df7e2e49c16e07eb57c12eccff3c34b12446456875205b34255f4 not found: ID does not exist" containerID="73313296109df7e2e49c16e07eb57c12eccff3c34b12446456875205b34255f4" Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.843838 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73313296109df7e2e49c16e07eb57c12eccff3c34b12446456875205b34255f4"} err="failed to get container status \"73313296109df7e2e49c16e07eb57c12eccff3c34b12446456875205b34255f4\": rpc error: code = NotFound desc = could not find container \"73313296109df7e2e49c16e07eb57c12eccff3c34b12446456875205b34255f4\": container with ID starting with 73313296109df7e2e49c16e07eb57c12eccff3c34b12446456875205b34255f4 not found: ID does not exist" Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.843862 4795 scope.go:117] "RemoveContainer" containerID="9c5d1dddf86e209c0b9a3a26fcbdc4092f27328a5bf6a0d47b8e78aa6b9732b1" Feb 19 23:46:39 crc kubenswrapper[4795]: E0219 23:46:39.844259 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c5d1dddf86e209c0b9a3a26fcbdc4092f27328a5bf6a0d47b8e78aa6b9732b1\": container with ID starting with 9c5d1dddf86e209c0b9a3a26fcbdc4092f27328a5bf6a0d47b8e78aa6b9732b1 not found: ID does not exist" containerID="9c5d1dddf86e209c0b9a3a26fcbdc4092f27328a5bf6a0d47b8e78aa6b9732b1" Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.844285 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c5d1dddf86e209c0b9a3a26fcbdc4092f27328a5bf6a0d47b8e78aa6b9732b1"} err="failed to get container status \"9c5d1dddf86e209c0b9a3a26fcbdc4092f27328a5bf6a0d47b8e78aa6b9732b1\": rpc error: code = NotFound desc = could not find container \"9c5d1dddf86e209c0b9a3a26fcbdc4092f27328a5bf6a0d47b8e78aa6b9732b1\": container with ID starting with 9c5d1dddf86e209c0b9a3a26fcbdc4092f27328a5bf6a0d47b8e78aa6b9732b1 not found: ID does not exist" Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.844301 4795 scope.go:117] "RemoveContainer" containerID="15e2caf1d36a29c75635195d8c25d9abf28cb859ec7996d4c3c0d8dd5e734adc" Feb 19 23:46:39 crc kubenswrapper[4795]: E0219 23:46:39.844698 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15e2caf1d36a29c75635195d8c25d9abf28cb859ec7996d4c3c0d8dd5e734adc\": container with ID starting with 15e2caf1d36a29c75635195d8c25d9abf28cb859ec7996d4c3c0d8dd5e734adc not found: ID does not exist" containerID="15e2caf1d36a29c75635195d8c25d9abf28cb859ec7996d4c3c0d8dd5e734adc" Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.844720 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15e2caf1d36a29c75635195d8c25d9abf28cb859ec7996d4c3c0d8dd5e734adc"} err="failed to get container status \"15e2caf1d36a29c75635195d8c25d9abf28cb859ec7996d4c3c0d8dd5e734adc\": rpc error: code = NotFound desc = could not find container \"15e2caf1d36a29c75635195d8c25d9abf28cb859ec7996d4c3c0d8dd5e734adc\": container with ID starting with 15e2caf1d36a29c75635195d8c25d9abf28cb859ec7996d4c3c0d8dd5e734adc not found: ID does not exist" Feb 19 23:46:41 crc kubenswrapper[4795]: I0219 23:46:41.542979 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b84443bf-9eee-4582-975b-6eb1a02b856b" path="/var/lib/kubelet/pods/b84443bf-9eee-4582-975b-6eb1a02b856b/volumes" Feb 19 23:46:46 crc kubenswrapper[4795]: I0219 23:46:46.824959 4795 generic.go:334] "Generic (PLEG): container finished" podID="ff3df901-a0ae-456e-8103-60aaa6439785" containerID="c2f202e4a47f5834d5d4ba7b5ccaec0183d8ee7a0ac71915df5f8ba0ec70bb7a" exitCode=0 Feb 19 23:46:46 crc kubenswrapper[4795]: I0219 23:46:46.825099 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" event={"ID":"ff3df901-a0ae-456e-8103-60aaa6439785","Type":"ContainerDied","Data":"c2f202e4a47f5834d5d4ba7b5ccaec0183d8ee7a0ac71915df5f8ba0ec70bb7a"} Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.248279 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.301099 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-inventory\") pod \"ff3df901-a0ae-456e-8103-60aaa6439785\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.301192 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-neutron-dhcp-combined-ca-bundle\") pod \"ff3df901-a0ae-456e-8103-60aaa6439785\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.301355 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-neutron-dhcp-agent-neutron-config-0\") pod \"ff3df901-a0ae-456e-8103-60aaa6439785\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.301425 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-ceph\") pod \"ff3df901-a0ae-456e-8103-60aaa6439785\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.301563 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spsv6\" (UniqueName: \"kubernetes.io/projected/ff3df901-a0ae-456e-8103-60aaa6439785-kube-api-access-spsv6\") pod \"ff3df901-a0ae-456e-8103-60aaa6439785\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.301821 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-ssh-key-openstack-cell1\") pod \"ff3df901-a0ae-456e-8103-60aaa6439785\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.309379 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-ceph" (OuterVolumeSpecName: "ceph") pod "ff3df901-a0ae-456e-8103-60aaa6439785" (UID: "ff3df901-a0ae-456e-8103-60aaa6439785"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.320477 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "ff3df901-a0ae-456e-8103-60aaa6439785" (UID: "ff3df901-a0ae-456e-8103-60aaa6439785"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.334080 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff3df901-a0ae-456e-8103-60aaa6439785-kube-api-access-spsv6" (OuterVolumeSpecName: "kube-api-access-spsv6") pod "ff3df901-a0ae-456e-8103-60aaa6439785" (UID: "ff3df901-a0ae-456e-8103-60aaa6439785"). InnerVolumeSpecName "kube-api-access-spsv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.339322 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "ff3df901-a0ae-456e-8103-60aaa6439785" (UID: "ff3df901-a0ae-456e-8103-60aaa6439785"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.339423 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "ff3df901-a0ae-456e-8103-60aaa6439785" (UID: "ff3df901-a0ae-456e-8103-60aaa6439785"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.355242 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-inventory" (OuterVolumeSpecName: "inventory") pod "ff3df901-a0ae-456e-8103-60aaa6439785" (UID: "ff3df901-a0ae-456e-8103-60aaa6439785"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.405348 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.405386 4795 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.405404 4795 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.405420 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.405437 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spsv6\" (UniqueName: \"kubernetes.io/projected/ff3df901-a0ae-456e-8103-60aaa6439785-kube-api-access-spsv6\") on node \"crc\" DevicePath \"\"" Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.405449 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.843132 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.843018 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" event={"ID":"ff3df901-a0ae-456e-8103-60aaa6439785","Type":"ContainerDied","Data":"0676394dc314b80aee4627ac3a27539eb9572cc4cc7886c949a8f92d0416d8a3"} Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.843569 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0676394dc314b80aee4627ac3a27539eb9572cc4cc7886c949a8f92d0416d8a3" Feb 19 23:46:58 crc kubenswrapper[4795]: I0219 23:46:58.427316 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:46:58 crc kubenswrapper[4795]: I0219 23:46:58.427810 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:47:00 crc kubenswrapper[4795]: I0219 23:47:00.247765 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 23:47:00 crc kubenswrapper[4795]: I0219 23:47:00.248324 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="761a7217-33fa-4d78-8a05-492cbb33f48d" containerName="nova-cell0-conductor-conductor" containerID="cri-o://2e039fa6d9d7121c3b280b416aad376a4ff38189c9a9b31473131235502fb9b5" gracePeriod=30 Feb 19 23:47:00 crc kubenswrapper[4795]: I0219 23:47:00.717747 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 23:47:00 crc kubenswrapper[4795]: I0219 23:47:00.718243 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="f1a9135a-42ef-42ca-880a-f4f5ffd78a13" containerName="nova-cell1-conductor-conductor" containerID="cri-o://47428fb66fcdd0ecf9a1523a5b7b854d3890395cb0c5cd90338ec63eed539ed1" gracePeriod=30 Feb 19 23:47:00 crc kubenswrapper[4795]: I0219 23:47:00.878768 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:47:00 crc kubenswrapper[4795]: I0219 23:47:00.879260 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8af13c78-4805-4828-980c-45e1defd94c3" containerName="nova-api-log" containerID="cri-o://ad930281e03b5e1a65091cba120348431f474125eb5bc690c29d8ca4f3247c21" gracePeriod=30 Feb 19 23:47:00 crc kubenswrapper[4795]: I0219 23:47:00.879330 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8af13c78-4805-4828-980c-45e1defd94c3" containerName="nova-api-api" containerID="cri-o://d925bab645dff9712b2a171f2f4482d18d0a0ee4391b990a9a7a705d61ad3def" gracePeriod=30 Feb 19 23:47:00 crc kubenswrapper[4795]: I0219 23:47:00.896049 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:47:00 crc kubenswrapper[4795]: I0219 23:47:00.896469 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2eb28a2e-eb12-4867-9c26-3416349cc1cc" containerName="nova-scheduler-scheduler" containerID="cri-o://1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2" gracePeriod=30 Feb 19 23:47:00 crc kubenswrapper[4795]: I0219 23:47:00.933670 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:47:00 crc kubenswrapper[4795]: I0219 23:47:00.934206 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" containerName="nova-metadata-log" containerID="cri-o://aa2cfffccfdef39a961925828c69395a846784ec188eaf37747eb10bd8158d2b" gracePeriod=30 Feb 19 23:47:00 crc kubenswrapper[4795]: I0219 23:47:00.934625 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" containerName="nova-metadata-metadata" containerID="cri-o://fd1e78c709cd03e12813d0773547c4465f0e66a7a0cdec8f42bc3426e77cff3f" gracePeriod=30 Feb 19 23:47:01 crc kubenswrapper[4795]: E0219 23:47:01.322124 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47428fb66fcdd0ecf9a1523a5b7b854d3890395cb0c5cd90338ec63eed539ed1" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 23:47:01 crc kubenswrapper[4795]: E0219 23:47:01.324220 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47428fb66fcdd0ecf9a1523a5b7b854d3890395cb0c5cd90338ec63eed539ed1" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 23:47:01 crc kubenswrapper[4795]: E0219 23:47:01.325705 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47428fb66fcdd0ecf9a1523a5b7b854d3890395cb0c5cd90338ec63eed539ed1" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 23:47:01 crc kubenswrapper[4795]: E0219 23:47:01.325746 4795 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="f1a9135a-42ef-42ca-880a-f4f5ffd78a13" containerName="nova-cell1-conductor-conductor" Feb 19 23:47:01 crc kubenswrapper[4795]: I0219 23:47:01.837913 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 23:47:01 crc kubenswrapper[4795]: I0219 23:47:01.937719 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-config-data\") pod \"f1a9135a-42ef-42ca-880a-f4f5ffd78a13\" (UID: \"f1a9135a-42ef-42ca-880a-f4f5ffd78a13\") " Feb 19 23:47:01 crc kubenswrapper[4795]: I0219 23:47:01.937833 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnl2g\" (UniqueName: \"kubernetes.io/projected/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-kube-api-access-qnl2g\") pod \"f1a9135a-42ef-42ca-880a-f4f5ffd78a13\" (UID: \"f1a9135a-42ef-42ca-880a-f4f5ffd78a13\") " Feb 19 23:47:01 crc kubenswrapper[4795]: I0219 23:47:01.937893 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-combined-ca-bundle\") pod \"f1a9135a-42ef-42ca-880a-f4f5ffd78a13\" (UID: \"f1a9135a-42ef-42ca-880a-f4f5ffd78a13\") " Feb 19 23:47:01 crc kubenswrapper[4795]: I0219 23:47:01.943601 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-kube-api-access-qnl2g" (OuterVolumeSpecName: "kube-api-access-qnl2g") pod "f1a9135a-42ef-42ca-880a-f4f5ffd78a13" (UID: "f1a9135a-42ef-42ca-880a-f4f5ffd78a13"). InnerVolumeSpecName "kube-api-access-qnl2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:47:01 crc kubenswrapper[4795]: I0219 23:47:01.967828 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1a9135a-42ef-42ca-880a-f4f5ffd78a13" (UID: "f1a9135a-42ef-42ca-880a-f4f5ffd78a13"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:47:01 crc kubenswrapper[4795]: I0219 23:47:01.970539 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-config-data" (OuterVolumeSpecName: "config-data") pod "f1a9135a-42ef-42ca-880a-f4f5ffd78a13" (UID: "f1a9135a-42ef-42ca-880a-f4f5ffd78a13"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.006732 4795 generic.go:334] "Generic (PLEG): container finished" podID="f1a9135a-42ef-42ca-880a-f4f5ffd78a13" containerID="47428fb66fcdd0ecf9a1523a5b7b854d3890395cb0c5cd90338ec63eed539ed1" exitCode=0 Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.006791 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f1a9135a-42ef-42ca-880a-f4f5ffd78a13","Type":"ContainerDied","Data":"47428fb66fcdd0ecf9a1523a5b7b854d3890395cb0c5cd90338ec63eed539ed1"} Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.006817 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f1a9135a-42ef-42ca-880a-f4f5ffd78a13","Type":"ContainerDied","Data":"2adc08c6dd489704d7eddd8052ac3149a11c14f7992162c2dccd22cfce6e5fe5"} Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.006835 4795 scope.go:117] "RemoveContainer" containerID="47428fb66fcdd0ecf9a1523a5b7b854d3890395cb0c5cd90338ec63eed539ed1" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.006979 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.014131 4795 generic.go:334] "Generic (PLEG): container finished" podID="88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" containerID="aa2cfffccfdef39a961925828c69395a846784ec188eaf37747eb10bd8158d2b" exitCode=143 Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.014202 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9","Type":"ContainerDied","Data":"aa2cfffccfdef39a961925828c69395a846784ec188eaf37747eb10bd8158d2b"} Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.016588 4795 generic.go:334] "Generic (PLEG): container finished" podID="8af13c78-4805-4828-980c-45e1defd94c3" containerID="ad930281e03b5e1a65091cba120348431f474125eb5bc690c29d8ca4f3247c21" exitCode=143 Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.016621 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8af13c78-4805-4828-980c-45e1defd94c3","Type":"ContainerDied","Data":"ad930281e03b5e1a65091cba120348431f474125eb5bc690c29d8ca4f3247c21"} Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.040692 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.040727 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnl2g\" (UniqueName: \"kubernetes.io/projected/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-kube-api-access-qnl2g\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.040766 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.061563 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.063238 4795 scope.go:117] "RemoveContainer" containerID="47428fb66fcdd0ecf9a1523a5b7b854d3890395cb0c5cd90338ec63eed539ed1" Feb 19 23:47:02 crc kubenswrapper[4795]: E0219 23:47:02.063604 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47428fb66fcdd0ecf9a1523a5b7b854d3890395cb0c5cd90338ec63eed539ed1\": container with ID starting with 47428fb66fcdd0ecf9a1523a5b7b854d3890395cb0c5cd90338ec63eed539ed1 not found: ID does not exist" containerID="47428fb66fcdd0ecf9a1523a5b7b854d3890395cb0c5cd90338ec63eed539ed1" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.063636 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47428fb66fcdd0ecf9a1523a5b7b854d3890395cb0c5cd90338ec63eed539ed1"} err="failed to get container status \"47428fb66fcdd0ecf9a1523a5b7b854d3890395cb0c5cd90338ec63eed539ed1\": rpc error: code = NotFound desc = could not find container \"47428fb66fcdd0ecf9a1523a5b7b854d3890395cb0c5cd90338ec63eed539ed1\": container with ID starting with 47428fb66fcdd0ecf9a1523a5b7b854d3890395cb0c5cd90338ec63eed539ed1 not found: ID does not exist" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.078971 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.094380 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 23:47:02 crc kubenswrapper[4795]: E0219 23:47:02.094874 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b84443bf-9eee-4582-975b-6eb1a02b856b" containerName="extract-utilities" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.094892 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b84443bf-9eee-4582-975b-6eb1a02b856b" containerName="extract-utilities" Feb 19 23:47:02 crc kubenswrapper[4795]: E0219 23:47:02.094909 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff3df901-a0ae-456e-8103-60aaa6439785" containerName="neutron-dhcp-openstack-openstack-cell1" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.094916 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff3df901-a0ae-456e-8103-60aaa6439785" containerName="neutron-dhcp-openstack-openstack-cell1" Feb 19 23:47:02 crc kubenswrapper[4795]: E0219 23:47:02.094929 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a9135a-42ef-42ca-880a-f4f5ffd78a13" containerName="nova-cell1-conductor-conductor" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.094936 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a9135a-42ef-42ca-880a-f4f5ffd78a13" containerName="nova-cell1-conductor-conductor" Feb 19 23:47:02 crc kubenswrapper[4795]: E0219 23:47:02.094963 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b84443bf-9eee-4582-975b-6eb1a02b856b" containerName="registry-server" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.094969 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b84443bf-9eee-4582-975b-6eb1a02b856b" containerName="registry-server" Feb 19 23:47:02 crc kubenswrapper[4795]: E0219 23:47:02.094983 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b84443bf-9eee-4582-975b-6eb1a02b856b" containerName="extract-content" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.094989 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b84443bf-9eee-4582-975b-6eb1a02b856b" containerName="extract-content" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.095212 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1a9135a-42ef-42ca-880a-f4f5ffd78a13" containerName="nova-cell1-conductor-conductor" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.095225 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b84443bf-9eee-4582-975b-6eb1a02b856b" containerName="registry-server" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.095234 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff3df901-a0ae-456e-8103-60aaa6439785" containerName="neutron-dhcp-openstack-openstack-cell1" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.095957 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.100850 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.105745 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.142601 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fa49294-8a0c-4d98-a388-067bdce0ac1b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0fa49294-8a0c-4d98-a388-067bdce0ac1b\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.142685 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drtxw\" (UniqueName: \"kubernetes.io/projected/0fa49294-8a0c-4d98-a388-067bdce0ac1b-kube-api-access-drtxw\") pod \"nova-cell1-conductor-0\" (UID: \"0fa49294-8a0c-4d98-a388-067bdce0ac1b\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.142747 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fa49294-8a0c-4d98-a388-067bdce0ac1b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0fa49294-8a0c-4d98-a388-067bdce0ac1b\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.244355 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fa49294-8a0c-4d98-a388-067bdce0ac1b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0fa49294-8a0c-4d98-a388-067bdce0ac1b\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.244736 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drtxw\" (UniqueName: \"kubernetes.io/projected/0fa49294-8a0c-4d98-a388-067bdce0ac1b-kube-api-access-drtxw\") pod \"nova-cell1-conductor-0\" (UID: \"0fa49294-8a0c-4d98-a388-067bdce0ac1b\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.244783 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fa49294-8a0c-4d98-a388-067bdce0ac1b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0fa49294-8a0c-4d98-a388-067bdce0ac1b\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.248495 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fa49294-8a0c-4d98-a388-067bdce0ac1b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0fa49294-8a0c-4d98-a388-067bdce0ac1b\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.256288 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fa49294-8a0c-4d98-a388-067bdce0ac1b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0fa49294-8a0c-4d98-a388-067bdce0ac1b\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.260902 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drtxw\" (UniqueName: \"kubernetes.io/projected/0fa49294-8a0c-4d98-a388-067bdce0ac1b-kube-api-access-drtxw\") pod \"nova-cell1-conductor-0\" (UID: \"0fa49294-8a0c-4d98-a388-067bdce0ac1b\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.468398 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.472986 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.550270 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/761a7217-33fa-4d78-8a05-492cbb33f48d-combined-ca-bundle\") pod \"761a7217-33fa-4d78-8a05-492cbb33f48d\" (UID: \"761a7217-33fa-4d78-8a05-492cbb33f48d\") " Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.550351 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/761a7217-33fa-4d78-8a05-492cbb33f48d-config-data\") pod \"761a7217-33fa-4d78-8a05-492cbb33f48d\" (UID: \"761a7217-33fa-4d78-8a05-492cbb33f48d\") " Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.550673 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mzqk\" (UniqueName: \"kubernetes.io/projected/761a7217-33fa-4d78-8a05-492cbb33f48d-kube-api-access-7mzqk\") pod \"761a7217-33fa-4d78-8a05-492cbb33f48d\" (UID: \"761a7217-33fa-4d78-8a05-492cbb33f48d\") " Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.586141 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/761a7217-33fa-4d78-8a05-492cbb33f48d-kube-api-access-7mzqk" (OuterVolumeSpecName: "kube-api-access-7mzqk") pod "761a7217-33fa-4d78-8a05-492cbb33f48d" (UID: "761a7217-33fa-4d78-8a05-492cbb33f48d"). InnerVolumeSpecName "kube-api-access-7mzqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.623298 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/761a7217-33fa-4d78-8a05-492cbb33f48d-config-data" (OuterVolumeSpecName: "config-data") pod "761a7217-33fa-4d78-8a05-492cbb33f48d" (UID: "761a7217-33fa-4d78-8a05-492cbb33f48d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.638759 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/761a7217-33fa-4d78-8a05-492cbb33f48d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "761a7217-33fa-4d78-8a05-492cbb33f48d" (UID: "761a7217-33fa-4d78-8a05-492cbb33f48d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.656634 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mzqk\" (UniqueName: \"kubernetes.io/projected/761a7217-33fa-4d78-8a05-492cbb33f48d-kube-api-access-7mzqk\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.656674 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/761a7217-33fa-4d78-8a05-492cbb33f48d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.656683 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/761a7217-33fa-4d78-8a05-492cbb33f48d-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.026769 4795 generic.go:334] "Generic (PLEG): container finished" podID="761a7217-33fa-4d78-8a05-492cbb33f48d" containerID="2e039fa6d9d7121c3b280b416aad376a4ff38189c9a9b31473131235502fb9b5" exitCode=0 Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.026816 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"761a7217-33fa-4d78-8a05-492cbb33f48d","Type":"ContainerDied","Data":"2e039fa6d9d7121c3b280b416aad376a4ff38189c9a9b31473131235502fb9b5"} Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.026862 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"761a7217-33fa-4d78-8a05-492cbb33f48d","Type":"ContainerDied","Data":"a920c7c53728d52d3ab518fdecf0b6800cb795ab36fabc65145920815940fa68"} Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.026880 4795 scope.go:117] "RemoveContainer" containerID="2e039fa6d9d7121c3b280b416aad376a4ff38189c9a9b31473131235502fb9b5" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.026825 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.058839 4795 scope.go:117] "RemoveContainer" containerID="2e039fa6d9d7121c3b280b416aad376a4ff38189c9a9b31473131235502fb9b5" Feb 19 23:47:03 crc kubenswrapper[4795]: E0219 23:47:03.059200 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e039fa6d9d7121c3b280b416aad376a4ff38189c9a9b31473131235502fb9b5\": container with ID starting with 2e039fa6d9d7121c3b280b416aad376a4ff38189c9a9b31473131235502fb9b5 not found: ID does not exist" containerID="2e039fa6d9d7121c3b280b416aad376a4ff38189c9a9b31473131235502fb9b5" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.059243 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e039fa6d9d7121c3b280b416aad376a4ff38189c9a9b31473131235502fb9b5"} err="failed to get container status \"2e039fa6d9d7121c3b280b416aad376a4ff38189c9a9b31473131235502fb9b5\": rpc error: code = NotFound desc = could not find container \"2e039fa6d9d7121c3b280b416aad376a4ff38189c9a9b31473131235502fb9b5\": container with ID starting with 2e039fa6d9d7121c3b280b416aad376a4ff38189c9a9b31473131235502fb9b5 not found: ID does not exist" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.063477 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.078716 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.094572 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.105270 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 23:47:03 crc kubenswrapper[4795]: E0219 23:47:03.105752 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="761a7217-33fa-4d78-8a05-492cbb33f48d" containerName="nova-cell0-conductor-conductor" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.105777 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="761a7217-33fa-4d78-8a05-492cbb33f48d" containerName="nova-cell0-conductor-conductor" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.106036 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="761a7217-33fa-4d78-8a05-492cbb33f48d" containerName="nova-cell0-conductor-conductor" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.106905 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.109002 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.118402 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.167009 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d27d8041-4940-4cd2-bf9e-02b7aa924067-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d27d8041-4940-4cd2-bf9e-02b7aa924067\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.167693 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d27d8041-4940-4cd2-bf9e-02b7aa924067-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d27d8041-4940-4cd2-bf9e-02b7aa924067\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.167910 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xpls\" (UniqueName: \"kubernetes.io/projected/d27d8041-4940-4cd2-bf9e-02b7aa924067-kube-api-access-6xpls\") pod \"nova-cell0-conductor-0\" (UID: \"d27d8041-4940-4cd2-bf9e-02b7aa924067\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.269200 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d27d8041-4940-4cd2-bf9e-02b7aa924067-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d27d8041-4940-4cd2-bf9e-02b7aa924067\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.269374 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d27d8041-4940-4cd2-bf9e-02b7aa924067-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d27d8041-4940-4cd2-bf9e-02b7aa924067\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.269430 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xpls\" (UniqueName: \"kubernetes.io/projected/d27d8041-4940-4cd2-bf9e-02b7aa924067-kube-api-access-6xpls\") pod \"nova-cell0-conductor-0\" (UID: \"d27d8041-4940-4cd2-bf9e-02b7aa924067\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.274630 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d27d8041-4940-4cd2-bf9e-02b7aa924067-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d27d8041-4940-4cd2-bf9e-02b7aa924067\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.276155 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d27d8041-4940-4cd2-bf9e-02b7aa924067-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d27d8041-4940-4cd2-bf9e-02b7aa924067\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.286574 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xpls\" (UniqueName: \"kubernetes.io/projected/d27d8041-4940-4cd2-bf9e-02b7aa924067-kube-api-access-6xpls\") pod \"nova-cell0-conductor-0\" (UID: \"d27d8041-4940-4cd2-bf9e-02b7aa924067\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.536046 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="761a7217-33fa-4d78-8a05-492cbb33f48d" path="/var/lib/kubelet/pods/761a7217-33fa-4d78-8a05-492cbb33f48d/volumes" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.536991 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1a9135a-42ef-42ca-880a-f4f5ffd78a13" path="/var/lib/kubelet/pods/f1a9135a-42ef-42ca-880a-f4f5ffd78a13/volumes" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.548029 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.021523 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 23:47:04 crc kubenswrapper[4795]: W0219 23:47:04.030804 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd27d8041_4940_4cd2_bf9e_02b7aa924067.slice/crio-01cdf5eb54fb6ab079bfe1c87f56fed5c2e6903145b84ea12364255cd0de0dd9 WatchSource:0}: Error finding container 01cdf5eb54fb6ab079bfe1c87f56fed5c2e6903145b84ea12364255cd0de0dd9: Status 404 returned error can't find the container with id 01cdf5eb54fb6ab079bfe1c87f56fed5c2e6903145b84ea12364255cd0de0dd9 Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.040400 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0fa49294-8a0c-4d98-a388-067bdce0ac1b","Type":"ContainerStarted","Data":"5f5d0f57bbef303932012df35b0d662bd9583e35a0c826f07fa25fce61b48e82"} Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.040633 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0fa49294-8a0c-4d98-a388-067bdce0ac1b","Type":"ContainerStarted","Data":"a3b717517b1ba0d636e16f433517f2ea2922b0dd075ee9edc9b8d78b4178b2c6"} Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.041872 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.061009 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.060989618 podStartE2EDuration="2.060989618s" podCreationTimestamp="2026-02-19 23:47:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:47:04.059820496 +0000 UTC m=+8335.252338360" watchObservedRunningTime="2026-02-19 23:47:04.060989618 +0000 UTC m=+8335.253507482" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.089764 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.83:8775/\": read tcp 10.217.0.2:43098->10.217.1.83:8775: read: connection reset by peer" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.090150 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.83:8775/\": read tcp 10.217.0.2:43096->10.217.1.83:8775: read: connection reset by peer" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.493850 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.596048 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8af13c78-4805-4828-980c-45e1defd94c3-combined-ca-bundle\") pod \"8af13c78-4805-4828-980c-45e1defd94c3\" (UID: \"8af13c78-4805-4828-980c-45e1defd94c3\") " Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.596188 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8af13c78-4805-4828-980c-45e1defd94c3-logs\") pod \"8af13c78-4805-4828-980c-45e1defd94c3\" (UID: \"8af13c78-4805-4828-980c-45e1defd94c3\") " Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.596245 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2xch\" (UniqueName: \"kubernetes.io/projected/8af13c78-4805-4828-980c-45e1defd94c3-kube-api-access-r2xch\") pod \"8af13c78-4805-4828-980c-45e1defd94c3\" (UID: \"8af13c78-4805-4828-980c-45e1defd94c3\") " Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.596279 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8af13c78-4805-4828-980c-45e1defd94c3-config-data\") pod \"8af13c78-4805-4828-980c-45e1defd94c3\" (UID: \"8af13c78-4805-4828-980c-45e1defd94c3\") " Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.598680 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8af13c78-4805-4828-980c-45e1defd94c3-logs" (OuterVolumeSpecName: "logs") pod "8af13c78-4805-4828-980c-45e1defd94c3" (UID: "8af13c78-4805-4828-980c-45e1defd94c3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.603670 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8af13c78-4805-4828-980c-45e1defd94c3-kube-api-access-r2xch" (OuterVolumeSpecName: "kube-api-access-r2xch") pod "8af13c78-4805-4828-980c-45e1defd94c3" (UID: "8af13c78-4805-4828-980c-45e1defd94c3"). InnerVolumeSpecName "kube-api-access-r2xch". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.633826 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8af13c78-4805-4828-980c-45e1defd94c3-config-data" (OuterVolumeSpecName: "config-data") pod "8af13c78-4805-4828-980c-45e1defd94c3" (UID: "8af13c78-4805-4828-980c-45e1defd94c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.644552 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8af13c78-4805-4828-980c-45e1defd94c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8af13c78-4805-4828-980c-45e1defd94c3" (UID: "8af13c78-4805-4828-980c-45e1defd94c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.645103 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.701726 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dq6w\" (UniqueName: \"kubernetes.io/projected/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-kube-api-access-2dq6w\") pod \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\" (UID: \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\") " Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.701822 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-logs\") pod \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\" (UID: \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\") " Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.701924 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-combined-ca-bundle\") pod \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\" (UID: \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\") " Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.702111 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-config-data\") pod \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\" (UID: \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\") " Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.702609 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-logs" (OuterVolumeSpecName: "logs") pod "88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" (UID: "88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.702641 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8af13c78-4805-4828-980c-45e1defd94c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.702655 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8af13c78-4805-4828-980c-45e1defd94c3-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.702666 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2xch\" (UniqueName: \"kubernetes.io/projected/8af13c78-4805-4828-980c-45e1defd94c3-kube-api-access-r2xch\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.702675 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8af13c78-4805-4828-980c-45e1defd94c3-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.712851 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-kube-api-access-2dq6w" (OuterVolumeSpecName: "kube-api-access-2dq6w") pod "88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" (UID: "88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9"). InnerVolumeSpecName "kube-api-access-2dq6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.747397 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" (UID: "88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.765520 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-config-data" (OuterVolumeSpecName: "config-data") pod "88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" (UID: "88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.804020 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.804049 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dq6w\" (UniqueName: \"kubernetes.io/projected/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-kube-api-access-2dq6w\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.804058 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.804066 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.054423 4795 generic.go:334] "Generic (PLEG): container finished" podID="88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" containerID="fd1e78c709cd03e12813d0773547c4465f0e66a7a0cdec8f42bc3426e77cff3f" exitCode=0 Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.054492 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.054534 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9","Type":"ContainerDied","Data":"fd1e78c709cd03e12813d0773547c4465f0e66a7a0cdec8f42bc3426e77cff3f"} Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.058305 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9","Type":"ContainerDied","Data":"df7150bd3c379f19d4f935bb8e348093119703bff34dd1ad6781416721057a60"} Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.058337 4795 scope.go:117] "RemoveContainer" containerID="fd1e78c709cd03e12813d0773547c4465f0e66a7a0cdec8f42bc3426e77cff3f" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.060760 4795 generic.go:334] "Generic (PLEG): container finished" podID="8af13c78-4805-4828-980c-45e1defd94c3" containerID="d925bab645dff9712b2a171f2f4482d18d0a0ee4391b990a9a7a705d61ad3def" exitCode=0 Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.060808 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.060847 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8af13c78-4805-4828-980c-45e1defd94c3","Type":"ContainerDied","Data":"d925bab645dff9712b2a171f2f4482d18d0a0ee4391b990a9a7a705d61ad3def"} Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.060880 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8af13c78-4805-4828-980c-45e1defd94c3","Type":"ContainerDied","Data":"7ee07f55b63d65c8ea13f8ca8377dd262c2422aff92a6a2abfe47d6fef72c015"} Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.064255 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d27d8041-4940-4cd2-bf9e-02b7aa924067","Type":"ContainerStarted","Data":"fe60ce54dc10f3166342ae8d794a6371165923563cbaf10fe016273bac89a75e"} Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.064297 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.064308 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d27d8041-4940-4cd2-bf9e-02b7aa924067","Type":"ContainerStarted","Data":"01cdf5eb54fb6ab079bfe1c87f56fed5c2e6903145b84ea12364255cd0de0dd9"} Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.086464 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.086446035 podStartE2EDuration="2.086446035s" podCreationTimestamp="2026-02-19 23:47:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:47:05.077462717 +0000 UTC m=+8336.269980581" watchObservedRunningTime="2026-02-19 23:47:05.086446035 +0000 UTC m=+8336.278963899" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.205634 4795 scope.go:117] "RemoveContainer" containerID="aa2cfffccfdef39a961925828c69395a846784ec188eaf37747eb10bd8158d2b" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.236637 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.249802 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.266076 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 23:47:05 crc kubenswrapper[4795]: E0219 23:47:05.266587 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" containerName="nova-metadata-log" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.266612 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" containerName="nova-metadata-log" Feb 19 23:47:05 crc kubenswrapper[4795]: E0219 23:47:05.266632 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8af13c78-4805-4828-980c-45e1defd94c3" containerName="nova-api-log" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.266641 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8af13c78-4805-4828-980c-45e1defd94c3" containerName="nova-api-log" Feb 19 23:47:05 crc kubenswrapper[4795]: E0219 23:47:05.266661 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8af13c78-4805-4828-980c-45e1defd94c3" containerName="nova-api-api" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.266667 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8af13c78-4805-4828-980c-45e1defd94c3" containerName="nova-api-api" Feb 19 23:47:05 crc kubenswrapper[4795]: E0219 23:47:05.266678 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" containerName="nova-metadata-metadata" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.266684 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" containerName="nova-metadata-metadata" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.266911 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8af13c78-4805-4828-980c-45e1defd94c3" containerName="nova-api-api" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.266934 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8af13c78-4805-4828-980c-45e1defd94c3" containerName="nova-api-log" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.266957 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" containerName="nova-metadata-log" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.266966 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" containerName="nova-metadata-metadata" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.269234 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.282059 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.292036 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.315672 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/644df2e5-37fd-468b-9e52-316d44e65f69-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"644df2e5-37fd-468b-9e52-316d44e65f69\") " pod="openstack/nova-api-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.316125 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5s2s\" (UniqueName: \"kubernetes.io/projected/644df2e5-37fd-468b-9e52-316d44e65f69-kube-api-access-q5s2s\") pod \"nova-api-0\" (UID: \"644df2e5-37fd-468b-9e52-316d44e65f69\") " pod="openstack/nova-api-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.316191 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/644df2e5-37fd-468b-9e52-316d44e65f69-logs\") pod \"nova-api-0\" (UID: \"644df2e5-37fd-468b-9e52-316d44e65f69\") " pod="openstack/nova-api-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.316231 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/644df2e5-37fd-468b-9e52-316d44e65f69-config-data\") pod \"nova-api-0\" (UID: \"644df2e5-37fd-468b-9e52-316d44e65f69\") " pod="openstack/nova-api-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.319824 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:47:05 crc kubenswrapper[4795]: E0219 23:47:05.345537 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2 is running failed: container process not found" containerID="1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 23:47:05 crc kubenswrapper[4795]: E0219 23:47:05.347306 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2 is running failed: container process not found" containerID="1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 23:47:05 crc kubenswrapper[4795]: E0219 23:47:05.360313 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2 is running failed: container process not found" containerID="1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 23:47:05 crc kubenswrapper[4795]: E0219 23:47:05.360388 4795 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="2eb28a2e-eb12-4867-9c26-3416349cc1cc" containerName="nova-scheduler-scheduler" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.360684 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.377490 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.385632 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.387371 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.388345 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.401148 4795 scope.go:117] "RemoveContainer" containerID="fd1e78c709cd03e12813d0773547c4465f0e66a7a0cdec8f42bc3426e77cff3f" Feb 19 23:47:05 crc kubenswrapper[4795]: E0219 23:47:05.402284 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd1e78c709cd03e12813d0773547c4465f0e66a7a0cdec8f42bc3426e77cff3f\": container with ID starting with fd1e78c709cd03e12813d0773547c4465f0e66a7a0cdec8f42bc3426e77cff3f not found: ID does not exist" containerID="fd1e78c709cd03e12813d0773547c4465f0e66a7a0cdec8f42bc3426e77cff3f" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.402311 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd1e78c709cd03e12813d0773547c4465f0e66a7a0cdec8f42bc3426e77cff3f"} err="failed to get container status \"fd1e78c709cd03e12813d0773547c4465f0e66a7a0cdec8f42bc3426e77cff3f\": rpc error: code = NotFound desc = could not find container \"fd1e78c709cd03e12813d0773547c4465f0e66a7a0cdec8f42bc3426e77cff3f\": container with ID starting with fd1e78c709cd03e12813d0773547c4465f0e66a7a0cdec8f42bc3426e77cff3f not found: ID does not exist" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.402332 4795 scope.go:117] "RemoveContainer" containerID="aa2cfffccfdef39a961925828c69395a846784ec188eaf37747eb10bd8158d2b" Feb 19 23:47:05 crc kubenswrapper[4795]: E0219 23:47:05.405791 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa2cfffccfdef39a961925828c69395a846784ec188eaf37747eb10bd8158d2b\": container with ID starting with aa2cfffccfdef39a961925828c69395a846784ec188eaf37747eb10bd8158d2b not found: ID does not exist" containerID="aa2cfffccfdef39a961925828c69395a846784ec188eaf37747eb10bd8158d2b" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.405826 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa2cfffccfdef39a961925828c69395a846784ec188eaf37747eb10bd8158d2b"} err="failed to get container status \"aa2cfffccfdef39a961925828c69395a846784ec188eaf37747eb10bd8158d2b\": rpc error: code = NotFound desc = could not find container \"aa2cfffccfdef39a961925828c69395a846784ec188eaf37747eb10bd8158d2b\": container with ID starting with aa2cfffccfdef39a961925828c69395a846784ec188eaf37747eb10bd8158d2b not found: ID does not exist" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.405850 4795 scope.go:117] "RemoveContainer" containerID="d925bab645dff9712b2a171f2f4482d18d0a0ee4391b990a9a7a705d61ad3def" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.417990 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/644df2e5-37fd-468b-9e52-316d44e65f69-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"644df2e5-37fd-468b-9e52-316d44e65f69\") " pod="openstack/nova-api-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.418048 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5s2s\" (UniqueName: \"kubernetes.io/projected/644df2e5-37fd-468b-9e52-316d44e65f69-kube-api-access-q5s2s\") pod \"nova-api-0\" (UID: \"644df2e5-37fd-468b-9e52-316d44e65f69\") " pod="openstack/nova-api-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.418093 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/644df2e5-37fd-468b-9e52-316d44e65f69-logs\") pod \"nova-api-0\" (UID: \"644df2e5-37fd-468b-9e52-316d44e65f69\") " pod="openstack/nova-api-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.418131 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/644df2e5-37fd-468b-9e52-316d44e65f69-config-data\") pod \"nova-api-0\" (UID: \"644df2e5-37fd-468b-9e52-316d44e65f69\") " pod="openstack/nova-api-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.418785 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/644df2e5-37fd-468b-9e52-316d44e65f69-logs\") pod \"nova-api-0\" (UID: \"644df2e5-37fd-468b-9e52-316d44e65f69\") " pod="openstack/nova-api-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.434780 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/644df2e5-37fd-468b-9e52-316d44e65f69-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"644df2e5-37fd-468b-9e52-316d44e65f69\") " pod="openstack/nova-api-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.435222 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/644df2e5-37fd-468b-9e52-316d44e65f69-config-data\") pod \"nova-api-0\" (UID: \"644df2e5-37fd-468b-9e52-316d44e65f69\") " pod="openstack/nova-api-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.438861 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5s2s\" (UniqueName: \"kubernetes.io/projected/644df2e5-37fd-468b-9e52-316d44e65f69-kube-api-access-q5s2s\") pod \"nova-api-0\" (UID: \"644df2e5-37fd-468b-9e52-316d44e65f69\") " pod="openstack/nova-api-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.449891 4795 scope.go:117] "RemoveContainer" containerID="ad930281e03b5e1a65091cba120348431f474125eb5bc690c29d8ca4f3247c21" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.519940 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d15a66bd-d8e7-4ad0-a8bc-7575a218f50c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d15a66bd-d8e7-4ad0-a8bc-7575a218f50c\") " pod="openstack/nova-metadata-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.520065 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d15a66bd-d8e7-4ad0-a8bc-7575a218f50c-config-data\") pod \"nova-metadata-0\" (UID: \"d15a66bd-d8e7-4ad0-a8bc-7575a218f50c\") " pod="openstack/nova-metadata-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.520103 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d15a66bd-d8e7-4ad0-a8bc-7575a218f50c-logs\") pod \"nova-metadata-0\" (UID: \"d15a66bd-d8e7-4ad0-a8bc-7575a218f50c\") " pod="openstack/nova-metadata-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.520130 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmpsv\" (UniqueName: \"kubernetes.io/projected/d15a66bd-d8e7-4ad0-a8bc-7575a218f50c-kube-api-access-tmpsv\") pod \"nova-metadata-0\" (UID: \"d15a66bd-d8e7-4ad0-a8bc-7575a218f50c\") " pod="openstack/nova-metadata-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.533492 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" path="/var/lib/kubelet/pods/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9/volumes" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.537513 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8af13c78-4805-4828-980c-45e1defd94c3" path="/var/lib/kubelet/pods/8af13c78-4805-4828-980c-45e1defd94c3/volumes" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.583989 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.590900 4795 scope.go:117] "RemoveContainer" containerID="d925bab645dff9712b2a171f2f4482d18d0a0ee4391b990a9a7a705d61ad3def" Feb 19 23:47:05 crc kubenswrapper[4795]: E0219 23:47:05.591435 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d925bab645dff9712b2a171f2f4482d18d0a0ee4391b990a9a7a705d61ad3def\": container with ID starting with d925bab645dff9712b2a171f2f4482d18d0a0ee4391b990a9a7a705d61ad3def not found: ID does not exist" containerID="d925bab645dff9712b2a171f2f4482d18d0a0ee4391b990a9a7a705d61ad3def" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.591486 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d925bab645dff9712b2a171f2f4482d18d0a0ee4391b990a9a7a705d61ad3def"} err="failed to get container status \"d925bab645dff9712b2a171f2f4482d18d0a0ee4391b990a9a7a705d61ad3def\": rpc error: code = NotFound desc = could not find container \"d925bab645dff9712b2a171f2f4482d18d0a0ee4391b990a9a7a705d61ad3def\": container with ID starting with d925bab645dff9712b2a171f2f4482d18d0a0ee4391b990a9a7a705d61ad3def not found: ID does not exist" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.591513 4795 scope.go:117] "RemoveContainer" containerID="ad930281e03b5e1a65091cba120348431f474125eb5bc690c29d8ca4f3247c21" Feb 19 23:47:05 crc kubenswrapper[4795]: E0219 23:47:05.591948 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad930281e03b5e1a65091cba120348431f474125eb5bc690c29d8ca4f3247c21\": container with ID starting with ad930281e03b5e1a65091cba120348431f474125eb5bc690c29d8ca4f3247c21 not found: ID does not exist" containerID="ad930281e03b5e1a65091cba120348431f474125eb5bc690c29d8ca4f3247c21" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.591973 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad930281e03b5e1a65091cba120348431f474125eb5bc690c29d8ca4f3247c21"} err="failed to get container status \"ad930281e03b5e1a65091cba120348431f474125eb5bc690c29d8ca4f3247c21\": rpc error: code = NotFound desc = could not find container \"ad930281e03b5e1a65091cba120348431f474125eb5bc690c29d8ca4f3247c21\": container with ID starting with ad930281e03b5e1a65091cba120348431f474125eb5bc690c29d8ca4f3247c21 not found: ID does not exist" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.594943 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h"] Feb 19 23:47:05 crc kubenswrapper[4795]: E0219 23:47:05.595597 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eb28a2e-eb12-4867-9c26-3416349cc1cc" containerName="nova-scheduler-scheduler" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.595669 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eb28a2e-eb12-4867-9c26-3416349cc1cc" containerName="nova-scheduler-scheduler" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.595912 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eb28a2e-eb12-4867-9c26-3416349cc1cc" containerName="nova-scheduler-scheduler" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.597506 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.602658 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.603563 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.604357 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.604542 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.604779 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.609609 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.609761 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.617642 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.639711 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eb28a2e-eb12-4867-9c26-3416349cc1cc-combined-ca-bundle\") pod \"2eb28a2e-eb12-4867-9c26-3416349cc1cc\" (UID: \"2eb28a2e-eb12-4867-9c26-3416349cc1cc\") " Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.641444 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dpxs\" (UniqueName: \"kubernetes.io/projected/2eb28a2e-eb12-4867-9c26-3416349cc1cc-kube-api-access-4dpxs\") pod \"2eb28a2e-eb12-4867-9c26-3416349cc1cc\" (UID: \"2eb28a2e-eb12-4867-9c26-3416349cc1cc\") " Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.641563 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eb28a2e-eb12-4867-9c26-3416349cc1cc-config-data\") pod \"2eb28a2e-eb12-4867-9c26-3416349cc1cc\" (UID: \"2eb28a2e-eb12-4867-9c26-3416349cc1cc\") " Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.642273 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d15a66bd-d8e7-4ad0-a8bc-7575a218f50c-config-data\") pod \"nova-metadata-0\" (UID: \"d15a66bd-d8e7-4ad0-a8bc-7575a218f50c\") " pod="openstack/nova-metadata-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.645150 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d15a66bd-d8e7-4ad0-a8bc-7575a218f50c-logs\") pod \"nova-metadata-0\" (UID: \"d15a66bd-d8e7-4ad0-a8bc-7575a218f50c\") " pod="openstack/nova-metadata-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.642986 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h"] Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.646941 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d15a66bd-d8e7-4ad0-a8bc-7575a218f50c-logs\") pod \"nova-metadata-0\" (UID: \"d15a66bd-d8e7-4ad0-a8bc-7575a218f50c\") " pod="openstack/nova-metadata-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.648299 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmpsv\" (UniqueName: \"kubernetes.io/projected/d15a66bd-d8e7-4ad0-a8bc-7575a218f50c-kube-api-access-tmpsv\") pod \"nova-metadata-0\" (UID: \"d15a66bd-d8e7-4ad0-a8bc-7575a218f50c\") " pod="openstack/nova-metadata-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.648708 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eb28a2e-eb12-4867-9c26-3416349cc1cc-kube-api-access-4dpxs" (OuterVolumeSpecName: "kube-api-access-4dpxs") pod "2eb28a2e-eb12-4867-9c26-3416349cc1cc" (UID: "2eb28a2e-eb12-4867-9c26-3416349cc1cc"). InnerVolumeSpecName "kube-api-access-4dpxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.649236 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d15a66bd-d8e7-4ad0-a8bc-7575a218f50c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d15a66bd-d8e7-4ad0-a8bc-7575a218f50c\") " pod="openstack/nova-metadata-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.649710 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dpxs\" (UniqueName: \"kubernetes.io/projected/2eb28a2e-eb12-4867-9c26-3416349cc1cc-kube-api-access-4dpxs\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.653346 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d15a66bd-d8e7-4ad0-a8bc-7575a218f50c-config-data\") pod \"nova-metadata-0\" (UID: \"d15a66bd-d8e7-4ad0-a8bc-7575a218f50c\") " pod="openstack/nova-metadata-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.669183 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmpsv\" (UniqueName: \"kubernetes.io/projected/d15a66bd-d8e7-4ad0-a8bc-7575a218f50c-kube-api-access-tmpsv\") pod \"nova-metadata-0\" (UID: \"d15a66bd-d8e7-4ad0-a8bc-7575a218f50c\") " pod="openstack/nova-metadata-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.669324 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d15a66bd-d8e7-4ad0-a8bc-7575a218f50c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d15a66bd-d8e7-4ad0-a8bc-7575a218f50c\") " pod="openstack/nova-metadata-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.693812 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eb28a2e-eb12-4867-9c26-3416349cc1cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2eb28a2e-eb12-4867-9c26-3416349cc1cc" (UID: "2eb28a2e-eb12-4867-9c26-3416349cc1cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.706827 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eb28a2e-eb12-4867-9c26-3416349cc1cc-config-data" (OuterVolumeSpecName: "config-data") pod "2eb28a2e-eb12-4867-9c26-3416349cc1cc" (UID: "2eb28a2e-eb12-4867-9c26-3416349cc1cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.730828 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.784702 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.785041 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.785090 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.785131 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.785351 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.785400 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.785466 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.785533 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.785636 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.785754 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.785788 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.785859 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.785897 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwhdv\" (UniqueName: \"kubernetes.io/projected/59981ca7-620e-4025-b165-4f54f920e8f2-kube-api-access-jwhdv\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.785984 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eb28a2e-eb12-4867-9c26-3416349cc1cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.786000 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eb28a2e-eb12-4867-9c26-3416349cc1cc-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.889702 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.889767 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.889896 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.889913 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwhdv\" (UniqueName: \"kubernetes.io/projected/59981ca7-620e-4025-b165-4f54f920e8f2-kube-api-access-jwhdv\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.890041 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.890083 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.890118 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.890153 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.890206 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.890233 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.890278 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.890336 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.890379 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.891103 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.892086 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.897157 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.899850 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.902742 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.903705 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.903785 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.903884 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.904225 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.906826 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.907566 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.909485 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.915004 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwhdv\" (UniqueName: \"kubernetes.io/projected/59981ca7-620e-4025-b165-4f54f920e8f2-kube-api-access-jwhdv\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.926805 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.101915 4795 generic.go:334] "Generic (PLEG): container finished" podID="2eb28a2e-eb12-4867-9c26-3416349cc1cc" containerID="1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2" exitCode=0 Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.102217 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2eb28a2e-eb12-4867-9c26-3416349cc1cc","Type":"ContainerDied","Data":"1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2"} Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.102245 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2eb28a2e-eb12-4867-9c26-3416349cc1cc","Type":"ContainerDied","Data":"93528f93304bf625c4781fad623cd7f9e1b6953a05d729d20b96627d846cf536"} Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.102260 4795 scope.go:117] "RemoveContainer" containerID="1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2" Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.102401 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.148151 4795 scope.go:117] "RemoveContainer" containerID="1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2" Feb 19 23:47:06 crc kubenswrapper[4795]: E0219 23:47:06.151635 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2\": container with ID starting with 1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2 not found: ID does not exist" containerID="1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2" Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.151682 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2"} err="failed to get container status \"1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2\": rpc error: code = NotFound desc = could not find container \"1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2\": container with ID starting with 1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2 not found: ID does not exist" Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.184397 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.208779 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.232260 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.233771 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.236377 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.262331 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.276442 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.316039 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:47:06 crc kubenswrapper[4795]: W0219 23:47:06.329861 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd15a66bd_d8e7_4ad0_a8bc_7575a218f50c.slice/crio-269ea3a3cc7c782717513c519defdf6e40bbb4f24b5f8b9cc525bbe3b1585b80 WatchSource:0}: Error finding container 269ea3a3cc7c782717513c519defdf6e40bbb4f24b5f8b9cc525bbe3b1585b80: Status 404 returned error can't find the container with id 269ea3a3cc7c782717513c519defdf6e40bbb4f24b5f8b9cc525bbe3b1585b80 Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.407930 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d16cd452-43cb-42e4-b4af-6de3271d7194-config-data\") pod \"nova-scheduler-0\" (UID: \"d16cd452-43cb-42e4-b4af-6de3271d7194\") " pod="openstack/nova-scheduler-0" Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.408150 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggvmf\" (UniqueName: \"kubernetes.io/projected/d16cd452-43cb-42e4-b4af-6de3271d7194-kube-api-access-ggvmf\") pod \"nova-scheduler-0\" (UID: \"d16cd452-43cb-42e4-b4af-6de3271d7194\") " pod="openstack/nova-scheduler-0" Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.408297 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d16cd452-43cb-42e4-b4af-6de3271d7194-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d16cd452-43cb-42e4-b4af-6de3271d7194\") " pod="openstack/nova-scheduler-0" Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.509687 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggvmf\" (UniqueName: \"kubernetes.io/projected/d16cd452-43cb-42e4-b4af-6de3271d7194-kube-api-access-ggvmf\") pod \"nova-scheduler-0\" (UID: \"d16cd452-43cb-42e4-b4af-6de3271d7194\") " pod="openstack/nova-scheduler-0" Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.509768 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d16cd452-43cb-42e4-b4af-6de3271d7194-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d16cd452-43cb-42e4-b4af-6de3271d7194\") " pod="openstack/nova-scheduler-0" Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.509838 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d16cd452-43cb-42e4-b4af-6de3271d7194-config-data\") pod \"nova-scheduler-0\" (UID: \"d16cd452-43cb-42e4-b4af-6de3271d7194\") " pod="openstack/nova-scheduler-0" Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.519961 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d16cd452-43cb-42e4-b4af-6de3271d7194-config-data\") pod \"nova-scheduler-0\" (UID: \"d16cd452-43cb-42e4-b4af-6de3271d7194\") " pod="openstack/nova-scheduler-0" Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.520122 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d16cd452-43cb-42e4-b4af-6de3271d7194-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d16cd452-43cb-42e4-b4af-6de3271d7194\") " pod="openstack/nova-scheduler-0" Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.527880 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggvmf\" (UniqueName: \"kubernetes.io/projected/d16cd452-43cb-42e4-b4af-6de3271d7194-kube-api-access-ggvmf\") pod \"nova-scheduler-0\" (UID: \"d16cd452-43cb-42e4-b4af-6de3271d7194\") " pod="openstack/nova-scheduler-0" Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.571624 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h"] Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.723702 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 23:47:07 crc kubenswrapper[4795]: I0219 23:47:07.091837 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:47:07 crc kubenswrapper[4795]: I0219 23:47:07.141619 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"644df2e5-37fd-468b-9e52-316d44e65f69","Type":"ContainerStarted","Data":"0f5fd6cdf83515d2353737625145defcce8f805453e8e49904eda86c74087d38"} Feb 19 23:47:07 crc kubenswrapper[4795]: I0219 23:47:07.141671 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"644df2e5-37fd-468b-9e52-316d44e65f69","Type":"ContainerStarted","Data":"2df44d43566b44017947ac4a40886bbe921efeaa0dfe870b97a2200c128147a8"} Feb 19 23:47:07 crc kubenswrapper[4795]: I0219 23:47:07.141685 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"644df2e5-37fd-468b-9e52-316d44e65f69","Type":"ContainerStarted","Data":"f310f9961bfa7e30dccb9b463be3fd81c2a95eff37d9a32e545f107088030354"} Feb 19 23:47:07 crc kubenswrapper[4795]: I0219 23:47:07.146698 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d16cd452-43cb-42e4-b4af-6de3271d7194","Type":"ContainerStarted","Data":"bb9261dd0e2bd9d5c273788bbe975558f6574c3d54d777e4a936c18b717d1be3"} Feb 19 23:47:07 crc kubenswrapper[4795]: I0219 23:47:07.165157 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" event={"ID":"59981ca7-620e-4025-b165-4f54f920e8f2","Type":"ContainerStarted","Data":"775470b3cc1f8cc005107bb43cf5801c2166fca168c3fce4a4df488ec55ed38a"} Feb 19 23:47:07 crc kubenswrapper[4795]: I0219 23:47:07.172590 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.172571975 podStartE2EDuration="2.172571975s" podCreationTimestamp="2026-02-19 23:47:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:47:07.168643407 +0000 UTC m=+8338.361161271" watchObservedRunningTime="2026-02-19 23:47:07.172571975 +0000 UTC m=+8338.365089839" Feb 19 23:47:07 crc kubenswrapper[4795]: I0219 23:47:07.187296 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d15a66bd-d8e7-4ad0-a8bc-7575a218f50c","Type":"ContainerStarted","Data":"375b22bf9ffd8064d8642f2b8d04f6a411a967b3310e411f925cc15f15fc29e4"} Feb 19 23:47:07 crc kubenswrapper[4795]: I0219 23:47:07.187356 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d15a66bd-d8e7-4ad0-a8bc-7575a218f50c","Type":"ContainerStarted","Data":"581ca652c9580fd55c1e58d31ee1935d33bc98270dac22244ea7159db91b1f7c"} Feb 19 23:47:07 crc kubenswrapper[4795]: I0219 23:47:07.187368 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d15a66bd-d8e7-4ad0-a8bc-7575a218f50c","Type":"ContainerStarted","Data":"269ea3a3cc7c782717513c519defdf6e40bbb4f24b5f8b9cc525bbe3b1585b80"} Feb 19 23:47:07 crc kubenswrapper[4795]: I0219 23:47:07.205106 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.205087055 podStartE2EDuration="2.205087055s" podCreationTimestamp="2026-02-19 23:47:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:47:07.203362877 +0000 UTC m=+8338.395880751" watchObservedRunningTime="2026-02-19 23:47:07.205087055 +0000 UTC m=+8338.397604919" Feb 19 23:47:07 crc kubenswrapper[4795]: I0219 23:47:07.551716 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2eb28a2e-eb12-4867-9c26-3416349cc1cc" path="/var/lib/kubelet/pods/2eb28a2e-eb12-4867-9c26-3416349cc1cc/volumes" Feb 19 23:47:08 crc kubenswrapper[4795]: I0219 23:47:08.206104 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d16cd452-43cb-42e4-b4af-6de3271d7194","Type":"ContainerStarted","Data":"435a833fa4fef5d0a6c7a1dbfb25df6372e9d725b8d48ca4e2085b2c4259562a"} Feb 19 23:47:08 crc kubenswrapper[4795]: I0219 23:47:08.210432 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" event={"ID":"59981ca7-620e-4025-b165-4f54f920e8f2","Type":"ContainerStarted","Data":"a825f4508bbcb5fe04299d8794a0d2181819f8e6cdbaee23877c97437e821ffd"} Feb 19 23:47:08 crc kubenswrapper[4795]: I0219 23:47:08.236855 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.236828937 podStartE2EDuration="2.236828937s" podCreationTimestamp="2026-02-19 23:47:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:47:08.224329891 +0000 UTC m=+8339.416847745" watchObservedRunningTime="2026-02-19 23:47:08.236828937 +0000 UTC m=+8339.429346801" Feb 19 23:47:08 crc kubenswrapper[4795]: I0219 23:47:08.249994 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" podStartSLOduration=2.8040324500000002 podStartE2EDuration="3.24997587s" podCreationTimestamp="2026-02-19 23:47:05 +0000 UTC" firstStartedPulling="2026-02-19 23:47:06.586369164 +0000 UTC m=+8337.778887028" lastFinishedPulling="2026-02-19 23:47:07.032312594 +0000 UTC m=+8338.224830448" observedRunningTime="2026-02-19 23:47:08.248560381 +0000 UTC m=+8339.441078245" watchObservedRunningTime="2026-02-19 23:47:08.24997587 +0000 UTC m=+8339.442493734" Feb 19 23:47:10 crc kubenswrapper[4795]: I0219 23:47:10.732683 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 23:47:10 crc kubenswrapper[4795]: I0219 23:47:10.734085 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 23:47:11 crc kubenswrapper[4795]: I0219 23:47:11.724661 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 23:47:12 crc kubenswrapper[4795]: I0219 23:47:12.519099 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 19 23:47:13 crc kubenswrapper[4795]: I0219 23:47:13.575120 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 19 23:47:15 crc kubenswrapper[4795]: I0219 23:47:15.618190 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 23:47:15 crc kubenswrapper[4795]: I0219 23:47:15.618823 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 23:47:15 crc kubenswrapper[4795]: I0219 23:47:15.733138 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 23:47:15 crc kubenswrapper[4795]: I0219 23:47:15.733232 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 23:47:16 crc kubenswrapper[4795]: I0219 23:47:16.701516 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="644df2e5-37fd-468b-9e52-316d44e65f69" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.190:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 23:47:16 crc kubenswrapper[4795]: I0219 23:47:16.701548 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="644df2e5-37fd-468b-9e52-316d44e65f69" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.190:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 23:47:16 crc kubenswrapper[4795]: I0219 23:47:16.724617 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 23:47:16 crc kubenswrapper[4795]: I0219 23:47:16.774559 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d15a66bd-d8e7-4ad0-a8bc-7575a218f50c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.191:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 23:47:16 crc kubenswrapper[4795]: I0219 23:47:16.798903 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 23:47:16 crc kubenswrapper[4795]: I0219 23:47:16.815869 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d15a66bd-d8e7-4ad0-a8bc-7575a218f50c" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.191:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 23:47:17 crc kubenswrapper[4795]: I0219 23:47:17.344818 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 23:47:25 crc kubenswrapper[4795]: I0219 23:47:25.622359 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 23:47:25 crc kubenswrapper[4795]: I0219 23:47:25.623795 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 23:47:25 crc kubenswrapper[4795]: I0219 23:47:25.626848 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 23:47:25 crc kubenswrapper[4795]: I0219 23:47:25.628473 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 23:47:25 crc kubenswrapper[4795]: I0219 23:47:25.734436 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 23:47:25 crc kubenswrapper[4795]: I0219 23:47:25.734598 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 23:47:25 crc kubenswrapper[4795]: I0219 23:47:25.736010 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 23:47:25 crc kubenswrapper[4795]: I0219 23:47:25.737027 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 23:47:26 crc kubenswrapper[4795]: I0219 23:47:26.401120 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 23:47:26 crc kubenswrapper[4795]: I0219 23:47:26.407520 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 23:47:28 crc kubenswrapper[4795]: I0219 23:47:28.427224 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:47:28 crc kubenswrapper[4795]: I0219 23:47:28.427618 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:47:58 crc kubenswrapper[4795]: I0219 23:47:58.427706 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:47:58 crc kubenswrapper[4795]: I0219 23:47:58.428310 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:47:58 crc kubenswrapper[4795]: I0219 23:47:58.428377 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 23:47:58 crc kubenswrapper[4795]: I0219 23:47:58.429554 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 23:47:58 crc kubenswrapper[4795]: I0219 23:47:58.429648 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" gracePeriod=600 Feb 19 23:47:58 crc kubenswrapper[4795]: E0219 23:47:58.554198 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:47:58 crc kubenswrapper[4795]: I0219 23:47:58.772744 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" exitCode=0 Feb 19 23:47:58 crc kubenswrapper[4795]: I0219 23:47:58.772819 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a"} Feb 19 23:47:58 crc kubenswrapper[4795]: I0219 23:47:58.772895 4795 scope.go:117] "RemoveContainer" containerID="9e31f1c0e7be53ce872e54fe0d6436f7bde017185ed0c455c7619a4196124fcf" Feb 19 23:47:58 crc kubenswrapper[4795]: I0219 23:47:58.774015 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:47:58 crc kubenswrapper[4795]: E0219 23:47:58.774546 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:48:09 crc kubenswrapper[4795]: I0219 23:48:09.530502 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:48:09 crc kubenswrapper[4795]: E0219 23:48:09.532223 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:48:20 crc kubenswrapper[4795]: I0219 23:48:20.512549 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:48:20 crc kubenswrapper[4795]: E0219 23:48:20.513476 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:48:33 crc kubenswrapper[4795]: I0219 23:48:33.512286 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:48:33 crc kubenswrapper[4795]: E0219 23:48:33.513551 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:48:47 crc kubenswrapper[4795]: I0219 23:48:47.511963 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:48:47 crc kubenswrapper[4795]: E0219 23:48:47.512997 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:48:58 crc kubenswrapper[4795]: I0219 23:48:58.511509 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:48:58 crc kubenswrapper[4795]: E0219 23:48:58.512450 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:49:11 crc kubenswrapper[4795]: I0219 23:49:11.511721 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:49:11 crc kubenswrapper[4795]: E0219 23:49:11.513025 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:49:25 crc kubenswrapper[4795]: I0219 23:49:25.512128 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:49:25 crc kubenswrapper[4795]: E0219 23:49:25.512875 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:49:37 crc kubenswrapper[4795]: I0219 23:49:37.511645 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:49:37 crc kubenswrapper[4795]: E0219 23:49:37.512537 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:49:49 crc kubenswrapper[4795]: I0219 23:49:49.518227 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:49:49 crc kubenswrapper[4795]: E0219 23:49:49.519011 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:50:04 crc kubenswrapper[4795]: I0219 23:50:04.512593 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:50:04 crc kubenswrapper[4795]: E0219 23:50:04.513379 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:50:18 crc kubenswrapper[4795]: I0219 23:50:18.512313 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:50:18 crc kubenswrapper[4795]: E0219 23:50:18.513119 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:50:23 crc kubenswrapper[4795]: I0219 23:50:23.301142 4795 generic.go:334] "Generic (PLEG): container finished" podID="59981ca7-620e-4025-b165-4f54f920e8f2" containerID="a825f4508bbcb5fe04299d8794a0d2181819f8e6cdbaee23877c97437e821ffd" exitCode=0 Feb 19 23:50:23 crc kubenswrapper[4795]: I0219 23:50:23.301261 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" event={"ID":"59981ca7-620e-4025-b165-4f54f920e8f2","Type":"ContainerDied","Data":"a825f4508bbcb5fe04299d8794a0d2181819f8e6cdbaee23877c97437e821ffd"} Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.779438 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.956644 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwhdv\" (UniqueName: \"kubernetes.io/projected/59981ca7-620e-4025-b165-4f54f920e8f2-kube-api-access-jwhdv\") pod \"59981ca7-620e-4025-b165-4f54f920e8f2\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.956693 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-2\") pod \"59981ca7-620e-4025-b165-4f54f920e8f2\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.956725 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cells-global-config-1\") pod \"59981ca7-620e-4025-b165-4f54f920e8f2\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.956743 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cells-global-config-0\") pod \"59981ca7-620e-4025-b165-4f54f920e8f2\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.956802 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-combined-ca-bundle\") pod \"59981ca7-620e-4025-b165-4f54f920e8f2\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.956845 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-ssh-key-openstack-cell1\") pod \"59981ca7-620e-4025-b165-4f54f920e8f2\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.956902 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-migration-ssh-key-1\") pod \"59981ca7-620e-4025-b165-4f54f920e8f2\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.957065 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-1\") pod \"59981ca7-620e-4025-b165-4f54f920e8f2\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.957112 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-0\") pod \"59981ca7-620e-4025-b165-4f54f920e8f2\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.957145 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-ceph\") pod \"59981ca7-620e-4025-b165-4f54f920e8f2\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.957190 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-migration-ssh-key-0\") pod \"59981ca7-620e-4025-b165-4f54f920e8f2\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.957230 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-inventory\") pod \"59981ca7-620e-4025-b165-4f54f920e8f2\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.957268 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-3\") pod \"59981ca7-620e-4025-b165-4f54f920e8f2\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.962221 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59981ca7-620e-4025-b165-4f54f920e8f2-kube-api-access-jwhdv" (OuterVolumeSpecName: "kube-api-access-jwhdv") pod "59981ca7-620e-4025-b165-4f54f920e8f2" (UID: "59981ca7-620e-4025-b165-4f54f920e8f2"). InnerVolumeSpecName "kube-api-access-jwhdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.967514 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "59981ca7-620e-4025-b165-4f54f920e8f2" (UID: "59981ca7-620e-4025-b165-4f54f920e8f2"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.976908 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-ceph" (OuterVolumeSpecName: "ceph") pod "59981ca7-620e-4025-b165-4f54f920e8f2" (UID: "59981ca7-620e-4025-b165-4f54f920e8f2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.992414 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-inventory" (OuterVolumeSpecName: "inventory") pod "59981ca7-620e-4025-b165-4f54f920e8f2" (UID: "59981ca7-620e-4025-b165-4f54f920e8f2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.004989 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "59981ca7-620e-4025-b165-4f54f920e8f2" (UID: "59981ca7-620e-4025-b165-4f54f920e8f2"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.008921 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "59981ca7-620e-4025-b165-4f54f920e8f2" (UID: "59981ca7-620e-4025-b165-4f54f920e8f2"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.013342 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "59981ca7-620e-4025-b165-4f54f920e8f2" (UID: "59981ca7-620e-4025-b165-4f54f920e8f2"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.013957 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "59981ca7-620e-4025-b165-4f54f920e8f2" (UID: "59981ca7-620e-4025-b165-4f54f920e8f2"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.015462 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "59981ca7-620e-4025-b165-4f54f920e8f2" (UID: "59981ca7-620e-4025-b165-4f54f920e8f2"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.015870 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "59981ca7-620e-4025-b165-4f54f920e8f2" (UID: "59981ca7-620e-4025-b165-4f54f920e8f2"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.015994 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "59981ca7-620e-4025-b165-4f54f920e8f2" (UID: "59981ca7-620e-4025-b165-4f54f920e8f2"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.019563 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "59981ca7-620e-4025-b165-4f54f920e8f2" (UID: "59981ca7-620e-4025-b165-4f54f920e8f2"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.023495 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "59981ca7-620e-4025-b165-4f54f920e8f2" (UID: "59981ca7-620e-4025-b165-4f54f920e8f2"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.059887 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.060253 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.060268 4795 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.060280 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.060295 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.060308 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.060320 4795 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.060329 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.060337 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.060347 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwhdv\" (UniqueName: \"kubernetes.io/projected/59981ca7-620e-4025-b165-4f54f920e8f2-kube-api-access-jwhdv\") on node \"crc\" DevicePath \"\"" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.060355 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.060366 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.060378 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.323701 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" event={"ID":"59981ca7-620e-4025-b165-4f54f920e8f2","Type":"ContainerDied","Data":"775470b3cc1f8cc005107bb43cf5801c2166fca168c3fce4a4df488ec55ed38a"} Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.323739 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="775470b3cc1f8cc005107bb43cf5801c2166fca168c3fce4a4df488ec55ed38a" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.323773 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:50:30 crc kubenswrapper[4795]: I0219 23:50:30.512418 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:50:30 crc kubenswrapper[4795]: E0219 23:50:30.513054 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:50:45 crc kubenswrapper[4795]: I0219 23:50:45.512841 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:50:45 crc kubenswrapper[4795]: E0219 23:50:45.514922 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:50:58 crc kubenswrapper[4795]: I0219 23:50:58.511838 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:50:58 crc kubenswrapper[4795]: E0219 23:50:58.512692 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:51:10 crc kubenswrapper[4795]: I0219 23:51:10.513284 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:51:10 crc kubenswrapper[4795]: E0219 23:51:10.514372 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:51:18 crc kubenswrapper[4795]: I0219 23:51:18.748586 4795 trace.go:236] Trace[1281311727]: "Calculate volume metrics of ovn-data for pod openstack/ovn-copy-data" (19-Feb-2026 23:51:17.717) (total time: 1030ms): Feb 19 23:51:18 crc kubenswrapper[4795]: Trace[1281311727]: [1.030662002s] [1.030662002s] END Feb 19 23:51:25 crc kubenswrapper[4795]: I0219 23:51:25.514121 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:51:25 crc kubenswrapper[4795]: E0219 23:51:25.515102 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:51:37 crc kubenswrapper[4795]: I0219 23:51:37.512251 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:51:37 crc kubenswrapper[4795]: E0219 23:51:37.513660 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:51:49 crc kubenswrapper[4795]: I0219 23:51:49.520875 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:51:49 crc kubenswrapper[4795]: E0219 23:51:49.524093 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:52:02 crc kubenswrapper[4795]: I0219 23:52:02.514141 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:52:02 crc kubenswrapper[4795]: E0219 23:52:02.515015 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:52:14 crc kubenswrapper[4795]: I0219 23:52:14.511512 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:52:14 crc kubenswrapper[4795]: E0219 23:52:14.512493 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:52:29 crc kubenswrapper[4795]: I0219 23:52:29.525366 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:52:29 crc kubenswrapper[4795]: E0219 23:52:29.526732 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:52:40 crc kubenswrapper[4795]: I0219 23:52:40.512061 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:52:40 crc kubenswrapper[4795]: E0219 23:52:40.512736 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:52:55 crc kubenswrapper[4795]: I0219 23:52:55.512471 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:52:55 crc kubenswrapper[4795]: E0219 23:52:55.513517 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:53:06 crc kubenswrapper[4795]: I0219 23:53:06.512126 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:53:07 crc kubenswrapper[4795]: I0219 23:53:07.050028 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"826efb3421363865a6a92aeeb46a1de0d922f1bfcfa12ff4f740f04d52a6b3e6"} Feb 19 23:53:15 crc kubenswrapper[4795]: I0219 23:53:15.569057 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-47mst"] Feb 19 23:53:15 crc kubenswrapper[4795]: E0219 23:53:15.571095 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59981ca7-620e-4025-b165-4f54f920e8f2" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Feb 19 23:53:15 crc kubenswrapper[4795]: I0219 23:53:15.571277 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="59981ca7-620e-4025-b165-4f54f920e8f2" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Feb 19 23:53:15 crc kubenswrapper[4795]: I0219 23:53:15.571557 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="59981ca7-620e-4025-b165-4f54f920e8f2" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Feb 19 23:53:15 crc kubenswrapper[4795]: I0219 23:53:15.573412 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:15 crc kubenswrapper[4795]: I0219 23:53:15.602024 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-47mst"] Feb 19 23:53:15 crc kubenswrapper[4795]: I0219 23:53:15.751080 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e0ad945-0620-480a-8200-fce17a619511-catalog-content\") pod \"community-operators-47mst\" (UID: \"3e0ad945-0620-480a-8200-fce17a619511\") " pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:15 crc kubenswrapper[4795]: I0219 23:53:15.751454 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e0ad945-0620-480a-8200-fce17a619511-utilities\") pod \"community-operators-47mst\" (UID: \"3e0ad945-0620-480a-8200-fce17a619511\") " pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:15 crc kubenswrapper[4795]: I0219 23:53:15.751666 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fllv\" (UniqueName: \"kubernetes.io/projected/3e0ad945-0620-480a-8200-fce17a619511-kube-api-access-7fllv\") pod \"community-operators-47mst\" (UID: \"3e0ad945-0620-480a-8200-fce17a619511\") " pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:15 crc kubenswrapper[4795]: I0219 23:53:15.853984 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e0ad945-0620-480a-8200-fce17a619511-utilities\") pod \"community-operators-47mst\" (UID: \"3e0ad945-0620-480a-8200-fce17a619511\") " pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:15 crc kubenswrapper[4795]: I0219 23:53:15.854583 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fllv\" (UniqueName: \"kubernetes.io/projected/3e0ad945-0620-480a-8200-fce17a619511-kube-api-access-7fllv\") pod \"community-operators-47mst\" (UID: \"3e0ad945-0620-480a-8200-fce17a619511\") " pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:15 crc kubenswrapper[4795]: I0219 23:53:15.854726 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e0ad945-0620-480a-8200-fce17a619511-utilities\") pod \"community-operators-47mst\" (UID: \"3e0ad945-0620-480a-8200-fce17a619511\") " pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:15 crc kubenswrapper[4795]: I0219 23:53:15.855383 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e0ad945-0620-480a-8200-fce17a619511-catalog-content\") pod \"community-operators-47mst\" (UID: \"3e0ad945-0620-480a-8200-fce17a619511\") " pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:15 crc kubenswrapper[4795]: I0219 23:53:15.855828 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e0ad945-0620-480a-8200-fce17a619511-catalog-content\") pod \"community-operators-47mst\" (UID: \"3e0ad945-0620-480a-8200-fce17a619511\") " pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:15 crc kubenswrapper[4795]: I0219 23:53:15.885062 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fllv\" (UniqueName: \"kubernetes.io/projected/3e0ad945-0620-480a-8200-fce17a619511-kube-api-access-7fllv\") pod \"community-operators-47mst\" (UID: \"3e0ad945-0620-480a-8200-fce17a619511\") " pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:15 crc kubenswrapper[4795]: I0219 23:53:15.907383 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:16 crc kubenswrapper[4795]: I0219 23:53:16.434821 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-47mst"] Feb 19 23:53:17 crc kubenswrapper[4795]: I0219 23:53:17.155319 4795 generic.go:334] "Generic (PLEG): container finished" podID="3e0ad945-0620-480a-8200-fce17a619511" containerID="c5dcdf6c1cf69827f802b079407c9d34bb28958b817ae561694de607237e5ca3" exitCode=0 Feb 19 23:53:17 crc kubenswrapper[4795]: I0219 23:53:17.155399 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47mst" event={"ID":"3e0ad945-0620-480a-8200-fce17a619511","Type":"ContainerDied","Data":"c5dcdf6c1cf69827f802b079407c9d34bb28958b817ae561694de607237e5ca3"} Feb 19 23:53:17 crc kubenswrapper[4795]: I0219 23:53:17.155651 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47mst" event={"ID":"3e0ad945-0620-480a-8200-fce17a619511","Type":"ContainerStarted","Data":"654266e485a03d2ad418348459ffcdd5cbe44c629d72206aae359e312bfb70eb"} Feb 19 23:53:17 crc kubenswrapper[4795]: I0219 23:53:17.157264 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 23:53:18 crc kubenswrapper[4795]: I0219 23:53:18.168980 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47mst" event={"ID":"3e0ad945-0620-480a-8200-fce17a619511","Type":"ContainerStarted","Data":"245578835a2e0844e5239253327e73fad61978cfe7130dfde66805d05d7a4ed8"} Feb 19 23:53:19 crc kubenswrapper[4795]: I0219 23:53:19.182632 4795 generic.go:334] "Generic (PLEG): container finished" podID="3e0ad945-0620-480a-8200-fce17a619511" containerID="245578835a2e0844e5239253327e73fad61978cfe7130dfde66805d05d7a4ed8" exitCode=0 Feb 19 23:53:19 crc kubenswrapper[4795]: I0219 23:53:19.182728 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47mst" event={"ID":"3e0ad945-0620-480a-8200-fce17a619511","Type":"ContainerDied","Data":"245578835a2e0844e5239253327e73fad61978cfe7130dfde66805d05d7a4ed8"} Feb 19 23:53:20 crc kubenswrapper[4795]: I0219 23:53:20.073221 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Feb 19 23:53:20 crc kubenswrapper[4795]: I0219 23:53:20.073724 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="4f232979-ab9c-4b59-8ad8-7756367fe0bf" containerName="adoption" containerID="cri-o://d99c65ed2cf2832977c62a47557eeea8eec734877d891b4ac4fe2f4a681f7224" gracePeriod=30 Feb 19 23:53:20 crc kubenswrapper[4795]: I0219 23:53:20.194479 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47mst" event={"ID":"3e0ad945-0620-480a-8200-fce17a619511","Type":"ContainerStarted","Data":"bfc69a308b4874cc4701238f4bc637dd5837640b9708a43dce7266489d8f2bbc"} Feb 19 23:53:20 crc kubenswrapper[4795]: I0219 23:53:20.222857 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-47mst" podStartSLOduration=2.815418555 podStartE2EDuration="5.22283707s" podCreationTimestamp="2026-02-19 23:53:15 +0000 UTC" firstStartedPulling="2026-02-19 23:53:17.156997619 +0000 UTC m=+8708.349515493" lastFinishedPulling="2026-02-19 23:53:19.564416134 +0000 UTC m=+8710.756934008" observedRunningTime="2026-02-19 23:53:20.215897246 +0000 UTC m=+8711.408415130" watchObservedRunningTime="2026-02-19 23:53:20.22283707 +0000 UTC m=+8711.415354934" Feb 19 23:53:25 crc kubenswrapper[4795]: I0219 23:53:25.908449 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:25 crc kubenswrapper[4795]: I0219 23:53:25.909119 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:25 crc kubenswrapper[4795]: I0219 23:53:25.965274 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:26 crc kubenswrapper[4795]: I0219 23:53:26.303667 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:29 crc kubenswrapper[4795]: I0219 23:53:29.210221 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-47mst"] Feb 19 23:53:29 crc kubenswrapper[4795]: I0219 23:53:29.293798 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-47mst" podUID="3e0ad945-0620-480a-8200-fce17a619511" containerName="registry-server" containerID="cri-o://bfc69a308b4874cc4701238f4bc637dd5837640b9708a43dce7266489d8f2bbc" gracePeriod=2 Feb 19 23:53:29 crc kubenswrapper[4795]: I0219 23:53:29.826343 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:29 crc kubenswrapper[4795]: I0219 23:53:29.974549 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fllv\" (UniqueName: \"kubernetes.io/projected/3e0ad945-0620-480a-8200-fce17a619511-kube-api-access-7fllv\") pod \"3e0ad945-0620-480a-8200-fce17a619511\" (UID: \"3e0ad945-0620-480a-8200-fce17a619511\") " Feb 19 23:53:29 crc kubenswrapper[4795]: I0219 23:53:29.974642 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e0ad945-0620-480a-8200-fce17a619511-catalog-content\") pod \"3e0ad945-0620-480a-8200-fce17a619511\" (UID: \"3e0ad945-0620-480a-8200-fce17a619511\") " Feb 19 23:53:29 crc kubenswrapper[4795]: I0219 23:53:29.974747 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e0ad945-0620-480a-8200-fce17a619511-utilities\") pod \"3e0ad945-0620-480a-8200-fce17a619511\" (UID: \"3e0ad945-0620-480a-8200-fce17a619511\") " Feb 19 23:53:29 crc kubenswrapper[4795]: I0219 23:53:29.975902 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e0ad945-0620-480a-8200-fce17a619511-utilities" (OuterVolumeSpecName: "utilities") pod "3e0ad945-0620-480a-8200-fce17a619511" (UID: "3e0ad945-0620-480a-8200-fce17a619511"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:53:29 crc kubenswrapper[4795]: I0219 23:53:29.976582 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e0ad945-0620-480a-8200-fce17a619511-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:53:29 crc kubenswrapper[4795]: I0219 23:53:29.982710 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e0ad945-0620-480a-8200-fce17a619511-kube-api-access-7fllv" (OuterVolumeSpecName: "kube-api-access-7fllv") pod "3e0ad945-0620-480a-8200-fce17a619511" (UID: "3e0ad945-0620-480a-8200-fce17a619511"). InnerVolumeSpecName "kube-api-access-7fllv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.039447 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e0ad945-0620-480a-8200-fce17a619511-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3e0ad945-0620-480a-8200-fce17a619511" (UID: "3e0ad945-0620-480a-8200-fce17a619511"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.079021 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fllv\" (UniqueName: \"kubernetes.io/projected/3e0ad945-0620-480a-8200-fce17a619511-kube-api-access-7fllv\") on node \"crc\" DevicePath \"\"" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.079065 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e0ad945-0620-480a-8200-fce17a619511-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.217008 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4cfwt"] Feb 19 23:53:30 crc kubenswrapper[4795]: E0219 23:53:30.217551 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e0ad945-0620-480a-8200-fce17a619511" containerName="extract-content" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.217571 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e0ad945-0620-480a-8200-fce17a619511" containerName="extract-content" Feb 19 23:53:30 crc kubenswrapper[4795]: E0219 23:53:30.217598 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e0ad945-0620-480a-8200-fce17a619511" containerName="registry-server" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.217605 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e0ad945-0620-480a-8200-fce17a619511" containerName="registry-server" Feb 19 23:53:30 crc kubenswrapper[4795]: E0219 23:53:30.217616 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e0ad945-0620-480a-8200-fce17a619511" containerName="extract-utilities" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.217623 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e0ad945-0620-480a-8200-fce17a619511" containerName="extract-utilities" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.217966 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e0ad945-0620-480a-8200-fce17a619511" containerName="registry-server" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.220918 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.229055 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4cfwt"] Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.309089 4795 generic.go:334] "Generic (PLEG): container finished" podID="3e0ad945-0620-480a-8200-fce17a619511" containerID="bfc69a308b4874cc4701238f4bc637dd5837640b9708a43dce7266489d8f2bbc" exitCode=0 Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.309191 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.309200 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47mst" event={"ID":"3e0ad945-0620-480a-8200-fce17a619511","Type":"ContainerDied","Data":"bfc69a308b4874cc4701238f4bc637dd5837640b9708a43dce7266489d8f2bbc"} Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.309267 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47mst" event={"ID":"3e0ad945-0620-480a-8200-fce17a619511","Type":"ContainerDied","Data":"654266e485a03d2ad418348459ffcdd5cbe44c629d72206aae359e312bfb70eb"} Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.309289 4795 scope.go:117] "RemoveContainer" containerID="bfc69a308b4874cc4701238f4bc637dd5837640b9708a43dce7266489d8f2bbc" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.343070 4795 scope.go:117] "RemoveContainer" containerID="245578835a2e0844e5239253327e73fad61978cfe7130dfde66805d05d7a4ed8" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.352416 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-47mst"] Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.364752 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-47mst"] Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.375990 4795 scope.go:117] "RemoveContainer" containerID="c5dcdf6c1cf69827f802b079407c9d34bb28958b817ae561694de607237e5ca3" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.389013 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jdmz\" (UniqueName: \"kubernetes.io/projected/2031ff34-f306-4920-8bb4-a6f0151a9aa3-kube-api-access-2jdmz\") pod \"certified-operators-4cfwt\" (UID: \"2031ff34-f306-4920-8bb4-a6f0151a9aa3\") " pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.389082 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2031ff34-f306-4920-8bb4-a6f0151a9aa3-utilities\") pod \"certified-operators-4cfwt\" (UID: \"2031ff34-f306-4920-8bb4-a6f0151a9aa3\") " pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.389210 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2031ff34-f306-4920-8bb4-a6f0151a9aa3-catalog-content\") pod \"certified-operators-4cfwt\" (UID: \"2031ff34-f306-4920-8bb4-a6f0151a9aa3\") " pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.431091 4795 scope.go:117] "RemoveContainer" containerID="bfc69a308b4874cc4701238f4bc637dd5837640b9708a43dce7266489d8f2bbc" Feb 19 23:53:30 crc kubenswrapper[4795]: E0219 23:53:30.431521 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfc69a308b4874cc4701238f4bc637dd5837640b9708a43dce7266489d8f2bbc\": container with ID starting with bfc69a308b4874cc4701238f4bc637dd5837640b9708a43dce7266489d8f2bbc not found: ID does not exist" containerID="bfc69a308b4874cc4701238f4bc637dd5837640b9708a43dce7266489d8f2bbc" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.431578 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfc69a308b4874cc4701238f4bc637dd5837640b9708a43dce7266489d8f2bbc"} err="failed to get container status \"bfc69a308b4874cc4701238f4bc637dd5837640b9708a43dce7266489d8f2bbc\": rpc error: code = NotFound desc = could not find container \"bfc69a308b4874cc4701238f4bc637dd5837640b9708a43dce7266489d8f2bbc\": container with ID starting with bfc69a308b4874cc4701238f4bc637dd5837640b9708a43dce7266489d8f2bbc not found: ID does not exist" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.431605 4795 scope.go:117] "RemoveContainer" containerID="245578835a2e0844e5239253327e73fad61978cfe7130dfde66805d05d7a4ed8" Feb 19 23:53:30 crc kubenswrapper[4795]: E0219 23:53:30.431932 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"245578835a2e0844e5239253327e73fad61978cfe7130dfde66805d05d7a4ed8\": container with ID starting with 245578835a2e0844e5239253327e73fad61978cfe7130dfde66805d05d7a4ed8 not found: ID does not exist" containerID="245578835a2e0844e5239253327e73fad61978cfe7130dfde66805d05d7a4ed8" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.431951 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"245578835a2e0844e5239253327e73fad61978cfe7130dfde66805d05d7a4ed8"} err="failed to get container status \"245578835a2e0844e5239253327e73fad61978cfe7130dfde66805d05d7a4ed8\": rpc error: code = NotFound desc = could not find container \"245578835a2e0844e5239253327e73fad61978cfe7130dfde66805d05d7a4ed8\": container with ID starting with 245578835a2e0844e5239253327e73fad61978cfe7130dfde66805d05d7a4ed8 not found: ID does not exist" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.431966 4795 scope.go:117] "RemoveContainer" containerID="c5dcdf6c1cf69827f802b079407c9d34bb28958b817ae561694de607237e5ca3" Feb 19 23:53:30 crc kubenswrapper[4795]: E0219 23:53:30.432528 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5dcdf6c1cf69827f802b079407c9d34bb28958b817ae561694de607237e5ca3\": container with ID starting with c5dcdf6c1cf69827f802b079407c9d34bb28958b817ae561694de607237e5ca3 not found: ID does not exist" containerID="c5dcdf6c1cf69827f802b079407c9d34bb28958b817ae561694de607237e5ca3" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.432548 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5dcdf6c1cf69827f802b079407c9d34bb28958b817ae561694de607237e5ca3"} err="failed to get container status \"c5dcdf6c1cf69827f802b079407c9d34bb28958b817ae561694de607237e5ca3\": rpc error: code = NotFound desc = could not find container \"c5dcdf6c1cf69827f802b079407c9d34bb28958b817ae561694de607237e5ca3\": container with ID starting with c5dcdf6c1cf69827f802b079407c9d34bb28958b817ae561694de607237e5ca3 not found: ID does not exist" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.491563 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2031ff34-f306-4920-8bb4-a6f0151a9aa3-catalog-content\") pod \"certified-operators-4cfwt\" (UID: \"2031ff34-f306-4920-8bb4-a6f0151a9aa3\") " pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.491717 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jdmz\" (UniqueName: \"kubernetes.io/projected/2031ff34-f306-4920-8bb4-a6f0151a9aa3-kube-api-access-2jdmz\") pod \"certified-operators-4cfwt\" (UID: \"2031ff34-f306-4920-8bb4-a6f0151a9aa3\") " pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.491791 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2031ff34-f306-4920-8bb4-a6f0151a9aa3-utilities\") pod \"certified-operators-4cfwt\" (UID: \"2031ff34-f306-4920-8bb4-a6f0151a9aa3\") " pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.492315 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2031ff34-f306-4920-8bb4-a6f0151a9aa3-utilities\") pod \"certified-operators-4cfwt\" (UID: \"2031ff34-f306-4920-8bb4-a6f0151a9aa3\") " pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.493578 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2031ff34-f306-4920-8bb4-a6f0151a9aa3-catalog-content\") pod \"certified-operators-4cfwt\" (UID: \"2031ff34-f306-4920-8bb4-a6f0151a9aa3\") " pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.507740 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jdmz\" (UniqueName: \"kubernetes.io/projected/2031ff34-f306-4920-8bb4-a6f0151a9aa3-kube-api-access-2jdmz\") pod \"certified-operators-4cfwt\" (UID: \"2031ff34-f306-4920-8bb4-a6f0151a9aa3\") " pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.595367 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:31 crc kubenswrapper[4795]: I0219 23:53:31.116589 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4cfwt"] Feb 19 23:53:31 crc kubenswrapper[4795]: I0219 23:53:31.319622 4795 generic.go:334] "Generic (PLEG): container finished" podID="2031ff34-f306-4920-8bb4-a6f0151a9aa3" containerID="9b76206f0a3519b063ea89a91d4b6dc9fa7aef45f124e3d60d49621c49a85c75" exitCode=0 Feb 19 23:53:31 crc kubenswrapper[4795]: I0219 23:53:31.319799 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4cfwt" event={"ID":"2031ff34-f306-4920-8bb4-a6f0151a9aa3","Type":"ContainerDied","Data":"9b76206f0a3519b063ea89a91d4b6dc9fa7aef45f124e3d60d49621c49a85c75"} Feb 19 23:53:31 crc kubenswrapper[4795]: I0219 23:53:31.320000 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4cfwt" event={"ID":"2031ff34-f306-4920-8bb4-a6f0151a9aa3","Type":"ContainerStarted","Data":"40a03b7739c80ba579fdcb69feece003a5e727cd18891934c7e249fece457390"} Feb 19 23:53:31 crc kubenswrapper[4795]: I0219 23:53:31.523293 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e0ad945-0620-480a-8200-fce17a619511" path="/var/lib/kubelet/pods/3e0ad945-0620-480a-8200-fce17a619511/volumes" Feb 19 23:53:32 crc kubenswrapper[4795]: I0219 23:53:32.335224 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4cfwt" event={"ID":"2031ff34-f306-4920-8bb4-a6f0151a9aa3","Type":"ContainerStarted","Data":"b5afe98365fbeba289f9d0a07bee96b2746c1c0aa55a58675b59b6f922d5b351"} Feb 19 23:53:33 crc kubenswrapper[4795]: I0219 23:53:33.349791 4795 generic.go:334] "Generic (PLEG): container finished" podID="2031ff34-f306-4920-8bb4-a6f0151a9aa3" containerID="b5afe98365fbeba289f9d0a07bee96b2746c1c0aa55a58675b59b6f922d5b351" exitCode=0 Feb 19 23:53:33 crc kubenswrapper[4795]: I0219 23:53:33.349921 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4cfwt" event={"ID":"2031ff34-f306-4920-8bb4-a6f0151a9aa3","Type":"ContainerDied","Data":"b5afe98365fbeba289f9d0a07bee96b2746c1c0aa55a58675b59b6f922d5b351"} Feb 19 23:53:34 crc kubenswrapper[4795]: I0219 23:53:34.361946 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4cfwt" event={"ID":"2031ff34-f306-4920-8bb4-a6f0151a9aa3","Type":"ContainerStarted","Data":"b16b8fdd2b82f14fd0ed7943a563799d24eebd99f90be20d9c9ad6bd42d06ba5"} Feb 19 23:53:34 crc kubenswrapper[4795]: I0219 23:53:34.393925 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4cfwt" podStartSLOduration=1.968909531 podStartE2EDuration="4.39390489s" podCreationTimestamp="2026-02-19 23:53:30 +0000 UTC" firstStartedPulling="2026-02-19 23:53:31.321682439 +0000 UTC m=+8722.514200303" lastFinishedPulling="2026-02-19 23:53:33.746677798 +0000 UTC m=+8724.939195662" observedRunningTime="2026-02-19 23:53:34.38356481 +0000 UTC m=+8725.576082674" watchObservedRunningTime="2026-02-19 23:53:34.39390489 +0000 UTC m=+8725.586422754" Feb 19 23:53:40 crc kubenswrapper[4795]: I0219 23:53:40.595980 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:40 crc kubenswrapper[4795]: I0219 23:53:40.596683 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:40 crc kubenswrapper[4795]: I0219 23:53:40.652821 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:41 crc kubenswrapper[4795]: I0219 23:53:41.498292 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:41 crc kubenswrapper[4795]: I0219 23:53:41.560405 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4cfwt"] Feb 19 23:53:43 crc kubenswrapper[4795]: I0219 23:53:43.447122 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4cfwt" podUID="2031ff34-f306-4920-8bb4-a6f0151a9aa3" containerName="registry-server" containerID="cri-o://b16b8fdd2b82f14fd0ed7943a563799d24eebd99f90be20d9c9ad6bd42d06ba5" gracePeriod=2 Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.018958 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.101944 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jdmz\" (UniqueName: \"kubernetes.io/projected/2031ff34-f306-4920-8bb4-a6f0151a9aa3-kube-api-access-2jdmz\") pod \"2031ff34-f306-4920-8bb4-a6f0151a9aa3\" (UID: \"2031ff34-f306-4920-8bb4-a6f0151a9aa3\") " Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.102217 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2031ff34-f306-4920-8bb4-a6f0151a9aa3-utilities\") pod \"2031ff34-f306-4920-8bb4-a6f0151a9aa3\" (UID: \"2031ff34-f306-4920-8bb4-a6f0151a9aa3\") " Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.102386 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2031ff34-f306-4920-8bb4-a6f0151a9aa3-catalog-content\") pod \"2031ff34-f306-4920-8bb4-a6f0151a9aa3\" (UID: \"2031ff34-f306-4920-8bb4-a6f0151a9aa3\") " Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.103080 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2031ff34-f306-4920-8bb4-a6f0151a9aa3-utilities" (OuterVolumeSpecName: "utilities") pod "2031ff34-f306-4920-8bb4-a6f0151a9aa3" (UID: "2031ff34-f306-4920-8bb4-a6f0151a9aa3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.107586 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2031ff34-f306-4920-8bb4-a6f0151a9aa3-kube-api-access-2jdmz" (OuterVolumeSpecName: "kube-api-access-2jdmz") pod "2031ff34-f306-4920-8bb4-a6f0151a9aa3" (UID: "2031ff34-f306-4920-8bb4-a6f0151a9aa3"). InnerVolumeSpecName "kube-api-access-2jdmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.150706 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2031ff34-f306-4920-8bb4-a6f0151a9aa3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2031ff34-f306-4920-8bb4-a6f0151a9aa3" (UID: "2031ff34-f306-4920-8bb4-a6f0151a9aa3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.205810 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jdmz\" (UniqueName: \"kubernetes.io/projected/2031ff34-f306-4920-8bb4-a6f0151a9aa3-kube-api-access-2jdmz\") on node \"crc\" DevicePath \"\"" Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.206258 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2031ff34-f306-4920-8bb4-a6f0151a9aa3-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.206379 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2031ff34-f306-4920-8bb4-a6f0151a9aa3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.458019 4795 generic.go:334] "Generic (PLEG): container finished" podID="2031ff34-f306-4920-8bb4-a6f0151a9aa3" containerID="b16b8fdd2b82f14fd0ed7943a563799d24eebd99f90be20d9c9ad6bd42d06ba5" exitCode=0 Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.458159 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.458236 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4cfwt" event={"ID":"2031ff34-f306-4920-8bb4-a6f0151a9aa3","Type":"ContainerDied","Data":"b16b8fdd2b82f14fd0ed7943a563799d24eebd99f90be20d9c9ad6bd42d06ba5"} Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.459735 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4cfwt" event={"ID":"2031ff34-f306-4920-8bb4-a6f0151a9aa3","Type":"ContainerDied","Data":"40a03b7739c80ba579fdcb69feece003a5e727cd18891934c7e249fece457390"} Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.459806 4795 scope.go:117] "RemoveContainer" containerID="b16b8fdd2b82f14fd0ed7943a563799d24eebd99f90be20d9c9ad6bd42d06ba5" Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.494558 4795 scope.go:117] "RemoveContainer" containerID="b5afe98365fbeba289f9d0a07bee96b2746c1c0aa55a58675b59b6f922d5b351" Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.508499 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4cfwt"] Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.519035 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4cfwt"] Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.520494 4795 scope.go:117] "RemoveContainer" containerID="9b76206f0a3519b063ea89a91d4b6dc9fa7aef45f124e3d60d49621c49a85c75" Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.565356 4795 scope.go:117] "RemoveContainer" containerID="b16b8fdd2b82f14fd0ed7943a563799d24eebd99f90be20d9c9ad6bd42d06ba5" Feb 19 23:53:44 crc kubenswrapper[4795]: E0219 23:53:44.566302 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b16b8fdd2b82f14fd0ed7943a563799d24eebd99f90be20d9c9ad6bd42d06ba5\": container with ID starting with b16b8fdd2b82f14fd0ed7943a563799d24eebd99f90be20d9c9ad6bd42d06ba5 not found: ID does not exist" containerID="b16b8fdd2b82f14fd0ed7943a563799d24eebd99f90be20d9c9ad6bd42d06ba5" Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.566332 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b16b8fdd2b82f14fd0ed7943a563799d24eebd99f90be20d9c9ad6bd42d06ba5"} err="failed to get container status \"b16b8fdd2b82f14fd0ed7943a563799d24eebd99f90be20d9c9ad6bd42d06ba5\": rpc error: code = NotFound desc = could not find container \"b16b8fdd2b82f14fd0ed7943a563799d24eebd99f90be20d9c9ad6bd42d06ba5\": container with ID starting with b16b8fdd2b82f14fd0ed7943a563799d24eebd99f90be20d9c9ad6bd42d06ba5 not found: ID does not exist" Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.566351 4795 scope.go:117] "RemoveContainer" containerID="b5afe98365fbeba289f9d0a07bee96b2746c1c0aa55a58675b59b6f922d5b351" Feb 19 23:53:44 crc kubenswrapper[4795]: E0219 23:53:44.566861 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5afe98365fbeba289f9d0a07bee96b2746c1c0aa55a58675b59b6f922d5b351\": container with ID starting with b5afe98365fbeba289f9d0a07bee96b2746c1c0aa55a58675b59b6f922d5b351 not found: ID does not exist" containerID="b5afe98365fbeba289f9d0a07bee96b2746c1c0aa55a58675b59b6f922d5b351" Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.566912 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5afe98365fbeba289f9d0a07bee96b2746c1c0aa55a58675b59b6f922d5b351"} err="failed to get container status \"b5afe98365fbeba289f9d0a07bee96b2746c1c0aa55a58675b59b6f922d5b351\": rpc error: code = NotFound desc = could not find container \"b5afe98365fbeba289f9d0a07bee96b2746c1c0aa55a58675b59b6f922d5b351\": container with ID starting with b5afe98365fbeba289f9d0a07bee96b2746c1c0aa55a58675b59b6f922d5b351 not found: ID does not exist" Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.566952 4795 scope.go:117] "RemoveContainer" containerID="9b76206f0a3519b063ea89a91d4b6dc9fa7aef45f124e3d60d49621c49a85c75" Feb 19 23:53:44 crc kubenswrapper[4795]: E0219 23:53:44.567286 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b76206f0a3519b063ea89a91d4b6dc9fa7aef45f124e3d60d49621c49a85c75\": container with ID starting with 9b76206f0a3519b063ea89a91d4b6dc9fa7aef45f124e3d60d49621c49a85c75 not found: ID does not exist" containerID="9b76206f0a3519b063ea89a91d4b6dc9fa7aef45f124e3d60d49621c49a85c75" Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.567311 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b76206f0a3519b063ea89a91d4b6dc9fa7aef45f124e3d60d49621c49a85c75"} err="failed to get container status \"9b76206f0a3519b063ea89a91d4b6dc9fa7aef45f124e3d60d49621c49a85c75\": rpc error: code = NotFound desc = could not find container \"9b76206f0a3519b063ea89a91d4b6dc9fa7aef45f124e3d60d49621c49a85c75\": container with ID starting with 9b76206f0a3519b063ea89a91d4b6dc9fa7aef45f124e3d60d49621c49a85c75 not found: ID does not exist" Feb 19 23:53:45 crc kubenswrapper[4795]: I0219 23:53:45.527436 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2031ff34-f306-4920-8bb4-a6f0151a9aa3" path="/var/lib/kubelet/pods/2031ff34-f306-4920-8bb4-a6f0151a9aa3/volumes" Feb 19 23:53:50 crc kubenswrapper[4795]: I0219 23:53:50.522033 4795 generic.go:334] "Generic (PLEG): container finished" podID="4f232979-ab9c-4b59-8ad8-7756367fe0bf" containerID="d99c65ed2cf2832977c62a47557eeea8eec734877d891b4ac4fe2f4a681f7224" exitCode=137 Feb 19 23:53:50 crc kubenswrapper[4795]: I0219 23:53:50.522995 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"4f232979-ab9c-4b59-8ad8-7756367fe0bf","Type":"ContainerDied","Data":"d99c65ed2cf2832977c62a47557eeea8eec734877d891b4ac4fe2f4a681f7224"} Feb 19 23:53:50 crc kubenswrapper[4795]: I0219 23:53:50.683740 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 19 23:53:50 crc kubenswrapper[4795]: I0219 23:53:50.793619 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8\") pod \"4f232979-ab9c-4b59-8ad8-7756367fe0bf\" (UID: \"4f232979-ab9c-4b59-8ad8-7756367fe0bf\") " Feb 19 23:53:50 crc kubenswrapper[4795]: I0219 23:53:50.793803 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzsrb\" (UniqueName: \"kubernetes.io/projected/4f232979-ab9c-4b59-8ad8-7756367fe0bf-kube-api-access-pzsrb\") pod \"4f232979-ab9c-4b59-8ad8-7756367fe0bf\" (UID: \"4f232979-ab9c-4b59-8ad8-7756367fe0bf\") " Feb 19 23:53:50 crc kubenswrapper[4795]: I0219 23:53:50.803506 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f232979-ab9c-4b59-8ad8-7756367fe0bf-kube-api-access-pzsrb" (OuterVolumeSpecName: "kube-api-access-pzsrb") pod "4f232979-ab9c-4b59-8ad8-7756367fe0bf" (UID: "4f232979-ab9c-4b59-8ad8-7756367fe0bf"). InnerVolumeSpecName "kube-api-access-pzsrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:53:50 crc kubenswrapper[4795]: I0219 23:53:50.817228 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8" (OuterVolumeSpecName: "mariadb-data") pod "4f232979-ab9c-4b59-8ad8-7756367fe0bf" (UID: "4f232979-ab9c-4b59-8ad8-7756367fe0bf"). InnerVolumeSpecName "pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 23:53:50 crc kubenswrapper[4795]: I0219 23:53:50.905582 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzsrb\" (UniqueName: \"kubernetes.io/projected/4f232979-ab9c-4b59-8ad8-7756367fe0bf-kube-api-access-pzsrb\") on node \"crc\" DevicePath \"\"" Feb 19 23:53:50 crc kubenswrapper[4795]: I0219 23:53:50.905639 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8\") on node \"crc\" " Feb 19 23:53:50 crc kubenswrapper[4795]: I0219 23:53:50.931615 4795 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 19 23:53:50 crc kubenswrapper[4795]: I0219 23:53:50.931782 4795 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8") on node "crc" Feb 19 23:53:51 crc kubenswrapper[4795]: I0219 23:53:51.007231 4795 reconciler_common.go:293] "Volume detached for volume \"pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8\") on node \"crc\" DevicePath \"\"" Feb 19 23:53:51 crc kubenswrapper[4795]: I0219 23:53:51.537131 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"4f232979-ab9c-4b59-8ad8-7756367fe0bf","Type":"ContainerDied","Data":"6654d86759c1aab8319afbb64f442570ab81fe83e940c47c9e22d533fa8c1665"} Feb 19 23:53:51 crc kubenswrapper[4795]: I0219 23:53:51.537217 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 19 23:53:51 crc kubenswrapper[4795]: I0219 23:53:51.537500 4795 scope.go:117] "RemoveContainer" containerID="d99c65ed2cf2832977c62a47557eeea8eec734877d891b4ac4fe2f4a681f7224" Feb 19 23:53:51 crc kubenswrapper[4795]: I0219 23:53:51.604637 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Feb 19 23:53:51 crc kubenswrapper[4795]: I0219 23:53:51.623998 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Feb 19 23:53:52 crc kubenswrapper[4795]: I0219 23:53:52.362989 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Feb 19 23:53:52 crc kubenswrapper[4795]: I0219 23:53:52.363529 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="59d77cc9-140e-4468-9023-0a973155d290" containerName="adoption" containerID="cri-o://a94a997e126652369a7f539bdbf820b6c97b6808304fc5ab088e5b94e32df40f" gracePeriod=30 Feb 19 23:53:53 crc kubenswrapper[4795]: I0219 23:53:53.525469 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f232979-ab9c-4b59-8ad8-7756367fe0bf" path="/var/lib/kubelet/pods/4f232979-ab9c-4b59-8ad8-7756367fe0bf/volumes" Feb 19 23:54:22 crc kubenswrapper[4795]: I0219 23:54:22.853223 4795 generic.go:334] "Generic (PLEG): container finished" podID="59d77cc9-140e-4468-9023-0a973155d290" containerID="a94a997e126652369a7f539bdbf820b6c97b6808304fc5ab088e5b94e32df40f" exitCode=137 Feb 19 23:54:22 crc kubenswrapper[4795]: I0219 23:54:22.853299 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"59d77cc9-140e-4468-9023-0a973155d290","Type":"ContainerDied","Data":"a94a997e126652369a7f539bdbf820b6c97b6808304fc5ab088e5b94e32df40f"} Feb 19 23:54:22 crc kubenswrapper[4795]: I0219 23:54:22.853615 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"59d77cc9-140e-4468-9023-0a973155d290","Type":"ContainerDied","Data":"d1d8d1a7f24d4b32f5d50dbc56c8c75a39927408e5337eb8542c8ba059a2e9d8"} Feb 19 23:54:22 crc kubenswrapper[4795]: I0219 23:54:22.853631 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1d8d1a7f24d4b32f5d50dbc56c8c75a39927408e5337eb8542c8ba059a2e9d8" Feb 19 23:54:22 crc kubenswrapper[4795]: I0219 23:54:22.940359 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 19 23:54:22 crc kubenswrapper[4795]: I0219 23:54:22.994065 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe46f653-3b46-49f1-9da1-d67576b17f42\") pod \"59d77cc9-140e-4468-9023-0a973155d290\" (UID: \"59d77cc9-140e-4468-9023-0a973155d290\") " Feb 19 23:54:22 crc kubenswrapper[4795]: I0219 23:54:22.994432 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcjlg\" (UniqueName: \"kubernetes.io/projected/59d77cc9-140e-4468-9023-0a973155d290-kube-api-access-jcjlg\") pod \"59d77cc9-140e-4468-9023-0a973155d290\" (UID: \"59d77cc9-140e-4468-9023-0a973155d290\") " Feb 19 23:54:22 crc kubenswrapper[4795]: I0219 23:54:22.994483 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/59d77cc9-140e-4468-9023-0a973155d290-ovn-data-cert\") pod \"59d77cc9-140e-4468-9023-0a973155d290\" (UID: \"59d77cc9-140e-4468-9023-0a973155d290\") " Feb 19 23:54:23 crc kubenswrapper[4795]: I0219 23:54:23.001111 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59d77cc9-140e-4468-9023-0a973155d290-kube-api-access-jcjlg" (OuterVolumeSpecName: "kube-api-access-jcjlg") pod "59d77cc9-140e-4468-9023-0a973155d290" (UID: "59d77cc9-140e-4468-9023-0a973155d290"). InnerVolumeSpecName "kube-api-access-jcjlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:54:23 crc kubenswrapper[4795]: I0219 23:54:23.002310 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59d77cc9-140e-4468-9023-0a973155d290-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "59d77cc9-140e-4468-9023-0a973155d290" (UID: "59d77cc9-140e-4468-9023-0a973155d290"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:54:23 crc kubenswrapper[4795]: I0219 23:54:23.019289 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe46f653-3b46-49f1-9da1-d67576b17f42" (OuterVolumeSpecName: "ovn-data") pod "59d77cc9-140e-4468-9023-0a973155d290" (UID: "59d77cc9-140e-4468-9023-0a973155d290"). InnerVolumeSpecName "pvc-fe46f653-3b46-49f1-9da1-d67576b17f42". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 23:54:23 crc kubenswrapper[4795]: I0219 23:54:23.097419 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcjlg\" (UniqueName: \"kubernetes.io/projected/59d77cc9-140e-4468-9023-0a973155d290-kube-api-access-jcjlg\") on node \"crc\" DevicePath \"\"" Feb 19 23:54:23 crc kubenswrapper[4795]: I0219 23:54:23.097469 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/59d77cc9-140e-4468-9023-0a973155d290-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Feb 19 23:54:23 crc kubenswrapper[4795]: I0219 23:54:23.097525 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-fe46f653-3b46-49f1-9da1-d67576b17f42\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe46f653-3b46-49f1-9da1-d67576b17f42\") on node \"crc\" " Feb 19 23:54:23 crc kubenswrapper[4795]: I0219 23:54:23.129353 4795 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 19 23:54:23 crc kubenswrapper[4795]: I0219 23:54:23.129567 4795 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-fe46f653-3b46-49f1-9da1-d67576b17f42" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe46f653-3b46-49f1-9da1-d67576b17f42") on node "crc" Feb 19 23:54:23 crc kubenswrapper[4795]: I0219 23:54:23.199741 4795 reconciler_common.go:293] "Volume detached for volume \"pvc-fe46f653-3b46-49f1-9da1-d67576b17f42\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe46f653-3b46-49f1-9da1-d67576b17f42\") on node \"crc\" DevicePath \"\"" Feb 19 23:54:23 crc kubenswrapper[4795]: I0219 23:54:23.864956 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 19 23:54:23 crc kubenswrapper[4795]: I0219 23:54:23.897915 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Feb 19 23:54:23 crc kubenswrapper[4795]: I0219 23:54:23.910395 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Feb 19 23:54:25 crc kubenswrapper[4795]: I0219 23:54:25.522513 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59d77cc9-140e-4468-9023-0a973155d290" path="/var/lib/kubelet/pods/59d77cc9-140e-4468-9023-0a973155d290/volumes" Feb 19 23:54:29 crc kubenswrapper[4795]: I0219 23:54:29.054104 4795 scope.go:117] "RemoveContainer" containerID="a94a997e126652369a7f539bdbf820b6c97b6808304fc5ab088e5b94e32df40f" Feb 19 23:55:28 crc kubenswrapper[4795]: I0219 23:55:28.427485 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:55:28 crc kubenswrapper[4795]: I0219 23:55:28.427936 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.213026 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gv265/must-gather-ltjmp"] Feb 19 23:55:29 crc kubenswrapper[4795]: E0219 23:55:29.213809 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2031ff34-f306-4920-8bb4-a6f0151a9aa3" containerName="registry-server" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.213832 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2031ff34-f306-4920-8bb4-a6f0151a9aa3" containerName="registry-server" Feb 19 23:55:29 crc kubenswrapper[4795]: E0219 23:55:29.213858 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59d77cc9-140e-4468-9023-0a973155d290" containerName="adoption" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.213865 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="59d77cc9-140e-4468-9023-0a973155d290" containerName="adoption" Feb 19 23:55:29 crc kubenswrapper[4795]: E0219 23:55:29.213888 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2031ff34-f306-4920-8bb4-a6f0151a9aa3" containerName="extract-utilities" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.213894 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2031ff34-f306-4920-8bb4-a6f0151a9aa3" containerName="extract-utilities" Feb 19 23:55:29 crc kubenswrapper[4795]: E0219 23:55:29.213909 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2031ff34-f306-4920-8bb4-a6f0151a9aa3" containerName="extract-content" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.213914 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2031ff34-f306-4920-8bb4-a6f0151a9aa3" containerName="extract-content" Feb 19 23:55:29 crc kubenswrapper[4795]: E0219 23:55:29.213928 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f232979-ab9c-4b59-8ad8-7756367fe0bf" containerName="adoption" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.213937 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f232979-ab9c-4b59-8ad8-7756367fe0bf" containerName="adoption" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.214157 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2031ff34-f306-4920-8bb4-a6f0151a9aa3" containerName="registry-server" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.214171 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f232979-ab9c-4b59-8ad8-7756367fe0bf" containerName="adoption" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.214204 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="59d77cc9-140e-4468-9023-0a973155d290" containerName="adoption" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.215406 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gv265/must-gather-ltjmp" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.217078 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-gv265"/"default-dockercfg-wm7vn" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.217560 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-gv265"/"openshift-service-ca.crt" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.217892 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-gv265"/"kube-root-ca.crt" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.231447 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gv265/must-gather-ltjmp"] Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.291565 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d06c32e0-d01f-47e9-871b-9fdfb391d796-must-gather-output\") pod \"must-gather-ltjmp\" (UID: \"d06c32e0-d01f-47e9-871b-9fdfb391d796\") " pod="openshift-must-gather-gv265/must-gather-ltjmp" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.291907 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5kfl\" (UniqueName: \"kubernetes.io/projected/d06c32e0-d01f-47e9-871b-9fdfb391d796-kube-api-access-g5kfl\") pod \"must-gather-ltjmp\" (UID: \"d06c32e0-d01f-47e9-871b-9fdfb391d796\") " pod="openshift-must-gather-gv265/must-gather-ltjmp" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.393487 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d06c32e0-d01f-47e9-871b-9fdfb391d796-must-gather-output\") pod \"must-gather-ltjmp\" (UID: \"d06c32e0-d01f-47e9-871b-9fdfb391d796\") " pod="openshift-must-gather-gv265/must-gather-ltjmp" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.393600 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5kfl\" (UniqueName: \"kubernetes.io/projected/d06c32e0-d01f-47e9-871b-9fdfb391d796-kube-api-access-g5kfl\") pod \"must-gather-ltjmp\" (UID: \"d06c32e0-d01f-47e9-871b-9fdfb391d796\") " pod="openshift-must-gather-gv265/must-gather-ltjmp" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.394328 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d06c32e0-d01f-47e9-871b-9fdfb391d796-must-gather-output\") pod \"must-gather-ltjmp\" (UID: \"d06c32e0-d01f-47e9-871b-9fdfb391d796\") " pod="openshift-must-gather-gv265/must-gather-ltjmp" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.727789 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5kfl\" (UniqueName: \"kubernetes.io/projected/d06c32e0-d01f-47e9-871b-9fdfb391d796-kube-api-access-g5kfl\") pod \"must-gather-ltjmp\" (UID: \"d06c32e0-d01f-47e9-871b-9fdfb391d796\") " pod="openshift-must-gather-gv265/must-gather-ltjmp" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.837703 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gv265/must-gather-ltjmp" Feb 19 23:55:30 crc kubenswrapper[4795]: I0219 23:55:30.365286 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gv265/must-gather-ltjmp"] Feb 19 23:55:30 crc kubenswrapper[4795]: I0219 23:55:30.553824 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gv265/must-gather-ltjmp" event={"ID":"d06c32e0-d01f-47e9-871b-9fdfb391d796","Type":"ContainerStarted","Data":"5fd8d3d5525d94b61cf12ae80a853096d968888e1ccdce6628c381ab590f1eb7"} Feb 19 23:55:35 crc kubenswrapper[4795]: I0219 23:55:35.735534 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sz6pt"] Feb 19 23:55:35 crc kubenswrapper[4795]: I0219 23:55:35.755434 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:55:35 crc kubenswrapper[4795]: I0219 23:55:35.776326 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sz6pt"] Feb 19 23:55:35 crc kubenswrapper[4795]: I0219 23:55:35.830243 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f2da7b5-d729-463f-9589-455203a5ad9e-utilities\") pod \"redhat-operators-sz6pt\" (UID: \"8f2da7b5-d729-463f-9589-455203a5ad9e\") " pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:55:35 crc kubenswrapper[4795]: I0219 23:55:35.830314 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f2da7b5-d729-463f-9589-455203a5ad9e-catalog-content\") pod \"redhat-operators-sz6pt\" (UID: \"8f2da7b5-d729-463f-9589-455203a5ad9e\") " pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:55:35 crc kubenswrapper[4795]: I0219 23:55:35.830411 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trfkf\" (UniqueName: \"kubernetes.io/projected/8f2da7b5-d729-463f-9589-455203a5ad9e-kube-api-access-trfkf\") pod \"redhat-operators-sz6pt\" (UID: \"8f2da7b5-d729-463f-9589-455203a5ad9e\") " pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:55:35 crc kubenswrapper[4795]: I0219 23:55:35.931968 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f2da7b5-d729-463f-9589-455203a5ad9e-utilities\") pod \"redhat-operators-sz6pt\" (UID: \"8f2da7b5-d729-463f-9589-455203a5ad9e\") " pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:55:35 crc kubenswrapper[4795]: I0219 23:55:35.932045 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f2da7b5-d729-463f-9589-455203a5ad9e-catalog-content\") pod \"redhat-operators-sz6pt\" (UID: \"8f2da7b5-d729-463f-9589-455203a5ad9e\") " pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:55:35 crc kubenswrapper[4795]: I0219 23:55:35.932218 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trfkf\" (UniqueName: \"kubernetes.io/projected/8f2da7b5-d729-463f-9589-455203a5ad9e-kube-api-access-trfkf\") pod \"redhat-operators-sz6pt\" (UID: \"8f2da7b5-d729-463f-9589-455203a5ad9e\") " pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:55:35 crc kubenswrapper[4795]: I0219 23:55:35.932883 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f2da7b5-d729-463f-9589-455203a5ad9e-utilities\") pod \"redhat-operators-sz6pt\" (UID: \"8f2da7b5-d729-463f-9589-455203a5ad9e\") " pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:55:35 crc kubenswrapper[4795]: I0219 23:55:35.932966 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f2da7b5-d729-463f-9589-455203a5ad9e-catalog-content\") pod \"redhat-operators-sz6pt\" (UID: \"8f2da7b5-d729-463f-9589-455203a5ad9e\") " pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:55:35 crc kubenswrapper[4795]: I0219 23:55:35.959396 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trfkf\" (UniqueName: \"kubernetes.io/projected/8f2da7b5-d729-463f-9589-455203a5ad9e-kube-api-access-trfkf\") pod \"redhat-operators-sz6pt\" (UID: \"8f2da7b5-d729-463f-9589-455203a5ad9e\") " pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:55:36 crc kubenswrapper[4795]: I0219 23:55:36.109732 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:55:37 crc kubenswrapper[4795]: W0219 23:55:37.295407 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f2da7b5_d729_463f_9589_455203a5ad9e.slice/crio-a6d95aca702893e3ea5637611a6e798f2de86b26f77a8aeaf0d2c93be967e572 WatchSource:0}: Error finding container a6d95aca702893e3ea5637611a6e798f2de86b26f77a8aeaf0d2c93be967e572: Status 404 returned error can't find the container with id a6d95aca702893e3ea5637611a6e798f2de86b26f77a8aeaf0d2c93be967e572 Feb 19 23:55:37 crc kubenswrapper[4795]: I0219 23:55:37.297352 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sz6pt"] Feb 19 23:55:37 crc kubenswrapper[4795]: I0219 23:55:37.626560 4795 generic.go:334] "Generic (PLEG): container finished" podID="8f2da7b5-d729-463f-9589-455203a5ad9e" containerID="62deb3f2bf568b395b46bc81de81ee26457a95f02f006897224aa24bc7dac271" exitCode=0 Feb 19 23:55:37 crc kubenswrapper[4795]: I0219 23:55:37.626614 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sz6pt" event={"ID":"8f2da7b5-d729-463f-9589-455203a5ad9e","Type":"ContainerDied","Data":"62deb3f2bf568b395b46bc81de81ee26457a95f02f006897224aa24bc7dac271"} Feb 19 23:55:37 crc kubenswrapper[4795]: I0219 23:55:37.626870 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sz6pt" event={"ID":"8f2da7b5-d729-463f-9589-455203a5ad9e","Type":"ContainerStarted","Data":"a6d95aca702893e3ea5637611a6e798f2de86b26f77a8aeaf0d2c93be967e572"} Feb 19 23:55:37 crc kubenswrapper[4795]: I0219 23:55:37.629597 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gv265/must-gather-ltjmp" event={"ID":"d06c32e0-d01f-47e9-871b-9fdfb391d796","Type":"ContainerStarted","Data":"340d2615d25276b88141337bb1421c18e4c691a87d218a924ca2427d0a329e70"} Feb 19 23:55:37 crc kubenswrapper[4795]: I0219 23:55:37.629638 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gv265/must-gather-ltjmp" event={"ID":"d06c32e0-d01f-47e9-871b-9fdfb391d796","Type":"ContainerStarted","Data":"35f5634e272e68850396058b8dbd9d80a146c6ccfa053b21de355451b37156a5"} Feb 19 23:55:37 crc kubenswrapper[4795]: I0219 23:55:37.666379 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gv265/must-gather-ltjmp" podStartSLOduration=2.196674599 podStartE2EDuration="8.666358667s" podCreationTimestamp="2026-02-19 23:55:29 +0000 UTC" firstStartedPulling="2026-02-19 23:55:30.371206465 +0000 UTC m=+8841.563724329" lastFinishedPulling="2026-02-19 23:55:36.840890543 +0000 UTC m=+8848.033408397" observedRunningTime="2026-02-19 23:55:37.657763495 +0000 UTC m=+8848.850281359" watchObservedRunningTime="2026-02-19 23:55:37.666358667 +0000 UTC m=+8848.858876531" Feb 19 23:55:39 crc kubenswrapper[4795]: I0219 23:55:39.654472 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sz6pt" event={"ID":"8f2da7b5-d729-463f-9589-455203a5ad9e","Type":"ContainerStarted","Data":"07a4317117426bae86bf07c8d5e610095ccbae15e950025068b03dd800e0da4a"} Feb 19 23:55:41 crc kubenswrapper[4795]: I0219 23:55:41.842344 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gv265/crc-debug-htwdp"] Feb 19 23:55:41 crc kubenswrapper[4795]: I0219 23:55:41.844223 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gv265/crc-debug-htwdp" Feb 19 23:55:41 crc kubenswrapper[4795]: I0219 23:55:41.969589 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0993101a-b7bc-40fd-a75c-9d6eefe49025-host\") pod \"crc-debug-htwdp\" (UID: \"0993101a-b7bc-40fd-a75c-9d6eefe49025\") " pod="openshift-must-gather-gv265/crc-debug-htwdp" Feb 19 23:55:41 crc kubenswrapper[4795]: I0219 23:55:41.969773 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbpz9\" (UniqueName: \"kubernetes.io/projected/0993101a-b7bc-40fd-a75c-9d6eefe49025-kube-api-access-hbpz9\") pod \"crc-debug-htwdp\" (UID: \"0993101a-b7bc-40fd-a75c-9d6eefe49025\") " pod="openshift-must-gather-gv265/crc-debug-htwdp" Feb 19 23:55:42 crc kubenswrapper[4795]: I0219 23:55:42.076763 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0993101a-b7bc-40fd-a75c-9d6eefe49025-host\") pod \"crc-debug-htwdp\" (UID: \"0993101a-b7bc-40fd-a75c-9d6eefe49025\") " pod="openshift-must-gather-gv265/crc-debug-htwdp" Feb 19 23:55:42 crc kubenswrapper[4795]: I0219 23:55:42.076877 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0993101a-b7bc-40fd-a75c-9d6eefe49025-host\") pod \"crc-debug-htwdp\" (UID: \"0993101a-b7bc-40fd-a75c-9d6eefe49025\") " pod="openshift-must-gather-gv265/crc-debug-htwdp" Feb 19 23:55:42 crc kubenswrapper[4795]: I0219 23:55:42.077113 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbpz9\" (UniqueName: \"kubernetes.io/projected/0993101a-b7bc-40fd-a75c-9d6eefe49025-kube-api-access-hbpz9\") pod \"crc-debug-htwdp\" (UID: \"0993101a-b7bc-40fd-a75c-9d6eefe49025\") " pod="openshift-must-gather-gv265/crc-debug-htwdp" Feb 19 23:55:42 crc kubenswrapper[4795]: I0219 23:55:42.101145 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbpz9\" (UniqueName: \"kubernetes.io/projected/0993101a-b7bc-40fd-a75c-9d6eefe49025-kube-api-access-hbpz9\") pod \"crc-debug-htwdp\" (UID: \"0993101a-b7bc-40fd-a75c-9d6eefe49025\") " pod="openshift-must-gather-gv265/crc-debug-htwdp" Feb 19 23:55:42 crc kubenswrapper[4795]: I0219 23:55:42.164820 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gv265/crc-debug-htwdp" Feb 19 23:55:42 crc kubenswrapper[4795]: W0219 23:55:42.212894 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0993101a_b7bc_40fd_a75c_9d6eefe49025.slice/crio-a07f1a9d217d7811f21420408d2460fcdb1ae5287bd4e9acf1ba0f20fa813088 WatchSource:0}: Error finding container a07f1a9d217d7811f21420408d2460fcdb1ae5287bd4e9acf1ba0f20fa813088: Status 404 returned error can't find the container with id a07f1a9d217d7811f21420408d2460fcdb1ae5287bd4e9acf1ba0f20fa813088 Feb 19 23:55:42 crc kubenswrapper[4795]: I0219 23:55:42.680011 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gv265/crc-debug-htwdp" event={"ID":"0993101a-b7bc-40fd-a75c-9d6eefe49025","Type":"ContainerStarted","Data":"a07f1a9d217d7811f21420408d2460fcdb1ae5287bd4e9acf1ba0f20fa813088"} Feb 19 23:55:43 crc kubenswrapper[4795]: I0219 23:55:43.691045 4795 generic.go:334] "Generic (PLEG): container finished" podID="8f2da7b5-d729-463f-9589-455203a5ad9e" containerID="07a4317117426bae86bf07c8d5e610095ccbae15e950025068b03dd800e0da4a" exitCode=0 Feb 19 23:55:43 crc kubenswrapper[4795]: I0219 23:55:43.691143 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sz6pt" event={"ID":"8f2da7b5-d729-463f-9589-455203a5ad9e","Type":"ContainerDied","Data":"07a4317117426bae86bf07c8d5e610095ccbae15e950025068b03dd800e0da4a"} Feb 19 23:55:44 crc kubenswrapper[4795]: I0219 23:55:44.702225 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sz6pt" event={"ID":"8f2da7b5-d729-463f-9589-455203a5ad9e","Type":"ContainerStarted","Data":"6d065a9d149243251b6b58ebaf53b077b0910d5423b5a4944782ee17e37027c2"} Feb 19 23:55:44 crc kubenswrapper[4795]: I0219 23:55:44.733667 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sz6pt" podStartSLOduration=3.3004902019999998 podStartE2EDuration="9.733652725s" podCreationTimestamp="2026-02-19 23:55:35 +0000 UTC" firstStartedPulling="2026-02-19 23:55:37.628940747 +0000 UTC m=+8848.821458601" lastFinishedPulling="2026-02-19 23:55:44.06210326 +0000 UTC m=+8855.254621124" observedRunningTime="2026-02-19 23:55:44.728475889 +0000 UTC m=+8855.920993743" watchObservedRunningTime="2026-02-19 23:55:44.733652725 +0000 UTC m=+8855.926170579" Feb 19 23:55:46 crc kubenswrapper[4795]: I0219 23:55:46.110602 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:55:46 crc kubenswrapper[4795]: I0219 23:55:46.110858 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:55:47 crc kubenswrapper[4795]: I0219 23:55:47.197852 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sz6pt" podUID="8f2da7b5-d729-463f-9589-455203a5ad9e" containerName="registry-server" probeResult="failure" output=< Feb 19 23:55:47 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Feb 19 23:55:47 crc kubenswrapper[4795]: > Feb 19 23:55:54 crc kubenswrapper[4795]: I0219 23:55:54.796381 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gv265/crc-debug-htwdp" event={"ID":"0993101a-b7bc-40fd-a75c-9d6eefe49025","Type":"ContainerStarted","Data":"de244572ed412b6d15883ac7be00a681de7409d595350d300ebc7aac8f96c690"} Feb 19 23:55:54 crc kubenswrapper[4795]: I0219 23:55:54.818157 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gv265/crc-debug-htwdp" podStartSLOduration=1.7409129650000001 podStartE2EDuration="13.818138079s" podCreationTimestamp="2026-02-19 23:55:41 +0000 UTC" firstStartedPulling="2026-02-19 23:55:42.215635965 +0000 UTC m=+8853.408153839" lastFinishedPulling="2026-02-19 23:55:54.292861099 +0000 UTC m=+8865.485378953" observedRunningTime="2026-02-19 23:55:54.810976878 +0000 UTC m=+8866.003494742" watchObservedRunningTime="2026-02-19 23:55:54.818138079 +0000 UTC m=+8866.010655943" Feb 19 23:55:57 crc kubenswrapper[4795]: I0219 23:55:57.165275 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sz6pt" podUID="8f2da7b5-d729-463f-9589-455203a5ad9e" containerName="registry-server" probeResult="failure" output=< Feb 19 23:55:57 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Feb 19 23:55:57 crc kubenswrapper[4795]: > Feb 19 23:55:58 crc kubenswrapper[4795]: I0219 23:55:58.427532 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:55:58 crc kubenswrapper[4795]: I0219 23:55:58.427849 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:56:07 crc kubenswrapper[4795]: I0219 23:56:07.157502 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sz6pt" podUID="8f2da7b5-d729-463f-9589-455203a5ad9e" containerName="registry-server" probeResult="failure" output=< Feb 19 23:56:07 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Feb 19 23:56:07 crc kubenswrapper[4795]: > Feb 19 23:56:17 crc kubenswrapper[4795]: I0219 23:56:17.155780 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sz6pt" podUID="8f2da7b5-d729-463f-9589-455203a5ad9e" containerName="registry-server" probeResult="failure" output=< Feb 19 23:56:17 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Feb 19 23:56:17 crc kubenswrapper[4795]: > Feb 19 23:56:21 crc kubenswrapper[4795]: I0219 23:56:21.088643 4795 generic.go:334] "Generic (PLEG): container finished" podID="0993101a-b7bc-40fd-a75c-9d6eefe49025" containerID="de244572ed412b6d15883ac7be00a681de7409d595350d300ebc7aac8f96c690" exitCode=0 Feb 19 23:56:21 crc kubenswrapper[4795]: I0219 23:56:21.088720 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gv265/crc-debug-htwdp" event={"ID":"0993101a-b7bc-40fd-a75c-9d6eefe49025","Type":"ContainerDied","Data":"de244572ed412b6d15883ac7be00a681de7409d595350d300ebc7aac8f96c690"} Feb 19 23:56:22 crc kubenswrapper[4795]: I0219 23:56:22.265539 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gv265/crc-debug-htwdp" Feb 19 23:56:22 crc kubenswrapper[4795]: I0219 23:56:22.314532 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gv265/crc-debug-htwdp"] Feb 19 23:56:22 crc kubenswrapper[4795]: I0219 23:56:22.332857 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gv265/crc-debug-htwdp"] Feb 19 23:56:22 crc kubenswrapper[4795]: I0219 23:56:22.413257 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbpz9\" (UniqueName: \"kubernetes.io/projected/0993101a-b7bc-40fd-a75c-9d6eefe49025-kube-api-access-hbpz9\") pod \"0993101a-b7bc-40fd-a75c-9d6eefe49025\" (UID: \"0993101a-b7bc-40fd-a75c-9d6eefe49025\") " Feb 19 23:56:22 crc kubenswrapper[4795]: I0219 23:56:22.413517 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0993101a-b7bc-40fd-a75c-9d6eefe49025-host\") pod \"0993101a-b7bc-40fd-a75c-9d6eefe49025\" (UID: \"0993101a-b7bc-40fd-a75c-9d6eefe49025\") " Feb 19 23:56:22 crc kubenswrapper[4795]: I0219 23:56:22.413629 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0993101a-b7bc-40fd-a75c-9d6eefe49025-host" (OuterVolumeSpecName: "host") pod "0993101a-b7bc-40fd-a75c-9d6eefe49025" (UID: "0993101a-b7bc-40fd-a75c-9d6eefe49025"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 23:56:22 crc kubenswrapper[4795]: I0219 23:56:22.414087 4795 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0993101a-b7bc-40fd-a75c-9d6eefe49025-host\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:22 crc kubenswrapper[4795]: I0219 23:56:22.421958 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0993101a-b7bc-40fd-a75c-9d6eefe49025-kube-api-access-hbpz9" (OuterVolumeSpecName: "kube-api-access-hbpz9") pod "0993101a-b7bc-40fd-a75c-9d6eefe49025" (UID: "0993101a-b7bc-40fd-a75c-9d6eefe49025"). InnerVolumeSpecName "kube-api-access-hbpz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:56:22 crc kubenswrapper[4795]: I0219 23:56:22.516488 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbpz9\" (UniqueName: \"kubernetes.io/projected/0993101a-b7bc-40fd-a75c-9d6eefe49025-kube-api-access-hbpz9\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:23 crc kubenswrapper[4795]: I0219 23:56:23.107451 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a07f1a9d217d7811f21420408d2460fcdb1ae5287bd4e9acf1ba0f20fa813088" Feb 19 23:56:23 crc kubenswrapper[4795]: I0219 23:56:23.107523 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gv265/crc-debug-htwdp" Feb 19 23:56:23 crc kubenswrapper[4795]: E0219 23:56:23.346987 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0993101a_b7bc_40fd_a75c_9d6eefe49025.slice/crio-a07f1a9d217d7811f21420408d2460fcdb1ae5287bd4e9acf1ba0f20fa813088\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0993101a_b7bc_40fd_a75c_9d6eefe49025.slice\": RecentStats: unable to find data in memory cache]" Feb 19 23:56:23 crc kubenswrapper[4795]: I0219 23:56:23.502983 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gv265/crc-debug-b2t7k"] Feb 19 23:56:23 crc kubenswrapper[4795]: E0219 23:56:23.503475 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0993101a-b7bc-40fd-a75c-9d6eefe49025" containerName="container-00" Feb 19 23:56:23 crc kubenswrapper[4795]: I0219 23:56:23.503494 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0993101a-b7bc-40fd-a75c-9d6eefe49025" containerName="container-00" Feb 19 23:56:23 crc kubenswrapper[4795]: I0219 23:56:23.503708 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0993101a-b7bc-40fd-a75c-9d6eefe49025" containerName="container-00" Feb 19 23:56:23 crc kubenswrapper[4795]: I0219 23:56:23.504432 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gv265/crc-debug-b2t7k" Feb 19 23:56:23 crc kubenswrapper[4795]: I0219 23:56:23.524402 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0993101a-b7bc-40fd-a75c-9d6eefe49025" path="/var/lib/kubelet/pods/0993101a-b7bc-40fd-a75c-9d6eefe49025/volumes" Feb 19 23:56:23 crc kubenswrapper[4795]: I0219 23:56:23.535775 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c8080932-0083-4fa5-9816-6f8f4c16c917-host\") pod \"crc-debug-b2t7k\" (UID: \"c8080932-0083-4fa5-9816-6f8f4c16c917\") " pod="openshift-must-gather-gv265/crc-debug-b2t7k" Feb 19 23:56:23 crc kubenswrapper[4795]: I0219 23:56:23.536062 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42v45\" (UniqueName: \"kubernetes.io/projected/c8080932-0083-4fa5-9816-6f8f4c16c917-kube-api-access-42v45\") pod \"crc-debug-b2t7k\" (UID: \"c8080932-0083-4fa5-9816-6f8f4c16c917\") " pod="openshift-must-gather-gv265/crc-debug-b2t7k" Feb 19 23:56:23 crc kubenswrapper[4795]: I0219 23:56:23.638650 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42v45\" (UniqueName: \"kubernetes.io/projected/c8080932-0083-4fa5-9816-6f8f4c16c917-kube-api-access-42v45\") pod \"crc-debug-b2t7k\" (UID: \"c8080932-0083-4fa5-9816-6f8f4c16c917\") " pod="openshift-must-gather-gv265/crc-debug-b2t7k" Feb 19 23:56:23 crc kubenswrapper[4795]: I0219 23:56:23.638754 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c8080932-0083-4fa5-9816-6f8f4c16c917-host\") pod \"crc-debug-b2t7k\" (UID: \"c8080932-0083-4fa5-9816-6f8f4c16c917\") " pod="openshift-must-gather-gv265/crc-debug-b2t7k" Feb 19 23:56:23 crc kubenswrapper[4795]: I0219 23:56:23.638884 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c8080932-0083-4fa5-9816-6f8f4c16c917-host\") pod \"crc-debug-b2t7k\" (UID: \"c8080932-0083-4fa5-9816-6f8f4c16c917\") " pod="openshift-must-gather-gv265/crc-debug-b2t7k" Feb 19 23:56:23 crc kubenswrapper[4795]: I0219 23:56:23.657283 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42v45\" (UniqueName: \"kubernetes.io/projected/c8080932-0083-4fa5-9816-6f8f4c16c917-kube-api-access-42v45\") pod \"crc-debug-b2t7k\" (UID: \"c8080932-0083-4fa5-9816-6f8f4c16c917\") " pod="openshift-must-gather-gv265/crc-debug-b2t7k" Feb 19 23:56:23 crc kubenswrapper[4795]: I0219 23:56:23.822972 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gv265/crc-debug-b2t7k" Feb 19 23:56:24 crc kubenswrapper[4795]: I0219 23:56:24.120256 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gv265/crc-debug-b2t7k" event={"ID":"c8080932-0083-4fa5-9816-6f8f4c16c917","Type":"ContainerStarted","Data":"3644eb2ff205a6ecbf078ee98cc404ff628a9964f7f066ce4f38a2487488db5f"} Feb 19 23:56:24 crc kubenswrapper[4795]: I0219 23:56:24.120586 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gv265/crc-debug-b2t7k" event={"ID":"c8080932-0083-4fa5-9816-6f8f4c16c917","Type":"ContainerStarted","Data":"0b0224086177e9301d265a5e33b0ab2e6fe6e59ed4f2c42819f86d26d52c4151"} Feb 19 23:56:24 crc kubenswrapper[4795]: I0219 23:56:24.131353 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gv265/crc-debug-b2t7k" podStartSLOduration=1.131334908 podStartE2EDuration="1.131334908s" podCreationTimestamp="2026-02-19 23:56:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:56:24.130533935 +0000 UTC m=+8895.323051799" watchObservedRunningTime="2026-02-19 23:56:24.131334908 +0000 UTC m=+8895.323852772" Feb 19 23:56:25 crc kubenswrapper[4795]: I0219 23:56:25.131258 4795 generic.go:334] "Generic (PLEG): container finished" podID="c8080932-0083-4fa5-9816-6f8f4c16c917" containerID="3644eb2ff205a6ecbf078ee98cc404ff628a9964f7f066ce4f38a2487488db5f" exitCode=0 Feb 19 23:56:25 crc kubenswrapper[4795]: I0219 23:56:25.131359 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gv265/crc-debug-b2t7k" event={"ID":"c8080932-0083-4fa5-9816-6f8f4c16c917","Type":"ContainerDied","Data":"3644eb2ff205a6ecbf078ee98cc404ff628a9964f7f066ce4f38a2487488db5f"} Feb 19 23:56:26 crc kubenswrapper[4795]: I0219 23:56:26.168236 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:56:26 crc kubenswrapper[4795]: I0219 23:56:26.236929 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:56:26 crc kubenswrapper[4795]: I0219 23:56:26.255699 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gv265/crc-debug-b2t7k" Feb 19 23:56:26 crc kubenswrapper[4795]: I0219 23:56:26.293970 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gv265/crc-debug-b2t7k"] Feb 19 23:56:26 crc kubenswrapper[4795]: I0219 23:56:26.302235 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gv265/crc-debug-b2t7k"] Feb 19 23:56:26 crc kubenswrapper[4795]: I0219 23:56:26.391488 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c8080932-0083-4fa5-9816-6f8f4c16c917-host\") pod \"c8080932-0083-4fa5-9816-6f8f4c16c917\" (UID: \"c8080932-0083-4fa5-9816-6f8f4c16c917\") " Feb 19 23:56:26 crc kubenswrapper[4795]: I0219 23:56:26.391615 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42v45\" (UniqueName: \"kubernetes.io/projected/c8080932-0083-4fa5-9816-6f8f4c16c917-kube-api-access-42v45\") pod \"c8080932-0083-4fa5-9816-6f8f4c16c917\" (UID: \"c8080932-0083-4fa5-9816-6f8f4c16c917\") " Feb 19 23:56:26 crc kubenswrapper[4795]: I0219 23:56:26.391847 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c8080932-0083-4fa5-9816-6f8f4c16c917-host" (OuterVolumeSpecName: "host") pod "c8080932-0083-4fa5-9816-6f8f4c16c917" (UID: "c8080932-0083-4fa5-9816-6f8f4c16c917"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 23:56:26 crc kubenswrapper[4795]: I0219 23:56:26.393297 4795 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c8080932-0083-4fa5-9816-6f8f4c16c917-host\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:26 crc kubenswrapper[4795]: I0219 23:56:26.401509 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8080932-0083-4fa5-9816-6f8f4c16c917-kube-api-access-42v45" (OuterVolumeSpecName: "kube-api-access-42v45") pod "c8080932-0083-4fa5-9816-6f8f4c16c917" (UID: "c8080932-0083-4fa5-9816-6f8f4c16c917"). InnerVolumeSpecName "kube-api-access-42v45". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:56:26 crc kubenswrapper[4795]: I0219 23:56:26.405890 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sz6pt"] Feb 19 23:56:26 crc kubenswrapper[4795]: I0219 23:56:26.495092 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42v45\" (UniqueName: \"kubernetes.io/projected/c8080932-0083-4fa5-9816-6f8f4c16c917-kube-api-access-42v45\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:27 crc kubenswrapper[4795]: I0219 23:56:27.150832 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b0224086177e9301d265a5e33b0ab2e6fe6e59ed4f2c42819f86d26d52c4151" Feb 19 23:56:27 crc kubenswrapper[4795]: I0219 23:56:27.150870 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gv265/crc-debug-b2t7k" Feb 19 23:56:27 crc kubenswrapper[4795]: I0219 23:56:27.523838 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8080932-0083-4fa5-9816-6f8f4c16c917" path="/var/lib/kubelet/pods/c8080932-0083-4fa5-9816-6f8f4c16c917/volumes" Feb 19 23:56:27 crc kubenswrapper[4795]: I0219 23:56:27.526914 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gv265/crc-debug-5lxqn"] Feb 19 23:56:27 crc kubenswrapper[4795]: E0219 23:56:27.527413 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8080932-0083-4fa5-9816-6f8f4c16c917" containerName="container-00" Feb 19 23:56:27 crc kubenswrapper[4795]: I0219 23:56:27.527495 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8080932-0083-4fa5-9816-6f8f4c16c917" containerName="container-00" Feb 19 23:56:27 crc kubenswrapper[4795]: I0219 23:56:27.527787 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8080932-0083-4fa5-9816-6f8f4c16c917" containerName="container-00" Feb 19 23:56:27 crc kubenswrapper[4795]: I0219 23:56:27.531879 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gv265/crc-debug-5lxqn" Feb 19 23:56:27 crc kubenswrapper[4795]: I0219 23:56:27.721319 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s26gw\" (UniqueName: \"kubernetes.io/projected/48fd74ed-b96f-4d54-bde4-7812aa6f92be-kube-api-access-s26gw\") pod \"crc-debug-5lxqn\" (UID: \"48fd74ed-b96f-4d54-bde4-7812aa6f92be\") " pod="openshift-must-gather-gv265/crc-debug-5lxqn" Feb 19 23:56:27 crc kubenswrapper[4795]: I0219 23:56:27.721471 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/48fd74ed-b96f-4d54-bde4-7812aa6f92be-host\") pod \"crc-debug-5lxqn\" (UID: \"48fd74ed-b96f-4d54-bde4-7812aa6f92be\") " pod="openshift-must-gather-gv265/crc-debug-5lxqn" Feb 19 23:56:27 crc kubenswrapper[4795]: I0219 23:56:27.823954 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s26gw\" (UniqueName: \"kubernetes.io/projected/48fd74ed-b96f-4d54-bde4-7812aa6f92be-kube-api-access-s26gw\") pod \"crc-debug-5lxqn\" (UID: \"48fd74ed-b96f-4d54-bde4-7812aa6f92be\") " pod="openshift-must-gather-gv265/crc-debug-5lxqn" Feb 19 23:56:27 crc kubenswrapper[4795]: I0219 23:56:27.824132 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/48fd74ed-b96f-4d54-bde4-7812aa6f92be-host\") pod \"crc-debug-5lxqn\" (UID: \"48fd74ed-b96f-4d54-bde4-7812aa6f92be\") " pod="openshift-must-gather-gv265/crc-debug-5lxqn" Feb 19 23:56:27 crc kubenswrapper[4795]: I0219 23:56:27.824285 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/48fd74ed-b96f-4d54-bde4-7812aa6f92be-host\") pod \"crc-debug-5lxqn\" (UID: \"48fd74ed-b96f-4d54-bde4-7812aa6f92be\") " pod="openshift-must-gather-gv265/crc-debug-5lxqn" Feb 19 23:56:27 crc kubenswrapper[4795]: I0219 23:56:27.846844 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s26gw\" (UniqueName: \"kubernetes.io/projected/48fd74ed-b96f-4d54-bde4-7812aa6f92be-kube-api-access-s26gw\") pod \"crc-debug-5lxqn\" (UID: \"48fd74ed-b96f-4d54-bde4-7812aa6f92be\") " pod="openshift-must-gather-gv265/crc-debug-5lxqn" Feb 19 23:56:27 crc kubenswrapper[4795]: I0219 23:56:27.851787 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gv265/crc-debug-5lxqn" Feb 19 23:56:27 crc kubenswrapper[4795]: W0219 23:56:27.884417 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48fd74ed_b96f_4d54_bde4_7812aa6f92be.slice/crio-bbe1f7b37dea849a53f62bcadc373fd195c3dd7ff3250a51919068d5d7de11e2 WatchSource:0}: Error finding container bbe1f7b37dea849a53f62bcadc373fd195c3dd7ff3250a51919068d5d7de11e2: Status 404 returned error can't find the container with id bbe1f7b37dea849a53f62bcadc373fd195c3dd7ff3250a51919068d5d7de11e2 Feb 19 23:56:28 crc kubenswrapper[4795]: I0219 23:56:28.161413 4795 generic.go:334] "Generic (PLEG): container finished" podID="48fd74ed-b96f-4d54-bde4-7812aa6f92be" containerID="7aaf83518c533d8a7a22f272741bc623bd1d1cbd9f71a60f196c67d16987e279" exitCode=0 Feb 19 23:56:28 crc kubenswrapper[4795]: I0219 23:56:28.161489 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gv265/crc-debug-5lxqn" event={"ID":"48fd74ed-b96f-4d54-bde4-7812aa6f92be","Type":"ContainerDied","Data":"7aaf83518c533d8a7a22f272741bc623bd1d1cbd9f71a60f196c67d16987e279"} Feb 19 23:56:28 crc kubenswrapper[4795]: I0219 23:56:28.161759 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gv265/crc-debug-5lxqn" event={"ID":"48fd74ed-b96f-4d54-bde4-7812aa6f92be","Type":"ContainerStarted","Data":"bbe1f7b37dea849a53f62bcadc373fd195c3dd7ff3250a51919068d5d7de11e2"} Feb 19 23:56:28 crc kubenswrapper[4795]: I0219 23:56:28.161917 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sz6pt" podUID="8f2da7b5-d729-463f-9589-455203a5ad9e" containerName="registry-server" containerID="cri-o://6d065a9d149243251b6b58ebaf53b077b0910d5423b5a4944782ee17e37027c2" gracePeriod=2 Feb 19 23:56:28 crc kubenswrapper[4795]: I0219 23:56:28.226729 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gv265/crc-debug-5lxqn"] Feb 19 23:56:28 crc kubenswrapper[4795]: I0219 23:56:28.241479 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gv265/crc-debug-5lxqn"] Feb 19 23:56:28 crc kubenswrapper[4795]: I0219 23:56:28.427235 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:56:28 crc kubenswrapper[4795]: I0219 23:56:28.427550 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:56:28 crc kubenswrapper[4795]: I0219 23:56:28.427597 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 23:56:28 crc kubenswrapper[4795]: I0219 23:56:28.428396 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"826efb3421363865a6a92aeeb46a1de0d922f1bfcfa12ff4f740f04d52a6b3e6"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 23:56:28 crc kubenswrapper[4795]: I0219 23:56:28.428453 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://826efb3421363865a6a92aeeb46a1de0d922f1bfcfa12ff4f740f04d52a6b3e6" gracePeriod=600 Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.172254 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"826efb3421363865a6a92aeeb46a1de0d922f1bfcfa12ff4f740f04d52a6b3e6"} Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.172158 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="826efb3421363865a6a92aeeb46a1de0d922f1bfcfa12ff4f740f04d52a6b3e6" exitCode=0 Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.172690 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.176969 4795 generic.go:334] "Generic (PLEG): container finished" podID="8f2da7b5-d729-463f-9589-455203a5ad9e" containerID="6d065a9d149243251b6b58ebaf53b077b0910d5423b5a4944782ee17e37027c2" exitCode=0 Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.177196 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sz6pt" event={"ID":"8f2da7b5-d729-463f-9589-455203a5ad9e","Type":"ContainerDied","Data":"6d065a9d149243251b6b58ebaf53b077b0910d5423b5a4944782ee17e37027c2"} Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.177224 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sz6pt" event={"ID":"8f2da7b5-d729-463f-9589-455203a5ad9e","Type":"ContainerDied","Data":"a6d95aca702893e3ea5637611a6e798f2de86b26f77a8aeaf0d2c93be967e572"} Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.177236 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6d95aca702893e3ea5637611a6e798f2de86b26f77a8aeaf0d2c93be967e572" Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.372812 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.387208 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gv265/crc-debug-5lxqn" Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.559568 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s26gw\" (UniqueName: \"kubernetes.io/projected/48fd74ed-b96f-4d54-bde4-7812aa6f92be-kube-api-access-s26gw\") pod \"48fd74ed-b96f-4d54-bde4-7812aa6f92be\" (UID: \"48fd74ed-b96f-4d54-bde4-7812aa6f92be\") " Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.560077 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f2da7b5-d729-463f-9589-455203a5ad9e-catalog-content\") pod \"8f2da7b5-d729-463f-9589-455203a5ad9e\" (UID: \"8f2da7b5-d729-463f-9589-455203a5ad9e\") " Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.560213 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/48fd74ed-b96f-4d54-bde4-7812aa6f92be-host\") pod \"48fd74ed-b96f-4d54-bde4-7812aa6f92be\" (UID: \"48fd74ed-b96f-4d54-bde4-7812aa6f92be\") " Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.560254 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48fd74ed-b96f-4d54-bde4-7812aa6f92be-host" (OuterVolumeSpecName: "host") pod "48fd74ed-b96f-4d54-bde4-7812aa6f92be" (UID: "48fd74ed-b96f-4d54-bde4-7812aa6f92be"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.560274 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f2da7b5-d729-463f-9589-455203a5ad9e-utilities\") pod \"8f2da7b5-d729-463f-9589-455203a5ad9e\" (UID: \"8f2da7b5-d729-463f-9589-455203a5ad9e\") " Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.560364 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trfkf\" (UniqueName: \"kubernetes.io/projected/8f2da7b5-d729-463f-9589-455203a5ad9e-kube-api-access-trfkf\") pod \"8f2da7b5-d729-463f-9589-455203a5ad9e\" (UID: \"8f2da7b5-d729-463f-9589-455203a5ad9e\") " Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.560982 4795 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/48fd74ed-b96f-4d54-bde4-7812aa6f92be-host\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.561091 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f2da7b5-d729-463f-9589-455203a5ad9e-utilities" (OuterVolumeSpecName: "utilities") pod "8f2da7b5-d729-463f-9589-455203a5ad9e" (UID: "8f2da7b5-d729-463f-9589-455203a5ad9e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.567804 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48fd74ed-b96f-4d54-bde4-7812aa6f92be-kube-api-access-s26gw" (OuterVolumeSpecName: "kube-api-access-s26gw") pod "48fd74ed-b96f-4d54-bde4-7812aa6f92be" (UID: "48fd74ed-b96f-4d54-bde4-7812aa6f92be"). InnerVolumeSpecName "kube-api-access-s26gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.588661 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f2da7b5-d729-463f-9589-455203a5ad9e-kube-api-access-trfkf" (OuterVolumeSpecName: "kube-api-access-trfkf") pod "8f2da7b5-d729-463f-9589-455203a5ad9e" (UID: "8f2da7b5-d729-463f-9589-455203a5ad9e"). InnerVolumeSpecName "kube-api-access-trfkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.662681 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s26gw\" (UniqueName: \"kubernetes.io/projected/48fd74ed-b96f-4d54-bde4-7812aa6f92be-kube-api-access-s26gw\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.662713 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f2da7b5-d729-463f-9589-455203a5ad9e-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.662723 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trfkf\" (UniqueName: \"kubernetes.io/projected/8f2da7b5-d729-463f-9589-455203a5ad9e-kube-api-access-trfkf\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.680212 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f2da7b5-d729-463f-9589-455203a5ad9e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f2da7b5-d729-463f-9589-455203a5ad9e" (UID: "8f2da7b5-d729-463f-9589-455203a5ad9e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.764355 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f2da7b5-d729-463f-9589-455203a5ad9e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:30 crc kubenswrapper[4795]: I0219 23:56:30.187394 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gv265/crc-debug-5lxqn" Feb 19 23:56:30 crc kubenswrapper[4795]: I0219 23:56:30.187400 4795 scope.go:117] "RemoveContainer" containerID="7aaf83518c533d8a7a22f272741bc623bd1d1cbd9f71a60f196c67d16987e279" Feb 19 23:56:30 crc kubenswrapper[4795]: I0219 23:56:30.191002 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7"} Feb 19 23:56:30 crc kubenswrapper[4795]: I0219 23:56:30.191047 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:56:30 crc kubenswrapper[4795]: I0219 23:56:30.279359 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sz6pt"] Feb 19 23:56:30 crc kubenswrapper[4795]: I0219 23:56:30.293827 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sz6pt"] Feb 19 23:56:31 crc kubenswrapper[4795]: I0219 23:56:31.529143 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48fd74ed-b96f-4d54-bde4-7812aa6f92be" path="/var/lib/kubelet/pods/48fd74ed-b96f-4d54-bde4-7812aa6f92be/volumes" Feb 19 23:56:31 crc kubenswrapper[4795]: I0219 23:56:31.530157 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f2da7b5-d729-463f-9589-455203a5ad9e" path="/var/lib/kubelet/pods/8f2da7b5-d729-463f-9589-455203a5ad9e/volumes" Feb 19 23:57:48 crc kubenswrapper[4795]: I0219 23:57:48.987680 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c5b66"] Feb 19 23:57:48 crc kubenswrapper[4795]: E0219 23:57:48.989211 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f2da7b5-d729-463f-9589-455203a5ad9e" containerName="extract-utilities" Feb 19 23:57:48 crc kubenswrapper[4795]: I0219 23:57:48.989229 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f2da7b5-d729-463f-9589-455203a5ad9e" containerName="extract-utilities" Feb 19 23:57:48 crc kubenswrapper[4795]: E0219 23:57:48.989243 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f2da7b5-d729-463f-9589-455203a5ad9e" containerName="extract-content" Feb 19 23:57:48 crc kubenswrapper[4795]: I0219 23:57:48.989250 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f2da7b5-d729-463f-9589-455203a5ad9e" containerName="extract-content" Feb 19 23:57:48 crc kubenswrapper[4795]: E0219 23:57:48.989296 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48fd74ed-b96f-4d54-bde4-7812aa6f92be" containerName="container-00" Feb 19 23:57:48 crc kubenswrapper[4795]: I0219 23:57:48.989303 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="48fd74ed-b96f-4d54-bde4-7812aa6f92be" containerName="container-00" Feb 19 23:57:48 crc kubenswrapper[4795]: E0219 23:57:48.989322 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f2da7b5-d729-463f-9589-455203a5ad9e" containerName="registry-server" Feb 19 23:57:48 crc kubenswrapper[4795]: I0219 23:57:48.989329 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f2da7b5-d729-463f-9589-455203a5ad9e" containerName="registry-server" Feb 19 23:57:48 crc kubenswrapper[4795]: I0219 23:57:48.989571 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="48fd74ed-b96f-4d54-bde4-7812aa6f92be" containerName="container-00" Feb 19 23:57:48 crc kubenswrapper[4795]: I0219 23:57:48.989600 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f2da7b5-d729-463f-9589-455203a5ad9e" containerName="registry-server" Feb 19 23:57:48 crc kubenswrapper[4795]: I0219 23:57:48.991785 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:57:49 crc kubenswrapper[4795]: I0219 23:57:49.019081 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c5b66"] Feb 19 23:57:49 crc kubenswrapper[4795]: I0219 23:57:49.044742 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2382916-87a1-4ef9-9461-45b6ab2b24a3-utilities\") pod \"redhat-marketplace-c5b66\" (UID: \"a2382916-87a1-4ef9-9461-45b6ab2b24a3\") " pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:57:49 crc kubenswrapper[4795]: I0219 23:57:49.044803 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-888bn\" (UniqueName: \"kubernetes.io/projected/a2382916-87a1-4ef9-9461-45b6ab2b24a3-kube-api-access-888bn\") pod \"redhat-marketplace-c5b66\" (UID: \"a2382916-87a1-4ef9-9461-45b6ab2b24a3\") " pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:57:49 crc kubenswrapper[4795]: I0219 23:57:49.044988 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2382916-87a1-4ef9-9461-45b6ab2b24a3-catalog-content\") pod \"redhat-marketplace-c5b66\" (UID: \"a2382916-87a1-4ef9-9461-45b6ab2b24a3\") " pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:57:49 crc kubenswrapper[4795]: I0219 23:57:49.146411 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2382916-87a1-4ef9-9461-45b6ab2b24a3-catalog-content\") pod \"redhat-marketplace-c5b66\" (UID: \"a2382916-87a1-4ef9-9461-45b6ab2b24a3\") " pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:57:49 crc kubenswrapper[4795]: I0219 23:57:49.146531 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2382916-87a1-4ef9-9461-45b6ab2b24a3-utilities\") pod \"redhat-marketplace-c5b66\" (UID: \"a2382916-87a1-4ef9-9461-45b6ab2b24a3\") " pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:57:49 crc kubenswrapper[4795]: I0219 23:57:49.146589 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-888bn\" (UniqueName: \"kubernetes.io/projected/a2382916-87a1-4ef9-9461-45b6ab2b24a3-kube-api-access-888bn\") pod \"redhat-marketplace-c5b66\" (UID: \"a2382916-87a1-4ef9-9461-45b6ab2b24a3\") " pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:57:49 crc kubenswrapper[4795]: I0219 23:57:49.146886 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2382916-87a1-4ef9-9461-45b6ab2b24a3-catalog-content\") pod \"redhat-marketplace-c5b66\" (UID: \"a2382916-87a1-4ef9-9461-45b6ab2b24a3\") " pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:57:49 crc kubenswrapper[4795]: I0219 23:57:49.147042 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2382916-87a1-4ef9-9461-45b6ab2b24a3-utilities\") pod \"redhat-marketplace-c5b66\" (UID: \"a2382916-87a1-4ef9-9461-45b6ab2b24a3\") " pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:57:49 crc kubenswrapper[4795]: I0219 23:57:49.166622 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-888bn\" (UniqueName: \"kubernetes.io/projected/a2382916-87a1-4ef9-9461-45b6ab2b24a3-kube-api-access-888bn\") pod \"redhat-marketplace-c5b66\" (UID: \"a2382916-87a1-4ef9-9461-45b6ab2b24a3\") " pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:57:49 crc kubenswrapper[4795]: I0219 23:57:49.320848 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:57:49 crc kubenswrapper[4795]: I0219 23:57:49.794733 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c5b66"] Feb 19 23:57:50 crc kubenswrapper[4795]: I0219 23:57:50.036501 4795 generic.go:334] "Generic (PLEG): container finished" podID="a2382916-87a1-4ef9-9461-45b6ab2b24a3" containerID="f5007b1b1946e5b94234d84aa0f061d3ecb4a8cec36f1001b7a98c4dbc215ab9" exitCode=0 Feb 19 23:57:50 crc kubenswrapper[4795]: I0219 23:57:50.036542 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c5b66" event={"ID":"a2382916-87a1-4ef9-9461-45b6ab2b24a3","Type":"ContainerDied","Data":"f5007b1b1946e5b94234d84aa0f061d3ecb4a8cec36f1001b7a98c4dbc215ab9"} Feb 19 23:57:50 crc kubenswrapper[4795]: I0219 23:57:50.036574 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c5b66" event={"ID":"a2382916-87a1-4ef9-9461-45b6ab2b24a3","Type":"ContainerStarted","Data":"a0f70866ad3d1dff70355360c8352691af4dac5cdb64e7815192fee5af7a7b5f"} Feb 19 23:57:51 crc kubenswrapper[4795]: I0219 23:57:51.049926 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c5b66" event={"ID":"a2382916-87a1-4ef9-9461-45b6ab2b24a3","Type":"ContainerStarted","Data":"c559a71b5fbc7d3a6fc6440c6f68da288b127e1ac01f0253718d300fc1931f7d"} Feb 19 23:57:52 crc kubenswrapper[4795]: I0219 23:57:52.071908 4795 generic.go:334] "Generic (PLEG): container finished" podID="a2382916-87a1-4ef9-9461-45b6ab2b24a3" containerID="c559a71b5fbc7d3a6fc6440c6f68da288b127e1ac01f0253718d300fc1931f7d" exitCode=0 Feb 19 23:57:52 crc kubenswrapper[4795]: I0219 23:57:52.071966 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c5b66" event={"ID":"a2382916-87a1-4ef9-9461-45b6ab2b24a3","Type":"ContainerDied","Data":"c559a71b5fbc7d3a6fc6440c6f68da288b127e1ac01f0253718d300fc1931f7d"} Feb 19 23:57:53 crc kubenswrapper[4795]: I0219 23:57:53.083698 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c5b66" event={"ID":"a2382916-87a1-4ef9-9461-45b6ab2b24a3","Type":"ContainerStarted","Data":"5bfe7779c9984cbb85e962016c59906ad2aaeafe4076d4ea66c30a52dc18ac0f"} Feb 19 23:57:53 crc kubenswrapper[4795]: I0219 23:57:53.106458 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c5b66" podStartSLOduration=2.668373126 podStartE2EDuration="5.106442149s" podCreationTimestamp="2026-02-19 23:57:48 +0000 UTC" firstStartedPulling="2026-02-19 23:57:50.038714508 +0000 UTC m=+8981.231232372" lastFinishedPulling="2026-02-19 23:57:52.476783521 +0000 UTC m=+8983.669301395" observedRunningTime="2026-02-19 23:57:53.101364496 +0000 UTC m=+8984.293882350" watchObservedRunningTime="2026-02-19 23:57:53.106442149 +0000 UTC m=+8984.298960013" Feb 19 23:57:59 crc kubenswrapper[4795]: I0219 23:57:59.321597 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:57:59 crc kubenswrapper[4795]: I0219 23:57:59.322231 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:57:59 crc kubenswrapper[4795]: I0219 23:57:59.378283 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:58:00 crc kubenswrapper[4795]: I0219 23:58:00.200948 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:58:00 crc kubenswrapper[4795]: I0219 23:58:00.259414 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c5b66"] Feb 19 23:58:02 crc kubenswrapper[4795]: I0219 23:58:02.173134 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-c5b66" podUID="a2382916-87a1-4ef9-9461-45b6ab2b24a3" containerName="registry-server" containerID="cri-o://5bfe7779c9984cbb85e962016c59906ad2aaeafe4076d4ea66c30a52dc18ac0f" gracePeriod=2 Feb 19 23:58:02 crc kubenswrapper[4795]: I0219 23:58:02.656403 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:58:02 crc kubenswrapper[4795]: I0219 23:58:02.742572 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2382916-87a1-4ef9-9461-45b6ab2b24a3-utilities\") pod \"a2382916-87a1-4ef9-9461-45b6ab2b24a3\" (UID: \"a2382916-87a1-4ef9-9461-45b6ab2b24a3\") " Feb 19 23:58:02 crc kubenswrapper[4795]: I0219 23:58:02.742852 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-888bn\" (UniqueName: \"kubernetes.io/projected/a2382916-87a1-4ef9-9461-45b6ab2b24a3-kube-api-access-888bn\") pod \"a2382916-87a1-4ef9-9461-45b6ab2b24a3\" (UID: \"a2382916-87a1-4ef9-9461-45b6ab2b24a3\") " Feb 19 23:58:02 crc kubenswrapper[4795]: I0219 23:58:02.742955 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2382916-87a1-4ef9-9461-45b6ab2b24a3-catalog-content\") pod \"a2382916-87a1-4ef9-9461-45b6ab2b24a3\" (UID: \"a2382916-87a1-4ef9-9461-45b6ab2b24a3\") " Feb 19 23:58:02 crc kubenswrapper[4795]: I0219 23:58:02.743467 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2382916-87a1-4ef9-9461-45b6ab2b24a3-utilities" (OuterVolumeSpecName: "utilities") pod "a2382916-87a1-4ef9-9461-45b6ab2b24a3" (UID: "a2382916-87a1-4ef9-9461-45b6ab2b24a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:58:02 crc kubenswrapper[4795]: I0219 23:58:02.743578 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2382916-87a1-4ef9-9461-45b6ab2b24a3-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:58:02 crc kubenswrapper[4795]: I0219 23:58:02.751347 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2382916-87a1-4ef9-9461-45b6ab2b24a3-kube-api-access-888bn" (OuterVolumeSpecName: "kube-api-access-888bn") pod "a2382916-87a1-4ef9-9461-45b6ab2b24a3" (UID: "a2382916-87a1-4ef9-9461-45b6ab2b24a3"). InnerVolumeSpecName "kube-api-access-888bn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:58:02 crc kubenswrapper[4795]: I0219 23:58:02.780716 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2382916-87a1-4ef9-9461-45b6ab2b24a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a2382916-87a1-4ef9-9461-45b6ab2b24a3" (UID: "a2382916-87a1-4ef9-9461-45b6ab2b24a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:58:02 crc kubenswrapper[4795]: I0219 23:58:02.846302 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-888bn\" (UniqueName: \"kubernetes.io/projected/a2382916-87a1-4ef9-9461-45b6ab2b24a3-kube-api-access-888bn\") on node \"crc\" DevicePath \"\"" Feb 19 23:58:02 crc kubenswrapper[4795]: I0219 23:58:02.846343 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2382916-87a1-4ef9-9461-45b6ab2b24a3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:58:03 crc kubenswrapper[4795]: I0219 23:58:03.184781 4795 generic.go:334] "Generic (PLEG): container finished" podID="a2382916-87a1-4ef9-9461-45b6ab2b24a3" containerID="5bfe7779c9984cbb85e962016c59906ad2aaeafe4076d4ea66c30a52dc18ac0f" exitCode=0 Feb 19 23:58:03 crc kubenswrapper[4795]: I0219 23:58:03.184839 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:58:03 crc kubenswrapper[4795]: I0219 23:58:03.184857 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c5b66" event={"ID":"a2382916-87a1-4ef9-9461-45b6ab2b24a3","Type":"ContainerDied","Data":"5bfe7779c9984cbb85e962016c59906ad2aaeafe4076d4ea66c30a52dc18ac0f"} Feb 19 23:58:03 crc kubenswrapper[4795]: I0219 23:58:03.187628 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c5b66" event={"ID":"a2382916-87a1-4ef9-9461-45b6ab2b24a3","Type":"ContainerDied","Data":"a0f70866ad3d1dff70355360c8352691af4dac5cdb64e7815192fee5af7a7b5f"} Feb 19 23:58:03 crc kubenswrapper[4795]: I0219 23:58:03.187676 4795 scope.go:117] "RemoveContainer" containerID="5bfe7779c9984cbb85e962016c59906ad2aaeafe4076d4ea66c30a52dc18ac0f" Feb 19 23:58:03 crc kubenswrapper[4795]: I0219 23:58:03.223842 4795 scope.go:117] "RemoveContainer" containerID="c559a71b5fbc7d3a6fc6440c6f68da288b127e1ac01f0253718d300fc1931f7d" Feb 19 23:58:03 crc kubenswrapper[4795]: I0219 23:58:03.234809 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c5b66"] Feb 19 23:58:03 crc kubenswrapper[4795]: I0219 23:58:03.247927 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-c5b66"] Feb 19 23:58:03 crc kubenswrapper[4795]: I0219 23:58:03.252495 4795 scope.go:117] "RemoveContainer" containerID="f5007b1b1946e5b94234d84aa0f061d3ecb4a8cec36f1001b7a98c4dbc215ab9" Feb 19 23:58:03 crc kubenswrapper[4795]: I0219 23:58:03.293618 4795 scope.go:117] "RemoveContainer" containerID="5bfe7779c9984cbb85e962016c59906ad2aaeafe4076d4ea66c30a52dc18ac0f" Feb 19 23:58:03 crc kubenswrapper[4795]: E0219 23:58:03.294197 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bfe7779c9984cbb85e962016c59906ad2aaeafe4076d4ea66c30a52dc18ac0f\": container with ID starting with 5bfe7779c9984cbb85e962016c59906ad2aaeafe4076d4ea66c30a52dc18ac0f not found: ID does not exist" containerID="5bfe7779c9984cbb85e962016c59906ad2aaeafe4076d4ea66c30a52dc18ac0f" Feb 19 23:58:03 crc kubenswrapper[4795]: I0219 23:58:03.294258 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bfe7779c9984cbb85e962016c59906ad2aaeafe4076d4ea66c30a52dc18ac0f"} err="failed to get container status \"5bfe7779c9984cbb85e962016c59906ad2aaeafe4076d4ea66c30a52dc18ac0f\": rpc error: code = NotFound desc = could not find container \"5bfe7779c9984cbb85e962016c59906ad2aaeafe4076d4ea66c30a52dc18ac0f\": container with ID starting with 5bfe7779c9984cbb85e962016c59906ad2aaeafe4076d4ea66c30a52dc18ac0f not found: ID does not exist" Feb 19 23:58:03 crc kubenswrapper[4795]: I0219 23:58:03.294294 4795 scope.go:117] "RemoveContainer" containerID="c559a71b5fbc7d3a6fc6440c6f68da288b127e1ac01f0253718d300fc1931f7d" Feb 19 23:58:03 crc kubenswrapper[4795]: E0219 23:58:03.297368 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c559a71b5fbc7d3a6fc6440c6f68da288b127e1ac01f0253718d300fc1931f7d\": container with ID starting with c559a71b5fbc7d3a6fc6440c6f68da288b127e1ac01f0253718d300fc1931f7d not found: ID does not exist" containerID="c559a71b5fbc7d3a6fc6440c6f68da288b127e1ac01f0253718d300fc1931f7d" Feb 19 23:58:03 crc kubenswrapper[4795]: I0219 23:58:03.297406 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c559a71b5fbc7d3a6fc6440c6f68da288b127e1ac01f0253718d300fc1931f7d"} err="failed to get container status \"c559a71b5fbc7d3a6fc6440c6f68da288b127e1ac01f0253718d300fc1931f7d\": rpc error: code = NotFound desc = could not find container \"c559a71b5fbc7d3a6fc6440c6f68da288b127e1ac01f0253718d300fc1931f7d\": container with ID starting with c559a71b5fbc7d3a6fc6440c6f68da288b127e1ac01f0253718d300fc1931f7d not found: ID does not exist" Feb 19 23:58:03 crc kubenswrapper[4795]: I0219 23:58:03.297429 4795 scope.go:117] "RemoveContainer" containerID="f5007b1b1946e5b94234d84aa0f061d3ecb4a8cec36f1001b7a98c4dbc215ab9" Feb 19 23:58:03 crc kubenswrapper[4795]: E0219 23:58:03.297874 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5007b1b1946e5b94234d84aa0f061d3ecb4a8cec36f1001b7a98c4dbc215ab9\": container with ID starting with f5007b1b1946e5b94234d84aa0f061d3ecb4a8cec36f1001b7a98c4dbc215ab9 not found: ID does not exist" containerID="f5007b1b1946e5b94234d84aa0f061d3ecb4a8cec36f1001b7a98c4dbc215ab9" Feb 19 23:58:03 crc kubenswrapper[4795]: I0219 23:58:03.297930 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5007b1b1946e5b94234d84aa0f061d3ecb4a8cec36f1001b7a98c4dbc215ab9"} err="failed to get container status \"f5007b1b1946e5b94234d84aa0f061d3ecb4a8cec36f1001b7a98c4dbc215ab9\": rpc error: code = NotFound desc = could not find container \"f5007b1b1946e5b94234d84aa0f061d3ecb4a8cec36f1001b7a98c4dbc215ab9\": container with ID starting with f5007b1b1946e5b94234d84aa0f061d3ecb4a8cec36f1001b7a98c4dbc215ab9 not found: ID does not exist" Feb 19 23:58:03 crc kubenswrapper[4795]: I0219 23:58:03.522937 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2382916-87a1-4ef9-9461-45b6ab2b24a3" path="/var/lib/kubelet/pods/a2382916-87a1-4ef9-9461-45b6ab2b24a3/volumes" Feb 19 23:58:58 crc kubenswrapper[4795]: I0219 23:58:58.427120 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:58:58 crc kubenswrapper[4795]: I0219 23:58:58.427754 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:59:28 crc kubenswrapper[4795]: I0219 23:59:28.427638 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:59:28 crc kubenswrapper[4795]: I0219 23:59:28.428252 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:59:58 crc kubenswrapper[4795]: I0219 23:59:58.427428 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:59:58 crc kubenswrapper[4795]: I0219 23:59:58.428155 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:59:58 crc kubenswrapper[4795]: I0219 23:59:58.428225 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 23:59:58 crc kubenswrapper[4795]: I0219 23:59:58.429440 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 23:59:58 crc kubenswrapper[4795]: I0219 23:59:58.429532 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" gracePeriod=600 Feb 19 23:59:58 crc kubenswrapper[4795]: E0219 23:59:58.861620 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:59:59 crc kubenswrapper[4795]: I0219 23:59:59.716516 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" exitCode=0 Feb 19 23:59:59 crc kubenswrapper[4795]: I0219 23:59:59.716585 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7"} Feb 19 23:59:59 crc kubenswrapper[4795]: I0219 23:59:59.716892 4795 scope.go:117] "RemoveContainer" containerID="826efb3421363865a6a92aeeb46a1de0d922f1bfcfa12ff4f740f04d52a6b3e6" Feb 19 23:59:59 crc kubenswrapper[4795]: I0219 23:59:59.718345 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 19 23:59:59 crc kubenswrapper[4795]: E0219 23:59:59.719190 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.153360 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-purge-29525760-9s2b5"] Feb 20 00:00:00 crc kubenswrapper[4795]: E0220 00:00:00.154396 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2382916-87a1-4ef9-9461-45b6ab2b24a3" containerName="extract-utilities" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.154419 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2382916-87a1-4ef9-9461-45b6ab2b24a3" containerName="extract-utilities" Feb 20 00:00:00 crc kubenswrapper[4795]: E0220 00:00:00.154446 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2382916-87a1-4ef9-9461-45b6ab2b24a3" containerName="extract-content" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.154453 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2382916-87a1-4ef9-9461-45b6ab2b24a3" containerName="extract-content" Feb 20 00:00:00 crc kubenswrapper[4795]: E0220 00:00:00.154477 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2382916-87a1-4ef9-9461-45b6ab2b24a3" containerName="registry-server" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.154485 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2382916-87a1-4ef9-9461-45b6ab2b24a3" containerName="registry-server" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.154678 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2382916-87a1-4ef9-9461-45b6ab2b24a3" containerName="registry-server" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.155565 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-purge-29525760-9s2b5" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.157722 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.168234 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-purge-29525760-n9sd6"] Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.170082 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-purge-29525760-n9sd6" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.172601 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.181849 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-purge-29525760-n9sd6"] Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.194573 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-purge-29525760-9s2b5"] Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.263480 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29525760-mtv5d"] Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.265010 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29525760-mtv5d" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.274567 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.274826 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.296791 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29525760-mtv5d"] Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.319422 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-scripts\") pod \"nova-cell0-db-purge-29525760-n9sd6\" (UID: \"57de3f43-e33f-4734-b02d-372d013b7e80\") " pod="openstack/nova-cell0-db-purge-29525760-n9sd6" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.319549 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-scripts\") pod \"nova-cell1-db-purge-29525760-9s2b5\" (UID: \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\") " pod="openstack/nova-cell1-db-purge-29525760-9s2b5" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.319700 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmg5x\" (UniqueName: \"kubernetes.io/projected/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-kube-api-access-qmg5x\") pod \"nova-cell1-db-purge-29525760-9s2b5\" (UID: \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\") " pod="openstack/nova-cell1-db-purge-29525760-9s2b5" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.319927 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-combined-ca-bundle\") pod \"nova-cell0-db-purge-29525760-n9sd6\" (UID: \"57de3f43-e33f-4734-b02d-372d013b7e80\") " pod="openstack/nova-cell0-db-purge-29525760-n9sd6" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.320029 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwmfj\" (UniqueName: \"kubernetes.io/projected/57de3f43-e33f-4734-b02d-372d013b7e80-kube-api-access-qwmfj\") pod \"nova-cell0-db-purge-29525760-n9sd6\" (UID: \"57de3f43-e33f-4734-b02d-372d013b7e80\") " pod="openstack/nova-cell0-db-purge-29525760-n9sd6" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.320129 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-config-data\") pod \"nova-cell0-db-purge-29525760-n9sd6\" (UID: \"57de3f43-e33f-4734-b02d-372d013b7e80\") " pod="openstack/nova-cell0-db-purge-29525760-n9sd6" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.320228 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-config-data\") pod \"nova-cell1-db-purge-29525760-9s2b5\" (UID: \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\") " pod="openstack/nova-cell1-db-purge-29525760-9s2b5" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.320262 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-combined-ca-bundle\") pod \"nova-cell1-db-purge-29525760-9s2b5\" (UID: \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\") " pod="openstack/nova-cell1-db-purge-29525760-9s2b5" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.352897 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx"] Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.354532 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.364319 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.365415 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.388157 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx"] Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.422109 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-scripts\") pod \"nova-cell1-db-purge-29525760-9s2b5\" (UID: \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\") " pod="openstack/nova-cell1-db-purge-29525760-9s2b5" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.422223 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmg5x\" (UniqueName: \"kubernetes.io/projected/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-kube-api-access-qmg5x\") pod \"nova-cell1-db-purge-29525760-9s2b5\" (UID: \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\") " pod="openstack/nova-cell1-db-purge-29525760-9s2b5" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.422303 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-combined-ca-bundle\") pod \"nova-cell0-db-purge-29525760-n9sd6\" (UID: \"57de3f43-e33f-4734-b02d-372d013b7e80\") " pod="openstack/nova-cell0-db-purge-29525760-n9sd6" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.422339 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwmfj\" (UniqueName: \"kubernetes.io/projected/57de3f43-e33f-4734-b02d-372d013b7e80-kube-api-access-qwmfj\") pod \"nova-cell0-db-purge-29525760-n9sd6\" (UID: \"57de3f43-e33f-4734-b02d-372d013b7e80\") " pod="openstack/nova-cell0-db-purge-29525760-n9sd6" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.422378 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-858m8\" (UniqueName: \"kubernetes.io/projected/29390c83-c5f7-4c7a-8f48-9a02661a1108-kube-api-access-858m8\") pod \"image-pruner-29525760-mtv5d\" (UID: \"29390c83-c5f7-4c7a-8f48-9a02661a1108\") " pod="openshift-image-registry/image-pruner-29525760-mtv5d" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.422407 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-config-data\") pod \"nova-cell0-db-purge-29525760-n9sd6\" (UID: \"57de3f43-e33f-4734-b02d-372d013b7e80\") " pod="openstack/nova-cell0-db-purge-29525760-n9sd6" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.422427 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-config-data\") pod \"nova-cell1-db-purge-29525760-9s2b5\" (UID: \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\") " pod="openstack/nova-cell1-db-purge-29525760-9s2b5" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.422454 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-combined-ca-bundle\") pod \"nova-cell1-db-purge-29525760-9s2b5\" (UID: \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\") " pod="openstack/nova-cell1-db-purge-29525760-9s2b5" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.422528 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-scripts\") pod \"nova-cell0-db-purge-29525760-n9sd6\" (UID: \"57de3f43-e33f-4734-b02d-372d013b7e80\") " pod="openstack/nova-cell0-db-purge-29525760-n9sd6" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.422553 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/29390c83-c5f7-4c7a-8f48-9a02661a1108-serviceca\") pod \"image-pruner-29525760-mtv5d\" (UID: \"29390c83-c5f7-4c7a-8f48-9a02661a1108\") " pod="openshift-image-registry/image-pruner-29525760-mtv5d" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.428893 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-config-data\") pod \"nova-cell1-db-purge-29525760-9s2b5\" (UID: \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\") " pod="openstack/nova-cell1-db-purge-29525760-9s2b5" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.430236 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-combined-ca-bundle\") pod \"nova-cell0-db-purge-29525760-n9sd6\" (UID: \"57de3f43-e33f-4734-b02d-372d013b7e80\") " pod="openstack/nova-cell0-db-purge-29525760-n9sd6" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.430338 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-scripts\") pod \"nova-cell1-db-purge-29525760-9s2b5\" (UID: \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\") " pod="openstack/nova-cell1-db-purge-29525760-9s2b5" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.431982 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-combined-ca-bundle\") pod \"nova-cell1-db-purge-29525760-9s2b5\" (UID: \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\") " pod="openstack/nova-cell1-db-purge-29525760-9s2b5" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.432296 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-config-data\") pod \"nova-cell0-db-purge-29525760-n9sd6\" (UID: \"57de3f43-e33f-4734-b02d-372d013b7e80\") " pod="openstack/nova-cell0-db-purge-29525760-n9sd6" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.438590 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-scripts\") pod \"nova-cell0-db-purge-29525760-n9sd6\" (UID: \"57de3f43-e33f-4734-b02d-372d013b7e80\") " pod="openstack/nova-cell0-db-purge-29525760-n9sd6" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.441550 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmg5x\" (UniqueName: \"kubernetes.io/projected/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-kube-api-access-qmg5x\") pod \"nova-cell1-db-purge-29525760-9s2b5\" (UID: \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\") " pod="openstack/nova-cell1-db-purge-29525760-9s2b5" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.443686 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwmfj\" (UniqueName: \"kubernetes.io/projected/57de3f43-e33f-4734-b02d-372d013b7e80-kube-api-access-qwmfj\") pod \"nova-cell0-db-purge-29525760-n9sd6\" (UID: \"57de3f43-e33f-4734-b02d-372d013b7e80\") " pod="openstack/nova-cell0-db-purge-29525760-n9sd6" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.524638 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/29390c83-c5f7-4c7a-8f48-9a02661a1108-serviceca\") pod \"image-pruner-29525760-mtv5d\" (UID: \"29390c83-c5f7-4c7a-8f48-9a02661a1108\") " pod="openshift-image-registry/image-pruner-29525760-mtv5d" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.525056 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7809862b-911c-4763-af00-a74f3fbf2500-config-volume\") pod \"collect-profiles-29525760-rlfxx\" (UID: \"7809862b-911c-4763-af00-a74f3fbf2500\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.525195 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7809862b-911c-4763-af00-a74f3fbf2500-secret-volume\") pod \"collect-profiles-29525760-rlfxx\" (UID: \"7809862b-911c-4763-af00-a74f3fbf2500\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.525532 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/29390c83-c5f7-4c7a-8f48-9a02661a1108-serviceca\") pod \"image-pruner-29525760-mtv5d\" (UID: \"29390c83-c5f7-4c7a-8f48-9a02661a1108\") " pod="openshift-image-registry/image-pruner-29525760-mtv5d" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.526841 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-858m8\" (UniqueName: \"kubernetes.io/projected/29390c83-c5f7-4c7a-8f48-9a02661a1108-kube-api-access-858m8\") pod \"image-pruner-29525760-mtv5d\" (UID: \"29390c83-c5f7-4c7a-8f48-9a02661a1108\") " pod="openshift-image-registry/image-pruner-29525760-mtv5d" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.527428 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfxmk\" (UniqueName: \"kubernetes.io/projected/7809862b-911c-4763-af00-a74f3fbf2500-kube-api-access-sfxmk\") pod \"collect-profiles-29525760-rlfxx\" (UID: \"7809862b-911c-4763-af00-a74f3fbf2500\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.544465 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-858m8\" (UniqueName: \"kubernetes.io/projected/29390c83-c5f7-4c7a-8f48-9a02661a1108-kube-api-access-858m8\") pod \"image-pruner-29525760-mtv5d\" (UID: \"29390c83-c5f7-4c7a-8f48-9a02661a1108\") " pod="openshift-image-registry/image-pruner-29525760-mtv5d" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.568694 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-purge-29525760-9s2b5" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.580246 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-purge-29525760-n9sd6" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.597061 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29525760-mtv5d" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.631812 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7809862b-911c-4763-af00-a74f3fbf2500-secret-volume\") pod \"collect-profiles-29525760-rlfxx\" (UID: \"7809862b-911c-4763-af00-a74f3fbf2500\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.632865 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfxmk\" (UniqueName: \"kubernetes.io/projected/7809862b-911c-4763-af00-a74f3fbf2500-kube-api-access-sfxmk\") pod \"collect-profiles-29525760-rlfxx\" (UID: \"7809862b-911c-4763-af00-a74f3fbf2500\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.633204 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7809862b-911c-4763-af00-a74f3fbf2500-config-volume\") pod \"collect-profiles-29525760-rlfxx\" (UID: \"7809862b-911c-4763-af00-a74f3fbf2500\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.649762 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7809862b-911c-4763-af00-a74f3fbf2500-config-volume\") pod \"collect-profiles-29525760-rlfxx\" (UID: \"7809862b-911c-4763-af00-a74f3fbf2500\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.650015 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7809862b-911c-4763-af00-a74f3fbf2500-secret-volume\") pod \"collect-profiles-29525760-rlfxx\" (UID: \"7809862b-911c-4763-af00-a74f3fbf2500\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.653585 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfxmk\" (UniqueName: \"kubernetes.io/projected/7809862b-911c-4763-af00-a74f3fbf2500-kube-api-access-sfxmk\") pod \"collect-profiles-29525760-rlfxx\" (UID: \"7809862b-911c-4763-af00-a74f3fbf2500\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.703512 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" Feb 20 00:00:01 crc kubenswrapper[4795]: W0220 00:00:01.097389 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fbbf12e_019a_40d4_9a07_46b3e5b4c814.slice/crio-a0cc8c7bdae191da08aaf6f8bc8f70504815a88c4a974a522c18d8e74303ae0a WatchSource:0}: Error finding container a0cc8c7bdae191da08aaf6f8bc8f70504815a88c4a974a522c18d8e74303ae0a: Status 404 returned error can't find the container with id a0cc8c7bdae191da08aaf6f8bc8f70504815a88c4a974a522c18d8e74303ae0a Feb 20 00:00:01 crc kubenswrapper[4795]: I0220 00:00:01.098764 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-purge-29525760-9s2b5"] Feb 20 00:00:01 crc kubenswrapper[4795]: W0220 00:00:01.102271 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57de3f43_e33f_4734_b02d_372d013b7e80.slice/crio-0aee228d062bbddc74795824664752ec2183d082ae8088e128256da56508fb59 WatchSource:0}: Error finding container 0aee228d062bbddc74795824664752ec2183d082ae8088e128256da56508fb59: Status 404 returned error can't find the container with id 0aee228d062bbddc74795824664752ec2183d082ae8088e128256da56508fb59 Feb 20 00:00:01 crc kubenswrapper[4795]: I0220 00:00:01.108521 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-purge-29525760-n9sd6"] Feb 20 00:00:01 crc kubenswrapper[4795]: I0220 00:00:01.261887 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29525760-mtv5d"] Feb 20 00:00:01 crc kubenswrapper[4795]: W0220 00:00:01.267700 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29390c83_c5f7_4c7a_8f48_9a02661a1108.slice/crio-671c229fbeb3efe9e21c8feda58904a1ee23355307111ba6fa2f34dcfa0ca6c1 WatchSource:0}: Error finding container 671c229fbeb3efe9e21c8feda58904a1ee23355307111ba6fa2f34dcfa0ca6c1: Status 404 returned error can't find the container with id 671c229fbeb3efe9e21c8feda58904a1ee23355307111ba6fa2f34dcfa0ca6c1 Feb 20 00:00:01 crc kubenswrapper[4795]: I0220 00:00:01.389503 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx"] Feb 20 00:00:01 crc kubenswrapper[4795]: W0220 00:00:01.401927 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7809862b_911c_4763_af00_a74f3fbf2500.slice/crio-9805d9298092525a75c78bd99ca910a4bf91a90017b08307551acb4710ef0f68 WatchSource:0}: Error finding container 9805d9298092525a75c78bd99ca910a4bf91a90017b08307551acb4710ef0f68: Status 404 returned error can't find the container with id 9805d9298092525a75c78bd99ca910a4bf91a90017b08307551acb4710ef0f68 Feb 20 00:00:01 crc kubenswrapper[4795]: I0220 00:00:01.758864 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" event={"ID":"7809862b-911c-4763-af00-a74f3fbf2500","Type":"ContainerStarted","Data":"243b5fea31103c3415f81359e3db7f940d7d50a74e1aebe60823f05f7ae58db8"} Feb 20 00:00:01 crc kubenswrapper[4795]: I0220 00:00:01.759252 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" event={"ID":"7809862b-911c-4763-af00-a74f3fbf2500","Type":"ContainerStarted","Data":"9805d9298092525a75c78bd99ca910a4bf91a90017b08307551acb4710ef0f68"} Feb 20 00:00:01 crc kubenswrapper[4795]: I0220 00:00:01.762953 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-purge-29525760-n9sd6" event={"ID":"57de3f43-e33f-4734-b02d-372d013b7e80","Type":"ContainerStarted","Data":"83d35cb72b27cfec9969608628b10b81e4e1659431946fa44a79242cb3e941e1"} Feb 20 00:00:01 crc kubenswrapper[4795]: I0220 00:00:01.763020 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-purge-29525760-n9sd6" event={"ID":"57de3f43-e33f-4734-b02d-372d013b7e80","Type":"ContainerStarted","Data":"0aee228d062bbddc74795824664752ec2183d082ae8088e128256da56508fb59"} Feb 20 00:00:01 crc kubenswrapper[4795]: I0220 00:00:01.764872 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29525760-mtv5d" event={"ID":"29390c83-c5f7-4c7a-8f48-9a02661a1108","Type":"ContainerStarted","Data":"a733d24075e34eb1adca6b5da2d53baa0ebfe92c14da71f66c920c375d7b4d91"} Feb 20 00:00:01 crc kubenswrapper[4795]: I0220 00:00:01.765015 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29525760-mtv5d" event={"ID":"29390c83-c5f7-4c7a-8f48-9a02661a1108","Type":"ContainerStarted","Data":"671c229fbeb3efe9e21c8feda58904a1ee23355307111ba6fa2f34dcfa0ca6c1"} Feb 20 00:00:01 crc kubenswrapper[4795]: I0220 00:00:01.768293 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-purge-29525760-9s2b5" event={"ID":"2fbbf12e-019a-40d4-9a07-46b3e5b4c814","Type":"ContainerStarted","Data":"2160bcdaff135da8488dd0341e2f0ef84c741af0030bb2e1d363fb7e5d8d2784"} Feb 20 00:00:01 crc kubenswrapper[4795]: I0220 00:00:01.768390 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-purge-29525760-9s2b5" event={"ID":"2fbbf12e-019a-40d4-9a07-46b3e5b4c814","Type":"ContainerStarted","Data":"a0cc8c7bdae191da08aaf6f8bc8f70504815a88c4a974a522c18d8e74303ae0a"} Feb 20 00:00:01 crc kubenswrapper[4795]: I0220 00:00:01.781148 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" podStartSLOduration=1.781127589 podStartE2EDuration="1.781127589s" podCreationTimestamp="2026-02-20 00:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 00:00:01.770949761 +0000 UTC m=+9112.963467625" watchObservedRunningTime="2026-02-20 00:00:01.781127589 +0000 UTC m=+9112.973645453" Feb 20 00:00:01 crc kubenswrapper[4795]: I0220 00:00:01.794915 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29525760-mtv5d" podStartSLOduration=1.794894518 podStartE2EDuration="1.794894518s" podCreationTimestamp="2026-02-20 00:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 00:00:01.790694889 +0000 UTC m=+9112.983212773" watchObservedRunningTime="2026-02-20 00:00:01.794894518 +0000 UTC m=+9112.987412382" Feb 20 00:00:01 crc kubenswrapper[4795]: I0220 00:00:01.836507 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-purge-29525760-9s2b5" podStartSLOduration=1.836487255 podStartE2EDuration="1.836487255s" podCreationTimestamp="2026-02-20 00:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 00:00:01.822143769 +0000 UTC m=+9113.014661643" watchObservedRunningTime="2026-02-20 00:00:01.836487255 +0000 UTC m=+9113.029005129" Feb 20 00:00:01 crc kubenswrapper[4795]: I0220 00:00:01.848030 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-purge-29525760-n9sd6" podStartSLOduration=1.848015301 podStartE2EDuration="1.848015301s" podCreationTimestamp="2026-02-20 00:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 00:00:01.839935572 +0000 UTC m=+9113.032453436" watchObservedRunningTime="2026-02-20 00:00:01.848015301 +0000 UTC m=+9113.040533165" Feb 20 00:00:02 crc kubenswrapper[4795]: I0220 00:00:02.780346 4795 generic.go:334] "Generic (PLEG): container finished" podID="7809862b-911c-4763-af00-a74f3fbf2500" containerID="243b5fea31103c3415f81359e3db7f940d7d50a74e1aebe60823f05f7ae58db8" exitCode=0 Feb 20 00:00:02 crc kubenswrapper[4795]: I0220 00:00:02.782285 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" event={"ID":"7809862b-911c-4763-af00-a74f3fbf2500","Type":"ContainerDied","Data":"243b5fea31103c3415f81359e3db7f940d7d50a74e1aebe60823f05f7ae58db8"} Feb 20 00:00:03 crc kubenswrapper[4795]: I0220 00:00:03.791331 4795 generic.go:334] "Generic (PLEG): container finished" podID="29390c83-c5f7-4c7a-8f48-9a02661a1108" containerID="a733d24075e34eb1adca6b5da2d53baa0ebfe92c14da71f66c920c375d7b4d91" exitCode=0 Feb 20 00:00:03 crc kubenswrapper[4795]: I0220 00:00:03.791407 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29525760-mtv5d" event={"ID":"29390c83-c5f7-4c7a-8f48-9a02661a1108","Type":"ContainerDied","Data":"a733d24075e34eb1adca6b5da2d53baa0ebfe92c14da71f66c920c375d7b4d91"} Feb 20 00:00:04 crc kubenswrapper[4795]: I0220 00:00:04.248657 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" Feb 20 00:00:04 crc kubenswrapper[4795]: I0220 00:00:04.426069 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfxmk\" (UniqueName: \"kubernetes.io/projected/7809862b-911c-4763-af00-a74f3fbf2500-kube-api-access-sfxmk\") pod \"7809862b-911c-4763-af00-a74f3fbf2500\" (UID: \"7809862b-911c-4763-af00-a74f3fbf2500\") " Feb 20 00:00:04 crc kubenswrapper[4795]: I0220 00:00:04.426184 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7809862b-911c-4763-af00-a74f3fbf2500-config-volume\") pod \"7809862b-911c-4763-af00-a74f3fbf2500\" (UID: \"7809862b-911c-4763-af00-a74f3fbf2500\") " Feb 20 00:00:04 crc kubenswrapper[4795]: I0220 00:00:04.426319 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7809862b-911c-4763-af00-a74f3fbf2500-secret-volume\") pod \"7809862b-911c-4763-af00-a74f3fbf2500\" (UID: \"7809862b-911c-4763-af00-a74f3fbf2500\") " Feb 20 00:00:04 crc kubenswrapper[4795]: I0220 00:00:04.427227 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7809862b-911c-4763-af00-a74f3fbf2500-config-volume" (OuterVolumeSpecName: "config-volume") pod "7809862b-911c-4763-af00-a74f3fbf2500" (UID: "7809862b-911c-4763-af00-a74f3fbf2500"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 00:00:04 crc kubenswrapper[4795]: I0220 00:00:04.445625 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7809862b-911c-4763-af00-a74f3fbf2500-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7809862b-911c-4763-af00-a74f3fbf2500" (UID: "7809862b-911c-4763-af00-a74f3fbf2500"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:00:04 crc kubenswrapper[4795]: I0220 00:00:04.445747 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7809862b-911c-4763-af00-a74f3fbf2500-kube-api-access-sfxmk" (OuterVolumeSpecName: "kube-api-access-sfxmk") pod "7809862b-911c-4763-af00-a74f3fbf2500" (UID: "7809862b-911c-4763-af00-a74f3fbf2500"). InnerVolumeSpecName "kube-api-access-sfxmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:00:04 crc kubenswrapper[4795]: I0220 00:00:04.457389 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95"] Feb 20 00:00:04 crc kubenswrapper[4795]: I0220 00:00:04.467689 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95"] Feb 20 00:00:04 crc kubenswrapper[4795]: I0220 00:00:04.529228 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfxmk\" (UniqueName: \"kubernetes.io/projected/7809862b-911c-4763-af00-a74f3fbf2500-kube-api-access-sfxmk\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:04 crc kubenswrapper[4795]: I0220 00:00:04.529264 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7809862b-911c-4763-af00-a74f3fbf2500-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:04 crc kubenswrapper[4795]: I0220 00:00:04.529274 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7809862b-911c-4763-af00-a74f3fbf2500-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:04 crc kubenswrapper[4795]: I0220 00:00:04.803405 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" Feb 20 00:00:04 crc kubenswrapper[4795]: I0220 00:00:04.810360 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" event={"ID":"7809862b-911c-4763-af00-a74f3fbf2500","Type":"ContainerDied","Data":"9805d9298092525a75c78bd99ca910a4bf91a90017b08307551acb4710ef0f68"} Feb 20 00:00:04 crc kubenswrapper[4795]: I0220 00:00:04.810448 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9805d9298092525a75c78bd99ca910a4bf91a90017b08307551acb4710ef0f68" Feb 20 00:00:05 crc kubenswrapper[4795]: I0220 00:00:05.178601 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29525760-mtv5d" Feb 20 00:00:05 crc kubenswrapper[4795]: I0220 00:00:05.341539 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-858m8\" (UniqueName: \"kubernetes.io/projected/29390c83-c5f7-4c7a-8f48-9a02661a1108-kube-api-access-858m8\") pod \"29390c83-c5f7-4c7a-8f48-9a02661a1108\" (UID: \"29390c83-c5f7-4c7a-8f48-9a02661a1108\") " Feb 20 00:00:05 crc kubenswrapper[4795]: I0220 00:00:05.341836 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/29390c83-c5f7-4c7a-8f48-9a02661a1108-serviceca\") pod \"29390c83-c5f7-4c7a-8f48-9a02661a1108\" (UID: \"29390c83-c5f7-4c7a-8f48-9a02661a1108\") " Feb 20 00:00:05 crc kubenswrapper[4795]: I0220 00:00:05.342448 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29390c83-c5f7-4c7a-8f48-9a02661a1108-serviceca" (OuterVolumeSpecName: "serviceca") pod "29390c83-c5f7-4c7a-8f48-9a02661a1108" (UID: "29390c83-c5f7-4c7a-8f48-9a02661a1108"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 00:00:05 crc kubenswrapper[4795]: I0220 00:00:05.349039 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29390c83-c5f7-4c7a-8f48-9a02661a1108-kube-api-access-858m8" (OuterVolumeSpecName: "kube-api-access-858m8") pod "29390c83-c5f7-4c7a-8f48-9a02661a1108" (UID: "29390c83-c5f7-4c7a-8f48-9a02661a1108"). InnerVolumeSpecName "kube-api-access-858m8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:00:05 crc kubenswrapper[4795]: I0220 00:00:05.444523 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-858m8\" (UniqueName: \"kubernetes.io/projected/29390c83-c5f7-4c7a-8f48-9a02661a1108-kube-api-access-858m8\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:05 crc kubenswrapper[4795]: I0220 00:00:05.444568 4795 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/29390c83-c5f7-4c7a-8f48-9a02661a1108-serviceca\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:05 crc kubenswrapper[4795]: I0220 00:00:05.524281 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3eafa182-621e-48fe-a019-360c2f94c212" path="/var/lib/kubelet/pods/3eafa182-621e-48fe-a019-360c2f94c212/volumes" Feb 20 00:00:05 crc kubenswrapper[4795]: I0220 00:00:05.813641 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29525760-mtv5d" event={"ID":"29390c83-c5f7-4c7a-8f48-9a02661a1108","Type":"ContainerDied","Data":"671c229fbeb3efe9e21c8feda58904a1ee23355307111ba6fa2f34dcfa0ca6c1"} Feb 20 00:00:05 crc kubenswrapper[4795]: I0220 00:00:05.813693 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="671c229fbeb3efe9e21c8feda58904a1ee23355307111ba6fa2f34dcfa0ca6c1" Feb 20 00:00:05 crc kubenswrapper[4795]: I0220 00:00:05.813759 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29525760-mtv5d" Feb 20 00:00:07 crc kubenswrapper[4795]: I0220 00:00:07.832699 4795 generic.go:334] "Generic (PLEG): container finished" podID="57de3f43-e33f-4734-b02d-372d013b7e80" containerID="83d35cb72b27cfec9969608628b10b81e4e1659431946fa44a79242cb3e941e1" exitCode=0 Feb 20 00:00:07 crc kubenswrapper[4795]: I0220 00:00:07.832774 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-purge-29525760-n9sd6" event={"ID":"57de3f43-e33f-4734-b02d-372d013b7e80","Type":"ContainerDied","Data":"83d35cb72b27cfec9969608628b10b81e4e1659431946fa44a79242cb3e941e1"} Feb 20 00:00:07 crc kubenswrapper[4795]: I0220 00:00:07.836674 4795 generic.go:334] "Generic (PLEG): container finished" podID="2fbbf12e-019a-40d4-9a07-46b3e5b4c814" containerID="2160bcdaff135da8488dd0341e2f0ef84c741af0030bb2e1d363fb7e5d8d2784" exitCode=0 Feb 20 00:00:07 crc kubenswrapper[4795]: I0220 00:00:07.836713 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-purge-29525760-9s2b5" event={"ID":"2fbbf12e-019a-40d4-9a07-46b3e5b4c814","Type":"ContainerDied","Data":"2160bcdaff135da8488dd0341e2f0ef84c741af0030bb2e1d363fb7e5d8d2784"} Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.369997 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-purge-29525760-n9sd6" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.377232 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-purge-29525760-9s2b5" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.527591 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-combined-ca-bundle\") pod \"57de3f43-e33f-4734-b02d-372d013b7e80\" (UID: \"57de3f43-e33f-4734-b02d-372d013b7e80\") " Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.527797 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-config-data\") pod \"57de3f43-e33f-4734-b02d-372d013b7e80\" (UID: \"57de3f43-e33f-4734-b02d-372d013b7e80\") " Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.527906 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmg5x\" (UniqueName: \"kubernetes.io/projected/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-kube-api-access-qmg5x\") pod \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\" (UID: \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\") " Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.528029 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-scripts\") pod \"57de3f43-e33f-4734-b02d-372d013b7e80\" (UID: \"57de3f43-e33f-4734-b02d-372d013b7e80\") " Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.528109 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-combined-ca-bundle\") pod \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\" (UID: \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\") " Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.528284 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-scripts\") pod \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\" (UID: \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\") " Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.528360 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwmfj\" (UniqueName: \"kubernetes.io/projected/57de3f43-e33f-4734-b02d-372d013b7e80-kube-api-access-qwmfj\") pod \"57de3f43-e33f-4734-b02d-372d013b7e80\" (UID: \"57de3f43-e33f-4734-b02d-372d013b7e80\") " Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.528378 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-config-data\") pod \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\" (UID: \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\") " Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.533372 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-scripts" (OuterVolumeSpecName: "scripts") pod "2fbbf12e-019a-40d4-9a07-46b3e5b4c814" (UID: "2fbbf12e-019a-40d4-9a07-46b3e5b4c814"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.534064 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-scripts" (OuterVolumeSpecName: "scripts") pod "57de3f43-e33f-4734-b02d-372d013b7e80" (UID: "57de3f43-e33f-4734-b02d-372d013b7e80"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.534299 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-kube-api-access-qmg5x" (OuterVolumeSpecName: "kube-api-access-qmg5x") pod "2fbbf12e-019a-40d4-9a07-46b3e5b4c814" (UID: "2fbbf12e-019a-40d4-9a07-46b3e5b4c814"). InnerVolumeSpecName "kube-api-access-qmg5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.535545 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57de3f43-e33f-4734-b02d-372d013b7e80-kube-api-access-qwmfj" (OuterVolumeSpecName: "kube-api-access-qwmfj") pod "57de3f43-e33f-4734-b02d-372d013b7e80" (UID: "57de3f43-e33f-4734-b02d-372d013b7e80"). InnerVolumeSpecName "kube-api-access-qwmfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.557087 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2fbbf12e-019a-40d4-9a07-46b3e5b4c814" (UID: "2fbbf12e-019a-40d4-9a07-46b3e5b4c814"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.557679 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-config-data" (OuterVolumeSpecName: "config-data") pod "57de3f43-e33f-4734-b02d-372d013b7e80" (UID: "57de3f43-e33f-4734-b02d-372d013b7e80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.559724 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-config-data" (OuterVolumeSpecName: "config-data") pod "2fbbf12e-019a-40d4-9a07-46b3e5b4c814" (UID: "2fbbf12e-019a-40d4-9a07-46b3e5b4c814"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.562749 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57de3f43-e33f-4734-b02d-372d013b7e80" (UID: "57de3f43-e33f-4734-b02d-372d013b7e80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.631246 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.631275 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.631287 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.631295 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwmfj\" (UniqueName: \"kubernetes.io/projected/57de3f43-e33f-4734-b02d-372d013b7e80-kube-api-access-qwmfj\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.631308 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.631317 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.631325 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.631332 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmg5x\" (UniqueName: \"kubernetes.io/projected/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-kube-api-access-qmg5x\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.856694 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-purge-29525760-9s2b5" event={"ID":"2fbbf12e-019a-40d4-9a07-46b3e5b4c814","Type":"ContainerDied","Data":"a0cc8c7bdae191da08aaf6f8bc8f70504815a88c4a974a522c18d8e74303ae0a"} Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.856733 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0cc8c7bdae191da08aaf6f8bc8f70504815a88c4a974a522c18d8e74303ae0a" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.856747 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-purge-29525760-9s2b5" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.857987 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-purge-29525760-n9sd6" event={"ID":"57de3f43-e33f-4734-b02d-372d013b7e80","Type":"ContainerDied","Data":"0aee228d062bbddc74795824664752ec2183d082ae8088e128256da56508fb59"} Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.858016 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-purge-29525760-n9sd6" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.858022 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0aee228d062bbddc74795824664752ec2183d082ae8088e128256da56508fb59" Feb 20 00:00:10 crc kubenswrapper[4795]: I0220 00:00:10.511652 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:00:10 crc kubenswrapper[4795]: E0220 00:00:10.512045 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:00:23 crc kubenswrapper[4795]: I0220 00:00:23.512031 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:00:23 crc kubenswrapper[4795]: E0220 00:00:23.512826 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:00:29 crc kubenswrapper[4795]: I0220 00:00:29.284775 4795 scope.go:117] "RemoveContainer" containerID="506e26301ea1d8c97cb2f36560ab4d1582f8a10338a4a109885babb88591cfe0" Feb 20 00:00:35 crc kubenswrapper[4795]: I0220 00:00:35.512380 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:00:35 crc kubenswrapper[4795]: E0220 00:00:35.513361 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:00:48 crc kubenswrapper[4795]: I0220 00:00:48.512043 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:00:48 crc kubenswrapper[4795]: E0220 00:00:48.512802 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.191536 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-purge-29525761-zbskn"] Feb 20 00:01:00 crc kubenswrapper[4795]: E0220 00:01:00.192796 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57de3f43-e33f-4734-b02d-372d013b7e80" containerName="nova-manage" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.192813 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="57de3f43-e33f-4734-b02d-372d013b7e80" containerName="nova-manage" Feb 20 00:01:00 crc kubenswrapper[4795]: E0220 00:01:00.192829 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7809862b-911c-4763-af00-a74f3fbf2500" containerName="collect-profiles" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.192837 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7809862b-911c-4763-af00-a74f3fbf2500" containerName="collect-profiles" Feb 20 00:01:00 crc kubenswrapper[4795]: E0220 00:01:00.192876 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29390c83-c5f7-4c7a-8f48-9a02661a1108" containerName="image-pruner" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.192885 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="29390c83-c5f7-4c7a-8f48-9a02661a1108" containerName="image-pruner" Feb 20 00:01:00 crc kubenswrapper[4795]: E0220 00:01:00.192900 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fbbf12e-019a-40d4-9a07-46b3e5b4c814" containerName="nova-manage" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.192908 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fbbf12e-019a-40d4-9a07-46b3e5b4c814" containerName="nova-manage" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.193140 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="29390c83-c5f7-4c7a-8f48-9a02661a1108" containerName="image-pruner" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.193193 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="57de3f43-e33f-4734-b02d-372d013b7e80" containerName="nova-manage" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.193214 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fbbf12e-019a-40d4-9a07-46b3e5b4c814" containerName="nova-manage" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.193232 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7809862b-911c-4763-af00-a74f3fbf2500" containerName="collect-profiles" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.194181 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-purge-29525761-zbskn" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.201597 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-config-data\") pod \"cinder-db-purge-29525761-zbskn\" (UID: \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\") " pod="openstack/cinder-db-purge-29525761-zbskn" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.201747 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-combined-ca-bundle\") pod \"cinder-db-purge-29525761-zbskn\" (UID: \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\") " pod="openstack/cinder-db-purge-29525761-zbskn" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.201946 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2bpx\" (UniqueName: \"kubernetes.io/projected/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-kube-api-access-r2bpx\") pod \"cinder-db-purge-29525761-zbskn\" (UID: \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\") " pod="openstack/cinder-db-purge-29525761-zbskn" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.202072 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-db-purge-config-data\") pod \"cinder-db-purge-29525761-zbskn\" (UID: \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\") " pod="openstack/cinder-db-purge-29525761-zbskn" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.212857 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-purge-29525761-5slgz"] Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.215004 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-purge-29525761-5slgz" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.239853 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-purge-29525761-lcwsj"] Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.241807 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29525761-d78q6"] Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.243046 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525761-d78q6" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.243244 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-purge-29525761-lcwsj" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.248725 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.248917 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-purge-29525761-tq4s7"] Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.252316 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-purge-29525761-tq4s7" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.260657 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-purge-29525761-5slgz"] Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.275578 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-purge-29525761-tq4s7"] Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.288723 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-purge-29525761-lcwsj"] Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.304566 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2bpx\" (UniqueName: \"kubernetes.io/projected/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-kube-api-access-r2bpx\") pod \"cinder-db-purge-29525761-zbskn\" (UID: \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\") " pod="openstack/cinder-db-purge-29525761-zbskn" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.304657 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-db-purge-config-data\") pod \"cinder-db-purge-29525761-zbskn\" (UID: \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\") " pod="openstack/cinder-db-purge-29525761-zbskn" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.304728 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-config-data\") pod \"cinder-db-purge-29525761-zbskn\" (UID: \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\") " pod="openstack/cinder-db-purge-29525761-zbskn" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.304777 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-combined-ca-bundle\") pod \"cinder-db-purge-29525761-zbskn\" (UID: \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\") " pod="openstack/cinder-db-purge-29525761-zbskn" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.314115 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29525761-d78q6"] Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.319052 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-db-purge-config-data\") pod \"cinder-db-purge-29525761-zbskn\" (UID: \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\") " pod="openstack/cinder-db-purge-29525761-zbskn" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.324929 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-combined-ca-bundle\") pod \"cinder-db-purge-29525761-zbskn\" (UID: \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\") " pod="openstack/cinder-db-purge-29525761-zbskn" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.335448 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-config-data\") pod \"cinder-db-purge-29525761-zbskn\" (UID: \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\") " pod="openstack/cinder-db-purge-29525761-zbskn" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.336768 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2bpx\" (UniqueName: \"kubernetes.io/projected/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-kube-api-access-r2bpx\") pod \"cinder-db-purge-29525761-zbskn\" (UID: \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\") " pod="openstack/cinder-db-purge-29525761-zbskn" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.398354 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-purge-29525761-zbskn"] Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.409292 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-config-data\") pod \"heat-db-purge-29525761-tq4s7\" (UID: \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\") " pod="openstack/heat-db-purge-29525761-tq4s7" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.409336 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpt9n\" (UniqueName: \"kubernetes.io/projected/9344b7be-b07a-4660-9352-dfdbcecac424-kube-api-access-kpt9n\") pod \"glance-db-purge-29525761-lcwsj\" (UID: \"9344b7be-b07a-4660-9352-dfdbcecac424\") " pod="openstack/glance-db-purge-29525761-lcwsj" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.409366 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-db-purge-config-data\") pod \"glance-db-purge-29525761-lcwsj\" (UID: \"9344b7be-b07a-4660-9352-dfdbcecac424\") " pod="openstack/glance-db-purge-29525761-lcwsj" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.409403 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc6e226-d501-4698-b49c-f07fc8e80339-combined-ca-bundle\") pod \"manila-db-purge-29525761-5slgz\" (UID: \"8dc6e226-d501-4698-b49c-f07fc8e80339\") " pod="openstack/manila-db-purge-29525761-5slgz" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.409555 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-combined-ca-bundle\") pod \"heat-db-purge-29525761-tq4s7\" (UID: \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\") " pod="openstack/heat-db-purge-29525761-tq4s7" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.413065 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-fernet-keys\") pod \"keystone-cron-29525761-d78q6\" (UID: \"e307d045-9890-4475-8c51-395484da10ca\") " pod="openstack/keystone-cron-29525761-d78q6" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.413103 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-config-data\") pod \"glance-db-purge-29525761-lcwsj\" (UID: \"9344b7be-b07a-4660-9352-dfdbcecac424\") " pod="openstack/glance-db-purge-29525761-lcwsj" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.413125 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97sdm\" (UniqueName: \"kubernetes.io/projected/8dc6e226-d501-4698-b49c-f07fc8e80339-kube-api-access-97sdm\") pod \"manila-db-purge-29525761-5slgz\" (UID: \"8dc6e226-d501-4698-b49c-f07fc8e80339\") " pod="openstack/manila-db-purge-29525761-5slgz" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.413146 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-combined-ca-bundle\") pod \"glance-db-purge-29525761-lcwsj\" (UID: \"9344b7be-b07a-4660-9352-dfdbcecac424\") " pod="openstack/glance-db-purge-29525761-lcwsj" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.413183 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/8dc6e226-d501-4698-b49c-f07fc8e80339-db-purge-config-data\") pod \"manila-db-purge-29525761-5slgz\" (UID: \"8dc6e226-d501-4698-b49c-f07fc8e80339\") " pod="openstack/manila-db-purge-29525761-5slgz" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.413246 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fd9z\" (UniqueName: \"kubernetes.io/projected/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-kube-api-access-4fd9z\") pod \"heat-db-purge-29525761-tq4s7\" (UID: \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\") " pod="openstack/heat-db-purge-29525761-tq4s7" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.413326 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b8t2\" (UniqueName: \"kubernetes.io/projected/e307d045-9890-4475-8c51-395484da10ca-kube-api-access-8b8t2\") pod \"keystone-cron-29525761-d78q6\" (UID: \"e307d045-9890-4475-8c51-395484da10ca\") " pod="openstack/keystone-cron-29525761-d78q6" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.413354 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-combined-ca-bundle\") pod \"keystone-cron-29525761-d78q6\" (UID: \"e307d045-9890-4475-8c51-395484da10ca\") " pod="openstack/keystone-cron-29525761-d78q6" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.413372 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-db-purge-config-data\") pod \"heat-db-purge-29525761-tq4s7\" (UID: \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\") " pod="openstack/heat-db-purge-29525761-tq4s7" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.413447 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-config-data\") pod \"keystone-cron-29525761-d78q6\" (UID: \"e307d045-9890-4475-8c51-395484da10ca\") " pod="openstack/keystone-cron-29525761-d78q6" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.511456 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:01:00 crc kubenswrapper[4795]: E0220 00:01:00.511761 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.516950 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-config-data\") pod \"heat-db-purge-29525761-tq4s7\" (UID: \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\") " pod="openstack/heat-db-purge-29525761-tq4s7" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.517006 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpt9n\" (UniqueName: \"kubernetes.io/projected/9344b7be-b07a-4660-9352-dfdbcecac424-kube-api-access-kpt9n\") pod \"glance-db-purge-29525761-lcwsj\" (UID: \"9344b7be-b07a-4660-9352-dfdbcecac424\") " pod="openstack/glance-db-purge-29525761-lcwsj" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.517038 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-db-purge-config-data\") pod \"glance-db-purge-29525761-lcwsj\" (UID: \"9344b7be-b07a-4660-9352-dfdbcecac424\") " pod="openstack/glance-db-purge-29525761-lcwsj" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.517116 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc6e226-d501-4698-b49c-f07fc8e80339-combined-ca-bundle\") pod \"manila-db-purge-29525761-5slgz\" (UID: \"8dc6e226-d501-4698-b49c-f07fc8e80339\") " pod="openstack/manila-db-purge-29525761-5slgz" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.517186 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-combined-ca-bundle\") pod \"heat-db-purge-29525761-tq4s7\" (UID: \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\") " pod="openstack/heat-db-purge-29525761-tq4s7" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.519268 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-fernet-keys\") pod \"keystone-cron-29525761-d78q6\" (UID: \"e307d045-9890-4475-8c51-395484da10ca\") " pod="openstack/keystone-cron-29525761-d78q6" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.519305 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-config-data\") pod \"glance-db-purge-29525761-lcwsj\" (UID: \"9344b7be-b07a-4660-9352-dfdbcecac424\") " pod="openstack/glance-db-purge-29525761-lcwsj" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.519349 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97sdm\" (UniqueName: \"kubernetes.io/projected/8dc6e226-d501-4698-b49c-f07fc8e80339-kube-api-access-97sdm\") pod \"manila-db-purge-29525761-5slgz\" (UID: \"8dc6e226-d501-4698-b49c-f07fc8e80339\") " pod="openstack/manila-db-purge-29525761-5slgz" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.519383 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-combined-ca-bundle\") pod \"glance-db-purge-29525761-lcwsj\" (UID: \"9344b7be-b07a-4660-9352-dfdbcecac424\") " pod="openstack/glance-db-purge-29525761-lcwsj" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.519440 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/8dc6e226-d501-4698-b49c-f07fc8e80339-db-purge-config-data\") pod \"manila-db-purge-29525761-5slgz\" (UID: \"8dc6e226-d501-4698-b49c-f07fc8e80339\") " pod="openstack/manila-db-purge-29525761-5slgz" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.519546 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fd9z\" (UniqueName: \"kubernetes.io/projected/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-kube-api-access-4fd9z\") pod \"heat-db-purge-29525761-tq4s7\" (UID: \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\") " pod="openstack/heat-db-purge-29525761-tq4s7" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.520067 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b8t2\" (UniqueName: \"kubernetes.io/projected/e307d045-9890-4475-8c51-395484da10ca-kube-api-access-8b8t2\") pod \"keystone-cron-29525761-d78q6\" (UID: \"e307d045-9890-4475-8c51-395484da10ca\") " pod="openstack/keystone-cron-29525761-d78q6" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.520372 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-combined-ca-bundle\") pod \"keystone-cron-29525761-d78q6\" (UID: \"e307d045-9890-4475-8c51-395484da10ca\") " pod="openstack/keystone-cron-29525761-d78q6" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.520692 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-db-purge-config-data\") pod \"heat-db-purge-29525761-tq4s7\" (UID: \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\") " pod="openstack/heat-db-purge-29525761-tq4s7" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.521127 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-config-data\") pod \"keystone-cron-29525761-d78q6\" (UID: \"e307d045-9890-4475-8c51-395484da10ca\") " pod="openstack/keystone-cron-29525761-d78q6" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.526675 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc6e226-d501-4698-b49c-f07fc8e80339-combined-ca-bundle\") pod \"manila-db-purge-29525761-5slgz\" (UID: \"8dc6e226-d501-4698-b49c-f07fc8e80339\") " pod="openstack/manila-db-purge-29525761-5slgz" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.535320 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-fernet-keys\") pod \"keystone-cron-29525761-d78q6\" (UID: \"e307d045-9890-4475-8c51-395484da10ca\") " pod="openstack/keystone-cron-29525761-d78q6" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.543098 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpt9n\" (UniqueName: \"kubernetes.io/projected/9344b7be-b07a-4660-9352-dfdbcecac424-kube-api-access-kpt9n\") pod \"glance-db-purge-29525761-lcwsj\" (UID: \"9344b7be-b07a-4660-9352-dfdbcecac424\") " pod="openstack/glance-db-purge-29525761-lcwsj" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.553000 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-db-purge-config-data\") pod \"glance-db-purge-29525761-lcwsj\" (UID: \"9344b7be-b07a-4660-9352-dfdbcecac424\") " pod="openstack/glance-db-purge-29525761-lcwsj" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.556806 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-combined-ca-bundle\") pod \"keystone-cron-29525761-d78q6\" (UID: \"e307d045-9890-4475-8c51-395484da10ca\") " pod="openstack/keystone-cron-29525761-d78q6" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.557316 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-config-data\") pod \"keystone-cron-29525761-d78q6\" (UID: \"e307d045-9890-4475-8c51-395484da10ca\") " pod="openstack/keystone-cron-29525761-d78q6" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.564188 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/8dc6e226-d501-4698-b49c-f07fc8e80339-db-purge-config-data\") pod \"manila-db-purge-29525761-5slgz\" (UID: \"8dc6e226-d501-4698-b49c-f07fc8e80339\") " pod="openstack/manila-db-purge-29525761-5slgz" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.564396 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-db-purge-config-data\") pod \"heat-db-purge-29525761-tq4s7\" (UID: \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\") " pod="openstack/heat-db-purge-29525761-tq4s7" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.580312 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fd9z\" (UniqueName: \"kubernetes.io/projected/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-kube-api-access-4fd9z\") pod \"heat-db-purge-29525761-tq4s7\" (UID: \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\") " pod="openstack/heat-db-purge-29525761-tq4s7" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.586816 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-purge-29525761-zbskn" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.588418 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b8t2\" (UniqueName: \"kubernetes.io/projected/e307d045-9890-4475-8c51-395484da10ca-kube-api-access-8b8t2\") pod \"keystone-cron-29525761-d78q6\" (UID: \"e307d045-9890-4475-8c51-395484da10ca\") " pod="openstack/keystone-cron-29525761-d78q6" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.588900 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-config-data\") pod \"glance-db-purge-29525761-lcwsj\" (UID: \"9344b7be-b07a-4660-9352-dfdbcecac424\") " pod="openstack/glance-db-purge-29525761-lcwsj" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.588985 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-combined-ca-bundle\") pod \"glance-db-purge-29525761-lcwsj\" (UID: \"9344b7be-b07a-4660-9352-dfdbcecac424\") " pod="openstack/glance-db-purge-29525761-lcwsj" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.591824 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97sdm\" (UniqueName: \"kubernetes.io/projected/8dc6e226-d501-4698-b49c-f07fc8e80339-kube-api-access-97sdm\") pod \"manila-db-purge-29525761-5slgz\" (UID: \"8dc6e226-d501-4698-b49c-f07fc8e80339\") " pod="openstack/manila-db-purge-29525761-5slgz" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.597764 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-combined-ca-bundle\") pod \"heat-db-purge-29525761-tq4s7\" (UID: \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\") " pod="openstack/heat-db-purge-29525761-tq4s7" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.605791 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-purge-29525761-5slgz" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.606690 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-config-data\") pod \"heat-db-purge-29525761-tq4s7\" (UID: \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\") " pod="openstack/heat-db-purge-29525761-tq4s7" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.737808 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525761-d78q6" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.746693 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-purge-29525761-lcwsj" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.755618 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-purge-29525761-tq4s7" Feb 20 00:01:01 crc kubenswrapper[4795]: I0220 00:01:01.101444 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-purge-29525761-zbskn"] Feb 20 00:01:01 crc kubenswrapper[4795]: W0220 00:01:01.322250 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8dc6e226_d501_4698_b49c_f07fc8e80339.slice/crio-839fcc1ae14bfe08de5f8cc957c2df706e67ee9e3944f751bcde5c02a76b7c98 WatchSource:0}: Error finding container 839fcc1ae14bfe08de5f8cc957c2df706e67ee9e3944f751bcde5c02a76b7c98: Status 404 returned error can't find the container with id 839fcc1ae14bfe08de5f8cc957c2df706e67ee9e3944f751bcde5c02a76b7c98 Feb 20 00:01:01 crc kubenswrapper[4795]: I0220 00:01:01.326451 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-purge-29525761-5slgz"] Feb 20 00:01:01 crc kubenswrapper[4795]: I0220 00:01:01.341683 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29525761-d78q6"] Feb 20 00:01:01 crc kubenswrapper[4795]: I0220 00:01:01.389711 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525761-d78q6" event={"ID":"e307d045-9890-4475-8c51-395484da10ca","Type":"ContainerStarted","Data":"4775fd8acc8285432dc75f9369474eda69ed12d7e07fe84286320ce0fe245768"} Feb 20 00:01:01 crc kubenswrapper[4795]: I0220 00:01:01.415883 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-purge-29525761-lcwsj"] Feb 20 00:01:01 crc kubenswrapper[4795]: I0220 00:01:01.418570 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-purge-29525761-5slgz" event={"ID":"8dc6e226-d501-4698-b49c-f07fc8e80339","Type":"ContainerStarted","Data":"839fcc1ae14bfe08de5f8cc957c2df706e67ee9e3944f751bcde5c02a76b7c98"} Feb 20 00:01:01 crc kubenswrapper[4795]: I0220 00:01:01.422055 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-purge-29525761-zbskn" event={"ID":"fe324720-6e0b-4d15-bc6e-3875b26bf7f4","Type":"ContainerStarted","Data":"d245ab4deae1c20cb273c4781102e64ba81f2352131918cd3603be3874e3b24b"} Feb 20 00:01:01 crc kubenswrapper[4795]: I0220 00:01:01.427941 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-purge-29525761-tq4s7"] Feb 20 00:01:01 crc kubenswrapper[4795]: W0220 00:01:01.429061 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9344b7be_b07a_4660_9352_dfdbcecac424.slice/crio-f6461fca43460d2524cc1869187a90e6e92ad799d2b396d33e0f626bc7a5e79e WatchSource:0}: Error finding container f6461fca43460d2524cc1869187a90e6e92ad799d2b396d33e0f626bc7a5e79e: Status 404 returned error can't find the container with id f6461fca43460d2524cc1869187a90e6e92ad799d2b396d33e0f626bc7a5e79e Feb 20 00:01:01 crc kubenswrapper[4795]: W0220 00:01:01.454830 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd673a8e7_fd1c_4bd1_ad6b_fb18a187b5cb.slice/crio-735a70899c25a3ec127ed998e46a3e97bab5c4645f2d2852a58f1b4d84ae9c13 WatchSource:0}: Error finding container 735a70899c25a3ec127ed998e46a3e97bab5c4645f2d2852a58f1b4d84ae9c13: Status 404 returned error can't find the container with id 735a70899c25a3ec127ed998e46a3e97bab5c4645f2d2852a58f1b4d84ae9c13 Feb 20 00:01:02 crc kubenswrapper[4795]: I0220 00:01:02.455568 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-purge-29525761-lcwsj" event={"ID":"9344b7be-b07a-4660-9352-dfdbcecac424","Type":"ContainerStarted","Data":"f6461fca43460d2524cc1869187a90e6e92ad799d2b396d33e0f626bc7a5e79e"} Feb 20 00:01:02 crc kubenswrapper[4795]: I0220 00:01:02.460778 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-purge-29525761-tq4s7" event={"ID":"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb","Type":"ContainerStarted","Data":"faad2ded953d5657f7a517328a5146fa339ecadd1a2f2c12485311d9ecfd00f1"} Feb 20 00:01:02 crc kubenswrapper[4795]: I0220 00:01:02.460857 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-purge-29525761-tq4s7" event={"ID":"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb","Type":"ContainerStarted","Data":"735a70899c25a3ec127ed998e46a3e97bab5c4645f2d2852a58f1b4d84ae9c13"} Feb 20 00:01:02 crc kubenswrapper[4795]: I0220 00:01:02.480945 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-purge-29525761-tq4s7" podStartSLOduration=2.480924389 podStartE2EDuration="2.480924389s" podCreationTimestamp="2026-02-20 00:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 00:01:02.478762238 +0000 UTC m=+9173.671280102" watchObservedRunningTime="2026-02-20 00:01:02.480924389 +0000 UTC m=+9173.673442253" Feb 20 00:01:02 crc kubenswrapper[4795]: I0220 00:01:02.521116 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525761-d78q6" event={"ID":"e307d045-9890-4475-8c51-395484da10ca","Type":"ContainerStarted","Data":"d952c9318d01d0f2fbed23c519883c7b2ce798dd43e62951ff40a36d13484b9f"} Feb 20 00:01:02 crc kubenswrapper[4795]: I0220 00:01:02.525492 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-purge-29525761-5slgz" event={"ID":"8dc6e226-d501-4698-b49c-f07fc8e80339","Type":"ContainerStarted","Data":"7f1b10e04140da2231a173788a781af5c72a39132101aedc853450f4470db79d"} Feb 20 00:01:02 crc kubenswrapper[4795]: I0220 00:01:02.545424 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29525761-d78q6" podStartSLOduration=2.5453996119999998 podStartE2EDuration="2.545399612s" podCreationTimestamp="2026-02-20 00:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 00:01:02.538974231 +0000 UTC m=+9173.731492095" watchObservedRunningTime="2026-02-20 00:01:02.545399612 +0000 UTC m=+9173.737917476" Feb 20 00:01:02 crc kubenswrapper[4795]: I0220 00:01:02.576124 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-purge-29525761-5slgz" podStartSLOduration=2.576105011 podStartE2EDuration="2.576105011s" podCreationTimestamp="2026-02-20 00:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 00:01:02.565276815 +0000 UTC m=+9173.757794679" watchObservedRunningTime="2026-02-20 00:01:02.576105011 +0000 UTC m=+9173.768622875" Feb 20 00:01:03 crc kubenswrapper[4795]: I0220 00:01:03.537941 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-purge-29525761-lcwsj" event={"ID":"9344b7be-b07a-4660-9352-dfdbcecac424","Type":"ContainerStarted","Data":"d03ab37f975de2c279ad295fe976f236b962ccc100e8c364d616ff19c0f5983e"} Feb 20 00:01:03 crc kubenswrapper[4795]: I0220 00:01:03.540770 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-purge-29525761-zbskn" event={"ID":"fe324720-6e0b-4d15-bc6e-3875b26bf7f4","Type":"ContainerStarted","Data":"4f952e7e5b352c1a9af1933c2d718e36ecf71d34b2f2efc9a84b624480ca61e8"} Feb 20 00:01:03 crc kubenswrapper[4795]: I0220 00:01:03.559626 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-purge-29525761-lcwsj" podStartSLOduration=3.559604756 podStartE2EDuration="3.559604756s" podCreationTimestamp="2026-02-20 00:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 00:01:03.558920027 +0000 UTC m=+9174.751437891" watchObservedRunningTime="2026-02-20 00:01:03.559604756 +0000 UTC m=+9174.752122620" Feb 20 00:01:03 crc kubenswrapper[4795]: I0220 00:01:03.579099 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-purge-29525761-zbskn" podStartSLOduration=3.579078407 podStartE2EDuration="3.579078407s" podCreationTimestamp="2026-02-20 00:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 00:01:03.575502726 +0000 UTC m=+9174.768020590" watchObservedRunningTime="2026-02-20 00:01:03.579078407 +0000 UTC m=+9174.771596261" Feb 20 00:01:04 crc kubenswrapper[4795]: I0220 00:01:04.551231 4795 generic.go:334] "Generic (PLEG): container finished" podID="8dc6e226-d501-4698-b49c-f07fc8e80339" containerID="7f1b10e04140da2231a173788a781af5c72a39132101aedc853450f4470db79d" exitCode=0 Feb 20 00:01:04 crc kubenswrapper[4795]: I0220 00:01:04.551313 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-purge-29525761-5slgz" event={"ID":"8dc6e226-d501-4698-b49c-f07fc8e80339","Type":"ContainerDied","Data":"7f1b10e04140da2231a173788a781af5c72a39132101aedc853450f4470db79d"} Feb 20 00:01:05 crc kubenswrapper[4795]: I0220 00:01:05.565126 4795 generic.go:334] "Generic (PLEG): container finished" podID="9344b7be-b07a-4660-9352-dfdbcecac424" containerID="d03ab37f975de2c279ad295fe976f236b962ccc100e8c364d616ff19c0f5983e" exitCode=0 Feb 20 00:01:05 crc kubenswrapper[4795]: I0220 00:01:05.565509 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-purge-29525761-lcwsj" event={"ID":"9344b7be-b07a-4660-9352-dfdbcecac424","Type":"ContainerDied","Data":"d03ab37f975de2c279ad295fe976f236b962ccc100e8c364d616ff19c0f5983e"} Feb 20 00:01:05 crc kubenswrapper[4795]: I0220 00:01:05.568644 4795 generic.go:334] "Generic (PLEG): container finished" podID="d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb" containerID="faad2ded953d5657f7a517328a5146fa339ecadd1a2f2c12485311d9ecfd00f1" exitCode=0 Feb 20 00:01:05 crc kubenswrapper[4795]: I0220 00:01:05.568699 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-purge-29525761-tq4s7" event={"ID":"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb","Type":"ContainerDied","Data":"faad2ded953d5657f7a517328a5146fa339ecadd1a2f2c12485311d9ecfd00f1"} Feb 20 00:01:05 crc kubenswrapper[4795]: I0220 00:01:05.570446 4795 generic.go:334] "Generic (PLEG): container finished" podID="e307d045-9890-4475-8c51-395484da10ca" containerID="d952c9318d01d0f2fbed23c519883c7b2ce798dd43e62951ff40a36d13484b9f" exitCode=0 Feb 20 00:01:05 crc kubenswrapper[4795]: I0220 00:01:05.570515 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525761-d78q6" event={"ID":"e307d045-9890-4475-8c51-395484da10ca","Type":"ContainerDied","Data":"d952c9318d01d0f2fbed23c519883c7b2ce798dd43e62951ff40a36d13484b9f"} Feb 20 00:01:06 crc kubenswrapper[4795]: I0220 00:01:06.024740 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-purge-29525761-5slgz" Feb 20 00:01:06 crc kubenswrapper[4795]: I0220 00:01:06.118568 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc6e226-d501-4698-b49c-f07fc8e80339-combined-ca-bundle\") pod \"8dc6e226-d501-4698-b49c-f07fc8e80339\" (UID: \"8dc6e226-d501-4698-b49c-f07fc8e80339\") " Feb 20 00:01:06 crc kubenswrapper[4795]: I0220 00:01:06.118644 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/8dc6e226-d501-4698-b49c-f07fc8e80339-db-purge-config-data\") pod \"8dc6e226-d501-4698-b49c-f07fc8e80339\" (UID: \"8dc6e226-d501-4698-b49c-f07fc8e80339\") " Feb 20 00:01:06 crc kubenswrapper[4795]: I0220 00:01:06.118873 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97sdm\" (UniqueName: \"kubernetes.io/projected/8dc6e226-d501-4698-b49c-f07fc8e80339-kube-api-access-97sdm\") pod \"8dc6e226-d501-4698-b49c-f07fc8e80339\" (UID: \"8dc6e226-d501-4698-b49c-f07fc8e80339\") " Feb 20 00:01:06 crc kubenswrapper[4795]: I0220 00:01:06.127306 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dc6e226-d501-4698-b49c-f07fc8e80339-db-purge-config-data" (OuterVolumeSpecName: "db-purge-config-data") pod "8dc6e226-d501-4698-b49c-f07fc8e80339" (UID: "8dc6e226-d501-4698-b49c-f07fc8e80339"). InnerVolumeSpecName "db-purge-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:06 crc kubenswrapper[4795]: I0220 00:01:06.134578 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dc6e226-d501-4698-b49c-f07fc8e80339-kube-api-access-97sdm" (OuterVolumeSpecName: "kube-api-access-97sdm") pod "8dc6e226-d501-4698-b49c-f07fc8e80339" (UID: "8dc6e226-d501-4698-b49c-f07fc8e80339"). InnerVolumeSpecName "kube-api-access-97sdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:01:06 crc kubenswrapper[4795]: I0220 00:01:06.166360 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dc6e226-d501-4698-b49c-f07fc8e80339-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8dc6e226-d501-4698-b49c-f07fc8e80339" (UID: "8dc6e226-d501-4698-b49c-f07fc8e80339"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:06 crc kubenswrapper[4795]: I0220 00:01:06.221200 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97sdm\" (UniqueName: \"kubernetes.io/projected/8dc6e226-d501-4698-b49c-f07fc8e80339-kube-api-access-97sdm\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:06 crc kubenswrapper[4795]: I0220 00:01:06.221231 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc6e226-d501-4698-b49c-f07fc8e80339-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:06 crc kubenswrapper[4795]: I0220 00:01:06.221240 4795 reconciler_common.go:293] "Volume detached for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/8dc6e226-d501-4698-b49c-f07fc8e80339-db-purge-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:06 crc kubenswrapper[4795]: I0220 00:01:06.592904 4795 generic.go:334] "Generic (PLEG): container finished" podID="fe324720-6e0b-4d15-bc6e-3875b26bf7f4" containerID="4f952e7e5b352c1a9af1933c2d718e36ecf71d34b2f2efc9a84b624480ca61e8" exitCode=0 Feb 20 00:01:06 crc kubenswrapper[4795]: I0220 00:01:06.595141 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-purge-29525761-zbskn" event={"ID":"fe324720-6e0b-4d15-bc6e-3875b26bf7f4","Type":"ContainerDied","Data":"4f952e7e5b352c1a9af1933c2d718e36ecf71d34b2f2efc9a84b624480ca61e8"} Feb 20 00:01:06 crc kubenswrapper[4795]: I0220 00:01:06.597899 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-purge-29525761-5slgz" event={"ID":"8dc6e226-d501-4698-b49c-f07fc8e80339","Type":"ContainerDied","Data":"839fcc1ae14bfe08de5f8cc957c2df706e67ee9e3944f751bcde5c02a76b7c98"} Feb 20 00:01:06 crc kubenswrapper[4795]: I0220 00:01:06.597964 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="839fcc1ae14bfe08de5f8cc957c2df706e67ee9e3944f751bcde5c02a76b7c98" Feb 20 00:01:06 crc kubenswrapper[4795]: I0220 00:01:06.598020 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-purge-29525761-5slgz" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.077997 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525761-d78q6" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.171592 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-config-data\") pod \"e307d045-9890-4475-8c51-395484da10ca\" (UID: \"e307d045-9890-4475-8c51-395484da10ca\") " Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.171686 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b8t2\" (UniqueName: \"kubernetes.io/projected/e307d045-9890-4475-8c51-395484da10ca-kube-api-access-8b8t2\") pod \"e307d045-9890-4475-8c51-395484da10ca\" (UID: \"e307d045-9890-4475-8c51-395484da10ca\") " Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.171769 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-fernet-keys\") pod \"e307d045-9890-4475-8c51-395484da10ca\" (UID: \"e307d045-9890-4475-8c51-395484da10ca\") " Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.171788 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-combined-ca-bundle\") pod \"e307d045-9890-4475-8c51-395484da10ca\" (UID: \"e307d045-9890-4475-8c51-395484da10ca\") " Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.177544 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e307d045-9890-4475-8c51-395484da10ca-kube-api-access-8b8t2" (OuterVolumeSpecName: "kube-api-access-8b8t2") pod "e307d045-9890-4475-8c51-395484da10ca" (UID: "e307d045-9890-4475-8c51-395484da10ca"). InnerVolumeSpecName "kube-api-access-8b8t2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.177836 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e307d045-9890-4475-8c51-395484da10ca" (UID: "e307d045-9890-4475-8c51-395484da10ca"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.201306 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e307d045-9890-4475-8c51-395484da10ca" (UID: "e307d045-9890-4475-8c51-395484da10ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.236140 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-config-data" (OuterVolumeSpecName: "config-data") pod "e307d045-9890-4475-8c51-395484da10ca" (UID: "e307d045-9890-4475-8c51-395484da10ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.258266 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-purge-29525761-tq4s7" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.264462 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-purge-29525761-lcwsj" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.279176 4795 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.279209 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.279222 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.279232 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b8t2\" (UniqueName: \"kubernetes.io/projected/e307d045-9890-4475-8c51-395484da10ca-kube-api-access-8b8t2\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.380159 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-combined-ca-bundle\") pod \"9344b7be-b07a-4660-9352-dfdbcecac424\" (UID: \"9344b7be-b07a-4660-9352-dfdbcecac424\") " Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.380256 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-db-purge-config-data\") pod \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\" (UID: \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\") " Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.380302 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-db-purge-config-data\") pod \"9344b7be-b07a-4660-9352-dfdbcecac424\" (UID: \"9344b7be-b07a-4660-9352-dfdbcecac424\") " Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.380345 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-config-data\") pod \"9344b7be-b07a-4660-9352-dfdbcecac424\" (UID: \"9344b7be-b07a-4660-9352-dfdbcecac424\") " Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.380479 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-combined-ca-bundle\") pod \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\" (UID: \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\") " Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.380508 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-config-data\") pod \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\" (UID: \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\") " Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.380555 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fd9z\" (UniqueName: \"kubernetes.io/projected/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-kube-api-access-4fd9z\") pod \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\" (UID: \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\") " Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.380592 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpt9n\" (UniqueName: \"kubernetes.io/projected/9344b7be-b07a-4660-9352-dfdbcecac424-kube-api-access-kpt9n\") pod \"9344b7be-b07a-4660-9352-dfdbcecac424\" (UID: \"9344b7be-b07a-4660-9352-dfdbcecac424\") " Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.384706 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9344b7be-b07a-4660-9352-dfdbcecac424-kube-api-access-kpt9n" (OuterVolumeSpecName: "kube-api-access-kpt9n") pod "9344b7be-b07a-4660-9352-dfdbcecac424" (UID: "9344b7be-b07a-4660-9352-dfdbcecac424"). InnerVolumeSpecName "kube-api-access-kpt9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.385781 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-db-purge-config-data" (OuterVolumeSpecName: "db-purge-config-data") pod "9344b7be-b07a-4660-9352-dfdbcecac424" (UID: "9344b7be-b07a-4660-9352-dfdbcecac424"). InnerVolumeSpecName "db-purge-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.386307 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-db-purge-config-data" (OuterVolumeSpecName: "db-purge-config-data") pod "d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb" (UID: "d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb"). InnerVolumeSpecName "db-purge-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.386460 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-kube-api-access-4fd9z" (OuterVolumeSpecName: "kube-api-access-4fd9z") pod "d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb" (UID: "d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb"). InnerVolumeSpecName "kube-api-access-4fd9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.412426 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-config-data" (OuterVolumeSpecName: "config-data") pod "d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb" (UID: "d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.413038 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-config-data" (OuterVolumeSpecName: "config-data") pod "9344b7be-b07a-4660-9352-dfdbcecac424" (UID: "9344b7be-b07a-4660-9352-dfdbcecac424"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.416520 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9344b7be-b07a-4660-9352-dfdbcecac424" (UID: "9344b7be-b07a-4660-9352-dfdbcecac424"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.419477 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb" (UID: "d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.490736 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.490767 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.490786 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fd9z\" (UniqueName: \"kubernetes.io/projected/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-kube-api-access-4fd9z\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.490804 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpt9n\" (UniqueName: \"kubernetes.io/projected/9344b7be-b07a-4660-9352-dfdbcecac424-kube-api-access-kpt9n\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.490813 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.490822 4795 reconciler_common.go:293] "Volume detached for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-db-purge-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.490832 4795 reconciler_common.go:293] "Volume detached for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-db-purge-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.490844 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.618456 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-purge-29525761-lcwsj" event={"ID":"9344b7be-b07a-4660-9352-dfdbcecac424","Type":"ContainerDied","Data":"f6461fca43460d2524cc1869187a90e6e92ad799d2b396d33e0f626bc7a5e79e"} Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.618823 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6461fca43460d2524cc1869187a90e6e92ad799d2b396d33e0f626bc7a5e79e" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.619226 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-purge-29525761-lcwsj" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.621653 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-purge-29525761-tq4s7" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.621692 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-purge-29525761-tq4s7" event={"ID":"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb","Type":"ContainerDied","Data":"735a70899c25a3ec127ed998e46a3e97bab5c4645f2d2852a58f1b4d84ae9c13"} Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.621727 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="735a70899c25a3ec127ed998e46a3e97bab5c4645f2d2852a58f1b4d84ae9c13" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.630150 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525761-d78q6" event={"ID":"e307d045-9890-4475-8c51-395484da10ca","Type":"ContainerDied","Data":"4775fd8acc8285432dc75f9369474eda69ed12d7e07fe84286320ce0fe245768"} Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.630196 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525761-d78q6" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.630216 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4775fd8acc8285432dc75f9369474eda69ed12d7e07fe84286320ce0fe245768" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.865294 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-purge-29525761-zbskn" Feb 20 00:01:08 crc kubenswrapper[4795]: I0220 00:01:08.003099 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-db-purge-config-data\") pod \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\" (UID: \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\") " Feb 20 00:01:08 crc kubenswrapper[4795]: I0220 00:01:08.003209 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2bpx\" (UniqueName: \"kubernetes.io/projected/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-kube-api-access-r2bpx\") pod \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\" (UID: \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\") " Feb 20 00:01:08 crc kubenswrapper[4795]: I0220 00:01:08.003374 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-combined-ca-bundle\") pod \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\" (UID: \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\") " Feb 20 00:01:08 crc kubenswrapper[4795]: I0220 00:01:08.003432 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-config-data\") pod \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\" (UID: \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\") " Feb 20 00:01:08 crc kubenswrapper[4795]: I0220 00:01:08.008709 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-db-purge-config-data" (OuterVolumeSpecName: "db-purge-config-data") pod "fe324720-6e0b-4d15-bc6e-3875b26bf7f4" (UID: "fe324720-6e0b-4d15-bc6e-3875b26bf7f4"). InnerVolumeSpecName "db-purge-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:08 crc kubenswrapper[4795]: I0220 00:01:08.008787 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-kube-api-access-r2bpx" (OuterVolumeSpecName: "kube-api-access-r2bpx") pod "fe324720-6e0b-4d15-bc6e-3875b26bf7f4" (UID: "fe324720-6e0b-4d15-bc6e-3875b26bf7f4"). InnerVolumeSpecName "kube-api-access-r2bpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:01:08 crc kubenswrapper[4795]: I0220 00:01:08.031767 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-config-data" (OuterVolumeSpecName: "config-data") pod "fe324720-6e0b-4d15-bc6e-3875b26bf7f4" (UID: "fe324720-6e0b-4d15-bc6e-3875b26bf7f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:08 crc kubenswrapper[4795]: I0220 00:01:08.036845 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe324720-6e0b-4d15-bc6e-3875b26bf7f4" (UID: "fe324720-6e0b-4d15-bc6e-3875b26bf7f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:08 crc kubenswrapper[4795]: I0220 00:01:08.105923 4795 reconciler_common.go:293] "Volume detached for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-db-purge-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:08 crc kubenswrapper[4795]: I0220 00:01:08.105959 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2bpx\" (UniqueName: \"kubernetes.io/projected/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-kube-api-access-r2bpx\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:08 crc kubenswrapper[4795]: I0220 00:01:08.105969 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:08 crc kubenswrapper[4795]: I0220 00:01:08.105978 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:08 crc kubenswrapper[4795]: I0220 00:01:08.645322 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-purge-29525761-zbskn" event={"ID":"fe324720-6e0b-4d15-bc6e-3875b26bf7f4","Type":"ContainerDied","Data":"d245ab4deae1c20cb273c4781102e64ba81f2352131918cd3603be3874e3b24b"} Feb 20 00:01:08 crc kubenswrapper[4795]: I0220 00:01:08.645363 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d245ab4deae1c20cb273c4781102e64ba81f2352131918cd3603be3874e3b24b" Feb 20 00:01:08 crc kubenswrapper[4795]: I0220 00:01:08.645423 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-purge-29525761-zbskn" Feb 20 00:01:15 crc kubenswrapper[4795]: I0220 00:01:15.511907 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:01:15 crc kubenswrapper[4795]: E0220 00:01:15.512779 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:01:27 crc kubenswrapper[4795]: I0220 00:01:27.511746 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:01:27 crc kubenswrapper[4795]: E0220 00:01:27.512503 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:01:40 crc kubenswrapper[4795]: I0220 00:01:40.511952 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:01:40 crc kubenswrapper[4795]: E0220 00:01:40.513864 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:01:51 crc kubenswrapper[4795]: I0220 00:01:51.512476 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:01:51 crc kubenswrapper[4795]: E0220 00:01:51.513618 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:02:02 crc kubenswrapper[4795]: I0220 00:02:02.512211 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:02:02 crc kubenswrapper[4795]: E0220 00:02:02.512971 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:02:13 crc kubenswrapper[4795]: I0220 00:02:13.511981 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:02:13 crc kubenswrapper[4795]: E0220 00:02:13.512876 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:02:27 crc kubenswrapper[4795]: I0220 00:02:27.513366 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:02:27 crc kubenswrapper[4795]: E0220 00:02:27.514732 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:02:29 crc kubenswrapper[4795]: I0220 00:02:29.357845 4795 scope.go:117] "RemoveContainer" containerID="de244572ed412b6d15883ac7be00a681de7409d595350d300ebc7aac8f96c690" Feb 20 00:02:29 crc kubenswrapper[4795]: I0220 00:02:29.381859 4795 scope.go:117] "RemoveContainer" containerID="62deb3f2bf568b395b46bc81de81ee26457a95f02f006897224aa24bc7dac271" Feb 20 00:02:29 crc kubenswrapper[4795]: I0220 00:02:29.425719 4795 scope.go:117] "RemoveContainer" containerID="3644eb2ff205a6ecbf078ee98cc404ff628a9964f7f066ce4f38a2487488db5f" Feb 20 00:02:29 crc kubenswrapper[4795]: I0220 00:02:29.479240 4795 scope.go:117] "RemoveContainer" containerID="6d065a9d149243251b6b58ebaf53b077b0910d5423b5a4944782ee17e37027c2" Feb 20 00:02:29 crc kubenswrapper[4795]: I0220 00:02:29.523358 4795 scope.go:117] "RemoveContainer" containerID="07a4317117426bae86bf07c8d5e610095ccbae15e950025068b03dd800e0da4a" Feb 20 00:02:38 crc kubenswrapper[4795]: I0220 00:02:38.512543 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:02:38 crc kubenswrapper[4795]: E0220 00:02:38.513662 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:02:52 crc kubenswrapper[4795]: I0220 00:02:52.512137 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:02:52 crc kubenswrapper[4795]: E0220 00:02:52.513264 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:03:05 crc kubenswrapper[4795]: I0220 00:03:05.512364 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:03:05 crc kubenswrapper[4795]: E0220 00:03:05.513370 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:03:20 crc kubenswrapper[4795]: I0220 00:03:20.512148 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:03:20 crc kubenswrapper[4795]: E0220 00:03:20.513141 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:03:32 crc kubenswrapper[4795]: I0220 00:03:32.512555 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:03:32 crc kubenswrapper[4795]: E0220 00:03:32.513409 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:03:45 crc kubenswrapper[4795]: I0220 00:03:45.512914 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:03:45 crc kubenswrapper[4795]: E0220 00:03:45.513848 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:03:59 crc kubenswrapper[4795]: I0220 00:03:59.519010 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:03:59 crc kubenswrapper[4795]: E0220 00:03:59.520122 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:04:12 crc kubenswrapper[4795]: I0220 00:04:12.512071 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:04:12 crc kubenswrapper[4795]: E0220 00:04:12.528860 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:04:25 crc kubenswrapper[4795]: I0220 00:04:25.798371 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t6sqk"] Feb 20 00:04:25 crc kubenswrapper[4795]: E0220 00:04:25.799241 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe324720-6e0b-4d15-bc6e-3875b26bf7f4" containerName="cinder-db-purge" Feb 20 00:04:25 crc kubenswrapper[4795]: I0220 00:04:25.799254 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe324720-6e0b-4d15-bc6e-3875b26bf7f4" containerName="cinder-db-purge" Feb 20 00:04:25 crc kubenswrapper[4795]: E0220 00:04:25.799269 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dc6e226-d501-4698-b49c-f07fc8e80339" containerName="manila-db-purge" Feb 20 00:04:25 crc kubenswrapper[4795]: I0220 00:04:25.799275 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dc6e226-d501-4698-b49c-f07fc8e80339" containerName="manila-db-purge" Feb 20 00:04:25 crc kubenswrapper[4795]: E0220 00:04:25.799299 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9344b7be-b07a-4660-9352-dfdbcecac424" containerName="glance-dbpurge" Feb 20 00:04:25 crc kubenswrapper[4795]: I0220 00:04:25.799305 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9344b7be-b07a-4660-9352-dfdbcecac424" containerName="glance-dbpurge" Feb 20 00:04:25 crc kubenswrapper[4795]: E0220 00:04:25.799320 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e307d045-9890-4475-8c51-395484da10ca" containerName="keystone-cron" Feb 20 00:04:25 crc kubenswrapper[4795]: I0220 00:04:25.799326 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e307d045-9890-4475-8c51-395484da10ca" containerName="keystone-cron" Feb 20 00:04:25 crc kubenswrapper[4795]: E0220 00:04:25.799333 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb" containerName="heat-dbpurge" Feb 20 00:04:25 crc kubenswrapper[4795]: I0220 00:04:25.799338 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb" containerName="heat-dbpurge" Feb 20 00:04:25 crc kubenswrapper[4795]: I0220 00:04:25.799515 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe324720-6e0b-4d15-bc6e-3875b26bf7f4" containerName="cinder-db-purge" Feb 20 00:04:25 crc kubenswrapper[4795]: I0220 00:04:25.799530 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e307d045-9890-4475-8c51-395484da10ca" containerName="keystone-cron" Feb 20 00:04:25 crc kubenswrapper[4795]: I0220 00:04:25.799539 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb" containerName="heat-dbpurge" Feb 20 00:04:25 crc kubenswrapper[4795]: I0220 00:04:25.799558 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dc6e226-d501-4698-b49c-f07fc8e80339" containerName="manila-db-purge" Feb 20 00:04:25 crc kubenswrapper[4795]: I0220 00:04:25.799568 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9344b7be-b07a-4660-9352-dfdbcecac424" containerName="glance-dbpurge" Feb 20 00:04:25 crc kubenswrapper[4795]: I0220 00:04:25.801022 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:25 crc kubenswrapper[4795]: I0220 00:04:25.826898 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t6sqk"] Feb 20 00:04:25 crc kubenswrapper[4795]: I0220 00:04:25.913944 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77dfee48-ba66-4c86-80c3-47b740e7e1c3-catalog-content\") pod \"certified-operators-t6sqk\" (UID: \"77dfee48-ba66-4c86-80c3-47b740e7e1c3\") " pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:25 crc kubenswrapper[4795]: I0220 00:04:25.914024 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77dfee48-ba66-4c86-80c3-47b740e7e1c3-utilities\") pod \"certified-operators-t6sqk\" (UID: \"77dfee48-ba66-4c86-80c3-47b740e7e1c3\") " pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:25 crc kubenswrapper[4795]: I0220 00:04:25.914335 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pkd6\" (UniqueName: \"kubernetes.io/projected/77dfee48-ba66-4c86-80c3-47b740e7e1c3-kube-api-access-9pkd6\") pod \"certified-operators-t6sqk\" (UID: \"77dfee48-ba66-4c86-80c3-47b740e7e1c3\") " pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:26 crc kubenswrapper[4795]: I0220 00:04:26.017111 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77dfee48-ba66-4c86-80c3-47b740e7e1c3-utilities\") pod \"certified-operators-t6sqk\" (UID: \"77dfee48-ba66-4c86-80c3-47b740e7e1c3\") " pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:26 crc kubenswrapper[4795]: I0220 00:04:26.017231 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pkd6\" (UniqueName: \"kubernetes.io/projected/77dfee48-ba66-4c86-80c3-47b740e7e1c3-kube-api-access-9pkd6\") pod \"certified-operators-t6sqk\" (UID: \"77dfee48-ba66-4c86-80c3-47b740e7e1c3\") " pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:26 crc kubenswrapper[4795]: I0220 00:04:26.017382 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77dfee48-ba66-4c86-80c3-47b740e7e1c3-catalog-content\") pod \"certified-operators-t6sqk\" (UID: \"77dfee48-ba66-4c86-80c3-47b740e7e1c3\") " pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:26 crc kubenswrapper[4795]: I0220 00:04:26.017915 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77dfee48-ba66-4c86-80c3-47b740e7e1c3-utilities\") pod \"certified-operators-t6sqk\" (UID: \"77dfee48-ba66-4c86-80c3-47b740e7e1c3\") " pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:26 crc kubenswrapper[4795]: I0220 00:04:26.017923 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77dfee48-ba66-4c86-80c3-47b740e7e1c3-catalog-content\") pod \"certified-operators-t6sqk\" (UID: \"77dfee48-ba66-4c86-80c3-47b740e7e1c3\") " pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:26 crc kubenswrapper[4795]: I0220 00:04:26.043038 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pkd6\" (UniqueName: \"kubernetes.io/projected/77dfee48-ba66-4c86-80c3-47b740e7e1c3-kube-api-access-9pkd6\") pod \"certified-operators-t6sqk\" (UID: \"77dfee48-ba66-4c86-80c3-47b740e7e1c3\") " pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:26 crc kubenswrapper[4795]: I0220 00:04:26.184277 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:26 crc kubenswrapper[4795]: I0220 00:04:26.513434 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:04:26 crc kubenswrapper[4795]: E0220 00:04:26.513990 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:04:26 crc kubenswrapper[4795]: I0220 00:04:26.890347 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t6sqk"] Feb 20 00:04:27 crc kubenswrapper[4795]: I0220 00:04:27.860070 4795 generic.go:334] "Generic (PLEG): container finished" podID="77dfee48-ba66-4c86-80c3-47b740e7e1c3" containerID="ffbea51d0633b74fb6f4a168487b22e2cd9e687d22b452b5402426703436c8ae" exitCode=0 Feb 20 00:04:27 crc kubenswrapper[4795]: I0220 00:04:27.860141 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6sqk" event={"ID":"77dfee48-ba66-4c86-80c3-47b740e7e1c3","Type":"ContainerDied","Data":"ffbea51d0633b74fb6f4a168487b22e2cd9e687d22b452b5402426703436c8ae"} Feb 20 00:04:27 crc kubenswrapper[4795]: I0220 00:04:27.860425 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6sqk" event={"ID":"77dfee48-ba66-4c86-80c3-47b740e7e1c3","Type":"ContainerStarted","Data":"7ac04f15babc163d9a654978f2668463945a212502dd56a6aa7609a7a093b1d6"} Feb 20 00:04:27 crc kubenswrapper[4795]: I0220 00:04:27.862821 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 00:04:29 crc kubenswrapper[4795]: I0220 00:04:29.883444 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6sqk" event={"ID":"77dfee48-ba66-4c86-80c3-47b740e7e1c3","Type":"ContainerStarted","Data":"d1a5d4597ee012843d01343335af26b459961559661e432712580b5de4255a09"} Feb 20 00:04:30 crc kubenswrapper[4795]: I0220 00:04:30.928933 4795 generic.go:334] "Generic (PLEG): container finished" podID="77dfee48-ba66-4c86-80c3-47b740e7e1c3" containerID="d1a5d4597ee012843d01343335af26b459961559661e432712580b5de4255a09" exitCode=0 Feb 20 00:04:30 crc kubenswrapper[4795]: I0220 00:04:30.929047 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6sqk" event={"ID":"77dfee48-ba66-4c86-80c3-47b740e7e1c3","Type":"ContainerDied","Data":"d1a5d4597ee012843d01343335af26b459961559661e432712580b5de4255a09"} Feb 20 00:04:31 crc kubenswrapper[4795]: I0220 00:04:31.943719 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6sqk" event={"ID":"77dfee48-ba66-4c86-80c3-47b740e7e1c3","Type":"ContainerStarted","Data":"221073919b4e858db2f48f769cfa321f21dd303aee546b34150f0cffd1381916"} Feb 20 00:04:31 crc kubenswrapper[4795]: I0220 00:04:31.977838 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t6sqk" podStartSLOduration=3.365831653 podStartE2EDuration="6.977820366s" podCreationTimestamp="2026-02-20 00:04:25 +0000 UTC" firstStartedPulling="2026-02-20 00:04:27.862479948 +0000 UTC m=+9379.054997812" lastFinishedPulling="2026-02-20 00:04:31.474468641 +0000 UTC m=+9382.666986525" observedRunningTime="2026-02-20 00:04:31.967973488 +0000 UTC m=+9383.160491382" watchObservedRunningTime="2026-02-20 00:04:31.977820366 +0000 UTC m=+9383.170338230" Feb 20 00:04:36 crc kubenswrapper[4795]: I0220 00:04:36.184521 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:36 crc kubenswrapper[4795]: I0220 00:04:36.185005 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:36 crc kubenswrapper[4795]: I0220 00:04:36.236533 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:37 crc kubenswrapper[4795]: I0220 00:04:37.057909 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:37 crc kubenswrapper[4795]: I0220 00:04:37.112567 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t6sqk"] Feb 20 00:04:39 crc kubenswrapper[4795]: I0220 00:04:39.019065 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t6sqk" podUID="77dfee48-ba66-4c86-80c3-47b740e7e1c3" containerName="registry-server" containerID="cri-o://221073919b4e858db2f48f769cfa321f21dd303aee546b34150f0cffd1381916" gracePeriod=2 Feb 20 00:04:39 crc kubenswrapper[4795]: I0220 00:04:39.522582 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:04:39 crc kubenswrapper[4795]: E0220 00:04:39.523423 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:04:39 crc kubenswrapper[4795]: I0220 00:04:39.581273 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:39 crc kubenswrapper[4795]: I0220 00:04:39.623023 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pkd6\" (UniqueName: \"kubernetes.io/projected/77dfee48-ba66-4c86-80c3-47b740e7e1c3-kube-api-access-9pkd6\") pod \"77dfee48-ba66-4c86-80c3-47b740e7e1c3\" (UID: \"77dfee48-ba66-4c86-80c3-47b740e7e1c3\") " Feb 20 00:04:39 crc kubenswrapper[4795]: I0220 00:04:39.623233 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77dfee48-ba66-4c86-80c3-47b740e7e1c3-catalog-content\") pod \"77dfee48-ba66-4c86-80c3-47b740e7e1c3\" (UID: \"77dfee48-ba66-4c86-80c3-47b740e7e1c3\") " Feb 20 00:04:39 crc kubenswrapper[4795]: I0220 00:04:39.623371 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77dfee48-ba66-4c86-80c3-47b740e7e1c3-utilities\") pod \"77dfee48-ba66-4c86-80c3-47b740e7e1c3\" (UID: \"77dfee48-ba66-4c86-80c3-47b740e7e1c3\") " Feb 20 00:04:39 crc kubenswrapper[4795]: I0220 00:04:39.625096 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77dfee48-ba66-4c86-80c3-47b740e7e1c3-utilities" (OuterVolumeSpecName: "utilities") pod "77dfee48-ba66-4c86-80c3-47b740e7e1c3" (UID: "77dfee48-ba66-4c86-80c3-47b740e7e1c3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:04:39 crc kubenswrapper[4795]: I0220 00:04:39.632386 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77dfee48-ba66-4c86-80c3-47b740e7e1c3-kube-api-access-9pkd6" (OuterVolumeSpecName: "kube-api-access-9pkd6") pod "77dfee48-ba66-4c86-80c3-47b740e7e1c3" (UID: "77dfee48-ba66-4c86-80c3-47b740e7e1c3"). InnerVolumeSpecName "kube-api-access-9pkd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:04:39 crc kubenswrapper[4795]: I0220 00:04:39.685581 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77dfee48-ba66-4c86-80c3-47b740e7e1c3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "77dfee48-ba66-4c86-80c3-47b740e7e1c3" (UID: "77dfee48-ba66-4c86-80c3-47b740e7e1c3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:04:39 crc kubenswrapper[4795]: I0220 00:04:39.726085 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pkd6\" (UniqueName: \"kubernetes.io/projected/77dfee48-ba66-4c86-80c3-47b740e7e1c3-kube-api-access-9pkd6\") on node \"crc\" DevicePath \"\"" Feb 20 00:04:39 crc kubenswrapper[4795]: I0220 00:04:39.726127 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77dfee48-ba66-4c86-80c3-47b740e7e1c3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 00:04:39 crc kubenswrapper[4795]: I0220 00:04:39.726139 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77dfee48-ba66-4c86-80c3-47b740e7e1c3-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 00:04:40 crc kubenswrapper[4795]: I0220 00:04:40.067710 4795 generic.go:334] "Generic (PLEG): container finished" podID="77dfee48-ba66-4c86-80c3-47b740e7e1c3" containerID="221073919b4e858db2f48f769cfa321f21dd303aee546b34150f0cffd1381916" exitCode=0 Feb 20 00:04:40 crc kubenswrapper[4795]: I0220 00:04:40.067811 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:40 crc kubenswrapper[4795]: I0220 00:04:40.067802 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6sqk" event={"ID":"77dfee48-ba66-4c86-80c3-47b740e7e1c3","Type":"ContainerDied","Data":"221073919b4e858db2f48f769cfa321f21dd303aee546b34150f0cffd1381916"} Feb 20 00:04:40 crc kubenswrapper[4795]: I0220 00:04:40.068409 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6sqk" event={"ID":"77dfee48-ba66-4c86-80c3-47b740e7e1c3","Type":"ContainerDied","Data":"7ac04f15babc163d9a654978f2668463945a212502dd56a6aa7609a7a093b1d6"} Feb 20 00:04:40 crc kubenswrapper[4795]: I0220 00:04:40.068431 4795 scope.go:117] "RemoveContainer" containerID="221073919b4e858db2f48f769cfa321f21dd303aee546b34150f0cffd1381916" Feb 20 00:04:40 crc kubenswrapper[4795]: I0220 00:04:40.092468 4795 scope.go:117] "RemoveContainer" containerID="d1a5d4597ee012843d01343335af26b459961559661e432712580b5de4255a09" Feb 20 00:04:40 crc kubenswrapper[4795]: I0220 00:04:40.117046 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t6sqk"] Feb 20 00:04:40 crc kubenswrapper[4795]: I0220 00:04:40.122987 4795 scope.go:117] "RemoveContainer" containerID="ffbea51d0633b74fb6f4a168487b22e2cd9e687d22b452b5402426703436c8ae" Feb 20 00:04:40 crc kubenswrapper[4795]: I0220 00:04:40.141133 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t6sqk"] Feb 20 00:04:40 crc kubenswrapper[4795]: I0220 00:04:40.176220 4795 scope.go:117] "RemoveContainer" containerID="221073919b4e858db2f48f769cfa321f21dd303aee546b34150f0cffd1381916" Feb 20 00:04:40 crc kubenswrapper[4795]: E0220 00:04:40.176742 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"221073919b4e858db2f48f769cfa321f21dd303aee546b34150f0cffd1381916\": container with ID starting with 221073919b4e858db2f48f769cfa321f21dd303aee546b34150f0cffd1381916 not found: ID does not exist" containerID="221073919b4e858db2f48f769cfa321f21dd303aee546b34150f0cffd1381916" Feb 20 00:04:40 crc kubenswrapper[4795]: I0220 00:04:40.176785 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"221073919b4e858db2f48f769cfa321f21dd303aee546b34150f0cffd1381916"} err="failed to get container status \"221073919b4e858db2f48f769cfa321f21dd303aee546b34150f0cffd1381916\": rpc error: code = NotFound desc = could not find container \"221073919b4e858db2f48f769cfa321f21dd303aee546b34150f0cffd1381916\": container with ID starting with 221073919b4e858db2f48f769cfa321f21dd303aee546b34150f0cffd1381916 not found: ID does not exist" Feb 20 00:04:40 crc kubenswrapper[4795]: I0220 00:04:40.176808 4795 scope.go:117] "RemoveContainer" containerID="d1a5d4597ee012843d01343335af26b459961559661e432712580b5de4255a09" Feb 20 00:04:40 crc kubenswrapper[4795]: E0220 00:04:40.177223 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1a5d4597ee012843d01343335af26b459961559661e432712580b5de4255a09\": container with ID starting with d1a5d4597ee012843d01343335af26b459961559661e432712580b5de4255a09 not found: ID does not exist" containerID="d1a5d4597ee012843d01343335af26b459961559661e432712580b5de4255a09" Feb 20 00:04:40 crc kubenswrapper[4795]: I0220 00:04:40.178770 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1a5d4597ee012843d01343335af26b459961559661e432712580b5de4255a09"} err="failed to get container status \"d1a5d4597ee012843d01343335af26b459961559661e432712580b5de4255a09\": rpc error: code = NotFound desc = could not find container \"d1a5d4597ee012843d01343335af26b459961559661e432712580b5de4255a09\": container with ID starting with d1a5d4597ee012843d01343335af26b459961559661e432712580b5de4255a09 not found: ID does not exist" Feb 20 00:04:40 crc kubenswrapper[4795]: I0220 00:04:40.178826 4795 scope.go:117] "RemoveContainer" containerID="ffbea51d0633b74fb6f4a168487b22e2cd9e687d22b452b5402426703436c8ae" Feb 20 00:04:40 crc kubenswrapper[4795]: E0220 00:04:40.179239 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffbea51d0633b74fb6f4a168487b22e2cd9e687d22b452b5402426703436c8ae\": container with ID starting with ffbea51d0633b74fb6f4a168487b22e2cd9e687d22b452b5402426703436c8ae not found: ID does not exist" containerID="ffbea51d0633b74fb6f4a168487b22e2cd9e687d22b452b5402426703436c8ae" Feb 20 00:04:40 crc kubenswrapper[4795]: I0220 00:04:40.179273 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffbea51d0633b74fb6f4a168487b22e2cd9e687d22b452b5402426703436c8ae"} err="failed to get container status \"ffbea51d0633b74fb6f4a168487b22e2cd9e687d22b452b5402426703436c8ae\": rpc error: code = NotFound desc = could not find container \"ffbea51d0633b74fb6f4a168487b22e2cd9e687d22b452b5402426703436c8ae\": container with ID starting with ffbea51d0633b74fb6f4a168487b22e2cd9e687d22b452b5402426703436c8ae not found: ID does not exist" Feb 20 00:04:41 crc kubenswrapper[4795]: I0220 00:04:41.524957 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77dfee48-ba66-4c86-80c3-47b740e7e1c3" path="/var/lib/kubelet/pods/77dfee48-ba66-4c86-80c3-47b740e7e1c3/volumes" Feb 20 00:04:53 crc kubenswrapper[4795]: I0220 00:04:53.513936 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:04:53 crc kubenswrapper[4795]: E0220 00:04:53.514866 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:05:02 crc kubenswrapper[4795]: I0220 00:05:02.042951 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_b5b60b6d-7ecf-424d-a297-f98fae5ef0a3/init-config-reloader/0.log" Feb 20 00:05:02 crc kubenswrapper[4795]: I0220 00:05:02.192305 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_b5b60b6d-7ecf-424d-a297-f98fae5ef0a3/init-config-reloader/0.log" Feb 20 00:05:02 crc kubenswrapper[4795]: I0220 00:05:02.270002 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_b5b60b6d-7ecf-424d-a297-f98fae5ef0a3/alertmanager/0.log" Feb 20 00:05:02 crc kubenswrapper[4795]: I0220 00:05:02.328934 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_b5b60b6d-7ecf-424d-a297-f98fae5ef0a3/config-reloader/0.log" Feb 20 00:05:02 crc kubenswrapper[4795]: I0220 00:05:02.462906 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_acb98719-7401-4241-8361-070eb67879c7/aodh-api/0.log" Feb 20 00:05:03 crc kubenswrapper[4795]: I0220 00:05:03.088501 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_acb98719-7401-4241-8361-070eb67879c7/aodh-evaluator/0.log" Feb 20 00:05:03 crc kubenswrapper[4795]: I0220 00:05:03.111580 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_acb98719-7401-4241-8361-070eb67879c7/aodh-notifier/0.log" Feb 20 00:05:03 crc kubenswrapper[4795]: I0220 00:05:03.124364 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_acb98719-7401-4241-8361-070eb67879c7/aodh-listener/0.log" Feb 20 00:05:03 crc kubenswrapper[4795]: I0220 00:05:03.292243 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-57b58f479d-8dz8t_30f7c03f-5289-48c5-987e-b808897adc6d/barbican-api/0.log" Feb 20 00:05:03 crc kubenswrapper[4795]: I0220 00:05:03.382723 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-57b58f479d-8dz8t_30f7c03f-5289-48c5-987e-b808897adc6d/barbican-api-log/0.log" Feb 20 00:05:03 crc kubenswrapper[4795]: I0220 00:05:03.467009 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6b9b47c4f6-2kzbr_5bb2f008-145f-4fc9-9d51-065874ab1b1e/barbican-keystone-listener/0.log" Feb 20 00:05:03 crc kubenswrapper[4795]: I0220 00:05:03.519082 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6b9b47c4f6-2kzbr_5bb2f008-145f-4fc9-9d51-065874ab1b1e/barbican-keystone-listener-log/0.log" Feb 20 00:05:03 crc kubenswrapper[4795]: I0220 00:05:03.601326 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7c6d9dfdbf-zg9wc_3e6a3af4-fd31-411b-833c-5a39501f5d63/barbican-worker/0.log" Feb 20 00:05:03 crc kubenswrapper[4795]: I0220 00:05:03.897123 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7c6d9dfdbf-zg9wc_3e6a3af4-fd31-411b-833c-5a39501f5d63/barbican-worker-log/0.log" Feb 20 00:05:04 crc kubenswrapper[4795]: I0220 00:05:04.079477 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-rmhfb_a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa/bootstrap-openstack-openstack-cell1/0.log" Feb 20 00:05:04 crc kubenswrapper[4795]: I0220 00:05:04.186835 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_87a2d0b8-5866-4d88-ab91-cd94c2136c6c/ceilometer-central-agent/0.log" Feb 20 00:05:04 crc kubenswrapper[4795]: I0220 00:05:04.194752 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_87a2d0b8-5866-4d88-ab91-cd94c2136c6c/ceilometer-notification-agent/0.log" Feb 20 00:05:04 crc kubenswrapper[4795]: I0220 00:05:04.330364 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_87a2d0b8-5866-4d88-ab91-cd94c2136c6c/sg-core/0.log" Feb 20 00:05:04 crc kubenswrapper[4795]: I0220 00:05:04.336341 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_87a2d0b8-5866-4d88-ab91-cd94c2136c6c/proxy-httpd/0.log" Feb 20 00:05:04 crc kubenswrapper[4795]: I0220 00:05:04.396926 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-openstack-openstack-cell1-zjjhg_532484aa-8294-4c2d-b257-082b09bafb14/ceph-client-openstack-openstack-cell1/0.log" Feb 20 00:05:04 crc kubenswrapper[4795]: I0220 00:05:04.511662 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:05:04 crc kubenswrapper[4795]: I0220 00:05:04.636960 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_daeb9555-6d76-45ca-b3da-b6dd91c33e00/cinder-api/0.log" Feb 20 00:05:04 crc kubenswrapper[4795]: I0220 00:05:04.651584 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_daeb9555-6d76-45ca-b3da-b6dd91c33e00/cinder-api-log/0.log" Feb 20 00:05:04 crc kubenswrapper[4795]: I0220 00:05:04.915868 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_12de80a7-e42b-4768-83d4-0ed7d7490c30/probe/0.log" Feb 20 00:05:04 crc kubenswrapper[4795]: I0220 00:05:04.942794 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_12de80a7-e42b-4768-83d4-0ed7d7490c30/cinder-backup/0.log" Feb 20 00:05:04 crc kubenswrapper[4795]: I0220 00:05:04.971799 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-db-purge-29525761-zbskn_fe324720-6e0b-4d15-bc6e-3875b26bf7f4/cinder-db-purge/0.log" Feb 20 00:05:05 crc kubenswrapper[4795]: I0220 00:05:05.151388 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_85502c41-99ab-4a8f-9c36-f4d839b931a1/cinder-scheduler/0.log" Feb 20 00:05:05 crc kubenswrapper[4795]: I0220 00:05:05.223602 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_85502c41-99ab-4a8f-9c36-f4d839b931a1/probe/0.log" Feb 20 00:05:05 crc kubenswrapper[4795]: I0220 00:05:05.328966 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"bbe670b74801c9c3513113ce23971acfb0538080cd5b8205d654717a9d255d0d"} Feb 20 00:05:05 crc kubenswrapper[4795]: I0220 00:05:05.474337 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_90e22321-4464-4199-b873-8998821a02ed/probe/0.log" Feb 20 00:05:05 crc kubenswrapper[4795]: I0220 00:05:05.493743 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_90e22321-4464-4199-b873-8998821a02ed/cinder-volume/0.log" Feb 20 00:05:05 crc kubenswrapper[4795]: I0220 00:05:05.509777 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-2twf8_5a293bce-3326-47c0-a9b5-b5af13dc46c8/configure-network-openstack-openstack-cell1/0.log" Feb 20 00:05:05 crc kubenswrapper[4795]: I0220 00:05:05.690080 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-lbssf_0d850dd7-a1bb-42db-893b-b96eebee4c9c/configure-os-openstack-openstack-cell1/0.log" Feb 20 00:05:05 crc kubenswrapper[4795]: I0220 00:05:05.768690 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-67d97dc55-pvbjd_f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0/init/0.log" Feb 20 00:05:05 crc kubenswrapper[4795]: I0220 00:05:05.914725 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-67d97dc55-pvbjd_f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0/init/0.log" Feb 20 00:05:05 crc kubenswrapper[4795]: I0220 00:05:05.948091 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-67d97dc55-pvbjd_f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0/dnsmasq-dns/0.log" Feb 20 00:05:05 crc kubenswrapper[4795]: I0220 00:05:05.980004 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-wfcx4_69464400-c61c-41bd-aeeb-984f7f948a16/download-cache-openstack-openstack-cell1/0.log" Feb 20 00:05:06 crc kubenswrapper[4795]: I0220 00:05:06.186762 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-db-purge-29525761-lcwsj_9344b7be-b07a-4660-9352-dfdbcecac424/glance-dbpurge/0.log" Feb 20 00:05:06 crc kubenswrapper[4795]: I0220 00:05:06.202036 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_bd4ac280-c0e4-46e3-95c8-5e051c96f32e/glance-httpd/0.log" Feb 20 00:05:06 crc kubenswrapper[4795]: I0220 00:05:06.254066 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_bd4ac280-c0e4-46e3-95c8-5e051c96f32e/glance-log/0.log" Feb 20 00:05:06 crc kubenswrapper[4795]: I0220 00:05:06.407076 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b7d41b06-abb7-4a30-a29c-3b9d66706d8f/glance-httpd/0.log" Feb 20 00:05:06 crc kubenswrapper[4795]: I0220 00:05:06.432714 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b7d41b06-abb7-4a30-a29c-3b9d66706d8f/glance-log/0.log" Feb 20 00:05:06 crc kubenswrapper[4795]: I0220 00:05:06.587726 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-57679899bc-rj6x7_058c5b61-3ec2-4a88-bea8-59843d00750c/heat-api/0.log" Feb 20 00:05:06 crc kubenswrapper[4795]: I0220 00:05:06.719000 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-6565dd9f4d-w85dm_94ea6e46-bacd-40ca-bce9-0f28656581af/heat-cfnapi/0.log" Feb 20 00:05:06 crc kubenswrapper[4795]: I0220 00:05:06.775188 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-db-purge-29525761-tq4s7_d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb/heat-dbpurge/0.log" Feb 20 00:05:06 crc kubenswrapper[4795]: I0220 00:05:06.904054 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-54b48c7f4c-97pnj_a380f130-e904-41e8-90e2-93bdeb0615d6/heat-engine/0.log" Feb 20 00:05:07 crc kubenswrapper[4795]: I0220 00:05:07.061027 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6f75767dd9-c8js2_53ce70ba-9e61-4dbd-b858-7059c82eed67/horizon/0.log" Feb 20 00:05:07 crc kubenswrapper[4795]: I0220 00:05:07.110325 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-tjvng_dc08b8d0-e577-4674-9ca5-b1a02818725c/install-certs-openstack-openstack-cell1/0.log" Feb 20 00:05:07 crc kubenswrapper[4795]: I0220 00:05:07.170401 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6f75767dd9-c8js2_53ce70ba-9e61-4dbd-b858-7059c82eed67/horizon-log/0.log" Feb 20 00:05:07 crc kubenswrapper[4795]: I0220 00:05:07.385880 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-6ctfc_41df3556-7d70-47f5-bd79-bec74fbd269c/install-os-openstack-openstack-cell1/0.log" Feb 20 00:05:07 crc kubenswrapper[4795]: I0220 00:05:07.552960 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-689ff8fbd7-j2v4l_57c39d61-cab0-49e7-8938-06952896387e/keystone-api/0.log" Feb 20 00:05:07 crc kubenswrapper[4795]: I0220 00:05:07.679255 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29525701-nmz5v_5f2d7932-b11f-4e9b-a6e0-2a9a069a3459/keystone-cron/0.log" Feb 20 00:05:07 crc kubenswrapper[4795]: I0220 00:05:07.806135 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29525761-d78q6_e307d045-9890-4475-8c51-395484da10ca/keystone-cron/0.log" Feb 20 00:05:07 crc kubenswrapper[4795]: I0220 00:05:07.907338 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_5a2a47de-c40d-40c9-8556-ea7033a4033b/kube-state-metrics/0.log" Feb 20 00:05:08 crc kubenswrapper[4795]: I0220 00:05:08.016136 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-drll7_b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d/libvirt-openstack-openstack-cell1/0.log" Feb 20 00:05:08 crc kubenswrapper[4795]: I0220 00:05:08.163402 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_b7cef1f6-95e4-4ccd-8a2a-49c27373a96d/manila-api-log/0.log" Feb 20 00:05:08 crc kubenswrapper[4795]: I0220 00:05:08.239525 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_b7cef1f6-95e4-4ccd-8a2a-49c27373a96d/manila-api/0.log" Feb 20 00:05:08 crc kubenswrapper[4795]: I0220 00:05:08.276730 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-db-purge-29525761-5slgz_8dc6e226-d501-4698-b49c-f07fc8e80339/manila-db-purge/0.log" Feb 20 00:05:08 crc kubenswrapper[4795]: I0220 00:05:08.433576 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_25554074-26bb-4b62-a1f9-dac4cd6308b4/probe/0.log" Feb 20 00:05:08 crc kubenswrapper[4795]: I0220 00:05:08.460784 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_25554074-26bb-4b62-a1f9-dac4cd6308b4/manila-scheduler/0.log" Feb 20 00:05:08 crc kubenswrapper[4795]: I0220 00:05:08.522742 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864/manila-share/0.log" Feb 20 00:05:08 crc kubenswrapper[4795]: I0220 00:05:08.627071 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864/probe/0.log" Feb 20 00:05:08 crc kubenswrapper[4795]: I0220 00:05:08.858381 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7945766d5c-fjptf_dea0417f-0988-4d82-80cc-03298be367bd/neutron-api/0.log" Feb 20 00:05:08 crc kubenswrapper[4795]: I0220 00:05:08.946913 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7945766d5c-fjptf_dea0417f-0988-4d82-80cc-03298be367bd/neutron-httpd/0.log" Feb 20 00:05:09 crc kubenswrapper[4795]: I0220 00:05:09.035760 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-khvh4_ff3df901-a0ae-456e-8103-60aaa6439785/neutron-dhcp-openstack-openstack-cell1/0.log" Feb 20 00:05:09 crc kubenswrapper[4795]: I0220 00:05:09.177429 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-5855g_a23f1a80-1645-454d-b9cf-e039928b84cb/neutron-metadata-openstack-openstack-cell1/0.log" Feb 20 00:05:09 crc kubenswrapper[4795]: I0220 00:05:09.273629 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-56tm4_a29cf217-b932-4515-a8e6-4bb762611d24/neutron-sriov-openstack-openstack-cell1/0.log" Feb 20 00:05:09 crc kubenswrapper[4795]: I0220 00:05:09.626740 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_644df2e5-37fd-468b-9e52-316d44e65f69/nova-api-log/0.log" Feb 20 00:05:09 crc kubenswrapper[4795]: I0220 00:05:09.651437 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_644df2e5-37fd-468b-9e52-316d44e65f69/nova-api-api/0.log" Feb 20 00:05:09 crc kubenswrapper[4795]: I0220 00:05:09.732342 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_d27d8041-4940-4cd2-bf9e-02b7aa924067/nova-cell0-conductor-conductor/0.log" Feb 20 00:05:09 crc kubenswrapper[4795]: I0220 00:05:09.851306 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-db-purge-29525760-n9sd6_57de3f43-e33f-4734-b02d-372d013b7e80/nova-manage/0.log" Feb 20 00:05:10 crc kubenswrapper[4795]: I0220 00:05:10.042472 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_0fa49294-8a0c-4d98-a388-067bdce0ac1b/nova-cell1-conductor-conductor/0.log" Feb 20 00:05:10 crc kubenswrapper[4795]: I0220 00:05:10.071456 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-db-purge-29525760-9s2b5_2fbbf12e-019a-40d4-9a07-46b3e5b4c814/nova-manage/0.log" Feb 20 00:05:10 crc kubenswrapper[4795]: I0220 00:05:10.341152 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_a96a8189-2b04-4ce7-908b-3544dc3b7ec4/nova-cell1-novncproxy-novncproxy/0.log" Feb 20 00:05:10 crc kubenswrapper[4795]: I0220 00:05:10.378730 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h_59981ca7-620e-4025-b165-4f54f920e8f2/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Feb 20 00:05:10 crc kubenswrapper[4795]: I0220 00:05:10.568400 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-q26dr_df818d88-cec5-4daf-8b17-cc4bb298b498/nova-cell1-openstack-openstack-cell1/0.log" Feb 20 00:05:10 crc kubenswrapper[4795]: I0220 00:05:10.696796 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_d15a66bd-d8e7-4ad0-a8bc-7575a218f50c/nova-metadata-log/0.log" Feb 20 00:05:10 crc kubenswrapper[4795]: I0220 00:05:10.792823 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_d15a66bd-d8e7-4ad0-a8bc-7575a218f50c/nova-metadata-metadata/0.log" Feb 20 00:05:11 crc kubenswrapper[4795]: I0220 00:05:11.328825 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-556fc55b45-7gxcm_c95e6fdb-6007-4490-9572-a2709f8b7daf/init/0.log" Feb 20 00:05:11 crc kubenswrapper[4795]: I0220 00:05:11.373058 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_d16cd452-43cb-42e4-b4af-6de3271d7194/nova-scheduler-scheduler/0.log" Feb 20 00:05:11 crc kubenswrapper[4795]: I0220 00:05:11.645938 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-556fc55b45-7gxcm_c95e6fdb-6007-4490-9572-a2709f8b7daf/init/0.log" Feb 20 00:05:11 crc kubenswrapper[4795]: I0220 00:05:11.847422 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-556fc55b45-7gxcm_c95e6fdb-6007-4490-9572-a2709f8b7daf/octavia-api-provider-agent/0.log" Feb 20 00:05:11 crc kubenswrapper[4795]: I0220 00:05:11.880415 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-g59hr_107db266-c130-4312-be67-ffe75016fd44/init/0.log" Feb 20 00:05:11 crc kubenswrapper[4795]: I0220 00:05:11.995449 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-556fc55b45-7gxcm_c95e6fdb-6007-4490-9572-a2709f8b7daf/octavia-api/0.log" Feb 20 00:05:12 crc kubenswrapper[4795]: I0220 00:05:12.049575 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-g59hr_107db266-c130-4312-be67-ffe75016fd44/init/0.log" Feb 20 00:05:12 crc kubenswrapper[4795]: I0220 00:05:12.160866 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-g59hr_107db266-c130-4312-be67-ffe75016fd44/octavia-healthmanager/0.log" Feb 20 00:05:12 crc kubenswrapper[4795]: I0220 00:05:12.232847 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-cm4g6_7167d9ee-5127-43c9-957a-598d9dcfecb3/init/0.log" Feb 20 00:05:12 crc kubenswrapper[4795]: I0220 00:05:12.499159 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-cm4g6_7167d9ee-5127-43c9-957a-598d9dcfecb3/octavia-housekeeping/0.log" Feb 20 00:05:12 crc kubenswrapper[4795]: I0220 00:05:12.543808 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-8d4564f8f-vp2hp_73c5ad0c-a7f2-414d-a1f8-041a807d82b9/init/0.log" Feb 20 00:05:12 crc kubenswrapper[4795]: I0220 00:05:12.559867 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-cm4g6_7167d9ee-5127-43c9-957a-598d9dcfecb3/init/0.log" Feb 20 00:05:12 crc kubenswrapper[4795]: I0220 00:05:12.735518 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-8d4564f8f-vp2hp_73c5ad0c-a7f2-414d-a1f8-041a807d82b9/octavia-amphora-httpd/0.log" Feb 20 00:05:12 crc kubenswrapper[4795]: I0220 00:05:12.796834 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-8d4564f8f-vp2hp_73c5ad0c-a7f2-414d-a1f8-041a807d82b9/init/0.log" Feb 20 00:05:12 crc kubenswrapper[4795]: I0220 00:05:12.828127 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-gzbmf_b3ffcac3-ee64-440c-983d-67404e5f47fd/init/0.log" Feb 20 00:05:13 crc kubenswrapper[4795]: I0220 00:05:13.320430 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-gzbmf_b3ffcac3-ee64-440c-983d-67404e5f47fd/octavia-rsyslog/0.log" Feb 20 00:05:13 crc kubenswrapper[4795]: I0220 00:05:13.364619 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-gzbmf_b3ffcac3-ee64-440c-983d-67404e5f47fd/init/0.log" Feb 20 00:05:13 crc kubenswrapper[4795]: I0220 00:05:13.383418 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-pktjg_cfbfd9d0-564e-41d0-8171-5f32f380a3df/init/0.log" Feb 20 00:05:13 crc kubenswrapper[4795]: I0220 00:05:13.585762 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-pktjg_cfbfd9d0-564e-41d0-8171-5f32f380a3df/init/0.log" Feb 20 00:05:13 crc kubenswrapper[4795]: I0220 00:05:13.692865 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c8f55130-d799-45ef-b174-450b6c3b52ff/mysql-bootstrap/0.log" Feb 20 00:05:13 crc kubenswrapper[4795]: I0220 00:05:13.769332 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-pktjg_cfbfd9d0-564e-41d0-8171-5f32f380a3df/octavia-worker/0.log" Feb 20 00:05:13 crc kubenswrapper[4795]: I0220 00:05:13.948094 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c8f55130-d799-45ef-b174-450b6c3b52ff/galera/0.log" Feb 20 00:05:13 crc kubenswrapper[4795]: I0220 00:05:13.973191 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c8f55130-d799-45ef-b174-450b6c3b52ff/mysql-bootstrap/0.log" Feb 20 00:05:14 crc kubenswrapper[4795]: I0220 00:05:14.050485 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_24345708-df30-4486-bc7e-44eaa7722ffd/mysql-bootstrap/0.log" Feb 20 00:05:14 crc kubenswrapper[4795]: I0220 00:05:14.217041 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_24345708-df30-4486-bc7e-44eaa7722ffd/mysql-bootstrap/0.log" Feb 20 00:05:14 crc kubenswrapper[4795]: I0220 00:05:14.309379 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_f1d06b1e-9114-47b8-913d-86144f6314c3/openstackclient/0.log" Feb 20 00:05:14 crc kubenswrapper[4795]: I0220 00:05:14.309469 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_24345708-df30-4486-bc7e-44eaa7722ffd/galera/0.log" Feb 20 00:05:14 crc kubenswrapper[4795]: I0220 00:05:14.516986 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-knqfl_3bbc323f-3f18-42bc-b0d8-12f021d91d6b/ovn-controller/0.log" Feb 20 00:05:14 crc kubenswrapper[4795]: I0220 00:05:14.575521 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-kx9qd_b48804d5-a275-45dd-896c-f35b7a322690/openstack-network-exporter/0.log" Feb 20 00:05:14 crc kubenswrapper[4795]: I0220 00:05:14.745138 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-lrv52_9e98c62c-20fc-462c-9973-2616cb184032/ovsdb-server-init/0.log" Feb 20 00:05:14 crc kubenswrapper[4795]: I0220 00:05:14.982271 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-lrv52_9e98c62c-20fc-462c-9973-2616cb184032/ovsdb-server/0.log" Feb 20 00:05:14 crc kubenswrapper[4795]: I0220 00:05:14.995272 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-lrv52_9e98c62c-20fc-462c-9973-2616cb184032/ovsdb-server-init/0.log" Feb 20 00:05:15 crc kubenswrapper[4795]: I0220 00:05:15.005434 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-lrv52_9e98c62c-20fc-462c-9973-2616cb184032/ovs-vswitchd/0.log" Feb 20 00:05:15 crc kubenswrapper[4795]: I0220 00:05:15.196655 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b989c1be-7a74-42ee-a27b-dc34ce8d727a/openstack-network-exporter/0.log" Feb 20 00:05:15 crc kubenswrapper[4795]: I0220 00:05:15.198940 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b989c1be-7a74-42ee-a27b-dc34ce8d727a/ovn-northd/0.log" Feb 20 00:05:15 crc kubenswrapper[4795]: I0220 00:05:15.300270 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-b9rq5_d02efd94-2196-48fe-85d5-e2c65d186d6e/ovn-openstack-openstack-cell1/0.log" Feb 20 00:05:15 crc kubenswrapper[4795]: I0220 00:05:15.401050 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9b32c19b-2b8b-4587-9327-1ddf5b074ad6/openstack-network-exporter/0.log" Feb 20 00:05:15 crc kubenswrapper[4795]: I0220 00:05:15.536028 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9b32c19b-2b8b-4587-9327-1ddf5b074ad6/ovsdbserver-nb/0.log" Feb 20 00:05:15 crc kubenswrapper[4795]: I0220 00:05:15.639944 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_f814768e-2961-4d2a-ba3b-615dea717cf8/openstack-network-exporter/0.log" Feb 20 00:05:15 crc kubenswrapper[4795]: I0220 00:05:15.643126 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_f814768e-2961-4d2a-ba3b-615dea717cf8/ovsdbserver-nb/0.log" Feb 20 00:05:15 crc kubenswrapper[4795]: I0220 00:05:15.771680 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_188a11e4-50de-4672-baaf-89a3a512cd0c/openstack-network-exporter/0.log" Feb 20 00:05:15 crc kubenswrapper[4795]: I0220 00:05:15.891725 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_188a11e4-50de-4672-baaf-89a3a512cd0c/ovsdbserver-nb/0.log" Feb 20 00:05:15 crc kubenswrapper[4795]: I0220 00:05:15.995317 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_924d2a8a-2ae7-417a-9770-054662474286/ovsdbserver-sb/0.log" Feb 20 00:05:16 crc kubenswrapper[4795]: I0220 00:05:16.024894 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_924d2a8a-2ae7-417a-9770-054662474286/openstack-network-exporter/0.log" Feb 20 00:05:16 crc kubenswrapper[4795]: I0220 00:05:16.160234 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_9c5bddd7-705d-41b3-ad43-1889c6c34ab0/openstack-network-exporter/0.log" Feb 20 00:05:16 crc kubenswrapper[4795]: I0220 00:05:16.237931 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_9c5bddd7-705d-41b3-ad43-1889c6c34ab0/ovsdbserver-sb/0.log" Feb 20 00:05:16 crc kubenswrapper[4795]: I0220 00:05:16.388628 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_9c8cf9f5-7499-4c52-9710-91b96d49b0fc/openstack-network-exporter/0.log" Feb 20 00:05:16 crc kubenswrapper[4795]: I0220 00:05:16.391782 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_9c8cf9f5-7499-4c52-9710-91b96d49b0fc/ovsdbserver-sb/0.log" Feb 20 00:05:16 crc kubenswrapper[4795]: I0220 00:05:16.623432 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-764895875b-czlhk_f4bb335d-ad73-403a-a25f-8e6f33f60ecb/placement-api/0.log" Feb 20 00:05:16 crc kubenswrapper[4795]: I0220 00:05:16.714372 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-764895875b-czlhk_f4bb335d-ad73-403a-a25f-8e6f33f60ecb/placement-log/0.log" Feb 20 00:05:16 crc kubenswrapper[4795]: I0220 00:05:16.950136 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n_7ba2e854-6881-4f7f-8068-7abf4df26229/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Feb 20 00:05:17 crc kubenswrapper[4795]: I0220 00:05:17.092292 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_281b5fc0-7da4-4d5a-89d4-b073b1500865/init-config-reloader/0.log" Feb 20 00:05:17 crc kubenswrapper[4795]: I0220 00:05:17.274250 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_281b5fc0-7da4-4d5a-89d4-b073b1500865/init-config-reloader/0.log" Feb 20 00:05:17 crc kubenswrapper[4795]: I0220 00:05:17.302971 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_281b5fc0-7da4-4d5a-89d4-b073b1500865/config-reloader/0.log" Feb 20 00:05:17 crc kubenswrapper[4795]: I0220 00:05:17.341454 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_281b5fc0-7da4-4d5a-89d4-b073b1500865/prometheus/0.log" Feb 20 00:05:17 crc kubenswrapper[4795]: I0220 00:05:17.362273 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_281b5fc0-7da4-4d5a-89d4-b073b1500865/thanos-sidecar/0.log" Feb 20 00:05:17 crc kubenswrapper[4795]: I0220 00:05:17.534703 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ce76f8f5-4383-4be1-ab7b-cf862ae77025/setup-container/0.log" Feb 20 00:05:17 crc kubenswrapper[4795]: I0220 00:05:17.729243 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ce76f8f5-4383-4be1-ab7b-cf862ae77025/setup-container/0.log" Feb 20 00:05:17 crc kubenswrapper[4795]: I0220 00:05:17.805727 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b/setup-container/0.log" Feb 20 00:05:17 crc kubenswrapper[4795]: I0220 00:05:17.891365 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ce76f8f5-4383-4be1-ab7b-cf862ae77025/rabbitmq/0.log" Feb 20 00:05:18 crc kubenswrapper[4795]: I0220 00:05:18.019523 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b/setup-container/0.log" Feb 20 00:05:18 crc kubenswrapper[4795]: I0220 00:05:18.046837 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b/rabbitmq/0.log" Feb 20 00:05:18 crc kubenswrapper[4795]: I0220 00:05:18.160848 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-bdhcf_99fb1ef3-d414-4a7e-9db8-54edf1aad197/reboot-os-openstack-openstack-cell1/0.log" Feb 20 00:05:18 crc kubenswrapper[4795]: I0220 00:05:18.270923 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-7vb2g_d82522ab-bf1a-47f9-902b-c82105b5d09b/run-os-openstack-openstack-cell1/0.log" Feb 20 00:05:18 crc kubenswrapper[4795]: I0220 00:05:18.444952 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-qzhll_adb280f6-14e8-45d3-91a1-1bf325d84aef/ssh-known-hosts-openstack/0.log" Feb 20 00:05:18 crc kubenswrapper[4795]: I0220 00:05:18.615081 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-45fwk_8272a408-0416-4077-9e85-b2962992b3f4/telemetry-openstack-openstack-cell1/0.log" Feb 20 00:05:18 crc kubenswrapper[4795]: I0220 00:05:18.816810 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg_c3cbdd11-d93f-4025-9c08-7530a68f6113/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Feb 20 00:05:18 crc kubenswrapper[4795]: I0220 00:05:18.913091 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-44t2w_94dbf6be-911e-46d9-a950-fa19fa137490/validate-network-openstack-openstack-cell1/0.log" Feb 20 00:05:19 crc kubenswrapper[4795]: I0220 00:05:19.487931 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_3d516d65-1efc-42ee-ab17-971e2d94e4a7/memcached/0.log" Feb 20 00:05:42 crc kubenswrapper[4795]: I0220 00:05:42.422059 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4_6afab948-ae77-464b-aa33-b8d45ddc01ff/util/0.log" Feb 20 00:05:42 crc kubenswrapper[4795]: I0220 00:05:42.637226 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4_6afab948-ae77-464b-aa33-b8d45ddc01ff/util/0.log" Feb 20 00:05:42 crc kubenswrapper[4795]: I0220 00:05:42.698528 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4_6afab948-ae77-464b-aa33-b8d45ddc01ff/pull/0.log" Feb 20 00:05:42 crc kubenswrapper[4795]: I0220 00:05:42.698915 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4_6afab948-ae77-464b-aa33-b8d45ddc01ff/pull/0.log" Feb 20 00:05:42 crc kubenswrapper[4795]: I0220 00:05:42.896599 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4_6afab948-ae77-464b-aa33-b8d45ddc01ff/pull/0.log" Feb 20 00:05:42 crc kubenswrapper[4795]: I0220 00:05:42.917801 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4_6afab948-ae77-464b-aa33-b8d45ddc01ff/util/0.log" Feb 20 00:05:42 crc kubenswrapper[4795]: I0220 00:05:42.918623 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4_6afab948-ae77-464b-aa33-b8d45ddc01ff/extract/0.log" Feb 20 00:05:43 crc kubenswrapper[4795]: I0220 00:05:43.385540 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-d8wqs_5c867f91-2ab2-43ce-8291-6d01825610d1/manager/0.log" Feb 20 00:05:43 crc kubenswrapper[4795]: I0220 00:05:43.954836 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-5cnjr_d19ed31e-e599-40ec-935d-d1d404e4c7a5/manager/0.log" Feb 20 00:05:44 crc kubenswrapper[4795]: I0220 00:05:44.024705 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-shb4d_268c2664-09cc-4616-9280-0dd6ae4159dc/manager/0.log" Feb 20 00:05:44 crc kubenswrapper[4795]: I0220 00:05:44.344669 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-fdd85_e37494c1-8780-4612-8569-fada28f0e772/manager/0.log" Feb 20 00:05:44 crc kubenswrapper[4795]: I0220 00:05:44.848367 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-z7hnk_4cc5be3d-87d8-46a4-ba7d-d95143c11857/manager/0.log" Feb 20 00:05:44 crc kubenswrapper[4795]: I0220 00:05:44.995078 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-t87bb_02592cbe-e1d4-4b62-8795-a204d5335594/manager/0.log" Feb 20 00:05:45 crc kubenswrapper[4795]: I0220 00:05:45.426903 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-t6hpt_c7e19956-a3fb-4ed2-bc2a-72084ed62ac2/manager/0.log" Feb 20 00:05:45 crc kubenswrapper[4795]: I0220 00:05:45.496691 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-qjgvw_2e80963b-888b-4bb9-9259-864e38dd10ed/manager/0.log" Feb 20 00:05:45 crc kubenswrapper[4795]: I0220 00:05:45.626485 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-7n98g_1d6085d5-f9db-4129-8662-b3ae045decfc/manager/0.log" Feb 20 00:05:45 crc kubenswrapper[4795]: I0220 00:05:45.827415 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-mr8mh_0bdb1789-27ad-4535-86d3-fd2fb7cebba2/manager/0.log" Feb 20 00:05:45 crc kubenswrapper[4795]: I0220 00:05:45.965560 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-4cf2p_c2c4435e-a135-4c1f-bad4-121458c09bc3/manager/0.log" Feb 20 00:05:46 crc kubenswrapper[4795]: I0220 00:05:46.425450 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-5b89b_5a6d3cc3-7e00-4013-b568-c2b835d8e2b9/manager/0.log" Feb 20 00:05:46 crc kubenswrapper[4795]: I0220 00:05:46.773284 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p_26db9cb2-1ed4-44e4-afac-404ce0f7d445/manager/0.log" Feb 20 00:05:47 crc kubenswrapper[4795]: I0220 00:05:47.316332 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-vwgdm_54a55994-69ff-48f1-8d75-24b2a828cdc9/manager/0.log" Feb 20 00:05:47 crc kubenswrapper[4795]: I0220 00:05:47.420026 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6679bf9b57-vbjcq_c6c44d2f-3e8f-42de-babe-85a8fc1a97ec/operator/0.log" Feb 20 00:05:47 crc kubenswrapper[4795]: I0220 00:05:47.715385 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-tf75g_91c93ffc-fbe2-486e-92a9-ca5737dc7875/registry-server/0.log" Feb 20 00:05:47 crc kubenswrapper[4795]: I0220 00:05:47.827858 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-slqxz_e0cad59b-249e-446f-b3fa-6be8aac2a858/manager/0.log" Feb 20 00:05:47 crc kubenswrapper[4795]: I0220 00:05:47.975651 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-bwsj2_7b637620-f307-4e2b-b92d-f1e0d50b0071/manager/0.log" Feb 20 00:05:48 crc kubenswrapper[4795]: I0220 00:05:48.092064 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-vv89z_80ce3bc1-0926-47a3-acc2-6f2d8be4089c/operator/0.log" Feb 20 00:05:48 crc kubenswrapper[4795]: I0220 00:05:48.219355 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-dqpjx_98979ac7-9fb1-49f8-8022-562082fc76f7/manager/0.log" Feb 20 00:05:48 crc kubenswrapper[4795]: I0220 00:05:48.541347 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-rcjgz_6bdc9c62-d8c1-42d5-8696-324fdc7abc2f/manager/0.log" Feb 20 00:05:48 crc kubenswrapper[4795]: I0220 00:05:48.682103 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7f45b4ff68-slj65_5f4d8698-27a0-44a4-87f6-c75d4c3407bc/manager/0.log" Feb 20 00:05:49 crc kubenswrapper[4795]: I0220 00:05:49.263729 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5db88f68c-bbtgm_09ce2dcf-0fb0-4180-a019-09d1abfec00e/manager/0.log" Feb 20 00:05:50 crc kubenswrapper[4795]: I0220 00:05:50.610331 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-69ff7bc449-62xdp_9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4/manager/0.log" Feb 20 00:05:50 crc kubenswrapper[4795]: I0220 00:05:50.837564 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-ckxlw_b22b5096-41cf-40c9-94f6-8e546ca96a96/manager/0.log" Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.466353 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7hdhn"] Feb 20 00:05:57 crc kubenswrapper[4795]: E0220 00:05:57.467370 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77dfee48-ba66-4c86-80c3-47b740e7e1c3" containerName="extract-utilities" Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.467384 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="77dfee48-ba66-4c86-80c3-47b740e7e1c3" containerName="extract-utilities" Feb 20 00:05:57 crc kubenswrapper[4795]: E0220 00:05:57.467408 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77dfee48-ba66-4c86-80c3-47b740e7e1c3" containerName="registry-server" Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.467413 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="77dfee48-ba66-4c86-80c3-47b740e7e1c3" containerName="registry-server" Feb 20 00:05:57 crc kubenswrapper[4795]: E0220 00:05:57.467447 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77dfee48-ba66-4c86-80c3-47b740e7e1c3" containerName="extract-content" Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.467454 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="77dfee48-ba66-4c86-80c3-47b740e7e1c3" containerName="extract-content" Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.467631 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="77dfee48-ba66-4c86-80c3-47b740e7e1c3" containerName="registry-server" Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.471783 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.489696 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7hdhn"] Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.546246 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20cdd942-7d19-4f30-9952-d7f228b9ce25-catalog-content\") pod \"redhat-operators-7hdhn\" (UID: \"20cdd942-7d19-4f30-9952-d7f228b9ce25\") " pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.546316 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20cdd942-7d19-4f30-9952-d7f228b9ce25-utilities\") pod \"redhat-operators-7hdhn\" (UID: \"20cdd942-7d19-4f30-9952-d7f228b9ce25\") " pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.546394 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-448mb\" (UniqueName: \"kubernetes.io/projected/20cdd942-7d19-4f30-9952-d7f228b9ce25-kube-api-access-448mb\") pod \"redhat-operators-7hdhn\" (UID: \"20cdd942-7d19-4f30-9952-d7f228b9ce25\") " pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.648749 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20cdd942-7d19-4f30-9952-d7f228b9ce25-catalog-content\") pod \"redhat-operators-7hdhn\" (UID: \"20cdd942-7d19-4f30-9952-d7f228b9ce25\") " pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.648828 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20cdd942-7d19-4f30-9952-d7f228b9ce25-utilities\") pod \"redhat-operators-7hdhn\" (UID: \"20cdd942-7d19-4f30-9952-d7f228b9ce25\") " pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.648936 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-448mb\" (UniqueName: \"kubernetes.io/projected/20cdd942-7d19-4f30-9952-d7f228b9ce25-kube-api-access-448mb\") pod \"redhat-operators-7hdhn\" (UID: \"20cdd942-7d19-4f30-9952-d7f228b9ce25\") " pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.649628 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20cdd942-7d19-4f30-9952-d7f228b9ce25-utilities\") pod \"redhat-operators-7hdhn\" (UID: \"20cdd942-7d19-4f30-9952-d7f228b9ce25\") " pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.649667 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20cdd942-7d19-4f30-9952-d7f228b9ce25-catalog-content\") pod \"redhat-operators-7hdhn\" (UID: \"20cdd942-7d19-4f30-9952-d7f228b9ce25\") " pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.680225 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-448mb\" (UniqueName: \"kubernetes.io/projected/20cdd942-7d19-4f30-9952-d7f228b9ce25-kube-api-access-448mb\") pod \"redhat-operators-7hdhn\" (UID: \"20cdd942-7d19-4f30-9952-d7f228b9ce25\") " pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.795568 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:05:58 crc kubenswrapper[4795]: I0220 00:05:58.297343 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7hdhn"] Feb 20 00:05:58 crc kubenswrapper[4795]: W0220 00:05:58.309074 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20cdd942_7d19_4f30_9952_d7f228b9ce25.slice/crio-7b602e85bc557e898c3ca566f65c7dc4a8e26a4c0fa1a330c88c268287d48d81 WatchSource:0}: Error finding container 7b602e85bc557e898c3ca566f65c7dc4a8e26a4c0fa1a330c88c268287d48d81: Status 404 returned error can't find the container with id 7b602e85bc557e898c3ca566f65c7dc4a8e26a4c0fa1a330c88c268287d48d81 Feb 20 00:05:58 crc kubenswrapper[4795]: I0220 00:05:58.835105 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7hdhn" event={"ID":"20cdd942-7d19-4f30-9952-d7f228b9ce25","Type":"ContainerStarted","Data":"5b1794b84c59b23398b947ba30b2bc5663729806a9fcfb22249c57f13530b50f"} Feb 20 00:05:58 crc kubenswrapper[4795]: I0220 00:05:58.835432 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7hdhn" event={"ID":"20cdd942-7d19-4f30-9952-d7f228b9ce25","Type":"ContainerStarted","Data":"7b602e85bc557e898c3ca566f65c7dc4a8e26a4c0fa1a330c88c268287d48d81"} Feb 20 00:05:59 crc kubenswrapper[4795]: I0220 00:05:59.850489 4795 generic.go:334] "Generic (PLEG): container finished" podID="20cdd942-7d19-4f30-9952-d7f228b9ce25" containerID="5b1794b84c59b23398b947ba30b2bc5663729806a9fcfb22249c57f13530b50f" exitCode=0 Feb 20 00:05:59 crc kubenswrapper[4795]: I0220 00:05:59.850590 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7hdhn" event={"ID":"20cdd942-7d19-4f30-9952-d7f228b9ce25","Type":"ContainerDied","Data":"5b1794b84c59b23398b947ba30b2bc5663729806a9fcfb22249c57f13530b50f"} Feb 20 00:06:01 crc kubenswrapper[4795]: I0220 00:06:01.873273 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7hdhn" event={"ID":"20cdd942-7d19-4f30-9952-d7f228b9ce25","Type":"ContainerStarted","Data":"3b20e09d048f1337f24d9d1e276d0fa4a7957311695be0063853e435b8104822"} Feb 20 00:06:04 crc kubenswrapper[4795]: I0220 00:06:04.911986 4795 generic.go:334] "Generic (PLEG): container finished" podID="20cdd942-7d19-4f30-9952-d7f228b9ce25" containerID="3b20e09d048f1337f24d9d1e276d0fa4a7957311695be0063853e435b8104822" exitCode=0 Feb 20 00:06:04 crc kubenswrapper[4795]: I0220 00:06:04.912079 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7hdhn" event={"ID":"20cdd942-7d19-4f30-9952-d7f228b9ce25","Type":"ContainerDied","Data":"3b20e09d048f1337f24d9d1e276d0fa4a7957311695be0063853e435b8104822"} Feb 20 00:06:05 crc kubenswrapper[4795]: I0220 00:06:05.923108 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7hdhn" event={"ID":"20cdd942-7d19-4f30-9952-d7f228b9ce25","Type":"ContainerStarted","Data":"2a956df3d891332966df5ad18e909d8da26b1727c8e93c48b762296a8b0255d5"} Feb 20 00:06:05 crc kubenswrapper[4795]: I0220 00:06:05.940906 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7hdhn" podStartSLOduration=3.462705052 podStartE2EDuration="8.940884878s" podCreationTimestamp="2026-02-20 00:05:57 +0000 UTC" firstStartedPulling="2026-02-20 00:05:59.853142657 +0000 UTC m=+9471.045660521" lastFinishedPulling="2026-02-20 00:06:05.331322483 +0000 UTC m=+9476.523840347" observedRunningTime="2026-02-20 00:06:05.939870299 +0000 UTC m=+9477.132388163" watchObservedRunningTime="2026-02-20 00:06:05.940884878 +0000 UTC m=+9477.133402742" Feb 20 00:06:07 crc kubenswrapper[4795]: I0220 00:06:07.796005 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:06:07 crc kubenswrapper[4795]: I0220 00:06:07.796613 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:06:08 crc kubenswrapper[4795]: I0220 00:06:08.845252 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7hdhn" podUID="20cdd942-7d19-4f30-9952-d7f228b9ce25" containerName="registry-server" probeResult="failure" output=< Feb 20 00:06:08 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Feb 20 00:06:08 crc kubenswrapper[4795]: > Feb 20 00:06:10 crc kubenswrapper[4795]: I0220 00:06:10.494613 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-42wfj_0da0af7f-f8f8-492d-bd44-1e81ab242a24/control-plane-machine-set-operator/0.log" Feb 20 00:06:10 crc kubenswrapper[4795]: I0220 00:06:10.685486 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7mzng_ab78fbf6-65df-4306-a7b8-c7bd98cfdf49/kube-rbac-proxy/0.log" Feb 20 00:06:10 crc kubenswrapper[4795]: I0220 00:06:10.719184 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7mzng_ab78fbf6-65df-4306-a7b8-c7bd98cfdf49/machine-api-operator/0.log" Feb 20 00:06:17 crc kubenswrapper[4795]: I0220 00:06:17.835927 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wqgfs"] Feb 20 00:06:17 crc kubenswrapper[4795]: I0220 00:06:17.839598 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:17 crc kubenswrapper[4795]: I0220 00:06:17.849728 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wqgfs"] Feb 20 00:06:17 crc kubenswrapper[4795]: I0220 00:06:17.916849 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9gk7\" (UniqueName: \"kubernetes.io/projected/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-kube-api-access-v9gk7\") pod \"community-operators-wqgfs\" (UID: \"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca\") " pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:17 crc kubenswrapper[4795]: I0220 00:06:17.916928 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-utilities\") pod \"community-operators-wqgfs\" (UID: \"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca\") " pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:17 crc kubenswrapper[4795]: I0220 00:06:17.916963 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-catalog-content\") pod \"community-operators-wqgfs\" (UID: \"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca\") " pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:18 crc kubenswrapper[4795]: I0220 00:06:18.020732 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9gk7\" (UniqueName: \"kubernetes.io/projected/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-kube-api-access-v9gk7\") pod \"community-operators-wqgfs\" (UID: \"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca\") " pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:18 crc kubenswrapper[4795]: I0220 00:06:18.020826 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-utilities\") pod \"community-operators-wqgfs\" (UID: \"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca\") " pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:18 crc kubenswrapper[4795]: I0220 00:06:18.020848 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-catalog-content\") pod \"community-operators-wqgfs\" (UID: \"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca\") " pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:18 crc kubenswrapper[4795]: I0220 00:06:18.021418 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-utilities\") pod \"community-operators-wqgfs\" (UID: \"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca\") " pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:18 crc kubenswrapper[4795]: I0220 00:06:18.021620 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-catalog-content\") pod \"community-operators-wqgfs\" (UID: \"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca\") " pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:18 crc kubenswrapper[4795]: I0220 00:06:18.043993 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9gk7\" (UniqueName: \"kubernetes.io/projected/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-kube-api-access-v9gk7\") pod \"community-operators-wqgfs\" (UID: \"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca\") " pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:18 crc kubenswrapper[4795]: I0220 00:06:18.177401 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:18 crc kubenswrapper[4795]: I0220 00:06:18.785966 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wqgfs"] Feb 20 00:06:18 crc kubenswrapper[4795]: I0220 00:06:18.869828 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7hdhn" podUID="20cdd942-7d19-4f30-9952-d7f228b9ce25" containerName="registry-server" probeResult="failure" output=< Feb 20 00:06:18 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Feb 20 00:06:18 crc kubenswrapper[4795]: > Feb 20 00:06:20 crc kubenswrapper[4795]: I0220 00:06:20.058859 4795 generic.go:334] "Generic (PLEG): container finished" podID="40dbe861-a8a0-4f3f-9d04-5a592dbce6ca" containerID="29b21899c393a3503ba5ae6dcf3912d8e17c4a592516cd085863f44ad997cd91" exitCode=0 Feb 20 00:06:20 crc kubenswrapper[4795]: I0220 00:06:20.059018 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wqgfs" event={"ID":"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca","Type":"ContainerDied","Data":"29b21899c393a3503ba5ae6dcf3912d8e17c4a592516cd085863f44ad997cd91"} Feb 20 00:06:20 crc kubenswrapper[4795]: I0220 00:06:20.059400 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wqgfs" event={"ID":"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca","Type":"ContainerStarted","Data":"c26d2d03a0a7f24ca832726bbb2107bca07d829795090d5a9ba6e14ef44f0230"} Feb 20 00:06:22 crc kubenswrapper[4795]: I0220 00:06:22.094998 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wqgfs" event={"ID":"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca","Type":"ContainerStarted","Data":"1bcbbf1fb3a51b7809b2c9d3857d7d30bd666a5c7ea1fd4ea2c2765511ea4288"} Feb 20 00:06:23 crc kubenswrapper[4795]: I0220 00:06:23.109669 4795 generic.go:334] "Generic (PLEG): container finished" podID="40dbe861-a8a0-4f3f-9d04-5a592dbce6ca" containerID="1bcbbf1fb3a51b7809b2c9d3857d7d30bd666a5c7ea1fd4ea2c2765511ea4288" exitCode=0 Feb 20 00:06:23 crc kubenswrapper[4795]: I0220 00:06:23.109762 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wqgfs" event={"ID":"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca","Type":"ContainerDied","Data":"1bcbbf1fb3a51b7809b2c9d3857d7d30bd666a5c7ea1fd4ea2c2765511ea4288"} Feb 20 00:06:24 crc kubenswrapper[4795]: I0220 00:06:24.121246 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wqgfs" event={"ID":"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca","Type":"ContainerStarted","Data":"8c4945d0f178c5671668c808efa2896dd2d5275ec4ad197350f883d68eb9ae13"} Feb 20 00:06:24 crc kubenswrapper[4795]: I0220 00:06:24.148306 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wqgfs" podStartSLOduration=3.684358736 podStartE2EDuration="7.148288457s" podCreationTimestamp="2026-02-20 00:06:17 +0000 UTC" firstStartedPulling="2026-02-20 00:06:20.061304952 +0000 UTC m=+9491.253822806" lastFinishedPulling="2026-02-20 00:06:23.525234663 +0000 UTC m=+9494.717752527" observedRunningTime="2026-02-20 00:06:24.138367398 +0000 UTC m=+9495.330885262" watchObservedRunningTime="2026-02-20 00:06:24.148288457 +0000 UTC m=+9495.340806321" Feb 20 00:06:24 crc kubenswrapper[4795]: I0220 00:06:24.269291 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-jdbs7_c1df7da5-3926-430a-8085-202bccbc4d73/cert-manager-controller/0.log" Feb 20 00:06:24 crc kubenswrapper[4795]: I0220 00:06:24.565114 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-bjb5c_7b645fbc-f078-4ebe-958d-cd7a8f8ba1ee/cert-manager-webhook/0.log" Feb 20 00:06:24 crc kubenswrapper[4795]: I0220 00:06:24.727645 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-n8skq_35b44919-239d-4fe8-8c53-a3698e24f753/cert-manager-cainjector/0.log" Feb 20 00:06:27 crc kubenswrapper[4795]: I0220 00:06:27.851106 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:06:27 crc kubenswrapper[4795]: I0220 00:06:27.911839 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:06:28 crc kubenswrapper[4795]: I0220 00:06:28.178091 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:28 crc kubenswrapper[4795]: I0220 00:06:28.178148 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:28 crc kubenswrapper[4795]: I0220 00:06:28.222362 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:29 crc kubenswrapper[4795]: I0220 00:06:29.069262 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7hdhn"] Feb 20 00:06:29 crc kubenswrapper[4795]: I0220 00:06:29.166769 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7hdhn" podUID="20cdd942-7d19-4f30-9952-d7f228b9ce25" containerName="registry-server" containerID="cri-o://2a956df3d891332966df5ad18e909d8da26b1727c8e93c48b762296a8b0255d5" gracePeriod=2 Feb 20 00:06:29 crc kubenswrapper[4795]: I0220 00:06:29.226565 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:29 crc kubenswrapper[4795]: I0220 00:06:29.673365 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:06:29 crc kubenswrapper[4795]: I0220 00:06:29.763127 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-448mb\" (UniqueName: \"kubernetes.io/projected/20cdd942-7d19-4f30-9952-d7f228b9ce25-kube-api-access-448mb\") pod \"20cdd942-7d19-4f30-9952-d7f228b9ce25\" (UID: \"20cdd942-7d19-4f30-9952-d7f228b9ce25\") " Feb 20 00:06:29 crc kubenswrapper[4795]: I0220 00:06:29.763523 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20cdd942-7d19-4f30-9952-d7f228b9ce25-catalog-content\") pod \"20cdd942-7d19-4f30-9952-d7f228b9ce25\" (UID: \"20cdd942-7d19-4f30-9952-d7f228b9ce25\") " Feb 20 00:06:29 crc kubenswrapper[4795]: I0220 00:06:29.763603 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20cdd942-7d19-4f30-9952-d7f228b9ce25-utilities\") pod \"20cdd942-7d19-4f30-9952-d7f228b9ce25\" (UID: \"20cdd942-7d19-4f30-9952-d7f228b9ce25\") " Feb 20 00:06:29 crc kubenswrapper[4795]: I0220 00:06:29.764182 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20cdd942-7d19-4f30-9952-d7f228b9ce25-utilities" (OuterVolumeSpecName: "utilities") pod "20cdd942-7d19-4f30-9952-d7f228b9ce25" (UID: "20cdd942-7d19-4f30-9952-d7f228b9ce25"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:06:29 crc kubenswrapper[4795]: I0220 00:06:29.866026 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20cdd942-7d19-4f30-9952-d7f228b9ce25-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 00:06:29 crc kubenswrapper[4795]: I0220 00:06:29.880911 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20cdd942-7d19-4f30-9952-d7f228b9ce25-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "20cdd942-7d19-4f30-9952-d7f228b9ce25" (UID: "20cdd942-7d19-4f30-9952-d7f228b9ce25"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:06:29 crc kubenswrapper[4795]: I0220 00:06:29.968598 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20cdd942-7d19-4f30-9952-d7f228b9ce25-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.178071 4795 generic.go:334] "Generic (PLEG): container finished" podID="20cdd942-7d19-4f30-9952-d7f228b9ce25" containerID="2a956df3d891332966df5ad18e909d8da26b1727c8e93c48b762296a8b0255d5" exitCode=0 Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.178196 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.178281 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7hdhn" event={"ID":"20cdd942-7d19-4f30-9952-d7f228b9ce25","Type":"ContainerDied","Data":"2a956df3d891332966df5ad18e909d8da26b1727c8e93c48b762296a8b0255d5"} Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.178310 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7hdhn" event={"ID":"20cdd942-7d19-4f30-9952-d7f228b9ce25","Type":"ContainerDied","Data":"7b602e85bc557e898c3ca566f65c7dc4a8e26a4c0fa1a330c88c268287d48d81"} Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.178349 4795 scope.go:117] "RemoveContainer" containerID="2a956df3d891332966df5ad18e909d8da26b1727c8e93c48b762296a8b0255d5" Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.197327 4795 scope.go:117] "RemoveContainer" containerID="3b20e09d048f1337f24d9d1e276d0fa4a7957311695be0063853e435b8104822" Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.332625 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20cdd942-7d19-4f30-9952-d7f228b9ce25-kube-api-access-448mb" (OuterVolumeSpecName: "kube-api-access-448mb") pod "20cdd942-7d19-4f30-9952-d7f228b9ce25" (UID: "20cdd942-7d19-4f30-9952-d7f228b9ce25"). InnerVolumeSpecName "kube-api-access-448mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.347646 4795 scope.go:117] "RemoveContainer" containerID="5b1794b84c59b23398b947ba30b2bc5663729806a9fcfb22249c57f13530b50f" Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.378468 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-448mb\" (UniqueName: \"kubernetes.io/projected/20cdd942-7d19-4f30-9952-d7f228b9ce25-kube-api-access-448mb\") on node \"crc\" DevicePath \"\"" Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.430219 4795 scope.go:117] "RemoveContainer" containerID="2a956df3d891332966df5ad18e909d8da26b1727c8e93c48b762296a8b0255d5" Feb 20 00:06:30 crc kubenswrapper[4795]: E0220 00:06:30.430615 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a956df3d891332966df5ad18e909d8da26b1727c8e93c48b762296a8b0255d5\": container with ID starting with 2a956df3d891332966df5ad18e909d8da26b1727c8e93c48b762296a8b0255d5 not found: ID does not exist" containerID="2a956df3d891332966df5ad18e909d8da26b1727c8e93c48b762296a8b0255d5" Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.430638 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a956df3d891332966df5ad18e909d8da26b1727c8e93c48b762296a8b0255d5"} err="failed to get container status \"2a956df3d891332966df5ad18e909d8da26b1727c8e93c48b762296a8b0255d5\": rpc error: code = NotFound desc = could not find container \"2a956df3d891332966df5ad18e909d8da26b1727c8e93c48b762296a8b0255d5\": container with ID starting with 2a956df3d891332966df5ad18e909d8da26b1727c8e93c48b762296a8b0255d5 not found: ID does not exist" Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.430658 4795 scope.go:117] "RemoveContainer" containerID="3b20e09d048f1337f24d9d1e276d0fa4a7957311695be0063853e435b8104822" Feb 20 00:06:30 crc kubenswrapper[4795]: E0220 00:06:30.431108 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b20e09d048f1337f24d9d1e276d0fa4a7957311695be0063853e435b8104822\": container with ID starting with 3b20e09d048f1337f24d9d1e276d0fa4a7957311695be0063853e435b8104822 not found: ID does not exist" containerID="3b20e09d048f1337f24d9d1e276d0fa4a7957311695be0063853e435b8104822" Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.431134 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b20e09d048f1337f24d9d1e276d0fa4a7957311695be0063853e435b8104822"} err="failed to get container status \"3b20e09d048f1337f24d9d1e276d0fa4a7957311695be0063853e435b8104822\": rpc error: code = NotFound desc = could not find container \"3b20e09d048f1337f24d9d1e276d0fa4a7957311695be0063853e435b8104822\": container with ID starting with 3b20e09d048f1337f24d9d1e276d0fa4a7957311695be0063853e435b8104822 not found: ID does not exist" Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.431149 4795 scope.go:117] "RemoveContainer" containerID="5b1794b84c59b23398b947ba30b2bc5663729806a9fcfb22249c57f13530b50f" Feb 20 00:06:30 crc kubenswrapper[4795]: E0220 00:06:30.431543 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b1794b84c59b23398b947ba30b2bc5663729806a9fcfb22249c57f13530b50f\": container with ID starting with 5b1794b84c59b23398b947ba30b2bc5663729806a9fcfb22249c57f13530b50f not found: ID does not exist" containerID="5b1794b84c59b23398b947ba30b2bc5663729806a9fcfb22249c57f13530b50f" Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.431645 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b1794b84c59b23398b947ba30b2bc5663729806a9fcfb22249c57f13530b50f"} err="failed to get container status \"5b1794b84c59b23398b947ba30b2bc5663729806a9fcfb22249c57f13530b50f\": rpc error: code = NotFound desc = could not find container \"5b1794b84c59b23398b947ba30b2bc5663729806a9fcfb22249c57f13530b50f\": container with ID starting with 5b1794b84c59b23398b947ba30b2bc5663729806a9fcfb22249c57f13530b50f not found: ID does not exist" Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.469094 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wqgfs"] Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.518663 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7hdhn"] Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.528346 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7hdhn"] Feb 20 00:06:31 crc kubenswrapper[4795]: I0220 00:06:31.188901 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wqgfs" podUID="40dbe861-a8a0-4f3f-9d04-5a592dbce6ca" containerName="registry-server" containerID="cri-o://8c4945d0f178c5671668c808efa2896dd2d5275ec4ad197350f883d68eb9ae13" gracePeriod=2 Feb 20 00:06:31 crc kubenswrapper[4795]: I0220 00:06:31.524568 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20cdd942-7d19-4f30-9952-d7f228b9ce25" path="/var/lib/kubelet/pods/20cdd942-7d19-4f30-9952-d7f228b9ce25/volumes" Feb 20 00:06:32 crc kubenswrapper[4795]: I0220 00:06:32.206308 4795 generic.go:334] "Generic (PLEG): container finished" podID="40dbe861-a8a0-4f3f-9d04-5a592dbce6ca" containerID="8c4945d0f178c5671668c808efa2896dd2d5275ec4ad197350f883d68eb9ae13" exitCode=0 Feb 20 00:06:32 crc kubenswrapper[4795]: I0220 00:06:32.206689 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wqgfs" event={"ID":"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca","Type":"ContainerDied","Data":"8c4945d0f178c5671668c808efa2896dd2d5275ec4ad197350f883d68eb9ae13"} Feb 20 00:06:32 crc kubenswrapper[4795]: I0220 00:06:32.206716 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wqgfs" event={"ID":"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca","Type":"ContainerDied","Data":"c26d2d03a0a7f24ca832726bbb2107bca07d829795090d5a9ba6e14ef44f0230"} Feb 20 00:06:32 crc kubenswrapper[4795]: I0220 00:06:32.206728 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c26d2d03a0a7f24ca832726bbb2107bca07d829795090d5a9ba6e14ef44f0230" Feb 20 00:06:32 crc kubenswrapper[4795]: I0220 00:06:32.376370 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:32 crc kubenswrapper[4795]: I0220 00:06:32.523630 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9gk7\" (UniqueName: \"kubernetes.io/projected/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-kube-api-access-v9gk7\") pod \"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca\" (UID: \"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca\") " Feb 20 00:06:32 crc kubenswrapper[4795]: I0220 00:06:32.523732 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-utilities\") pod \"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca\" (UID: \"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca\") " Feb 20 00:06:32 crc kubenswrapper[4795]: I0220 00:06:32.523892 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-catalog-content\") pod \"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca\" (UID: \"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca\") " Feb 20 00:06:32 crc kubenswrapper[4795]: I0220 00:06:32.524386 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-utilities" (OuterVolumeSpecName: "utilities") pod "40dbe861-a8a0-4f3f-9d04-5a592dbce6ca" (UID: "40dbe861-a8a0-4f3f-9d04-5a592dbce6ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:06:32 crc kubenswrapper[4795]: I0220 00:06:32.556693 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-kube-api-access-v9gk7" (OuterVolumeSpecName: "kube-api-access-v9gk7") pod "40dbe861-a8a0-4f3f-9d04-5a592dbce6ca" (UID: "40dbe861-a8a0-4f3f-9d04-5a592dbce6ca"). InnerVolumeSpecName "kube-api-access-v9gk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:06:32 crc kubenswrapper[4795]: I0220 00:06:32.592255 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40dbe861-a8a0-4f3f-9d04-5a592dbce6ca" (UID: "40dbe861-a8a0-4f3f-9d04-5a592dbce6ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:06:32 crc kubenswrapper[4795]: I0220 00:06:32.626741 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9gk7\" (UniqueName: \"kubernetes.io/projected/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-kube-api-access-v9gk7\") on node \"crc\" DevicePath \"\"" Feb 20 00:06:32 crc kubenswrapper[4795]: I0220 00:06:32.626779 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 00:06:32 crc kubenswrapper[4795]: I0220 00:06:32.626791 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 00:06:33 crc kubenswrapper[4795]: I0220 00:06:33.218055 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:33 crc kubenswrapper[4795]: I0220 00:06:33.261508 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wqgfs"] Feb 20 00:06:33 crc kubenswrapper[4795]: I0220 00:06:33.273354 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wqgfs"] Feb 20 00:06:33 crc kubenswrapper[4795]: I0220 00:06:33.526541 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40dbe861-a8a0-4f3f-9d04-5a592dbce6ca" path="/var/lib/kubelet/pods/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca/volumes" Feb 20 00:06:38 crc kubenswrapper[4795]: I0220 00:06:38.293905 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-gp2td_f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8/nmstate-console-plugin/0.log" Feb 20 00:06:38 crc kubenswrapper[4795]: I0220 00:06:38.477924 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-zqk47_06d09723-c7bd-422c-b447-70dee244cc05/nmstate-handler/0.log" Feb 20 00:06:38 crc kubenswrapper[4795]: I0220 00:06:38.519739 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-kgnfd_1dfc7b5c-9302-4774-a6c8-e76ff4d60385/kube-rbac-proxy/0.log" Feb 20 00:06:38 crc kubenswrapper[4795]: I0220 00:06:38.561681 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-kgnfd_1dfc7b5c-9302-4774-a6c8-e76ff4d60385/nmstate-metrics/0.log" Feb 20 00:06:38 crc kubenswrapper[4795]: I0220 00:06:38.710898 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-rzkrx_6b614198-6804-46a3-bb1e-d8495c0d53d6/nmstate-operator/0.log" Feb 20 00:06:38 crc kubenswrapper[4795]: I0220 00:06:38.772414 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-nk7zb_6c89273b-007f-44e6-88da-f48de3a5f03b/nmstate-webhook/0.log" Feb 20 00:06:54 crc kubenswrapper[4795]: I0220 00:06:54.314427 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-s52kw_0a29e309-2974-42a7-afd9-c77d17f414d0/prometheus-operator/0.log" Feb 20 00:06:54 crc kubenswrapper[4795]: I0220 00:06:54.746256 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd_658abf91-1e8b-4182-998f-76d3ed17b836/prometheus-operator-admission-webhook/0.log" Feb 20 00:06:54 crc kubenswrapper[4795]: I0220 00:06:54.776736 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb_fb0e3807-a209-43ca-a245-64283a1d021f/prometheus-operator-admission-webhook/0.log" Feb 20 00:06:54 crc kubenswrapper[4795]: I0220 00:06:54.942244 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-ls2vk_5d03bcc6-aa94-401a-9a3b-4970f64537cd/operator/0.log" Feb 20 00:06:54 crc kubenswrapper[4795]: I0220 00:06:54.986109 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-w6sln_e8a44bc2-38b3-4e8d-ab95-e5e6f861b13b/perses-operator/0.log" Feb 20 00:07:09 crc kubenswrapper[4795]: I0220 00:07:09.219738 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-xrsfh_1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4/kube-rbac-proxy/0.log" Feb 20 00:07:09 crc kubenswrapper[4795]: I0220 00:07:09.523798 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/cp-frr-files/0.log" Feb 20 00:07:09 crc kubenswrapper[4795]: I0220 00:07:09.721440 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-xrsfh_1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4/controller/0.log" Feb 20 00:07:09 crc kubenswrapper[4795]: I0220 00:07:09.732761 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/cp-frr-files/0.log" Feb 20 00:07:09 crc kubenswrapper[4795]: I0220 00:07:09.734543 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/cp-reloader/0.log" Feb 20 00:07:09 crc kubenswrapper[4795]: I0220 00:07:09.778344 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/cp-metrics/0.log" Feb 20 00:07:09 crc kubenswrapper[4795]: I0220 00:07:09.879474 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/cp-reloader/0.log" Feb 20 00:07:10 crc kubenswrapper[4795]: I0220 00:07:10.053375 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/cp-reloader/0.log" Feb 20 00:07:10 crc kubenswrapper[4795]: I0220 00:07:10.087959 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/cp-frr-files/0.log" Feb 20 00:07:10 crc kubenswrapper[4795]: I0220 00:07:10.096129 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/cp-metrics/0.log" Feb 20 00:07:10 crc kubenswrapper[4795]: I0220 00:07:10.103762 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/cp-metrics/0.log" Feb 20 00:07:10 crc kubenswrapper[4795]: I0220 00:07:10.690380 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/cp-frr-files/0.log" Feb 20 00:07:10 crc kubenswrapper[4795]: I0220 00:07:10.707888 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/cp-metrics/0.log" Feb 20 00:07:10 crc kubenswrapper[4795]: I0220 00:07:10.707942 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/cp-reloader/0.log" Feb 20 00:07:10 crc kubenswrapper[4795]: I0220 00:07:10.757192 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/controller/0.log" Feb 20 00:07:10 crc kubenswrapper[4795]: I0220 00:07:10.911940 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/frr-metrics/0.log" Feb 20 00:07:10 crc kubenswrapper[4795]: I0220 00:07:10.917552 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/kube-rbac-proxy/0.log" Feb 20 00:07:10 crc kubenswrapper[4795]: I0220 00:07:10.967581 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/kube-rbac-proxy-frr/0.log" Feb 20 00:07:11 crc kubenswrapper[4795]: I0220 00:07:11.131624 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/reloader/0.log" Feb 20 00:07:11 crc kubenswrapper[4795]: I0220 00:07:11.168865 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-5qtpm_e32c1521-9c29-4d70-b4bb-54af4127daaf/frr-k8s-webhook-server/0.log" Feb 20 00:07:11 crc kubenswrapper[4795]: I0220 00:07:11.346840 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-549cf7d797-tscrj_2eb889b2-1f23-4497-a779-5312fcd470b1/manager/0.log" Feb 20 00:07:11 crc kubenswrapper[4795]: I0220 00:07:11.606295 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7d8d766c-z8q6x_94a7e477-a2bd-4c46-8eb0-084260fade4a/webhook-server/0.log" Feb 20 00:07:11 crc kubenswrapper[4795]: I0220 00:07:11.702145 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-kmbww_32ed0d55-a2df-4643-9283-e5bc8d1c993e/kube-rbac-proxy/0.log" Feb 20 00:07:12 crc kubenswrapper[4795]: I0220 00:07:12.702478 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-kmbww_32ed0d55-a2df-4643-9283-e5bc8d1c993e/speaker/0.log" Feb 20 00:07:14 crc kubenswrapper[4795]: I0220 00:07:14.348733 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/frr/0.log" Feb 20 00:07:25 crc kubenswrapper[4795]: I0220 00:07:25.563443 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6_5aba51a3-e783-497f-b58e-dcd4e631b0e9/util/0.log" Feb 20 00:07:25 crc kubenswrapper[4795]: I0220 00:07:25.720474 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6_5aba51a3-e783-497f-b58e-dcd4e631b0e9/util/0.log" Feb 20 00:07:25 crc kubenswrapper[4795]: I0220 00:07:25.777362 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6_5aba51a3-e783-497f-b58e-dcd4e631b0e9/pull/0.log" Feb 20 00:07:25 crc kubenswrapper[4795]: I0220 00:07:25.777459 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6_5aba51a3-e783-497f-b58e-dcd4e631b0e9/pull/0.log" Feb 20 00:07:25 crc kubenswrapper[4795]: I0220 00:07:25.940030 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6_5aba51a3-e783-497f-b58e-dcd4e631b0e9/util/0.log" Feb 20 00:07:26 crc kubenswrapper[4795]: I0220 00:07:26.012501 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6_5aba51a3-e783-497f-b58e-dcd4e631b0e9/extract/0.log" Feb 20 00:07:26 crc kubenswrapper[4795]: I0220 00:07:26.025961 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6_5aba51a3-e783-497f-b58e-dcd4e631b0e9/pull/0.log" Feb 20 00:07:26 crc kubenswrapper[4795]: I0220 00:07:26.134513 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk_42ff3cee-7522-42a5-8cc9-b52b30d45220/util/0.log" Feb 20 00:07:26 crc kubenswrapper[4795]: I0220 00:07:26.303539 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk_42ff3cee-7522-42a5-8cc9-b52b30d45220/pull/0.log" Feb 20 00:07:26 crc kubenswrapper[4795]: I0220 00:07:26.335599 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk_42ff3cee-7522-42a5-8cc9-b52b30d45220/util/0.log" Feb 20 00:07:26 crc kubenswrapper[4795]: I0220 00:07:26.403332 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk_42ff3cee-7522-42a5-8cc9-b52b30d45220/pull/0.log" Feb 20 00:07:26 crc kubenswrapper[4795]: I0220 00:07:26.511156 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk_42ff3cee-7522-42a5-8cc9-b52b30d45220/extract/0.log" Feb 20 00:07:26 crc kubenswrapper[4795]: I0220 00:07:26.516785 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk_42ff3cee-7522-42a5-8cc9-b52b30d45220/pull/0.log" Feb 20 00:07:26 crc kubenswrapper[4795]: I0220 00:07:26.574190 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk_42ff3cee-7522-42a5-8cc9-b52b30d45220/util/0.log" Feb 20 00:07:26 crc kubenswrapper[4795]: I0220 00:07:26.713436 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj_8f281e72-3a5e-4abb-bbcb-7555808866be/util/0.log" Feb 20 00:07:26 crc kubenswrapper[4795]: I0220 00:07:26.901957 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj_8f281e72-3a5e-4abb-bbcb-7555808866be/pull/0.log" Feb 20 00:07:26 crc kubenswrapper[4795]: I0220 00:07:26.904302 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj_8f281e72-3a5e-4abb-bbcb-7555808866be/util/0.log" Feb 20 00:07:26 crc kubenswrapper[4795]: I0220 00:07:26.907185 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj_8f281e72-3a5e-4abb-bbcb-7555808866be/pull/0.log" Feb 20 00:07:27 crc kubenswrapper[4795]: I0220 00:07:27.146797 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj_8f281e72-3a5e-4abb-bbcb-7555808866be/util/0.log" Feb 20 00:07:27 crc kubenswrapper[4795]: I0220 00:07:27.162694 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj_8f281e72-3a5e-4abb-bbcb-7555808866be/extract/0.log" Feb 20 00:07:27 crc kubenswrapper[4795]: I0220 00:07:27.189447 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj_8f281e72-3a5e-4abb-bbcb-7555808866be/pull/0.log" Feb 20 00:07:27 crc kubenswrapper[4795]: I0220 00:07:27.339760 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mdtnr_f457fe15-4099-4d77-8140-3297bee0a182/extract-utilities/0.log" Feb 20 00:07:27 crc kubenswrapper[4795]: I0220 00:07:27.544278 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mdtnr_f457fe15-4099-4d77-8140-3297bee0a182/extract-content/0.log" Feb 20 00:07:27 crc kubenswrapper[4795]: I0220 00:07:27.567858 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mdtnr_f457fe15-4099-4d77-8140-3297bee0a182/extract-utilities/0.log" Feb 20 00:07:27 crc kubenswrapper[4795]: I0220 00:07:27.588959 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mdtnr_f457fe15-4099-4d77-8140-3297bee0a182/extract-content/0.log" Feb 20 00:07:27 crc kubenswrapper[4795]: I0220 00:07:27.720057 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mdtnr_f457fe15-4099-4d77-8140-3297bee0a182/extract-content/0.log" Feb 20 00:07:27 crc kubenswrapper[4795]: I0220 00:07:27.727733 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mdtnr_f457fe15-4099-4d77-8140-3297bee0a182/extract-utilities/0.log" Feb 20 00:07:27 crc kubenswrapper[4795]: I0220 00:07:27.987413 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p266t_432a371d-d143-4da7-9332-682f52b39381/extract-utilities/0.log" Feb 20 00:07:28 crc kubenswrapper[4795]: I0220 00:07:28.112447 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p266t_432a371d-d143-4da7-9332-682f52b39381/extract-utilities/0.log" Feb 20 00:07:28 crc kubenswrapper[4795]: I0220 00:07:28.134183 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p266t_432a371d-d143-4da7-9332-682f52b39381/extract-content/0.log" Feb 20 00:07:28 crc kubenswrapper[4795]: I0220 00:07:28.189409 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p266t_432a371d-d143-4da7-9332-682f52b39381/extract-content/0.log" Feb 20 00:07:28 crc kubenswrapper[4795]: I0220 00:07:28.427091 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 00:07:28 crc kubenswrapper[4795]: I0220 00:07:28.427186 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 00:07:28 crc kubenswrapper[4795]: I0220 00:07:28.487070 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p266t_432a371d-d143-4da7-9332-682f52b39381/extract-utilities/0.log" Feb 20 00:07:28 crc kubenswrapper[4795]: I0220 00:07:28.507539 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p266t_432a371d-d143-4da7-9332-682f52b39381/extract-content/0.log" Feb 20 00:07:28 crc kubenswrapper[4795]: I0220 00:07:28.774575 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72_bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a/util/0.log" Feb 20 00:07:28 crc kubenswrapper[4795]: I0220 00:07:28.961359 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72_bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a/util/0.log" Feb 20 00:07:29 crc kubenswrapper[4795]: I0220 00:07:29.021382 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72_bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a/pull/0.log" Feb 20 00:07:29 crc kubenswrapper[4795]: I0220 00:07:29.103280 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mdtnr_f457fe15-4099-4d77-8140-3297bee0a182/registry-server/0.log" Feb 20 00:07:29 crc kubenswrapper[4795]: I0220 00:07:29.180822 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72_bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a/pull/0.log" Feb 20 00:07:29 crc kubenswrapper[4795]: I0220 00:07:29.371587 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72_bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a/util/0.log" Feb 20 00:07:29 crc kubenswrapper[4795]: I0220 00:07:29.442618 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72_bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a/pull/0.log" Feb 20 00:07:29 crc kubenswrapper[4795]: I0220 00:07:29.484725 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72_bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a/extract/0.log" Feb 20 00:07:29 crc kubenswrapper[4795]: I0220 00:07:29.648930 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-n9qlf_c91304a6-fa59-4df4-aa17-d7d2f73d9103/marketplace-operator/0.log" Feb 20 00:07:29 crc kubenswrapper[4795]: I0220 00:07:29.752308 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r8t7x_4941d783-94cd-4a5c-a124-5c8751cc8494/extract-utilities/0.log" Feb 20 00:07:29 crc kubenswrapper[4795]: I0220 00:07:29.826109 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p266t_432a371d-d143-4da7-9332-682f52b39381/registry-server/0.log" Feb 20 00:07:29 crc kubenswrapper[4795]: I0220 00:07:29.921285 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r8t7x_4941d783-94cd-4a5c-a124-5c8751cc8494/extract-content/0.log" Feb 20 00:07:29 crc kubenswrapper[4795]: I0220 00:07:29.944594 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r8t7x_4941d783-94cd-4a5c-a124-5c8751cc8494/extract-content/0.log" Feb 20 00:07:29 crc kubenswrapper[4795]: I0220 00:07:29.958834 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r8t7x_4941d783-94cd-4a5c-a124-5c8751cc8494/extract-utilities/0.log" Feb 20 00:07:30 crc kubenswrapper[4795]: I0220 00:07:30.139933 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r8t7x_4941d783-94cd-4a5c-a124-5c8751cc8494/extract-utilities/0.log" Feb 20 00:07:30 crc kubenswrapper[4795]: I0220 00:07:30.140478 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r8t7x_4941d783-94cd-4a5c-a124-5c8751cc8494/extract-content/0.log" Feb 20 00:07:30 crc kubenswrapper[4795]: I0220 00:07:30.245297 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4v92x_4002b94b-8679-454c-a721-fa900f6cde3b/extract-utilities/0.log" Feb 20 00:07:30 crc kubenswrapper[4795]: I0220 00:07:30.411829 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4v92x_4002b94b-8679-454c-a721-fa900f6cde3b/extract-utilities/0.log" Feb 20 00:07:30 crc kubenswrapper[4795]: I0220 00:07:30.438772 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4v92x_4002b94b-8679-454c-a721-fa900f6cde3b/extract-content/0.log" Feb 20 00:07:30 crc kubenswrapper[4795]: I0220 00:07:30.527572 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4v92x_4002b94b-8679-454c-a721-fa900f6cde3b/extract-content/0.log" Feb 20 00:07:30 crc kubenswrapper[4795]: I0220 00:07:30.605936 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r8t7x_4941d783-94cd-4a5c-a124-5c8751cc8494/registry-server/0.log" Feb 20 00:07:30 crc kubenswrapper[4795]: I0220 00:07:30.643527 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4v92x_4002b94b-8679-454c-a721-fa900f6cde3b/extract-content/0.log" Feb 20 00:07:30 crc kubenswrapper[4795]: I0220 00:07:30.695547 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4v92x_4002b94b-8679-454c-a721-fa900f6cde3b/extract-utilities/0.log" Feb 20 00:07:31 crc kubenswrapper[4795]: I0220 00:07:31.827488 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4v92x_4002b94b-8679-454c-a721-fa900f6cde3b/registry-server/0.log" Feb 20 00:07:45 crc kubenswrapper[4795]: I0220 00:07:45.041210 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd_658abf91-1e8b-4182-998f-76d3ed17b836/prometheus-operator-admission-webhook/0.log" Feb 20 00:07:45 crc kubenswrapper[4795]: I0220 00:07:45.044302 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-s52kw_0a29e309-2974-42a7-afd9-c77d17f414d0/prometheus-operator/0.log" Feb 20 00:07:45 crc kubenswrapper[4795]: I0220 00:07:45.119340 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb_fb0e3807-a209-43ca-a245-64283a1d021f/prometheus-operator-admission-webhook/0.log" Feb 20 00:07:45 crc kubenswrapper[4795]: I0220 00:07:45.261152 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-ls2vk_5d03bcc6-aa94-401a-9a3b-4970f64537cd/operator/0.log" Feb 20 00:07:45 crc kubenswrapper[4795]: I0220 00:07:45.287278 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-w6sln_e8a44bc2-38b3-4e8d-ab95-e5e6f861b13b/perses-operator/0.log" Feb 20 00:07:52 crc kubenswrapper[4795]: E0220 00:07:52.282635 4795 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.69:51250->38.102.83.69:37561: write tcp 38.102.83.69:51250->38.102.83.69:37561: write: broken pipe Feb 20 00:07:58 crc kubenswrapper[4795]: I0220 00:07:58.427644 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 00:07:58 crc kubenswrapper[4795]: I0220 00:07:58.429154 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 00:08:28 crc kubenswrapper[4795]: I0220 00:08:28.427661 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 00:08:28 crc kubenswrapper[4795]: I0220 00:08:28.428207 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 00:08:28 crc kubenswrapper[4795]: I0220 00:08:28.428253 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 20 00:08:28 crc kubenswrapper[4795]: I0220 00:08:28.429077 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bbe670b74801c9c3513113ce23971acfb0538080cd5b8205d654717a9d255d0d"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 00:08:28 crc kubenswrapper[4795]: I0220 00:08:28.429128 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://bbe670b74801c9c3513113ce23971acfb0538080cd5b8205d654717a9d255d0d" gracePeriod=600 Feb 20 00:08:29 crc kubenswrapper[4795]: I0220 00:08:29.403610 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="bbe670b74801c9c3513113ce23971acfb0538080cd5b8205d654717a9d255d0d" exitCode=0 Feb 20 00:08:29 crc kubenswrapper[4795]: I0220 00:08:29.403704 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"bbe670b74801c9c3513113ce23971acfb0538080cd5b8205d654717a9d255d0d"} Feb 20 00:08:29 crc kubenswrapper[4795]: I0220 00:08:29.404281 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be"} Feb 20 00:08:29 crc kubenswrapper[4795]: I0220 00:08:29.404306 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.850619 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8hsl9"] Feb 20 00:08:44 crc kubenswrapper[4795]: E0220 00:08:44.851671 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40dbe861-a8a0-4f3f-9d04-5a592dbce6ca" containerName="extract-utilities" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.851686 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="40dbe861-a8a0-4f3f-9d04-5a592dbce6ca" containerName="extract-utilities" Feb 20 00:08:44 crc kubenswrapper[4795]: E0220 00:08:44.851703 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40dbe861-a8a0-4f3f-9d04-5a592dbce6ca" containerName="extract-content" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.851711 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="40dbe861-a8a0-4f3f-9d04-5a592dbce6ca" containerName="extract-content" Feb 20 00:08:44 crc kubenswrapper[4795]: E0220 00:08:44.851731 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20cdd942-7d19-4f30-9952-d7f228b9ce25" containerName="extract-content" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.851737 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="20cdd942-7d19-4f30-9952-d7f228b9ce25" containerName="extract-content" Feb 20 00:08:44 crc kubenswrapper[4795]: E0220 00:08:44.851750 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20cdd942-7d19-4f30-9952-d7f228b9ce25" containerName="registry-server" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.851757 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="20cdd942-7d19-4f30-9952-d7f228b9ce25" containerName="registry-server" Feb 20 00:08:44 crc kubenswrapper[4795]: E0220 00:08:44.851776 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40dbe861-a8a0-4f3f-9d04-5a592dbce6ca" containerName="registry-server" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.851782 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="40dbe861-a8a0-4f3f-9d04-5a592dbce6ca" containerName="registry-server" Feb 20 00:08:44 crc kubenswrapper[4795]: E0220 00:08:44.851789 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20cdd942-7d19-4f30-9952-d7f228b9ce25" containerName="extract-utilities" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.851795 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="20cdd942-7d19-4f30-9952-d7f228b9ce25" containerName="extract-utilities" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.852032 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="20cdd942-7d19-4f30-9952-d7f228b9ce25" containerName="registry-server" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.852056 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="40dbe861-a8a0-4f3f-9d04-5a592dbce6ca" containerName="registry-server" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.853798 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.867788 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8hsl9"] Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.887803 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99349ff2-bb13-4040-a617-0e7f78e9e3ed-catalog-content\") pod \"redhat-marketplace-8hsl9\" (UID: \"99349ff2-bb13-4040-a617-0e7f78e9e3ed\") " pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.887869 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8czgq\" (UniqueName: \"kubernetes.io/projected/99349ff2-bb13-4040-a617-0e7f78e9e3ed-kube-api-access-8czgq\") pod \"redhat-marketplace-8hsl9\" (UID: \"99349ff2-bb13-4040-a617-0e7f78e9e3ed\") " pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.888001 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99349ff2-bb13-4040-a617-0e7f78e9e3ed-utilities\") pod \"redhat-marketplace-8hsl9\" (UID: \"99349ff2-bb13-4040-a617-0e7f78e9e3ed\") " pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.989878 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99349ff2-bb13-4040-a617-0e7f78e9e3ed-catalog-content\") pod \"redhat-marketplace-8hsl9\" (UID: \"99349ff2-bb13-4040-a617-0e7f78e9e3ed\") " pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.990160 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8czgq\" (UniqueName: \"kubernetes.io/projected/99349ff2-bb13-4040-a617-0e7f78e9e3ed-kube-api-access-8czgq\") pod \"redhat-marketplace-8hsl9\" (UID: \"99349ff2-bb13-4040-a617-0e7f78e9e3ed\") " pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.990306 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99349ff2-bb13-4040-a617-0e7f78e9e3ed-utilities\") pod \"redhat-marketplace-8hsl9\" (UID: \"99349ff2-bb13-4040-a617-0e7f78e9e3ed\") " pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.990441 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99349ff2-bb13-4040-a617-0e7f78e9e3ed-catalog-content\") pod \"redhat-marketplace-8hsl9\" (UID: \"99349ff2-bb13-4040-a617-0e7f78e9e3ed\") " pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.990652 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99349ff2-bb13-4040-a617-0e7f78e9e3ed-utilities\") pod \"redhat-marketplace-8hsl9\" (UID: \"99349ff2-bb13-4040-a617-0e7f78e9e3ed\") " pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:45 crc kubenswrapper[4795]: I0220 00:08:45.009575 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8czgq\" (UniqueName: \"kubernetes.io/projected/99349ff2-bb13-4040-a617-0e7f78e9e3ed-kube-api-access-8czgq\") pod \"redhat-marketplace-8hsl9\" (UID: \"99349ff2-bb13-4040-a617-0e7f78e9e3ed\") " pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:45 crc kubenswrapper[4795]: I0220 00:08:45.196641 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:45 crc kubenswrapper[4795]: I0220 00:08:45.689543 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8hsl9"] Feb 20 00:08:46 crc kubenswrapper[4795]: I0220 00:08:46.598366 4795 generic.go:334] "Generic (PLEG): container finished" podID="99349ff2-bb13-4040-a617-0e7f78e9e3ed" containerID="cc04e107df9387fa933d9b2754e61d9ccb95dd1a55a0a1b37be3a3de02132c69" exitCode=0 Feb 20 00:08:46 crc kubenswrapper[4795]: I0220 00:08:46.598412 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hsl9" event={"ID":"99349ff2-bb13-4040-a617-0e7f78e9e3ed","Type":"ContainerDied","Data":"cc04e107df9387fa933d9b2754e61d9ccb95dd1a55a0a1b37be3a3de02132c69"} Feb 20 00:08:46 crc kubenswrapper[4795]: I0220 00:08:46.598774 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hsl9" event={"ID":"99349ff2-bb13-4040-a617-0e7f78e9e3ed","Type":"ContainerStarted","Data":"4adb63b598af0614119b00d6faa7a9924ef2d0a6f681c9bc978a176a914368ef"} Feb 20 00:08:48 crc kubenswrapper[4795]: I0220 00:08:48.619372 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hsl9" event={"ID":"99349ff2-bb13-4040-a617-0e7f78e9e3ed","Type":"ContainerStarted","Data":"56c2f512ffb3e5146cc6b5ad4d6b00d311420e2ad85ca35ad134f00ed3ab8011"} Feb 20 00:08:49 crc kubenswrapper[4795]: I0220 00:08:49.631094 4795 generic.go:334] "Generic (PLEG): container finished" podID="99349ff2-bb13-4040-a617-0e7f78e9e3ed" containerID="56c2f512ffb3e5146cc6b5ad4d6b00d311420e2ad85ca35ad134f00ed3ab8011" exitCode=0 Feb 20 00:08:49 crc kubenswrapper[4795]: I0220 00:08:49.631364 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hsl9" event={"ID":"99349ff2-bb13-4040-a617-0e7f78e9e3ed","Type":"ContainerDied","Data":"56c2f512ffb3e5146cc6b5ad4d6b00d311420e2ad85ca35ad134f00ed3ab8011"} Feb 20 00:08:51 crc kubenswrapper[4795]: I0220 00:08:51.655197 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hsl9" event={"ID":"99349ff2-bb13-4040-a617-0e7f78e9e3ed","Type":"ContainerStarted","Data":"8630eb78752030da728fb607630fea61e7430267e5b9298cff9f74b4653bba09"} Feb 20 00:08:51 crc kubenswrapper[4795]: I0220 00:08:51.679593 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8hsl9" podStartSLOduration=4.211116367 podStartE2EDuration="7.679574046s" podCreationTimestamp="2026-02-20 00:08:44 +0000 UTC" firstStartedPulling="2026-02-20 00:08:46.600484389 +0000 UTC m=+9637.793002253" lastFinishedPulling="2026-02-20 00:08:50.068942058 +0000 UTC m=+9641.261459932" observedRunningTime="2026-02-20 00:08:51.67937744 +0000 UTC m=+9642.871895304" watchObservedRunningTime="2026-02-20 00:08:51.679574046 +0000 UTC m=+9642.872091900" Feb 20 00:08:55 crc kubenswrapper[4795]: I0220 00:08:55.197474 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:55 crc kubenswrapper[4795]: I0220 00:08:55.197974 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:55 crc kubenswrapper[4795]: I0220 00:08:55.250899 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:55 crc kubenswrapper[4795]: I0220 00:08:55.761942 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:56 crc kubenswrapper[4795]: I0220 00:08:56.036487 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8hsl9"] Feb 20 00:08:57 crc kubenswrapper[4795]: I0220 00:08:57.726673 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8hsl9" podUID="99349ff2-bb13-4040-a617-0e7f78e9e3ed" containerName="registry-server" containerID="cri-o://8630eb78752030da728fb607630fea61e7430267e5b9298cff9f74b4653bba09" gracePeriod=2 Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.248477 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.377656 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99349ff2-bb13-4040-a617-0e7f78e9e3ed-utilities\") pod \"99349ff2-bb13-4040-a617-0e7f78e9e3ed\" (UID: \"99349ff2-bb13-4040-a617-0e7f78e9e3ed\") " Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.377710 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99349ff2-bb13-4040-a617-0e7f78e9e3ed-catalog-content\") pod \"99349ff2-bb13-4040-a617-0e7f78e9e3ed\" (UID: \"99349ff2-bb13-4040-a617-0e7f78e9e3ed\") " Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.377955 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8czgq\" (UniqueName: \"kubernetes.io/projected/99349ff2-bb13-4040-a617-0e7f78e9e3ed-kube-api-access-8czgq\") pod \"99349ff2-bb13-4040-a617-0e7f78e9e3ed\" (UID: \"99349ff2-bb13-4040-a617-0e7f78e9e3ed\") " Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.378656 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99349ff2-bb13-4040-a617-0e7f78e9e3ed-utilities" (OuterVolumeSpecName: "utilities") pod "99349ff2-bb13-4040-a617-0e7f78e9e3ed" (UID: "99349ff2-bb13-4040-a617-0e7f78e9e3ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.382933 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99349ff2-bb13-4040-a617-0e7f78e9e3ed-kube-api-access-8czgq" (OuterVolumeSpecName: "kube-api-access-8czgq") pod "99349ff2-bb13-4040-a617-0e7f78e9e3ed" (UID: "99349ff2-bb13-4040-a617-0e7f78e9e3ed"). InnerVolumeSpecName "kube-api-access-8czgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.409647 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99349ff2-bb13-4040-a617-0e7f78e9e3ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "99349ff2-bb13-4040-a617-0e7f78e9e3ed" (UID: "99349ff2-bb13-4040-a617-0e7f78e9e3ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.480279 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8czgq\" (UniqueName: \"kubernetes.io/projected/99349ff2-bb13-4040-a617-0e7f78e9e3ed-kube-api-access-8czgq\") on node \"crc\" DevicePath \"\"" Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.480316 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99349ff2-bb13-4040-a617-0e7f78e9e3ed-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.480326 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99349ff2-bb13-4040-a617-0e7f78e9e3ed-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.741138 4795 generic.go:334] "Generic (PLEG): container finished" podID="99349ff2-bb13-4040-a617-0e7f78e9e3ed" containerID="8630eb78752030da728fb607630fea61e7430267e5b9298cff9f74b4653bba09" exitCode=0 Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.741778 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hsl9" event={"ID":"99349ff2-bb13-4040-a617-0e7f78e9e3ed","Type":"ContainerDied","Data":"8630eb78752030da728fb607630fea61e7430267e5b9298cff9f74b4653bba09"} Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.741834 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hsl9" event={"ID":"99349ff2-bb13-4040-a617-0e7f78e9e3ed","Type":"ContainerDied","Data":"4adb63b598af0614119b00d6faa7a9924ef2d0a6f681c9bc978a176a914368ef"} Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.741873 4795 scope.go:117] "RemoveContainer" containerID="8630eb78752030da728fb607630fea61e7430267e5b9298cff9f74b4653bba09" Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.742130 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.775256 4795 scope.go:117] "RemoveContainer" containerID="56c2f512ffb3e5146cc6b5ad4d6b00d311420e2ad85ca35ad134f00ed3ab8011" Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.801051 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8hsl9"] Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.811144 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8hsl9"] Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.822824 4795 scope.go:117] "RemoveContainer" containerID="cc04e107df9387fa933d9b2754e61d9ccb95dd1a55a0a1b37be3a3de02132c69" Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.867341 4795 scope.go:117] "RemoveContainer" containerID="8630eb78752030da728fb607630fea61e7430267e5b9298cff9f74b4653bba09" Feb 20 00:08:58 crc kubenswrapper[4795]: E0220 00:08:58.867703 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8630eb78752030da728fb607630fea61e7430267e5b9298cff9f74b4653bba09\": container with ID starting with 8630eb78752030da728fb607630fea61e7430267e5b9298cff9f74b4653bba09 not found: ID does not exist" containerID="8630eb78752030da728fb607630fea61e7430267e5b9298cff9f74b4653bba09" Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.867743 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8630eb78752030da728fb607630fea61e7430267e5b9298cff9f74b4653bba09"} err="failed to get container status \"8630eb78752030da728fb607630fea61e7430267e5b9298cff9f74b4653bba09\": rpc error: code = NotFound desc = could not find container \"8630eb78752030da728fb607630fea61e7430267e5b9298cff9f74b4653bba09\": container with ID starting with 8630eb78752030da728fb607630fea61e7430267e5b9298cff9f74b4653bba09 not found: ID does not exist" Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.867763 4795 scope.go:117] "RemoveContainer" containerID="56c2f512ffb3e5146cc6b5ad4d6b00d311420e2ad85ca35ad134f00ed3ab8011" Feb 20 00:08:58 crc kubenswrapper[4795]: E0220 00:08:58.868075 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56c2f512ffb3e5146cc6b5ad4d6b00d311420e2ad85ca35ad134f00ed3ab8011\": container with ID starting with 56c2f512ffb3e5146cc6b5ad4d6b00d311420e2ad85ca35ad134f00ed3ab8011 not found: ID does not exist" containerID="56c2f512ffb3e5146cc6b5ad4d6b00d311420e2ad85ca35ad134f00ed3ab8011" Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.868105 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56c2f512ffb3e5146cc6b5ad4d6b00d311420e2ad85ca35ad134f00ed3ab8011"} err="failed to get container status \"56c2f512ffb3e5146cc6b5ad4d6b00d311420e2ad85ca35ad134f00ed3ab8011\": rpc error: code = NotFound desc = could not find container \"56c2f512ffb3e5146cc6b5ad4d6b00d311420e2ad85ca35ad134f00ed3ab8011\": container with ID starting with 56c2f512ffb3e5146cc6b5ad4d6b00d311420e2ad85ca35ad134f00ed3ab8011 not found: ID does not exist" Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.868125 4795 scope.go:117] "RemoveContainer" containerID="cc04e107df9387fa933d9b2754e61d9ccb95dd1a55a0a1b37be3a3de02132c69" Feb 20 00:08:58 crc kubenswrapper[4795]: E0220 00:08:58.868397 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc04e107df9387fa933d9b2754e61d9ccb95dd1a55a0a1b37be3a3de02132c69\": container with ID starting with cc04e107df9387fa933d9b2754e61d9ccb95dd1a55a0a1b37be3a3de02132c69 not found: ID does not exist" containerID="cc04e107df9387fa933d9b2754e61d9ccb95dd1a55a0a1b37be3a3de02132c69" Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.868430 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc04e107df9387fa933d9b2754e61d9ccb95dd1a55a0a1b37be3a3de02132c69"} err="failed to get container status \"cc04e107df9387fa933d9b2754e61d9ccb95dd1a55a0a1b37be3a3de02132c69\": rpc error: code = NotFound desc = could not find container \"cc04e107df9387fa933d9b2754e61d9ccb95dd1a55a0a1b37be3a3de02132c69\": container with ID starting with cc04e107df9387fa933d9b2754e61d9ccb95dd1a55a0a1b37be3a3de02132c69 not found: ID does not exist" Feb 20 00:08:59 crc kubenswrapper[4795]: I0220 00:08:59.524772 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99349ff2-bb13-4040-a617-0e7f78e9e3ed" path="/var/lib/kubelet/pods/99349ff2-bb13-4040-a617-0e7f78e9e3ed/volumes" Feb 20 00:09:58 crc kubenswrapper[4795]: I0220 00:09:58.427833 4795 generic.go:334] "Generic (PLEG): container finished" podID="d06c32e0-d01f-47e9-871b-9fdfb391d796" containerID="35f5634e272e68850396058b8dbd9d80a146c6ccfa053b21de355451b37156a5" exitCode=0 Feb 20 00:09:58 crc kubenswrapper[4795]: I0220 00:09:58.427937 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gv265/must-gather-ltjmp" event={"ID":"d06c32e0-d01f-47e9-871b-9fdfb391d796","Type":"ContainerDied","Data":"35f5634e272e68850396058b8dbd9d80a146c6ccfa053b21de355451b37156a5"} Feb 20 00:09:58 crc kubenswrapper[4795]: I0220 00:09:58.429226 4795 scope.go:117] "RemoveContainer" containerID="35f5634e272e68850396058b8dbd9d80a146c6ccfa053b21de355451b37156a5" Feb 20 00:09:58 crc kubenswrapper[4795]: I0220 00:09:58.785138 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gv265_must-gather-ltjmp_d06c32e0-d01f-47e9-871b-9fdfb391d796/gather/0.log" Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.041775 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gv265/must-gather-ltjmp"] Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.042793 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-gv265/must-gather-ltjmp" podUID="d06c32e0-d01f-47e9-871b-9fdfb391d796" containerName="copy" containerID="cri-o://340d2615d25276b88141337bb1421c18e4c691a87d218a924ca2427d0a329e70" gracePeriod=2 Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.067000 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gv265/must-gather-ltjmp"] Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.511436 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gv265_must-gather-ltjmp_d06c32e0-d01f-47e9-871b-9fdfb391d796/copy/0.log" Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.513145 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gv265/must-gather-ltjmp" Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.516160 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5kfl\" (UniqueName: \"kubernetes.io/projected/d06c32e0-d01f-47e9-871b-9fdfb391d796-kube-api-access-g5kfl\") pod \"d06c32e0-d01f-47e9-871b-9fdfb391d796\" (UID: \"d06c32e0-d01f-47e9-871b-9fdfb391d796\") " Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.516390 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d06c32e0-d01f-47e9-871b-9fdfb391d796-must-gather-output\") pod \"d06c32e0-d01f-47e9-871b-9fdfb391d796\" (UID: \"d06c32e0-d01f-47e9-871b-9fdfb391d796\") " Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.552374 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d06c32e0-d01f-47e9-871b-9fdfb391d796-kube-api-access-g5kfl" (OuterVolumeSpecName: "kube-api-access-g5kfl") pod "d06c32e0-d01f-47e9-871b-9fdfb391d796" (UID: "d06c32e0-d01f-47e9-871b-9fdfb391d796"). InnerVolumeSpecName "kube-api-access-g5kfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.567836 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gv265_must-gather-ltjmp_d06c32e0-d01f-47e9-871b-9fdfb391d796/copy/0.log" Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.570836 4795 generic.go:334] "Generic (PLEG): container finished" podID="d06c32e0-d01f-47e9-871b-9fdfb391d796" containerID="340d2615d25276b88141337bb1421c18e4c691a87d218a924ca2427d0a329e70" exitCode=143 Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.570888 4795 scope.go:117] "RemoveContainer" containerID="340d2615d25276b88141337bb1421c18e4c691a87d218a924ca2427d0a329e70" Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.571014 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gv265/must-gather-ltjmp" Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.604685 4795 scope.go:117] "RemoveContainer" containerID="35f5634e272e68850396058b8dbd9d80a146c6ccfa053b21de355451b37156a5" Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.620482 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5kfl\" (UniqueName: \"kubernetes.io/projected/d06c32e0-d01f-47e9-871b-9fdfb391d796-kube-api-access-g5kfl\") on node \"crc\" DevicePath \"\"" Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.670807 4795 scope.go:117] "RemoveContainer" containerID="340d2615d25276b88141337bb1421c18e4c691a87d218a924ca2427d0a329e70" Feb 20 00:10:08 crc kubenswrapper[4795]: E0220 00:10:08.671312 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"340d2615d25276b88141337bb1421c18e4c691a87d218a924ca2427d0a329e70\": container with ID starting with 340d2615d25276b88141337bb1421c18e4c691a87d218a924ca2427d0a329e70 not found: ID does not exist" containerID="340d2615d25276b88141337bb1421c18e4c691a87d218a924ca2427d0a329e70" Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.671384 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"340d2615d25276b88141337bb1421c18e4c691a87d218a924ca2427d0a329e70"} err="failed to get container status \"340d2615d25276b88141337bb1421c18e4c691a87d218a924ca2427d0a329e70\": rpc error: code = NotFound desc = could not find container \"340d2615d25276b88141337bb1421c18e4c691a87d218a924ca2427d0a329e70\": container with ID starting with 340d2615d25276b88141337bb1421c18e4c691a87d218a924ca2427d0a329e70 not found: ID does not exist" Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.671413 4795 scope.go:117] "RemoveContainer" containerID="35f5634e272e68850396058b8dbd9d80a146c6ccfa053b21de355451b37156a5" Feb 20 00:10:08 crc kubenswrapper[4795]: E0220 00:10:08.671713 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35f5634e272e68850396058b8dbd9d80a146c6ccfa053b21de355451b37156a5\": container with ID starting with 35f5634e272e68850396058b8dbd9d80a146c6ccfa053b21de355451b37156a5 not found: ID does not exist" containerID="35f5634e272e68850396058b8dbd9d80a146c6ccfa053b21de355451b37156a5" Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.671747 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35f5634e272e68850396058b8dbd9d80a146c6ccfa053b21de355451b37156a5"} err="failed to get container status \"35f5634e272e68850396058b8dbd9d80a146c6ccfa053b21de355451b37156a5\": rpc error: code = NotFound desc = could not find container \"35f5634e272e68850396058b8dbd9d80a146c6ccfa053b21de355451b37156a5\": container with ID starting with 35f5634e272e68850396058b8dbd9d80a146c6ccfa053b21de355451b37156a5 not found: ID does not exist" Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.749055 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d06c32e0-d01f-47e9-871b-9fdfb391d796-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "d06c32e0-d01f-47e9-871b-9fdfb391d796" (UID: "d06c32e0-d01f-47e9-871b-9fdfb391d796"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.825215 4795 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d06c32e0-d01f-47e9-871b-9fdfb391d796-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 20 00:10:09 crc kubenswrapper[4795]: I0220 00:10:09.537606 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d06c32e0-d01f-47e9-871b-9fdfb391d796" path="/var/lib/kubelet/pods/d06c32e0-d01f-47e9-871b-9fdfb391d796/volumes" Feb 20 00:10:28 crc kubenswrapper[4795]: I0220 00:10:28.428104 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 00:10:28 crc kubenswrapper[4795]: I0220 00:10:28.428587 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 00:10:58 crc kubenswrapper[4795]: I0220 00:10:58.427794 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 00:10:58 crc kubenswrapper[4795]: I0220 00:10:58.429440 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 00:11:28 crc kubenswrapper[4795]: I0220 00:11:28.427112 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 00:11:28 crc kubenswrapper[4795]: I0220 00:11:28.427715 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 00:11:28 crc kubenswrapper[4795]: I0220 00:11:28.427759 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 20 00:11:28 crc kubenswrapper[4795]: I0220 00:11:28.428588 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 00:11:28 crc kubenswrapper[4795]: I0220 00:11:28.428637 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be" gracePeriod=600 Feb 20 00:11:29 crc kubenswrapper[4795]: I0220 00:11:29.360500 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be" exitCode=0 Feb 20 00:11:29 crc kubenswrapper[4795]: I0220 00:11:29.360545 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be"} Feb 20 00:11:29 crc kubenswrapper[4795]: I0220 00:11:29.360583 4795 scope.go:117] "RemoveContainer" containerID="bbe670b74801c9c3513113ce23971acfb0538080cd5b8205d654717a9d255d0d" Feb 20 00:11:29 crc kubenswrapper[4795]: E0220 00:11:29.544550 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:11:30 crc kubenswrapper[4795]: I0220 00:11:30.372433 4795 scope.go:117] "RemoveContainer" containerID="7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be" Feb 20 00:11:30 crc kubenswrapper[4795]: E0220 00:11:30.373077 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:11:45 crc kubenswrapper[4795]: I0220 00:11:45.512027 4795 scope.go:117] "RemoveContainer" containerID="7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be" Feb 20 00:11:45 crc kubenswrapper[4795]: E0220 00:11:45.512838 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:11:58 crc kubenswrapper[4795]: I0220 00:11:58.512017 4795 scope.go:117] "RemoveContainer" containerID="7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be" Feb 20 00:11:58 crc kubenswrapper[4795]: E0220 00:11:58.512873 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:12:09 crc kubenswrapper[4795]: I0220 00:12:09.522958 4795 scope.go:117] "RemoveContainer" containerID="7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be" Feb 20 00:12:09 crc kubenswrapper[4795]: E0220 00:12:09.523739 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:12:24 crc kubenswrapper[4795]: I0220 00:12:24.511732 4795 scope.go:117] "RemoveContainer" containerID="7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be" Feb 20 00:12:24 crc kubenswrapper[4795]: E0220 00:12:24.512685 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:12:29 crc kubenswrapper[4795]: I0220 00:12:29.850319 4795 scope.go:117] "RemoveContainer" containerID="8c4945d0f178c5671668c808efa2896dd2d5275ec4ad197350f883d68eb9ae13" Feb 20 00:12:29 crc kubenswrapper[4795]: I0220 00:12:29.871339 4795 scope.go:117] "RemoveContainer" containerID="29b21899c393a3503ba5ae6dcf3912d8e17c4a592516cd085863f44ad997cd91" Feb 20 00:12:29 crc kubenswrapper[4795]: I0220 00:12:29.891403 4795 scope.go:117] "RemoveContainer" containerID="1bcbbf1fb3a51b7809b2c9d3857d7d30bd666a5c7ea1fd4ea2c2765511ea4288" Feb 20 00:12:39 crc kubenswrapper[4795]: I0220 00:12:39.518625 4795 scope.go:117] "RemoveContainer" containerID="7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be" Feb 20 00:12:39 crc kubenswrapper[4795]: E0220 00:12:39.519635 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:12:53 crc kubenswrapper[4795]: I0220 00:12:53.514879 4795 scope.go:117] "RemoveContainer" containerID="7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be" Feb 20 00:12:53 crc kubenswrapper[4795]: E0220 00:12:53.516065 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:13:06 crc kubenswrapper[4795]: I0220 00:13:06.511775 4795 scope.go:117] "RemoveContainer" containerID="7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be" Feb 20 00:13:06 crc kubenswrapper[4795]: E0220 00:13:06.512771 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:13:21 crc kubenswrapper[4795]: I0220 00:13:21.512321 4795 scope.go:117] "RemoveContainer" containerID="7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be" Feb 20 00:13:21 crc kubenswrapper[4795]: E0220 00:13:21.513000 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:13:33 crc kubenswrapper[4795]: I0220 00:13:33.512746 4795 scope.go:117] "RemoveContainer" containerID="7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be" Feb 20 00:13:33 crc kubenswrapper[4795]: E0220 00:13:33.513854 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:13:48 crc kubenswrapper[4795]: I0220 00:13:48.512022 4795 scope.go:117] "RemoveContainer" containerID="7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be" Feb 20 00:13:48 crc kubenswrapper[4795]: E0220 00:13:48.512968 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:14:01 crc kubenswrapper[4795]: I0220 00:14:01.511800 4795 scope.go:117] "RemoveContainer" containerID="7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be" Feb 20 00:14:01 crc kubenswrapper[4795]: E0220 00:14:01.512834 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:14:15 crc kubenswrapper[4795]: I0220 00:14:15.511553 4795 scope.go:117] "RemoveContainer" containerID="7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be" Feb 20 00:14:15 crc kubenswrapper[4795]: E0220 00:14:15.512516 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:14:26 crc kubenswrapper[4795]: I0220 00:14:26.512528 4795 scope.go:117] "RemoveContainer" containerID="7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be" Feb 20 00:14:26 crc kubenswrapper[4795]: E0220 00:14:26.513821 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515145723561024456 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015145723561017373 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015145677610016521 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015145677610015471 5ustar corecore